diff --git a/.agent/skills b/.agent/skills new file mode 120000 index 00000000000..8574c4f65b4 --- /dev/null +++ b/.agent/skills @@ -0,0 +1 @@ +../.cursor/skills \ No newline at end of file diff --git a/.cursor/skills/celo-release/SKILL.md b/.cursor/skills/celo-release/SKILL.md new file mode 100644 index 00000000000..b24e280df62 --- /dev/null +++ b/.cursor/skills/celo-release/SKILL.md @@ -0,0 +1,323 @@ +--- +name: celo-release +description: Deploy Celo core contract releases using Foundry tooling. Use when releasing contracts, testing releases on forks, generating governance proposals, or when user mentions release, deploy, upgrade contracts, CR14, CR15, make-release, or verify-deployed. +--- + +# Celo Core Contracts Release + +Deploy and test Celo core contract releases using Foundry-based tooling. + +## Quick Reference + +| Step | Command | Output | +|------|---------|--------| +| 1. Generate libraries.json | `verify-deployed:foundry -b ` | `libraries.json` | +| 2. Generate report | `check-versions:foundry -a -b ` | `releaseData/versionReports/releaseN-report.json` | +| 3. Deploy & create proposal | `make-release:foundry -b ` | `proposal.json` + deployed contracts | + +## Networks + +| Network | Chain ID | RPC URL | Use Case | +|---------|----------|---------|----------| +| Celo Mainnet | 42220 | https://forno.celo.org | Production releases | +| Celo Sepolia | 11142220 | https://forno.celo-sepolia.celo-testnet.org | Testnet releases | +| Local Fork | varies | http://127.0.0.1:8545 | Testing releases | + +## Deployer Keys + +Deployer keys are stored in encrypted mnemonic files in the repo root: + +| Network | Mnemonic File | Encrypted File | +|---------|--------------|----------------| +| Celo Sepolia | `.env.mnemonic.celosepolia` | N/A (manual) | +| Mainnet | `.env.mnemonic.mainnet` | `.env.mnemonic.mainnet.enc` | + +### Decrypting Keys (cLabs employees) + +```bash +# Decrypt all mnemonic files using GCP KMS +yarn keys:decrypt +``` + +### Using Keys + +Each mnemonic file exports `DEPLOYER_PRIVATE_KEY`. Source it before running release commands: + +```bash +# For Celo Sepolia +source .env.mnemonic.celosepolia + +# For Mainnet +source .env.mnemonic.mainnet +``` + +Then use `$DEPLOYER_PRIVATE_KEY` in release commands. + +## Release Workflow + +### Step 0: Confirm Release Parameters with User + +**IMPORTANT**: Before executing any release commands, you MUST: + +1. Query available tags and branches: + ```bash + git tag -l 'core-contracts.*' | sort -V | tail -15 + git branch -a | grep 'release/core-contracts' + ``` + +2. Determine the release source using this priority order: + + **Tag priority (check in order, use first match):** + 1. Post-audit tag (if exists): `core-contracts.vN.post-audit` or `core-contracts.vN.post_audit` + 2. Base tag: `core-contracts.vN` + 3. Release branch: `release/core-contracts/N` (only if no tags exist) + + **Note**: Not all releases have post-audit tags. If one exists, use it; otherwise use the base tag. + + **For the NEW release**: Check for post-audit tag first, then base tag, then branch. + **For the PREVIOUS release**: Same logic - use post-audit tag if available, otherwise base tag. + +3. Present a confirmation to the user with: + - **Previous release tag** (for libraries.json): e.g., `core-contracts.v14` or `core-contracts.v14.post-audit` + - **New release source**: Following the priority above + - **Target network**: e.g., `celo-sepolia` or `celo` + +4. Use the AskQuestion tool to get explicit confirmation: + - Option to confirm the proposed tags/branches + - Option to specify different tag/branch + +5. Only proceed with the release after user confirmation. + +Example confirmation prompt (when post-audit tag exists): +``` +I plan to release with the following parameters: +- Previous release tag: core-contracts.v14.post-audit +- New release tag: core-contracts.v15.post-audit +- Target network: celo-sepolia + +Please confirm or specify different values. +``` + +Example confirmation prompt (when only base tag exists): +``` +I plan to release with the following parameters: +- Previous release tag: core-contracts.v14 +- New release tag: core-contracts.v15 +- Target network: celo-sepolia + +Please confirm or specify different values. +``` + +Example confirmation prompt (when only branch exists): +``` +I plan to release with the following parameters: +- Previous release tag: core-contracts.v14 +- New release branch: release/core-contracts/15 (no tag found) +- Target network: celo-sepolia + +Please confirm or specify different values. +``` + +### Step 1: Generate libraries.json + +Verify the **currently deployed** release to get library addresses: + +```bash +cd packages/protocol + +# For Celo Sepolia +yarn release:verify-deployed:foundry -b core-contracts.v${PREVIOUS} -n celo-sepolia + +# For Mainnet +yarn release:verify-deployed:foundry -b core-contracts.v${PREVIOUS} -n celo +``` + +**Output**: `libraries.json` in `packages/protocol/` + +### Step 2: Generate Compatibility Report + +Compare previous release to new release branch: + +```bash +yarn release:check-versions:foundry \ + -a core-contracts.v${PREVIOUS} \ + -b release/core-contracts/${NEW} \ + -r ./releaseData/versionReports/release${NEW}-report.json +``` + +**Output**: `releaseData/versionReports/release${NEW}-report.json` + +### Step 3: Prepare Initialization Data + +Create or verify initialization data exists: + +```bash +# Check if file exists +cat ./releaseData/initializationData/release${NEW}.json + +# If missing, create empty (valid if no new contracts) +echo "{}" > ./releaseData/initializationData/release${NEW}.json +``` + +### Step 4: Deploy Release + +#### On Local Fork (Testing) + +```bash +yarn release:make:foundry \ + -b release/core-contracts/${NEW} \ + -k $DEPLOYER_PRIVATE_KEY \ + -i ./releaseData/initializationData/release${NEW}.json \ + -l ./libraries.json \ + -n celo-sepolia \ + -p ./proposal-fork.json \ + -r ./releaseData/versionReports/release${NEW}-report.json \ + -u http://127.0.0.1:8545 +``` + +#### On Celo Sepolia + +```bash +yarn release:make:foundry \ + -b release/core-contracts/${NEW} \ + -k $CELO_SEPOLIA_DEPLOYER_KEY \ + -i ./releaseData/initializationData/release${NEW}.json \ + -l ./libraries.json \ + -n celo-sepolia \ + -p ./proposal-celo-sepolia.json \ + -r ./releaseData/versionReports/release${NEW}-report.json +``` + +#### On Mainnet + +```bash +# First regenerate libraries.json for mainnet! +yarn release:verify-deployed:foundry -b core-contracts.v${PREVIOUS} -n celo + +yarn release:make:foundry \ + -b release/core-contracts/${NEW} \ + -k $MAINNET_DEPLOYER_KEY \ + -i ./releaseData/initializationData/release${NEW}.json \ + -l ./libraries.json \ + -n celo \ + -p ./proposal-mainnet.json \ + -r ./releaseData/versionReports/release${NEW}-report.json +``` + +## Release Artifacts + +| Artifact | Location | Purpose | +|----------|----------|---------| +| `libraries.json` | `packages/protocol/` | Library addresses (network-specific!) | +| Version report | `releaseData/versionReports/releaseN-report.json` | Contract changes & version deltas | +| Init data | `releaseData/initializationData/releaseN.json` | Constructor args for new contracts | +| Proposal | `proposal-*.json` | Governance transactions | + +**Important**: `libraries.json` is network-specific. Regenerate when switching between Celo Sepolia and Mainnet. + +## Determining Release Numbers + +**Priority**: Always prefer git tags over branches. Post-audit tags are preferred over base tags. + +```bash +# List existing tags (check these FIRST) +git tag -l 'core-contracts.*' | sort -V | tail -15 + +# List release branches (fallback if no tag exists) +git branch -a | grep 'release/core-contracts' +``` + +**Tag priority (check in order, use first match):** +1. `core-contracts.vN.post-audit` or `core-contracts.vN.post_audit` - Post-audit (if exists) +2. `core-contracts.vN` - Base tag +3. `release/core-contracts/N` - Branch (fallback only) + +**Note**: Not all releases have post-audit tags. Use it if available, otherwise use the base tag. + +**Selection logic:** +1. For NEW release: Check for post-audit tag first, then base tag, then branch +2. For PREVIOUS release: Same logic - use post-audit tag if available, otherwise base tag + +## Starting a Local Fork + +Before testing on a local fork, start Anvil with the required parameters: + +```bash +# Fork Celo Sepolia +anvil --fork-url https://forno.celo-sepolia.celo-testnet.org \ + --code-size-limit 500000 \ + --gas-limit 100000000 + +# Fork Mainnet +anvil --fork-url https://forno.celo.org \ + --code-size-limit 500000 \ + --gas-limit 100000000 +``` + +**Important**: The `--code-size-limit` and `--gas-limit` flags are required for Celo contract deployments due to large contract sizes. + +## Contract Verification + +The release script automatically verifies deployed contracts on: +- **Blockscout** (https://celo-sepolia.blockscout.com or https://celo.blockscout.com) - No API key required +- **Celoscan** via Etherscan V2 API (https://celoscan.io) - **API key required** for production networks + +### Verification Features + +The script handles verification automatically with: +- **Linked libraries**: Contracts using libraries (e.g., Governance with Proposals library) are verified with the `--libraries` flag +- **Foundry profiles**: Sets `FOUNDRY_PROFILE` environment variable (`truffle-compat` for 0.5.x, `truffle-compat8` for 0.8.x) to ensure bytecode matches +- **Full compiler version**: Uses full version with commit hash (e.g., `0.5.14+commit.01f1aaa4`) + +### Celoscan API Key (Required for celo-sepolia and mainnet) + +The API key is **required by default** for production networks. Get your key from https://etherscan.io/myapikey + +**Setup options (in order of precedence):** + +1. **CLI flag**: `-a YOUR_API_KEY` +2. **Environment variable**: `export CELOSCAN_API_KEY=YOUR_API_KEY` +3. **Config file**: `packages/protocol/.env.json` + ```json + { + "celoScanApiKey": "YOUR_API_KEY" + } + ``` + +**Note**: The Etherscan V2 API uses a unified endpoint (`api.etherscan.io`) that works with a single API key for all supported chains including Celo. + +### Skip Verification + +To skip verification (e.g., for testing or if you don't have an API key): +```bash +yarn release:make:foundry ... -s +``` + +Verification is automatically skipped when using a custom RPC URL (local forks). + +### Verification Troubleshooting + +- **"Address is not a smart-contract"**: Block explorer hasn't indexed the contract yet. The script waits 30s initially, then automatically retries up to 6 times with logarithmic delays (5s, 10s, 20s, 40s, 60s max). +- **"Bytecode mismatch"**: Usually caused by wrong foundry profile. The script now automatically sets `FOUNDRY_PROFILE` based on contract source path. +- **Linked library errors**: The script automatically detects and passes library addresses via `--libraries` flag for contracts that use linked libraries. + +## Common Issues + +### "libraries.json not found" +Run `verify-deployed:foundry` first with the previous release tag. + +### "Version mismatch detected" +Update `getVersionNumber()` in the contract to match expected version from the report. + +### "Deployment reverted" or "Out of gas" on Local Fork +Ensure Anvil is started with `--code-size-limit 500000 --gas-limit 100000000`. + +### Testing on Local Fork +Use anvil's default test key for local forks: +``` +0xac0974bec39a17e36ba4a6b4d238ff944bacb478cbed5efcae784d7bf4f2ff80 +``` + +## Additional Resources + +For complete documentation, see [RELEASE_PROCESS_FOUNDRY.md](../../packages/protocol/RELEASE_PROCESS_FOUNDRY.md) diff --git a/.cursor/skills/celo-release/examples.md b/.cursor/skills/celo-release/examples.md new file mode 100644 index 00000000000..9263a664c4b --- /dev/null +++ b/.cursor/skills/celo-release/examples.md @@ -0,0 +1,199 @@ +# Release Examples + +## Example: Release CR15 to Celo Sepolia + +```bash +cd packages/protocol + +# 1. Source deployer key +source ../../.env.mnemonic.celosepolia + +# 2. Generate libraries.json from v14 (currently deployed) +yarn release:verify-deployed:foundry -b core-contracts.v14 -n celo-sepolia + +# 3. Generate compatibility report (v14 → v15) +yarn release:check-versions:foundry \ + -a core-contracts.v14 \ + -b release/core-contracts/15 \ + -r ./releaseData/versionReports/release15-report.json + +# 4. Ensure init data exists +cat ./releaseData/initializationData/release15.json || echo "{}" > ./releaseData/initializationData/release15.json + +# 5. Deploy to Celo Sepolia (requires Celoscan API key for verification) +# API key can be set via: +# - .env.json: {"celoScanApiKey": "YOUR_KEY"} +# - Environment: export CELOSCAN_API_KEY=YOUR_KEY +# - CLI flag: -a YOUR_KEY +yarn release:make:foundry \ + -b release/core-contracts/15 \ + -k $DEPLOYER_PRIVATE_KEY \ + -i ./releaseData/initializationData/release15.json \ + -l ./libraries.json \ + -n celo-sepolia \ + -p ./proposal-celo-sepolia.json \ + -r ./releaseData/versionReports/release15-report.json +``` + +## Example: Release CR15 on Local Fork (Testing) + +```bash +cd packages/protocol + +# 1. Start anvil fork in another terminal +anvil --fork-url https://forno.celo-sepolia.celo-testnet.org \ + --code-size-limit 500000 --gas-limit 100000000 + +# 2. Generate libraries.json from v14 (currently deployed) +yarn release:verify-deployed:foundry -b core-contracts.v14 -n celo-sepolia + +# 3. Generate compatibility report (v14 → v15) +yarn release:check-versions:foundry \ + -a core-contracts.v14 \ + -b release/core-contracts/15 \ + -r ./releaseData/versionReports/release15-report.json + +# 4. Deploy to local fork at 127.0.0.1:8545 (use anvil test key, verification auto-skipped for forks) +yarn release:make:foundry \ + -b release/core-contracts/15 \ + -k 0xac0974bec39a17e36ba4a6b4d238ff944bacb478cbed5efcae784d7bf4f2ff80 \ + -i ./releaseData/initializationData/release15.json \ + -l ./libraries.json \ + -n celo-sepolia \ + -p ./proposal-fork.json \ + -r ./releaseData/versionReports/release15-report.json \ + -u http://127.0.0.1:8545 + +# To skip verification entirely (no API key required), use -s flag: +yarn release:make:foundry \ + -b release/core-contracts/15 \ + -k $DEPLOYER_PRIVATE_KEY \ + -i ./releaseData/initializationData/release15.json \ + -l ./libraries.json \ + -n celo-sepolia \ + -p ./proposal-celo-sepolia.json \ + -r ./releaseData/versionReports/release15-report.json \ + -s +``` + +## Example: Full Mainnet Release + +```bash +cd packages/protocol + +# 1. Source deployer key (decrypt first if needed: yarn keys:decrypt) +source ../../.env.mnemonic.mainnet + +# 2. Ensure Celoscan API key is configured (required for mainnet) +# Set in .env.json: {"celoScanApiKey": "YOUR_KEY"} +# Or: export CELOSCAN_API_KEY=YOUR_KEY + +# 3. Generate libraries.json for mainnet +yarn release:verify-deployed:foundry -b core-contracts.v14 -n celo + +# 4. Deploy to mainnet (with automatic verification) +yarn release:make:foundry \ + -b release/core-contracts/15 \ + -k $DEPLOYER_PRIVATE_KEY \ + -i ./releaseData/initializationData/release15.json \ + -l ./libraries.json \ + -n celo \ + -p ./proposal-mainnet.json \ + -r ./releaseData/versionReports/release15-report.json + +# 5. Submit governance proposal +celocli governance:propose \ + --jsonTransactions ./proposal-mainnet.json \ + --deposit 10000e18 \ + --from $PROPOSER_ADDRESS \ + --node https://forno.celo.org +``` + +## Example: Proposal Output + +```json +[ + { + "contract": "EpochManagerProxy", + "function": "_setImplementation", + "args": ["0xNEW_IMPLEMENTATION_ADDRESS"], + "value": "0" + }, + { + "contract": "GovernanceProxy", + "function": "_setImplementation", + "args": ["0xNEW_IMPLEMENTATION_ADDRESS"], + "value": "0" + } +] +``` + +## Example: Version Report Structure + +```json +{ + "report": { + "contracts": { + "Governance": { + "changes": { + "storage": [], + "major": [], + "minor": [{"type": "MethodAdded", "signature": "proposalCount()"}], + "patch": [{"type": "DeployedBytecode"}] + }, + "versionDelta": { + "storage": "=", + "major": "=", + "minor": "+1", + "patch": "0" + } + } + } + } +} +``` + +## Example: Verification Output + +The script automatically verifies all deployed contracts on both Blockscout and Celoscan: + +``` +======================================== +Contract Verification +======================================== +Verifying 4 contract(s) on celo-sepolia... + +Verifying EpochManager at 0xecaf98acf55c598ee8f2e5ebbcc9f683b15a11a8... + Compiler: 0.8.19+commit.7dd6d404, Optimizer: 200 runs, EVM: paris + Foundry profile: truffle-compat8 + ✓ EpochManager verified on Blockscout + ✓ EpochManager verified on Celoscan + +Verifying Governance at 0x299461b1b6a34ad83dab8451e2cd43c6fca3bf80... + Compiler: 0.5.14+commit.01f1aaa4, Optimizer: disabled, EVM: istanbul + Foundry profile: truffle-compat + Linked libraries: Proposals@0x96d4da..., IntegerSortedLinkedList@0x38ff7d... + ✓ Governance verified on Blockscout + ✓ Governance verified on Celoscan + +---------------------------------------- +Verification Summary: + Blockscout: 4/4 verified + Celoscan: 4/4 verified +---------------------------------------- +``` + +Key verification features: +- **Foundry profile**: Automatically detected from source path (`contracts/` → `truffle-compat`, `contracts-0.8/` → `truffle-compat8`) +- **Linked libraries**: Automatically detected and passed to verifier for contracts that use libraries +- **Full compiler version**: Uses version with commit hash for accurate bytecode matching + +## Starting a Local Fork + +```bash +# Fork Celo Sepolia +anvil --fork-url https://forno.celo-sepolia.celo-testnet.org --port 8545 + +# Fork Mainnet +anvil --fork-url https://forno.celo.org --port 8545 +``` diff --git a/.cursor/skills/node-cache-update/SKILL.md b/.cursor/skills/node-cache-update/SKILL.md new file mode 100644 index 00000000000..c88605c0496 --- /dev/null +++ b/.cursor/skills/node-cache-update/SKILL.md @@ -0,0 +1,35 @@ +--- +name: node-cache-update +description: Update GitHub Actions node module cache version when dependencies change. Use when modifying package.json, yarn.lock, adding/removing/updating npm packages, or changing node dependencies. +--- + +# Node Module Cache Update + +When node dependencies are added, removed, or updated in this monorepo, the GitHub Actions cache must be invalidated to ensure CI builds use the correct dependencies. + +## When to Update + +Update the cache version when: +- Adding new dependencies to any `package.json` +- Removing dependencies from any `package.json` +- Updating dependency versions (including pinning versions) +- Modifying `yarn.lock` + +## How to Update + +1. Open `.github/workflows/celo-monorepo.yml` +2. Find the `NODE_MODULE_CACHE_VERSION` environment variable (around line 27) +3. Increment the version number by 1 + +```yaml +env: + # Increment these to force cache rebuilding + NODE_MODULE_CACHE_VERSION: 10 # <-- Increment this number +``` + +## Why This Matters + +GitHub Actions caches `node_modules` based on this version number combined with `yarn.lock` hash. If you change dependencies but don't increment the cache version, CI may: +- Use stale cached dependencies +- Fail with missing module errors +- Have inconsistent behavior between local and CI builds diff --git a/.dockerignore b/.dockerignore index 0e5dacaad96..c145dd0c5ab 100644 --- a/.dockerignore +++ b/.dockerignore @@ -11,15 +11,6 @@ npm-debug.log dockerfiles .vscode -# not docker packages -packages/blockchain-api -packages/docs -packages/faucet -packages/mobile -packages/react-components -packages/notification-service -packages/web - # Ignore generated credentials from google-github-actions/auth gha-creds-*.json diff --git a/.env b/.env index d7c38222261..af423ebd433 100644 --- a/.env +++ b/.env @@ -39,9 +39,6 @@ GETH_BOOTNODE_DOCKER_IMAGE_REPOSITORY="us.gcr.io/celo-testnet/geth-all" # `geth $ git show | head -n 1` GETH_BOOTNODE_DOCKER_IMAGE_TAG="master" -CELOTOOL_DOCKER_IMAGE_REPOSITORY="gcr.io/celo-testnet/celo-monorepo" -CELOTOOL_DOCKER_IMAGE_TAG="celotool-4257fe61f91e935681f3a91bb4dcb44c8dd6df47" - CELOCLI_STANDALONE_IMAGE_REPOSITORY="gcr.io/celo-testnet/celocli-standalone" CELOCLI_STANDALONE_IMAGE_TAG="0.0.30-beta2" diff --git a/.env.alfajores b/.env.alfajores index b243f5ef99e..a96be340372 100644 --- a/.env.alfajores +++ b/.env.alfajores @@ -64,9 +64,6 @@ GETH_ENABLE_METRICS=true # Disable the sidecar that forwards the metrics to stackdriver PROMETHEUS_DISABLE_STACKDRIVER_SIDECAR="true" -CELOTOOL_DOCKER_IMAGE_REPOSITORY="gcr.io/celo-testnet/celo-monorepo" -CELOTOOL_DOCKER_IMAGE_TAG="celotool-4257fe61f91e935681f3a91bb4dcb44c8dd6df47" - CELOCLI_STANDALONE_IMAGE_REPOSITORY="gcr.io/celo-testnet/celocli-standalone" CELOCLI_STANDALONE_IMAGE_TAG="0.0.53" diff --git a/.env.baklava b/.env.baklava deleted file mode 100644 index ad905f981f6..00000000000 --- a/.env.baklava +++ /dev/null @@ -1,234 +0,0 @@ -# Don't use "//" for comments in this file. -# This file is meant to be executed as a bash script for testing. -ENV_TYPE="production" - -GETH_VERBOSITY=2 -GETH_ENABLE_METRICS=true - - -KUBERNETES_CLUSTER_NAME="baklavastaging" -KUBERNETES_CLUSTER_ZONE="us-west1-a" -CLUSTER_DOMAIN_NAME="celo-testnet" - -TESTNET_PROJECT_NAME="celo-testnet-production" - -BLOCKSCOUT_DOCKER_IMAGE_TAG="0362f9f4d1d4842f27adb634d628f969f53c046d" -BLOCKSCOUT_DB_SUFFIX=2 - -CELOSTATS_SERVER_DOCKER_IMAGE_REPOSITORY="gcr.io/celo-testnet/celostats-server" -CELOSTATS_SERVER_DOCKER_IMAGE_TAG="master" -CELOSTATS_FRONTEND_DOCKER_IMAGE_REPOSITORY="gcr.io/celo-testnet/celostats-frontend" -CELOSTATS_FRONTEND_DOCKER_IMAGE_TAG="master" -CELOSTATS_TRUSTED_ADDRESSES="" -CELOSTATS_BANNED_ADDRESSES="" -CELOSTATS_RESERVED_ADDRESSES="" - -GETH_NODE_DOCKER_IMAGE_REPOSITORY="us.gcr.io/celo-org/geth" -GETH_NODE_DOCKER_IMAGE_TAG="1.7.0" - -GETH_BOOTNODE_DOCKER_IMAGE_REPOSITORY="us.gcr.io/celo-org/geth-all" -GETH_BOOTNODE_DOCKER_IMAGE_TAG="1.7.0" - -CELOTOOL_DOCKER_IMAGE_REPOSITORY="gcr.io/celo-testnet/celo-monorepo" -CELOTOOL_DOCKER_IMAGE_TAG="celotool-4257fe61f91e935681f3a91bb4dcb44c8dd6df47" - -CELOCLI_STANDALONE_IMAGE_REPOSITORY="gcr.io/celo-testnet/celocli-standalone" -CELOCLI_STANDALONE_IMAGE_TAG="0.0.30-beta2" - -ORACLE_DOCKER_IMAGE_REPOSITORY="us-west1-docker.pkg.dev/celo-testnet-production/celo-oracle/celo-oracle" -ORACLE_DOCKER_IMAGE_TAG="459947a" - -# ---- Full Node Chain Restore ---- - -USE_GSTORAGE_DATA=true -GSTORAGE_DATA_BUCKET=celo-chain-backup/baklava - -# ---- Contexts ---- - -# each context should have its own environment variables, generally of the form -# _* -CONTEXTS=azure-oracle-westus2,azure-oracle-centralus,gcp-forno-europe-west1 - -# ---- Oracle Contexts ---- - -AZURE_ORACLE_WESTUS2_AZURE_SUBSCRIPTION_ID=7a6f5f20-bd43-4267-8c35-a734efca140c -AZURE_ORACLE_WESTUS2_AZURE_TENANT_ID=7cb7628a-e37c-4afb-8332-2029e418980e -AZURE_ORACLE_WESTUS2_AZURE_KUBERNETES_RESOURCE_GROUP=baklava-oracles-westus2 -AZURE_ORACLE_WESTUS2_AZURE_REGION_NAME=westus2 -AZURE_ORACLE_WESTUS2_KUBERNETES_CLUSTER_NAME=baklava-oracles-westus2 -# Format should be a comma-separated sequence of: -#
:: -AZURE_ORACLE_WESTUS2_CELOUSD_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0x460b3f8d3c203363bb65b1a18d89d4ffb6b0c981:baklava-oracle2:baklava-oracles-westus2,0x3b522230c454ca9720665d66e6335a72327291e8:baklava-oracle3:baklava-oracles-westus2 -AZURE_ORACLE_WESTUS2_CELOEUR_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0x042aC21b9CDA6171881135884DE16CF6aCa78375:baklava-celoeur-oracle2:baklava-oracles-westus2,0xc496d108C6781Fbc9967E48873Eac2eA14ca119a:baklava-celoeur-oracle3:baklava-oracles-westus2 -AZURE_ORACLE_WESTUS2_CELOBRL_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0x86f9c87d13347e6041e75f3a18533e07d284c419:baklava-celobrl-oracle0:baklava-oracles-westus2,0x3ee0a85e7d8e9d06617986d40623b1493139c5ae:baklava-celobrl-oracle1:baklava-oracles-westus2 -AZURE_ORACLE_WESTUS2_USDCUSD_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0x97ef27cf3ce65b2558161aeb1e3cff8b5f71fd04:baklava-usdcusd-oracle1:baklava-oracles-westus2,0x559702d23983eb29bcf30f2487d477945c0dbc6a:baklava-usdcusd-oracle3:baklava-oracles-westus2 -AZURE_ORACLE_WESTUS2_USDCBRL_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0x8a2d375ae246e305c14c88e6687ff06acd66c9ba:baklava-brlusdc-oracle2:baklava-oracles-westus2,0x72434eca70d5544f8178c1a769762c8c1f0fd940:baklava-brlusdc-oracle5:baklava-oracles-westus2 -AZURE_ORACLE_WESTUS2_USDCEUR_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0x99ce1e35574802e29644ee7a8284f9987fceee3d:baklava-eurusdc-oracle6:baklava-oracles-westus2,0x4c37e2cc2e9105984fef866a3f06aa953cc660d1:baklava-eurusdc-oracle7:baklava-oracles-westus2 -AZURE_ORACLE_WESTUS2_EUROCEUR_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0x6866e306b32acae7310d3b87dd53fc948f0e0629:baklava-euroceur-oracle2:baklava-oracles-westus2,0xe33502b13be6e0444a08de933faa24a59ae9b585:baklava-euroceur-oracle3:baklava-oracles-westus2 -AZURE_ORACLE_WESTUS2_CELOXOF_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0x96eda2cad69c8cd1daeb80da86d24825f45f46b7:baklava-celoxof-oracle2,0x4e9d441fd1c77222395a1853d851fea8a0e3aed8:baklava-celoxof-oracle3 -AZURE_ORACLE_WESTUS2_EURXOF_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0x7fe5f297dd812ca21e7bf1cbf145a0b59227b35f:baklava-eurxof-oracle2,0x2addc69c2ce3a9d93a8291419319bf7f0a2c6c82:baklava-eurxof-oracle3 -AZURE_ORACLE_WESTUS2_EUROCXOF_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0x729e058e97c099c79af674bbe2f687171432dd17:baklava-eurocxof-oracle2,0xd226aa9ee80ee282339c1ae69f3f811dbe5d895a:baklava-eurocxof-oracle4 -AZURE_ORACLE_WESTUS2_CELOKES_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0x84f0d0c9385de3509cdf6eb2fb168e35b0dbad92:baklava-celokes-oracle2,0x2db4d3bf7e744b422812b63b036c401828be7778:baklava-celokes-oracle3 -AZURE_ORACLE_WESTUS2_KESUSD_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0x94cd5463902630dd22db8ac41242002e6a7a6844:baklava-kesusd-oracle2,0xd3e70b118b674c4db7fde6946b16070bf9ec5ce3:baklava-kesusd-oracle3 -AZURE_ORACLE_WESTUS2_USDTUSD_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0x9829cf05869f1b9770f4ce9d5653909f1f9e4c5e:baklava-usdtusd-oracle2,0xdfbcbae6de4fb7b72dbad402b975e374441395ea:baklava-usdtusd-oracle3 -AZURE_ORACLE_WESTUS2_FULL_NODES_COUNT=2 -AZURE_ORACLE_WESTUS2_FULL_NODES_ROLLING_UPDATE_PARTITION=0 -AZURE_ORACLE_WESTUS2_FULL_NODES_DISK_SIZE=30 -AZURE_ORACLE_WESTUS2_FULL_NODES_RPC_API_METHODS="eth,net,rpc,web3" -AZURE_ORACLE_WESTUS2_FULL_NODES_GETH_GC_MODE="full" -AZURE_ORACLE_WESTUS2_FULL_NODES_USE_GSTORAGE_DATA=false -AZURE_ORACLE_WESTUS2_FULL_NODES_WS_PORT="8546" -AZURE_ORACLE_WESTUS2_PROM_SIDECAR_DISABLED="true" - -AZURE_ORACLE_CENTRALUS_AZURE_SUBSCRIPTION_ID=7a6f5f20-bd43-4267-8c35-a734efca140c -AZURE_ORACLE_CENTRALUS_AZURE_TENANT_ID=7cb7628a-e37c-4afb-8332-2029e418980e -AZURE_ORACLE_CENTRALUS_AZURE_KUBERNETES_RESOURCE_GROUP=baklava-oracles-centralus -AZURE_ORACLE_CENTRALUS_AZURE_REGION_NAME=centralus -AZURE_ORACLE_CENTRALUS_KUBERNETES_CLUSTER_NAME=baklava-oracles-centralus -# Format should be a comma-separated sequence of: -#
:: -AZURE_ORACLE_CENTRALUS_CELOUSD_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0x02b1d1bea682fcab4448c0820f5db409cce4f702:baklava-oracle7:baklava-oracles-centralus,0xe90f891710f625f18ecbf1e02efb4fd1ab236a10:baklava-oracle8:baklava-oracles-centralus -AZURE_ORACLE_CENTRALUS_CELOEUR_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0x4Ab2B594f08403b6324Cb0B172e332d8a08f1d5a:baklava-celoeur-oracle7:baklava-oracles-centralus,0x21d66472604A931211379A32710D26AFA203D190:baklava-celoeur-oracle8:baklava-oracles-centralus -AZURE_ORACLE_CENTRALUS_CELOBRL_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0xe467003845bdcbef710192d86263716815ab39f6:baklava-celobrl-oracle2:baklava-oracles-centralus,0x4b7f9e7e18230a271109f67b60a70fb3b4d0268a:baklava-celobrl-oracle3:baklava-oracles-centralus -AZURE_ORACLE_CENTRALUS_USDCUSD_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0x6a0e5d8a496feb59464028250c88a08341ea0831:baklava-usdcusd-oracle0:baklava-oracles-centralus,0xb15833400aecc72cb759d4e57a3a5a9c2963a5c5:baklava-usdcusd-oracle2:baklava-oracles-centralus -AZURE_ORACLE_CENTRALUS_USDCBRL_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0xa326d557b79422952d708d4eea8523d29b7174ec:baklava-brlusdc-oracle0:baklava-oracles-centralus,0xe73b1f39d1289e608df9a055c459b909f4bc1592:baklava-brlusdc-oracle1:baklava-oracles-centralus -AZURE_ORACLE_CENTRALUS_USDCEUR_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0xa6a5ae0c1c6b0ed52f2c6bb5666b61cfa86ac3ae:baklava-eurusdc-oracle4:baklava-oracles-centralus,0x2679be4034f47219bf9dfcbfb55bad60fe741315:baklava-eurusdc-oracle5:baklava-oracles-centralus -AZURE_ORACLE_CENTRALUS_EUROCEUR_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0x9a0613e8a1ff6cfd72ab692e6b450fbf02deba81:baklava-euroceur-oracle0:baklava-oracles-centralus,0xbe4bbd15177e2857c7c3297da12331033eeacd93:baklava-euroceur-oracle1:baklava-oracles-centralus -AZURE_ORACLE_CENTRALUS_CELOXOF_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0xd056a29e86161a34692c34f4c95933b59de077dc:baklava-celoxof-oracle0,0x5ad07f89176298ae3a0f3d20d0b4a756307d46e7:baklava-celoxof-oracle1 -AZURE_ORACLE_CENTRALUS_EURXOF_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0xa4a46db00840e6525ffe79aee5990abaebb7479d:baklava-eurxof-oracle0,0x6e537c9462ed968ff08eab430c5f8c11eab7df1a:baklava-eurxof-oracle1 -AZURE_ORACLE_CENTRALUS_EUROCXOF_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0x1a637c38671512866317475d19df5f55b0802276:baklava-eurocxof-oracle0,0x8589f0bb307581b96877f9e1a5ce3fcb05127fd0:baklava-eurocxof-oracle1 -AZURE_ORACLE_CENTRALUS_CELOKES_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0x0468aabc726f2f8d6bc612af99bf994026654a34:baklava-celokes-oracle0,0x8fc0c18b0fc7c11d4af89f0be046ed17dd1fe0f4:baklava-celokes-oracle1 -AZURE_ORACLE_CENTRALUS_KESUSD_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0xf44345ff4ae8a3e18ae3e7d9c6b3de62736fb01c:baklava-kesusd-oracle0,0xb9410ac25ae1424190f6b4e45dcabd4d32168e5f:baklava-kesusd-oracle1 -AZURE_ORACLE_CENTRALUS_USDTUSD_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0x2b4b450daecaf5a011497762380cdf1938791f85:baklava-usdtusd-oracle0,0x179282dcbf4c506332b0376cf5bcebd6ca9ec2f3:baklava-usdtusd-oracle1 -AZURE_ORACLE_CENTRALUS_FULL_NODES_COUNT=2 -AZURE_ORACLE_CENTRALUS_FULL_NODES_ROLLING_UPDATE_PARTITION=0 -AZURE_ORACLE_CENTRALUS_FULL_NODES_DISK_SIZE=30 -AZURE_ORACLE_CENTRALUS_FULL_NODES_RPC_API_METHODS="eth,net,rpc,web3" -AZURE_ORACLE_CENTRALUS_FULL_NODES_GETH_GC_MODE="full" -AZURE_ORACLE_CENTRALUS_FULL_NODES_USE_GSTORAGE_DATA=false -AZURE_ORACLE_CENTRALUS_FULL_NODES_WS_PORT="8546" -AZURE_ORACLE_CENTRALUS_PROM_SIDECAR_DISABLED="true" - -# ---- Forno ---- - -# ---- General Forno ---- - -# A list of every context that forno will use. -FORNO_FULL_NODE_CONTEXTS=gcp-forno-europe-west1 -# A list of domains to provision the SSL certificate for -FORNO_DOMAINS=baklava-forno-k8s.celo-testnet.org -# All forno clusters must be in this VPC -FORNO_VPC_NETWORK_NAME=baklava-network -FORNO_BANNED_CIDR="255.255.255.255/32" - -# ---- Forno Contexts ---- - -GCP_FORNO_EUROPE_WEST1_GCP_PROJECT_NAME=celo-testnet-production -GCP_FORNO_EUROPE_WEST1_GCP_ZONE=europe-west1-b -GCP_FORNO_EUROPE_WEST1_KUBERNETES_CLUSTER_NAME=baklava-europe-west1 -GCP_FORNO_EUROPE_WEST1_FULL_NODES_COUNT=2 -GCP_FORNO_EUROPE_WEST1_FULL_NODES_ROLLING_UPDATE_PARTITION=0 -GCP_FORNO_EUROPE_WEST1_FULL_NODES_DISK_SIZE=100 -# NOTE: If these fullnodes are used for static nodes, changing this will result -# in the full nodes having a different nodekey -GCP_FORNO_EUROPE_WEST1_FULL_NODES_NODEKEY_DERIVATION_STRING=GCP_FORNO_EUROPE_WEST1 -# suffix of the static nodes file, e.g. `.` -GCP_FORNO_EUROPE_WEST1_FULL_NODES_STATIC_NODES_FILE_SUFFIX=gcp-europe-west1 -GCP_FORNO_EUROPE_WEST1_FULL_NODES_RPC_API_METHODS="eth,net,rpc,web3" -GCP_FORNO_EUROPE_WEST1_FULL_NODES_GETH_GC_MODE="full" -GCP_FORNO_EUROPE_WEST1_FULL_NODES_USE_GSTORAGE_DATA=false -GCP_FORNO_EUROPE_WEST1_FULL_NODES_WS_PORT="8546" -GCP_FORNO_EUROPE_WEST1_PROM_SIDECAR_DISABLED="true" - -TRANSACTION_METRICS_EXPORTER_DOCKER_IMAGE_REPOSITORY="gcr.io/celo-testnet/celo-monorepo" -TRANSACTION_METRICS_EXPORTER_DOCKER_IMAGE_TAG="transaction-metrics-exporter-dc5e5dfa07231a4ff4664816a95eae606293eae9" - -# Genesis Vars -NETWORK_ID=62320 -CONSENSUS_TYPE="istanbul" -BLOCK_TIME=5 -EPOCH=17280 -LOOKBACK=12 -ISTANBUL_REQUEST_TIMEOUT_MS=10000 - -# the number of load test clients that will be given funds in the genesis & migrations -LOAD_TEST_CLIENTS=20 -# every 7.2 seconds, so that 500 transactions are sent by a client every hour -# to simulate 10,000 total transactions per hour -LOAD_TEST_TX_DELAY_MS=7200 - -# the amount in cUSD wei to give faucet, load test, and attestation bot accounts -FAUCET_CUSD_WEI=60000000000000000000000 - -# format: <# of validators>:,<# of validators>:;... -# For vm-based testnets: The n proxied validators have to be the n first validators. Only indicate in the list the validators with proxy. -# Example: For a 10 validator testnet, 2:3,2:2,2:1 will result in Validator 0-1 with 3 proxies, 2-3 with 2 proxies, 4-5 with 1 proxy, and 6-9 with 0 proxies -VALIDATORS=6 -VALIDATOR_PROXY_COUNTS=2:0,2:1,1:2,1:3 -SECONDARIES=5 -ELECTION_MIN_VALIDATORS=20 -# TX_NODES are used for forno -TX_NODES=3 - -# Nodes whose RPC ports are only internally exposed. Used bu Blockscout and other internal services. -PRIVATE_TX_NODES=2 - -STATIC_IPS_FOR_GETH_NODES=true -# Whether tx_nodes/validators stateful set should use ssd persistent disks -GETH_NODES_SSD_DISKS=true -# Used for validators, proxies, and tx_nodes -NODE_DISK_SIZE_GB=100 -# Used for private tx_nodes (gcmode archive) -PRIVATE_NODE_DISK_SIZE_GB=100 -IN_MEMORY_DISCOVERY_TABLE=false -# PING_IP_FROM_PACKET=false - -# Testnet vars -CLUSTER_CREATION_FLAGS="--enable-autoscaling --min-nodes 3 --max-nodes 8 --machine-type=n1-standard-4" - -GETH_NODE_CPU_REQUEST=2 -GETH_NODE_MEMORY_REQUEST=4Gi - -VERIFICATION_POOL_URL="https://us-central1-celo-testnet.cloudfunctions.net/handleVerificationRequestbaklava/v0.1/sms/" -VERIFICATION_REWARDS_URL="https://us-central1-celo-testnet.cloudfunctions.net/handleVerificationRequestbaklava/v0.1/rewards/" - -# Disable the sidecar that forwards the metrics to stackdriver -PROMETHEUS_DISABLE_STACKDRIVER_SIDECAR="true" - -MOBILE_WALLET_PLAYSTORE_LINK="https://play.google.com/apps/internaltest/4700990475000634666" - -# Number of gold-holding bots that vote for validator groups -VOTING_BOTS=100 -# 10,000 CG -VOTING_BOT_BALANCE=10000000000000000000000 - -# Probability that a given bot account will participate in the current epoch's voting -VOTING_BOT_WAKE_PROBABILITY=0 - -# Baseline probability of changing vote once woken. If the group that this bot account -# has currently voted for is 1, this is the probability the bot will change their vote -VOTING_BOT_CHANGE_BASELINE=0 - -# The probability that when choosing a new group to vote for, it will choose a never-elected group -# Should be in the range of 0 to 1. Lower values bias towards incumbency. Higher values bias -# towards novelty. -VOTING_BOT_EXPLORE_PROBABILITY=0.6 - -# This value is used to determine how heavily a group's score weigh in the bot's voting decision -# Should be a positive number. 1 is a linear relationship. Higher values increase the separation -# among high scores. Less than 1 has a steep drop-off for low-scoring groups. -VOTING_BOT_SCORE_SENSITIVITY=1 - -# Schedule for the voting bot process, deployed via celotool, expressed in crontab syntax -# Minimum epoch length is 1 hour. This cron schedule is "every hour at minute 1", in order -# to run once an epoch. -# Notes: -# - if an epoch takes longer, this will occasionally run twice in an epoch -# - Running on minute 1 is arbitrary. This could be any value from 0-59. -VOTING_BOT_CRON_SCHEDULE="1 * * * *" - -# For WalletConnect relay -WALLET_CONNECT_IMAGE_REPOSITORY = 'us.gcr.io/celo-testnet/walletconnect' -WALLET_CONNECT_IMAGE_TAG = '1472bcaad57e3746498f7a661c42ff5cf9acaf5a' -WALLET_CONNECT_REDIS_CLUSTER_ENABLED = false -WALLET_CONNECT_REDIS_CLUSTER_USEPASSWORD = false diff --git a/.env.mnemonic.alfajores.enc b/.env.mnemonic.alfajores.enc index dc214d6f47d..75282672614 100644 Binary files a/.env.mnemonic.alfajores.enc and b/.env.mnemonic.alfajores.enc differ diff --git a/.env.mnemonic.baklava.enc b/.env.mnemonic.baklava.enc index 4096501b094..e789a376ec6 100644 Binary files a/.env.mnemonic.baklava.enc and b/.env.mnemonic.baklava.enc differ diff --git a/.env.mnemonic.enc b/.env.mnemonic.enc index f2d0ca89712..5cb0242dbcc 100644 Binary files a/.env.mnemonic.enc and b/.env.mnemonic.enc differ diff --git a/.env.mnemonic.mainnet.enc b/.env.mnemonic.mainnet.enc new file mode 100644 index 00000000000..2ee1d5a359a Binary files /dev/null and b/.env.mnemonic.mainnet.enc differ diff --git a/.env.mnemonic.rc1.enc b/.env.mnemonic.rc1.enc deleted file mode 100644 index 37abb5321b6..00000000000 Binary files a/.env.mnemonic.rc1.enc and /dev/null differ diff --git a/.env.oracledev b/.env.oracledev deleted file mode 100644 index 2cda4367b37..00000000000 --- a/.env.oracledev +++ /dev/null @@ -1,142 +0,0 @@ -ORACLE_DOCKER_IMAGE_REPOSITORY="celotestnet.azurecr.io/testnet/oracle" -ORACLE_DOCKER_IMAGE_TAG="c7215fcbeccc0c61b306fbd0503a67bd0cf509de" - -ORACLE_UNUSED_ORACLE_ADDRESSES= - -# each context should have its own environment variables, generally of the form -# _* -CONTEXTS=azure-eastus,gcp-test,gcp-test-asia - -FORNO_FULL_NODE_CONTEXTS=gcp-test,gcp-test-asia -FORNO_DOMAINS=oracledev-forno.celo-networks-dev.org. -# all clusters must be in this VPC -FORNO_VPC_NETWORK_NAME=default - -AZURE_EASTUS_AZURE_SUBSCRIPTION_ID=97e2b592-255b-4f92-bce0-127257163c36 -AZURE_EASTUS_AZURE_TENANT_ID=7cb7628a-e37c-4afb-8332-2029e418980e -AZURE_EASTUS_AZURE_KUBERNETES_RESOURCE_GROUP=testnet-oracle-eastus -AZURE_EASTUS_KUBERNETES_CLUSTER_NAME=testnet-oracle-eastus -# Format should be a comma-separated sequence of: -#
:: -AZURE_EASTUS_CELOUSD_ORACLE_ADDRESS_AZURE_KEY_VAULTS=0x21860ca3a0a6f7e450b8f24bd00eac7ba766b85e:testnet-oracle-eastus,0xfd3738e5e0a020614a9e5253078dda491e77031c:testnet-oracle-eastus -# AZURE_EASTUS_CELOUSD_ORACLE_ADDRESSES_FROM_MNEMONIC_COUNT= -AZURE_EASTUS_FULL_NODES_COUNT=2 -AZURE_EASTUS_FULL_NODES_DISK_SIZE=10 - -# Format should be a comma-separated sequence of: -#
:: -GCP_TEST_GCP_PROJECT_NAME=celo-testnet -GCP_TEST_GCP_ZONE=us-west4-a -GCP_TEST_KUBERNETES_CLUSTER_NAME=federated-dev-us-west4-a -GCP_TEST_FULL_NODES_COUNT=1 -GCP_TEST_FULL_NODES_DISK_SIZE=10 -GCP_TEST_FULL_NODES_STATIC_NODES_FILE_SUFFIX=gcp-test - -GCP_TEST_ASIA_GCP_PROJECT_NAME=celo-testnet -GCP_TEST_ASIA_GCP_ZONE=asia-northeast2-a -GCP_TEST_ASIA_KUBERNETES_CLUSTER_NAME=federated-dev-asia-northeast2-a -GCP_TEST_ASIA_FULL_NODES_STATIC_NODES_FILE_SUFFIX=gcp-test-asia -GCP_TEST_ASIA_FULL_NODES_COUNT=1 -GCP_TEST_ASIA_FULL_NODES_DISK_SIZE=10 - -# ---- General ---- - -ENV_TYPE="development" -CLUSTER_DOMAIN_NAME="celo-networks-dev" -TESTNET_PROJECT_NAME="celo-testnet" -CLUSTER_DOMAIN_NAME="celo-networks-dev" - -# ---- Kubernetes ---- - -KUBERNETES_CLUSTER_NAME="celo-networks-dev" -KUBERNETES_CLUSTER_ZONE="us-west1-a" -CLUSTER_CREATION_FLAGS="--enable-autoscaling --min-nodes 3 --max-nodes 8 --machine-type=n1-standard-4" - -# ---- VM ---- - - -# ---- Blockscout ---- - -BLOCKSCOUT_DOCKER_IMAGE_TAG="ad86714d629c01272e0651dec1fb6a968c3cec71" -BLOCKSCOUT_DB_SUFFIX="-2" - -# ---- Geth ---- - -GETH_NODE_DOCKER_IMAGE_REPOSITORY="us.gcr.io/celo-testnet/geth" -# When upgrading change this to latest commit hash from the master of the geth repo -# `geth $ git show | head -n 1` -GETH_NODE_DOCKER_IMAGE_TAG="8a44c2cd92200bdffce595c7558e84a39ea2bc15" - -GETH_VERBOSITY=2 - -GETH_BOOTNODE_DOCKER_IMAGE_REPOSITORY="us.gcr.io/celo-testnet/geth-all" -# When upgrading change this to latest commit hash from the master of the geth repo -# `geth $ git show | head -n 1` -GETH_BOOTNODE_DOCKER_IMAGE_TAG="8a44c2cd92200bdffce595c7558e84a39ea2bc15" - -# ---- Celotool ---- - -CELOTOOL_DOCKER_IMAGE_REPOSITORY="gcr.io/celo-testnet/celo-monorepo" -CELOTOOL_DOCKER_IMAGE_TAG="celotool-4257fe61f91e935681f3a91bb4dcb44c8dd6df47" - -# ---- Transaction Metrics Exporter ---- - -TRANSACTION_METRICS_EXPORTER_DOCKER_IMAGE_REPOSITORY="gcr.io/celo-testnet/celo-monorepo" -TRANSACTION_METRICS_EXPORTER_DOCKER_IMAGE_TAG="transaction-metrics-exporter-8e69cf86010b62e283d5f9285f181fca5483733e" - -# ---- Genesis Vars ---- -NETWORK_ID=1101 -CONSENSUS_TYPE="istanbul" -PREDEPLOYED_CONTRACTS="REGISTRY" -BLOCK_TIME=5 -EPOCH=720 - -# ---- Network Vars ---- -VALIDATORS=2 -PROXIED_VALIDATORS=0 -TX_NODES=1 -# Nodes whose RPC ports are only internally exposed -PRIVATE_TX_NODES=1 -STATIC_IPS_FOR_GETH_NODES=false -PING_IP_FROM_PACKET=true -IN_MEMORY_DISCOVERY_TABLE=false -ISTANBUL_REQUEST_TIMEOUT_MS=3000 -# Used for validators, proxies, and tx_nodes -NODE_DISK_SIZE_GB=10 -# Used for private tx_nodes (gcmode archive) -PRIVATE_NODE_DISK_SIZE_GB=10 -ADMIN_RPC_ENABLED=false -LOAD_TEST_CLIENTS=100 -# every 36 seconds, so that 100 transactions are sent by a client every hour -LOAD_TEST_TX_DELAY_MS=36000 - -GETH_NODE_CPU_REQUEST=400m -GETH_NODE_MEMORY_REQUEST=1.75G - -VERIFICATION_POOL_URL="https://us-central1-celo-testnet.cloudfunctions.net/handleVerificationRequesttrevor1/v0.1/sms/" -VERIFICATION_REWARDS_URL="https://us-central1-celo-testnet.cloudfunctions.net/handleVerificationRequesttrevor1/v0.1/rewards/" - -MOBILE_WALLET_PLAYSTORE_LINK="https://play.google.com/apps/internaltest/4700990475000634666" - -NOTIFICATION_SERVICE_FIREBASE_DB="https://console.firebase.google.com/u/0/project/celo-org-mobile/database/celo-org-mobile-int/data" - -AUCTION_CRON_SPEC="*/5 * * * *" - -SMS_RETRIEVER_HASH_CODE=l5k6LvdPDXS -# empty string is false for helm -GETH_NODES_BACKUP_CRONJOB_ENABLED= -CONTRACT_CRONJOBS_ENABLED= - -# Schedule for an oracle deployed via celotool, expressed in crontab syntax -# This schedule is "every 5th minute" -MOCK_ORACLE_CRON_SCHEDULE="*/5 * * * *" - -MOCK_ORACLE_DOCKER_IMAGE_REPOSITORY="gcr.io/celo-testnet/oracle" -MOCK_ORACLE_DOCKER_IMAGE_TAG="baklava" - -# The number of oracles to add during contract migrations. -# Their keys/addresses are generated using the mnemonic -ORACLES=10 - -CELOCLI_STANDALONE_IMAGE_REPOSITORY="gcr.io/celo-testnet/celocli-standalone" -CELOCLI_STANDALONE_IMAGE_TAG="0.0.30-beta2" diff --git a/.env.rc1 b/.env.rc1 index 1012a6eb2da..2a0f83289a6 100644 --- a/.env.rc1 +++ b/.env.rc1 @@ -34,9 +34,6 @@ GETH_BOOTNODE_DOCKER_IMAGE_REPOSITORY="us.gcr.io/celo-org/geth-all" # `geth $ git show | head -n 1` GETH_BOOTNODE_DOCKER_IMAGE_TAG="1.7.0" -CELOTOOL_DOCKER_IMAGE_REPOSITORY="gcr.io/celo-testnet/celo-monorepo" -CELOTOOL_DOCKER_IMAGE_TAG="celotool-4257fe61f91e935681f3a91bb4dcb44c8dd6df47" - CELOCLI_STANDALONE_IMAGE_REPOSITORY="gcr.io/celo-testnet/celocli-standalone" CELOCLI_STANDALONE_IMAGE_TAG="0.0.42" diff --git a/.env.rc1staging b/.env.rc1staging deleted file mode 100644 index 8d651268dfe..00000000000 --- a/.env.rc1staging +++ /dev/null @@ -1,23 +0,0 @@ -# Don't use "//" for comments in this file. -# This file is meant to be executed as a bash script for testing. -ENV_TYPE="staging" - -GETH_VERBOSITY=2 -GETH_ENABLE_METRICS=true - -KUBERNETES_CLUSTER_NAME="rc1staging" -KUBERNETES_CLUSTER_ZONE="us-west1-a" -CLUSTER_DOMAIN_NAME="celo-testnet" - -TESTNET_PROJECT_NAME="celo-testnet-production" - -BLOCKSCOUT_DOCKER_IMAGE_TAG="0362f9f4d1d4842f27adb634d628f969f53c046d" -BLOCKSCOUT_DB_SUFFIX="" - -# The archive nodes are managed throught the Helm charts. - -NETWORK_ID=42220 -PRIVATE_TX_NODES=2 - -# Disable the sidecar that forwards the metrics to stackdriver -PROMETHEUS_DISABLE_STACKDRIVER_SIDECAR="true" \ No newline at end of file diff --git a/.eslintrc.js b/.eslintrc.js index 12d0b16b922..b9baeac8b2f 100644 --- a/.eslintrc.js +++ b/.eslintrc.js @@ -25,7 +25,6 @@ module.exports = { '**/*.js', 'packages/protocol/build/**', 'packages/protocol/types/**', - '/packages/protocol/migrations_ts/**', 'packages/protocol/scripts/truffle/**', // ignoring any files that for which "TSConfig does not include this file" error was given 'packages/protocol/scripts/utils.test.ts', diff --git a/.gitbook.yaml b/.gitbook.yaml deleted file mode 100644 index a75f0198461..00000000000 --- a/.gitbook.yaml +++ /dev/null @@ -1,36 +0,0 @@ -root: ./packages/docs/ - -redirects: - operations-manual/key-management/using-a-ledger-wallet: celo-holder-guide/ledger.md - operations-manual/key-management/ledger: celo-holder-guide/ledger.md - operations-manual/summary/ledger: celo-holder-guide/ledger.md - operations-manual/using-a-ledger-wallet: celo-holder-guide/ledger.md - validator-guide/key-management/using-a-ledger-wallet: celo-holder-guide/ledger.md - validator-guide/key-management/ledger: celo-holder-guide/ledger.md - validator-guide/summary/ledger: celo-holder-guide/ledger.md - validator-guide/using-a-ledger-wallet: celo-holder-guide/ledger.md - getting-started/running-a-full-node: getting-started/running-a-full-node-in-baklava.md - getting-started/running-a-validator: getting-started/running-a-validator-in-baklava.md - getting-started/using-the-wallet: getting-started/using-the-mobile-wallet.md - celo-codebase/protocol/release-gold: celo-holder-guide/release-gold.md - validator-guide/governance: celo-holder-guide/voting-governance.md - developer-guide/start/celo-truffle-box: developer-resources/walkthroughs/hello-mobile-dapp.md - developer-guide/overview/introduction: developer-resources/overview.md - developer-guide/start/development-chain: developer-guide/development-chain - important-information/rc-network-disclaimer: important-information/mainnet-network-disclaimer.md - getting-started/rc1: getting-started/mainnet.md - getting-started/rc1/running-a-full-node-in-rc1: getting-started/running-a-full-node-in-mainnet.md - getting-started/rc1/running-a-validator-in-rc1: getting-started/running-a-validator-in-mainnet.md - celo-gold-holder-guide/ledger: celo-holder-guide/ledger.md - celo-gold-holder-guide/quick-start: celo-holder-guide/quick-start.md - celo-gold-holder-guide/release-gold: celo-holder-guide/release-gold.md - celo-gold-holder-guide/voting-governance: celo-holder-guide/voting-governance.md - celo-gold-holder-guide/voting-validators: celo-holder-guide/voting-validators.md - celo-owner-guide/quick-start: celo-holder-guide/quick-start.md - celo-owner-guide/cusd: celo-holder-guide/cusd.md - celo-owner-guide/ledger: celo-holder-guide/ledger.md - celo-owner-guide/release-gold: celo-holder-guide/release-gold.md - celo-owner-guide/voting-validators: celo-holder-guide/voting-validators.md - celo-owner-guide/voting-governance: celo-holder-guide/voting-governance.md - celo-owner-guide/eth-recovery: celo-holder-guide/eth-recovery.md - diff --git a/.github/CODEOWNERS b/.github/CODEOWNERS index 48daa60d9e8..ccaa02a6a2d 100644 --- a/.github/CODEOWNERS +++ b/.github/CODEOWNERS @@ -3,20 +3,15 @@ # For details on acceptable file patterns, please refer to the [Github Documentation](https://help.github.com/articles/about-codeowners/) # default owners, overridden by package specific owners below -* @celo-org/default-owners -# directory and file-level owners. Feel free to add to this! +* @celo-org/primitives + +# directory and file-level owners .github/actions/ @celo-org/devopsre .github/workflows/ @celo-org/devopsre +.github/renovate.json5 @celo-org/devopsre /dockerfiles/ @celo-org/devopsre -/packages/celotool/ @celo-org/devopsre -/packages/env-tests/ @celo-org/primitives -/packages/helm-charts/ @celo-org/devopsre -/packages/helm-charts/mock-oracle/ @celo-org/mento @celo-org/devopsre -/packages/helm-charts/oracle-rbac/ @celo-org/mento @celo-org/devopsre -/packages/helm-charts/oracle/ @celo-org/mento @celo-org/devopsre -/packages/metadata-crawler/ @celo-org/devopsre -/packages/protocol/ @celo-org/primitives +/packages/ @celo-org/primitives diff --git a/.github/CONTRIBUTING.md b/.github/CONTRIBUTING.md index 51d0f28518c..8f34a3484f6 100644 --- a/.github/CONTRIBUTING.md +++ b/.github/CONTRIBUTING.md @@ -10,4 +10,4 @@ If you wish to submit more complex changes, please sync with a core developer fi This will help ensure those changes are in line with the general philosophy of the project and enable you to get some early feedback. -See the [contributing guide](https://docs.celo.org/community/contributing) for details on how to participate. +See the [contributing guide](https://docs.celo.org/what-is-celo/joining-celo/contributors/overview) for details on how to participate. diff --git a/.github/workflows/celo-monorepo.yml b/.github/workflows/celo-monorepo.yml index 425296b4a58..7607ddf3efc 100644 --- a/.github/workflows/celo-monorepo.yml +++ b/.github/workflows/celo-monorepo.yml @@ -24,14 +24,12 @@ defaults: env: # Increment these to force cache rebuilding - NODE_MODULE_CACHE_VERSION: 8 + NODE_MODULE_CACHE_VERSION: 10 NODE_OPTIONS: '--max-old-space-size=4096' TERM: dumb GRADLE_OPTS: '-Dorg.gradle.daemon=false -Dorg.gradle.parallel=false -Dorg.gradle.configureondemand=true -Dorg.gradle.jvmargs="-Xmx4096m -XX:+HeapDumpOnOutOfMemoryError"' # Git Tag for contract release to use - RELEASE_TAG: core-contracts.v12-renamed - # CELO_BLOCKCHAIN_BRANCH_TO_TEST: master - CELO_BLOCKCHAIN_BRANCH_TO_TEST: release/1.8.x + RELEASE_TAG: core-contracts.v14.anvil # EXAMPLE on debug ssh step # - name: Setup tmate session @@ -52,8 +50,7 @@ jobs: # Adding a initial comma so ',' matches also for the first file all_modified_files: ',${{ steps.changed-files.outputs.all_modified_files }}' artifacts_to_cache: ${{ steps.get_artifacts_to_cache.outputs.artifacts_to_cache }} - # runs-on: ubuntu-latest - runs-on: ['self-hosted', 'monorepo-node18'] + runs-on: ['self-hosted', 'org', 'ubuntu22-node18'] timeout-minutes: 30 steps: - uses: actions/checkout@v4 @@ -71,6 +68,10 @@ jobs: key: node-${{ runner.os }}-${{ runner.arch }}-${{ env.NODE_MODULE_CACHE_VERSION }}-${{ hashFiles('**/yarn.lock') }} restore-keys: | node-${{ runner.os }}-${{ runner.arch }}-${{ env.NODE_MODULE_CACHE_VERSION }}- + - name: Double check node version + run: | + echo "Node version: $(node --version)" + echo "Yarn version: $(yarn --version)" - name: Install yarn dependencies run: git config --global url."https://".insteadOf ssh:// && yarn generate-lock-entry && yarn install --frozen-lockfile --network-timeout 1000000 if: steps.cache_node.outputs.cache-hit != 'true' @@ -141,7 +142,7 @@ jobs: lint-checks: name: Lint code - runs-on: ['self-hosted', 'monorepo-node18'] + runs-on: ['self-hosted', 'org', 'ubuntu22-node18'] timeout-minutes: 30 needs: install-dependencies steps: @@ -157,7 +158,7 @@ jobs: protocol-test-release: name: Protocol Test Release - runs-on: ['self-hosted', 'monorepo-node18'] + runs-on: ['self-hosted', 'org', 'ubuntu22-node18'] timeout-minutes: 500 needs: [install-dependencies, lint-checks] if: | @@ -181,10 +182,14 @@ jobs: workflow: protocol-devchain.yml name: devchain-${{ env.RELEASE_TAG }} path: packages/protocol/.tmp/devchain - - name: Copy DevChain and Build generated from released tag - run: | - BUILD_AND_DEVCHAIN_DIR=$(echo build/$(echo $RELEASE_TAG | sed -e 's/\//_/g')) - (cp -r packages/protocol/.tmp/devchain packages/protocol/$BUILD_AND_DEVCHAIN_DIR) + - name: Install Foundry + uses: foundry-rs/foundry-toolchain@8f1998e9878d786675189ef566a2e4bf24869773 + with: + version: 'v1.0.0' + + - name: Install forge dependencies + working-directory: packages/protocol + run: forge install - name: Test against current release run: | echo "Comparing against $RELEASE_TAG" @@ -194,7 +199,7 @@ jobs: protocol-test-matrix: # Keeping name short because GitHub UI does not handle long names well name: ${{ matrix.name }} - runs-on: ['self-hosted', 'monorepo-node18'] + runs-on: ['self-hosted', 'org', 'ubuntu22-node18'] timeout-minutes: 60 needs: [install-dependencies, lint-checks] if: | @@ -217,15 +222,9 @@ jobs: echo "If these changes are intended, update the 'releaseData/versionReports' accordingly" exit 1 fi - - name: Protocol Common tests - command: | - yarn --cwd packages/protocol test common/ - name: Protocol Compatibility command: | - yarn --cwd packages/protocol test compatibility/ - - name: Protocol scripts test - command: | - yarn --cwd packages/protocol test:scripts + yarn --cwd packages/protocol test:ts steps: - uses: actions/checkout@v4 with: @@ -235,95 +234,17 @@ jobs: with: rebuild-package: 'true' artifacts_to_cache: ${{ needs.install-dependencies.outputs.artifacts_to_cache }} - - name: Execute matrix command for test - uses: nick-fields/retry@7152eba30c6575329ac0576536151aca5a72780e - with: - timeout_minutes: 40 - max_attempts: 3 - command: | - ${{ matrix.command }} - - end-to-end-geth-matrix: - # Keeping name short because GitHub UI does not handle long names well - name: e2e ${{ matrix.name }} - runs-on: ['self-hosted', 'monorepo-node18'] - timeout-minutes: 60 - needs: [install-dependencies, lint-checks] - if: | - github.base_ref == 'master' || contains(github.base_ref, 'release') || contains(github.base_ref, 'production') || - contains(needs.install-dependencies.outputs.all_modified_files, 'packages/celotool') || - contains(needs.install-dependencies.outputs.all_modified_files, 'packages/protocol') || - contains(needs.install-dependencies.outputs.all_modified_files, 'packages/typescript') || - contains(needs.install-dependencies.outputs.all_modified_files, ',package.json') || - contains(needs.install-dependencies.outputs.all_modified_files, ',yarn.lock') || - false - strategy: - fail-fast: false - matrix: - include: - - name: Transfer test - command: | - set -e - # Forcing to load go and rust paths - export PATH="/usr/local/go/bin:$HOME/.cargo/bin:${PATH}" - cd packages/celotool - ./ci_test_transfers.sh checkout ${CELO_BLOCKCHAIN_BRANCH_TO_TEST} - - name: Blockchain Parameters test - command: | - set -e - export PATH="/usr/local/go/bin:$HOME/.cargo/bin:${PATH}" - cd packages/celotool - ./ci_test_blockchain_parameters.sh checkout ${CELO_BLOCKCHAIN_BRANCH_TO_TEST} - - name: Slashing test - command: | - set -e - export PATH="/usr/local/go/bin:$HOME/.cargo/bin:${PATH}" - cd packages/celotool - ./ci_test_slashing.sh checkout ${CELO_BLOCKCHAIN_BRANCH_TO_TEST} - - name: Governance test - command: | - set -e - export PATH="/usr/local/go/bin:$HOME/.cargo/bin:${PATH}" - cd packages/celotool - ./ci_test_governance.sh checkout ${CELO_BLOCKCHAIN_BRANCH_TO_TEST} - - name: Replica test - command: | - set -e - export PATH="/usr/local/go/bin:$HOME/.cargo/bin:${PATH}" - cd packages/celotool - ./ci_test_replicas.sh checkout ${CELO_BLOCKCHAIN_BRANCH_TO_TEST} - - name: Sync test - command: | - set -e - export PATH="/usr/local/go/bin:$HOME/.cargo/bin:${PATH}" - cd packages/celotool - ./ci_test_sync.sh checkout ${CELO_BLOCKCHAIN_BRANCH_TO_TEST} - - name: CIP35 eth compatibility test - command: | - set -e - export PATH="/usr/local/go/bin:$HOME/.cargo/bin:${PATH}" - cd packages/celotool - echo "Test is skipped because migrations somehow fail" - # ./ci_test_cip35.sh checkout ${CELO_BLOCKCHAIN_BRANCH_TO_TEST} - - name: Validator order test - command: | - set -e - export PATH="/usr/local/go/bin:$HOME/.cargo/bin:${PATH}" - cd packages/celotool - - ./ci_test_validator_order.sh checkout ${CELO_BLOCKCHAIN_BRANCH_TO_TEST} - steps: - - uses: actions/checkout@v4 + - name: Install Foundry + uses: foundry-rs/foundry-toolchain@8f1998e9878d786675189ef566a2e4bf24869773 with: - submodules: recursive - - name: Sync workspace - uses: ./.github/actions/sync-workspace - with: - artifacts_to_cache: ${{ needs.install-dependencies.outputs.artifacts_to_cache }} + version: 'v1.0.0' + - name: Install forge dependencies + working-directory: packages/protocol + run: forge install - name: Execute matrix command for test uses: nick-fields/retry@7152eba30c6575329ac0576536151aca5a72780e with: - timeout_minutes: 30 + timeout_minutes: 40 max_attempts: 3 command: | ${{ matrix.command }} @@ -331,7 +252,9 @@ jobs: # NOTE: This has not been fully tested as we don't have a license for certora certora-test: name: Certora test ${{ matrix.name }} - runs-on: ['self-hosted', 'monorepo-node18'] + runs-on: ['self-hosted', 'org', '8-cpu'] + container: + image: us-west1-docker.pkg.dev/devopsre/actions-runner-controller/celo-monorepo:34b1c5e90f9613964160cdf6bc292b410362ec6d timeout-minutes: 30 needs: [install-dependencies, lint-checks] # Disable as certora license is not active diff --git a/.github/workflows/containers.yaml b/.github/workflows/containers.yaml index 4e75ad04134..0ba35b94663 100644 --- a/.github/workflows/containers.yaml +++ b/.github/workflows/containers.yaml @@ -5,14 +5,12 @@ on: push: paths: - 'dockerfiles/**' - - 'packages/celotool/**' branches: - master - 'release/**' pull_request: paths: - 'dockerfiles/**' - - 'packages/celotool/**' workflow_dispatch: permissions: @@ -33,50 +31,9 @@ jobs: with: # Using comma as separator to be able to easily match full paths (using ,) separator: ',' - - # Celotool images - celotool-build-dev: - uses: celo-org/reusable-workflows/.github/workflows/container-cicd.yaml@v2.0.4 - name: Build us-west1-docker.pkg.dev/devopsre/dev-images/celotool:${{ github.sha }} - needs: changed-files - if: | - github.ref != 'refs/heads/master' && ( - contains(needs.changed-files.outputs.all_modified_files, ',dockerfiles/celotool/Dockerfile') || - contains(needs.changed-files.outputs.all_modified_files, ',packages/celotool') || - github.event_name == 'workflow_dispatch' - ) - with: - workload-id-provider: projects/1094498259535/locations/global/workloadIdentityPools/gh-celo-monorepo/providers/github-by-repos - service-account: 'celo-monorepo-dev@devopsre.iam.gserviceaccount.com' - artifact-registry: us-west1-docker.pkg.dev/devopsre/dev-images/celotool - tags: ${{ github.sha }} - platforms: linux/amd64 - context: . - file: dockerfiles/celotool/Dockerfile - trivy: false - celotool-build: - uses: celo-org/reusable-workflows/.github/workflows/container-cicd.yaml@v2.0.4 - name: Build us-west1-docker.pkg.dev/devopsre/celo-monorepo/celotool:${{ github.sha }} - needs: changed-files - if: | - github.ref == 'refs/heads/master' && ( - contains(needs.changed-files.outputs.all_modified_files, ',dockerfiles/celotool/Dockerfile') || - contains(needs.changed-files.outputs.all_modified_files, ',packages/celotool') || - github.event_name == 'workflow_dispatch' - ) - with: - workload-id-provider: projects/1094498259535/locations/global/workloadIdentityPools/gh-celo-monorepo-master/providers/github-by-repos - service-account: 'celo-monorepo@devopsre.iam.gserviceaccount.com' - artifact-registry: us-west1-docker.pkg.dev/devopsre/celo-monorepo/celotool - tags: ${{ github.sha }} - platforms: linux/amd64 - context: . - file: dockerfiles/celotool/Dockerfile - trivy: false - # All monorepo celomonorepo-build-dev: - uses: celo-org/reusable-workflows/.github/workflows/container-cicd.yaml@v2.0.4 + uses: celo-org/reusable-workflows/.github/workflows/container-cicd.yaml@v2.0.5 name: Build us-west1-docker.pkg.dev/devopsre/dev-images/monorepo:${{ github.sha }} needs: changed-files if: | @@ -92,7 +49,7 @@ jobs: file: dockerfiles/all-monorepo/Dockerfile trivy: false celomonorepo-build: - uses: celo-org/reusable-workflows/.github/workflows/container-cicd.yaml@v2.0.4 + uses: celo-org/reusable-workflows/.github/workflows/container-cicd.yaml@v2.0.5 name: Build us-west1-docker.pkg.dev/devopsre/celo-monorepo/monorepo:${{ github.sha }} needs: changed-files if: | @@ -107,37 +64,3 @@ jobs: context: . file: dockerfiles/all-monorepo/Dockerfile trivy: false - - # Blockscout Metadata crawler images - metadata-crawler-build-dev: - uses: celo-org/reusable-workflows/.github/workflows/container-cicd.yaml@v2.0.4 - needs: changed-files - name: Build us-west1-docker.pkg.dev/devopsre/dev-images/blockscout-metadata-crawler:testing - if: | - github.ref != 'refs/heads/master' && ( - contains(needs.changed-files.outputs.all_modified_files, ',dockerfiles/metadata-crawler') - ) - with: - workload-id-provider: projects/1094498259535/locations/global/workloadIdentityPools/gh-celo-monorepo/providers/github-by-repos - service-account: 'celo-monorepo-dev@devopsre.iam.gserviceaccount.com' - artifact-registry: us-west1-docker.pkg.dev/devopsre/dev-images/blockscout-metadata-crawler - tags: testing - context: . - file: dockerfiles/metadata-crawler/Dockerfile - trivy: false - metadata-crawler-build: - uses: celo-org/reusable-workflows/.github/workflows/container-cicd.yaml@v2.0.4 - needs: changed-files - name: Build us-west1-docker.pkg.dev/devopsre/celo-monorepo/blockscout-metadata-crawler:latest - if: | - github.ref == 'refs/heads/master' && ( - contains(needs.changed-files.outputs.all_modified_files, ',dockerfiles/metadata-crawler') - ) - with: - workload-id-provider: projects/1094498259535/locations/global/workloadIdentityPools/gh-celo-monorepo-master/providers/github-by-repos - service-account: 'celo-monorepo@devopsre.iam.gserviceaccount.com' - artifact-registry: us-west1-docker.pkg.dev/devopsre/celo-monorepo/blockscout-metadata-crawler - tags: latest - context: . - file: dockerfiles/metadata-crawler/Dockerfile - trivy: false diff --git a/.github/workflows/protocol-devchain-anvil.yml b/.github/workflows/protocol-devchain-anvil.yml index 4170e20e528..4baa62ad956 100644 --- a/.github/workflows/protocol-devchain-anvil.yml +++ b/.github/workflows/protocol-devchain-anvil.yml @@ -32,7 +32,7 @@ jobs: run: working-directory: packages/protocol name: Generate anvil - runs-on: ['self-hosted', 'org', 'npm-publish'] + runs-on: ['self-hosted', 'monorepo-node18'] permissions: contents: read pull-requests: read @@ -95,7 +95,7 @@ jobs: - name: Install Foundry uses: foundry-rs/foundry-toolchain@8f1998e9878d786675189ef566a2e4bf24869773 with: - version: 'nightly-fa0e0c2ca3ae75895dd19173a02faf88509c0608' + version: 'v1.0.0' - name: Install forge dependencies run: forge install @@ -126,32 +126,19 @@ jobs: - name: 'Install packages' shell: bash run: yarn - - # Starting L1 from scratch instead of JSON state to circumvent this Anvil bug https://github.com/foundry-rs/foundry/issues/7502 - # Install `lsof` dependency, because it's not readily available on CI, but is required by - # `create_and_migrate_anvil_l2_devchain.sh`, because it uses `stop_anvil.sh` to kill - # existing anvil servers. - - name: Generate L1 migrations and run migration tests against L1 devchain + + - name: Generate L2 migration and run migration tests if: success() || failure() run: | - sudo apt-get update - sudo apt-get install -y lsof - source ./scripts/foundry/constants.sh - echo "Starting L1 from scratch to circumvent Anvil bug" - source ./scripts/foundry/create_and_migrate_anvil_devchain.sh + echo "Starting L2 from scratch to circumvent Anvil bug" + ./scripts/foundry/create_and_migrate_anvil_l2_devchain.sh - FOUNDRY_PROFILE=devchain forge test -vvv \ + FOUNDRY_PROFILE=devchain forge test \ --match-path "test-sol/devchain/migration/*" \ --fork-url $ANVIL_RPC_URL - ./scripts/foundry/stop_anvil.sh - - - name: Generate L2 migration - if: success() || failure() - run: ./scripts/foundry/create_and_migrate_anvil_l2_devchain.sh - - name: Sanitize ref name id: sanitize-ref-name run: | @@ -161,7 +148,7 @@ jobs: - name: Determine release type and version (or dry run) # This is what sets the RELEASE_TYPE and RELEASE_VERSION env variables - run: yarn --silent determine-release-version >> "$GITHUB_ENV" + run: yarn --silent release:determine-release-version >> "$GITHUB_ENV" working-directory: packages/protocol env: GITHUB_TAG: ${{ github.ref_name }} @@ -169,7 +156,7 @@ jobs: NPM_TAG: ${{ inputs.npm_tag }} - name: Prepare package for publishing - run: yarn prepare_devchain_anvil_publishing + run: yarn utils:prepare-devchain-anvil-publishing working-directory: packages/protocol env: RELEASE_TYPE: ${{ env.RELEASE_TYPE }} diff --git a/.github/workflows/protocol-devchain.yml b/.github/workflows/protocol-devchain.yml index 5b2d5b0ca76..49b692439ae 100644 --- a/.github/workflows/protocol-devchain.yml +++ b/.github/workflows/protocol-devchain.yml @@ -6,7 +6,6 @@ on: # monthly on 1 at 0:00 UTC - cron: 0 0 1 * * workflow_dispatch: - permissions: contents: read @@ -19,11 +18,7 @@ jobs: fail-fast: false matrix: include: - - tag: core-contracts.v9 - node-version: 12 - - tag: core-contracts.v11 - node-version: 18 - - tag: core-contracts.v12-renamed + - tag: core-contracts.v14.anvil node-version: 18 steps: - uses: actions/checkout@v4 @@ -35,25 +30,29 @@ jobs: uses: actions/setup-node@v4 with: node-version: ${{ matrix.node-version }} - - name: Install yarn dependencies - run: git config --global url."https://".insteadOf ssh:// && yarn install - - name: Build packages - run: yarn build --ignore docs --include-dependencies + - name: install foundry + uses: foundry-rs/foundry-toolchain@v1 + with: + version: 'v1.0.0' + - name: Install forge dependencies + run: forge install + working-directory: packages/protocol # Workaround for https://stackoverflow.com/questions/72978485/git-submodule-update-failed-with-fatal-detected-dubious-ownership-in-repositor # This is needed because some runners messup with permissions of git files and submodules. Particularly problematic for DinD runners (e.g. self-hosted+container arg) - name: Configure git safe directories run: git config --global --add safe.directory '*' - name: Generate devchain of previous release run: | - mkdir devchain - GRANTS_FILE=scripts/truffle/releaseGoldExampleConfigs.json - yarn --cwd packages/protocol devchain generate-tar devchain/devchain.tar.gz --release_gold_contracts $GRANTS_FILE - mv packages/protocol/build/contracts* devchain/ + yarn anvil-devchain:start-L2 + mkdir .tmp/devchain + mv .tmp/devchain.json .tmp/devchain + mv .tmp/l2-devchain.json .tmp/devchain + working-directory: packages/protocol - name: Upload devchain as artifact uses: actions/upload-artifact@v4 with: name: devchain-${{ matrix.tag }} - path: devchain + path: packages/protocol/.tmp/devchain # Max retention time is 90 days for public repos # https://docs.github.com/en/actions/learn-github-actions/usage-limits-billing-and-administration#artifact-and-log-retention-policy retention-days: 90 diff --git a/.github/workflows/protocol_tests.yml b/.github/workflows/protocol_tests.yml index 477bf89b60d..aeb42b53409 100644 --- a/.github/workflows/protocol_tests.yml +++ b/.github/workflows/protocol_tests.yml @@ -57,7 +57,7 @@ jobs: - name: Install Foundry uses: foundry-rs/foundry-toolchain@v1 with: - version: 'nightly-fa0e0c2ca3ae75895dd19173a02faf88509c0608' # TODO: revert back to env var + version: 'v1.0.0' # TODO: revert back to env var - name: Install forge dependencies run: forge install @@ -132,7 +132,7 @@ jobs: - name: Generate migrations and run devchain if: success() || failure() - run: ./scripts/foundry/create_and_migrate_anvil_devchain.sh + run: ./scripts/foundry/create_and_migrate_anvil_l2_devchain.sh - name: Run migration tests against local anvil devchain run: | diff --git a/.github/workflows/publish-contracts-abi-release.yml b/.github/workflows/publish-contracts-abi-release.yml index 57576dbe7c0..1cf043f8233 100644 --- a/.github/workflows/publish-contracts-abi-release.yml +++ b/.github/workflows/publish-contracts-abi-release.yml @@ -78,7 +78,7 @@ jobs: - name: Determine release type and version (or dry run) # This is what sets the RELEASE_TYPE and RELEASE_VERSION env variables - run: yarn --silent determine-release-version >> "$GITHUB_ENV" + run: yarn --silent release:determine-release-version >> "$GITHUB_ENV" working-directory: packages/protocol env: GITHUB_TAG: ${{ github.ref_name }} @@ -86,9 +86,9 @@ jobs: NPM_TAG: ${{ inputs.npm_tag }} - name: 'Build packages' shell: bash - run: yarn build --ignore @celo/celotool --ignore @celo/env-tests --include-dependencies + run: yarn build --include-dependencies - name: Compile solidity contracts and typescript files - run: yarn prepare_contracts_and_abis_publishing + run: yarn utils:prepare-contracts-and-abis-publishing working-directory: packages/protocol env: RELEASE_TYPE: ${{ env.RELEASE_TYPE }} @@ -96,7 +96,7 @@ jobs: # a safety check especially useful if some package is upgraded - name: 'Validate ABIS Exports' shell: bash - run: yarn validate_abis_exports + run: yarn utils:validate-abis-exports working-directory: packages/protocol - name: 'Get git commit hash' id: get_COMMIT_HASH diff --git a/.gitignore b/.gitignore index 5cf4870976c..1ad36763b09 100644 --- a/.gitignore +++ b/.gitignore @@ -88,8 +88,6 @@ tsconfig.tsbuildinfo # git mergetool *.orig -packages/docs/_book/ - */.next/* .env.local diff --git a/.gitmodules b/.gitmodules index f3440efa785..7690307ca10 100644 --- a/.gitmodules +++ b/.gitmodules @@ -6,10 +6,7 @@ url = https://github.com/OpenZeppelin/openzeppelin-contracts [submodule "packages/protocol/lib/mento-core"] path = packages/protocol/lib/mento-core - url = https://github.com/mento-protocol/mento-core -[submodule "packages/protocol/lib/memview.sol"] - path = packages/protocol/lib/memview.sol - url = https://github.com/summa-tx/memview.sol + url = https://github.com/celo-org/mento-core [submodule "packages/protocol/lib/openzeppelin-contracts8"] path = packages/protocol/lib/openzeppelin-contracts8 url = https://github.com/OpenZeppelin/openzeppelin-contracts @@ -23,4 +20,4 @@ [submodule "packages/protocol/lib/solidity-bytes-utils-8"] path = packages/protocol/lib/solidity-bytes-utils-8 url = https://github.com/GNSPS/solidity-bytes-utils - branch = master + branch = master \ No newline at end of file diff --git a/.prettierignore b/.prettierignore index 269beb0221b..7ca4693db8c 100644 --- a/.prettierignore +++ b/.prettierignore @@ -8,20 +8,12 @@ package.json packages/**/dist packages/**/lib -# Needed because we have packages/celotool/src/lib -!packages/celotool/src/** - -packages/docs/_book - packages/protocol/build/ packages/protocol/types/ !packages/protocol/lib/**/*.ts packages/protocol/scripts/**/*.js packages/protocol/migrations/**/*.js packages/protocol/test/**/*.js +packages/protocol/test-ts/**/*.js packages/protocol/contractPackages.js packages/protocol/abis/src-generated/ - -# prettier eats Latex underscore escapting and doesn't seem to have an option to disable -packages/docs/celo-codebase/protocol/proof-of-stake/epoch-rewards.md - diff --git a/.vscode/settings.json b/.vscode/settings.json index c461df645d5..b2e95fd86b4 100644 --- a/.vscode/settings.json +++ b/.vscode/settings.json @@ -48,5 +48,7 @@ "editor.codeActionsOnSave": { "source.organizeImports": "explicit" } - } + }, + "git.detectSubmodules": false, + "git.ignoreSubmodules": true } diff --git a/CLAUDE.md b/CLAUDE.md new file mode 100644 index 00000000000..c28b43fb72e --- /dev/null +++ b/CLAUDE.md @@ -0,0 +1,69 @@ +# Celo Monorepo - AI Instructions + +This is the Celo protocol monorepo containing core smart contracts and tooling. + +## Project Structure + +- `packages/protocol/` - Core Solidity contracts (0.5.x and 0.8.x) +- `packages/protocol/contracts/` - Solidity 0.5.x contracts +- `packages/protocol/contracts-0.8/` - Solidity 0.8.x contracts +- `packages/protocol/scripts/` - Deployment and release scripts +- `packages/protocol/releaseData/` - Release artifacts (version reports, init data) +- `.github/workflows/` - CI/CD pipelines + +## Skills / Detailed Instructions + +For complex tasks, refer to these detailed skill files: + +### Contract Releases +**File:** `.cursor/skills/celo-release/SKILL.md` + +Use when: releasing contracts, testing releases on forks, generating governance proposals, or when mentioning release, deploy, upgrade contracts, CR14, CR15, make-release, or verify-deployed. + +Quick reference: +1. `yarn release:verify-deployed:foundry` - Generate libraries.json +2. `yarn release:check-versions:foundry` - Generate version report +3. `yarn release:make:foundry` - Deploy contracts and create proposal + +### Node Cache Updates +**File:** `.cursor/skills/node-cache-update/SKILL.md` + +Use when: modifying package.json, yarn.lock, adding/removing/updating npm packages, or changing node dependencies. + +Quick reference: Increment `NODE_MODULE_CACHE_VERSION` in `.github/workflows/celo-monorepo.yml` + +## Key Files + +| File | Purpose | +|------|---------| +| `packages/protocol/libraries.json` | Library addresses (network-specific) | +| `packages/protocol/.env.json` | Config including Celoscan API key | +| `.env.mnemonic.*` | Deployer keys per network | +| `packages/protocol/releaseData/versionReports/` | Contract change reports | +| `packages/protocol/releaseData/initializationData/` | Constructor args for new contracts | + +## Networks + +| Network | Chain ID | RPC URL | +|---------|----------|---------| +| Celo Mainnet | 42220 | https://forno.celo.org | +| Celo Sepolia | 11142220 | https://forno.celo-sepolia.celo-testnet.org | + +## Common Commands + +```bash +# From packages/protocol/ +yarn release:verify-deployed:foundry -b -n +yarn release:check-versions:foundry -a -b -r +yarn release:make:foundry -b -k -n ... + +# Decrypt deployer keys (cLabs employees) +yarn keys:decrypt +``` + +## Code Conventions + +- Solidity 0.5.x for existing contracts in `contracts/` +- Solidity 0.8.x for new contracts in `contracts-0.8/` +- All upgradeable contracts use proxy pattern +- Version numbers follow `getVersionNumber()` convention diff --git a/README-dev.md b/README-dev.md deleted file mode 100644 index 2ad0e1a9602..00000000000 --- a/README-dev.md +++ /dev/null @@ -1,17 +0,0 @@ -# README GUIDE FOR CELO DEVELOPERS - -## How to run a local testnet - -Often when developing, it is useful to create a test network localy using the full celo-blockchain binary to go beyond what can be done with other options such as [Ganache](https://www.trufflesuite.com/ganache) - -The quickest way to get started with a local testnet is by running `yarn celotool local-testnet` from the `monorepo` root. - -This command will create a local testnet with a single validator node and deploy all smart contract migrations to it. -Once the network is initialized a NodeJS REPL is provided to help interact with the running nodes. -For more options, consult `yarn celotool local-testnet --help`, which provides an overview of the tool and its options. - - -### Verify installation in Docker - -Test installation in isolation using Docker. -This confirms that it is locally installable and does not have implicit dependency on rest of the `celo-monorepo` or have an implicit dependency which is an explicit dependency of another `celo-monorepo` package. diff --git a/README.md b/README.md index a2845f8e200..e18e7ad4a46 100644 --- a/README.md +++ b/README.md @@ -1,142 +1,46 @@ -

- - celo logo - -

+
+ + + + Celo logo + +
-**Celo Monorepo - Official repository for core projects comprising the Celo platform** +# Celo Monorepo -This repository contains the source code for the Celo core projects including the [smart contracts](https://github.com/celo-org/celo-monorepo/tree/master/packages/protocol) -and other packages. The source code for the Celo Blockchain which operates a node on the Celo Network is kept in a separate repo [here](https://github.com/celo-org/celo-blockchain). +This repository contains the Core Contracts for the Celo Blockchain. Most Celo projects have moved to dedicated repositories. - +- **Celo Blockchain Execution Client**: [celo-org/op-geth](https://github.com/celo-org/op-geth) +- **Developer Tooling**: [celo-org/developer-tooling](https://github.com/celo-org/developer-tooling) +- **Helm Charts**: [celo-org/charts](https://github.com/celo-org/charts) +- **SocialConnect**: [celo-org/social-connect](https://github.com/celo-org/social-connect) +- **Valora Wallet**: [valora-inc](https://github.com/valora-inc) +- **Mento Protocol**: [mento-protocol](https://github.com/mento-protocol) -[![GitHub Actions](https://github.com/celo-org/celo-monorepo/actions/workflows/container-all-monorepo.yml/badge.svg)](https://github.com/celo-org/celo-monorepo/actions/workflows/container-all-monorepo.yml) -[![GitHub contributors](https://img.shields.io/github/contributors/celo-org/celo-monorepo)](https://github.com/celo-org/celo-monorepo/graphs/contributors) -[![GitHub commit activity](https://img.shields.io/github/commit-activity/w/celo-org/celo-monorepo)](https://github.com/celo-org/celo-monorepo/graphs/contributors) -[![GitHub Stars](https://img.shields.io/github/stars/celo-org/celo-monorepo.svg)](https://github.com/celo-org/celo-monorepo/stargazers) -![GitHub repo size](https://img.shields.io/github/repo-size/celo-org/celo-monorepo) -[![GitHub](https://img.shields.io/github/license/celo-org/celo-monorepo?color=blue)](https://github.com/celo-org/celo-monorepo/blob/master/LICENSE) +For a full list of Celo repositories, visit the [Celo GitHub organization](https://github.com/celo-org). - +## 🌱 Mission -[![Website celo.org](https://img.shields.io/website-up-down-green-red/https/celo.org.svg)](https://celo.org) -[![Blog](https://img.shields.io/badge/blog-up-green)](https://medium.com/celoorg) -[![docs](https://img.shields.io/badge/docs-up-green)](https://docs.celo.org/) -[![Youtube](https://img.shields.io/badge/YouTube%20channel-up-green)](https://www.youtube.com/channel/UCCZgos_YAJSXm5QX5D5Wkcw/videos?view=0&sort=p&flow=grid) -[![forum](https://img.shields.io/badge/forum-up-green)](https://forum.celo.org) -[![Discord](https://img.shields.io/discord/600834479145353243.svg)](https://discord.gg/RfHQKtY) -[![Twitter CeloDevs](https://img.shields.io/twitter/follow/celodevs?style=social)](https://twitter.com/celodevs) -[![Twitter CeloOrg](https://img.shields.io/twitter/follow/celoorg?style=social)](https://twitter.com/CeloOrg) -[![Subreddit subscribers](https://img.shields.io/reddit/subreddit-subscribers/CeloHQ?style=social)](https://www.reddit.com/r/CeloHQ/) +**[Celo](https://celo.org/)'s mission is to build a _regenerative_ digital economy that creates conditions of _prosperity_ for all.** - +## 📚 Documentation -[![GitHub pull requests by-label](https://img.shields.io/github/issues-pr-raw/celo-org/celo-monorepo)](https://github.com/celo-org/celo-monorepo/pulls) -[![GitHub Issues](https://img.shields.io/github/issues-raw/celo-org/celo-monorepo.svg)](https://github.com/celo-org/celo-monorepo/issues) -[![GitHub issues by-label](https://img.shields.io/github/issues/celo-org/celo-monorepo/1%20hour%20tasks)](https://github.com/celo-org/celo-monorepo/issues?q=is%3Aopen+is%3Aissue+label%3A%221+hour+tasks%22) +- [Celo Docs](https://docs.celo.org/) +- [Developer Guide](https://docs.celo.org/developer) -Contents: +## 🚀 Getting Started - +```bash +git clone https://github.com/celo-org/celo-monorepo.git +cd celo-monorepo +yarn +yarn build --ignore docs +``` -- [Celo's Mission - Prosperity for All](#mission) -- [The Celo Stack](#stack) -- [Documentation](#docs) -- [Issues](#issues) -- [Repo Structure](#repo) -- [Contributing](#contributing) -- [Ask Questions, Find Answers, Get in Touch](#ask) -- [License](#license) - +## 💬 Community & Support -## 🥅 Celo's Mission - Prosperity for All - -Celo, pronounced /ˈtselo/, means ‘purpose’ in Esperanto. In a similar spirit, we are aiming to create a new platform to connect people globally and bring financial stability to those who need it most. We believe blockchain technology is one of the most exciting innovations in recent history and as a community we look to push the boundaries of what is possible with it today. More importantly, we are driven by purpose -- to solve real-world problems such as lack of access to sound currency, or friction for cash-transfer programs aimed to alleviate poverty. Our mission is to build a monetary system that creates the conditions for prosperity for all. - - -

- - Play on Youtube - What if money were beautiful - -
- What if money were beautiful? -

- -## 🧱 The Celo Stack - -Celo is oriented around providing the simplest possible experience for end users, who may have no familiarity with cryptocurrencies, and may be using low cost devices with limited connectivity. To achieve this, the project takes a full-stack approach, where each layer of the stack is designed with the end user in mind whilst considering other stakeholders \(e.g. operators of nodes in the network\) involved in enabling the end user experience. - -The Celo stack is structured into the following logical layers: - - -

- Celo protocol -
- The Celo Blockchain and Celo Core Contracts together comprise the Celo Protocol -

- -- **Celo Blockchain**: An open cryptographic protocol that allows applications to make transactions with and run smart contracts in a secure and decentralized fashion. The Celo Blockchain has shared ancestry with [Ethereum](https://www.ethereum.org), and maintains full EVM compatibility for smart contracts. However it uses a [Byzantine Fault Tolerant](http://pmg.csail.mit.edu/papers/osdi99.pdf) \(BFT\) consensus mechanism rather than Proof of Work, and has different block format, transaction format, client synchronization protocols, and gas payment and pricing mechanisms. The network’s native asset is Celo Gold, exposed via an ERC-20 interface. - -- **Celo Core Contracts**: A set of smart contracts running on the Celo Blockchain that comprise much of the logic of the platform features including ERC-20 stable currencies, identity attestations, Proof of Stake and governance. These smart contracts are upgradeable and managed by the decentralized governance process. - - -

- Celo network -
- Topology of a Celo Network -

- -- **Applications:** Applications for end users built on the Celo platform. The Celo Wallet app, the first of an ecosystem of applications, allows end users to manage accounts and make payments securely and simply by taking advantage of the innovations in the Celo protocol. Applications take the form of external mobile or backend software: they interact with the Celo Blockchain to issue transactions and invoke code that forms the Celo Core Contracts’ API. Third parties can also deploy custom smart contracts that their own applications can invoke, which in turn can leverage Celo Core Contracts. Applications may use centralized cloud services to provide some of their functionality: in the case of the Celo Wallet, push notifications and a transaction activity feed. - -## 📚 Documentation - -Follow the instructions in [SETUP.md](SETUP.md) to get a development environment set up. - -See [Developer's Guide](https://docs.celo.org/developer) for full details about the design of the Celo protocol and other information about running these projects. - -## 🙋 Issues - -See the [issue backlog](https://github.com/celo-org/celo-monorepo/issues) for a list of active or proposed tasks. Feel free to create new issues to report bugs and/or request features. - -## 📂 Repo Structure - -The repository has the following packages (sub projects): - -- [celotool](packages/celotool) - scripts for deploying and managing testnets -- [helm-charts](packages/helm-charts) - (DEPRECATED) templatized deployments of entire environments to Kubernetes clusters. Check [celo-org/charts](https://github.com/celo-org/charts) instead. -- [protocol](packages/protocol) - identity, stability and other smart contracts for the Celo protocol ([docs](https://docs.celo.org/protocol)) - -Code owners for each package can be found in [.github/CODEOWNERS](.github/CODEOWNERS). - -## ✍️ Contributing - -Feel free to jump on the Celo 🚂🚋🚋🚋. Improvements and contributions are highly encouraged! 🙏👊 - -See the [contributing guide](https://docs.celo.org/community/contributing) for details on how to participate. -[![GitHub issues by-label](https://img.shields.io/github/issues/celo-org/celo-monorepo/1%20hour%20tasks)](https://github.com/celo-org/celo-monorepo/issues?q=is%3Aopen+is%3Aissue+label%3A%221+hour+tasks%22) - -All communication and contributions to the Celo project are subject to the [Celo Code of Conduct](https://celo.org/code-of-conduct). - -Not yet ready to contribute but do like the project? Support Celo with a ⭐ or share the love in a [![Twitter URL](https://img.shields.io/twitter/url?style=social&url=https%3A%2F%2Fcelo.org%2F)](https://twitter.com/intent/tweet?url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DkKggE5OvyhE&via=celohq&text=Checkout%20celo%21%20Love%20what%20they%20are%20building.&hashtags=celo) - - - -## 💬 Ask Questions, Find Answers, Get in Touch - -- [Website](https://celo.org/) -- [Docs](https://docs.celo.org/) -- [Blog](https://medium.com/celohq) -- [YouTube](https://www.youtube.com/channel/UCCZgos_YAJSXm5QX5D5Wkcw/videos?view=0&sort=p&flow=grid) -- [Forum](https://forum.celo.org) -- [Discord](https://discord.gg/vRbExjv) -- [Twitter](https://twitter.com/CeloDevs) -- [Reddit](https://www.reddit.com/r/CeloHQ/) -- [Community Events](https://celo.org/community) - -## 📜 License - -All packages are licensed under the terms of the [Apache 2.0 License](LICENSE) unless otherwise specified in the LICENSE file at package's root. +| **Resources** | **Social** | **Discussion** | +|:------------:|:----------:|:--------------:| +| [🌐 Website](https://celo.org/) | [🐦 X (Twitter)](https://x.com/Celo) | [💬 Discord](https://discord.com/invite/celo) | +| [📚 Docs](https://docs.celo.org/) | [📰 Blog](https://blog.celo.org/) | [🗣️ Forum](https://forum.celo.org) | +| [🎥 YouTube](https://www.youtube.com/channel/UCCZgos_YAJSXm5QX5D5Wkcw/videos?view=0&sort=p&flow=grid) | [👾 Reddit](https://www.reddit.com/r/CeloHQ/) | [💡 GitHub Discussions](https://github.com/celo-org/celo-monorepo/discussions) | diff --git a/SETUP.md b/SETUP.md deleted file mode 100644 index de5033e8cd8..00000000000 --- a/SETUP.md +++ /dev/null @@ -1,165 +0,0 @@ -# Celo Engineering Setup - -- [Celo Engineering Setup](#celo-engineering-setup) - - [Reading](#reading) - - [Getting everything installed](#getting-everything-installed) - - [Common stuff](#common-stuff) - - [Install Go](#install-go) - - [Install Node](#install-node) - - [MacOS](#macos) - - [Xcode CLI](#xcode-CLI) - - [Homebrew](#homebrew) - - [Install Yarn](#install-yarn) - - [Linux](#linux) - - [Install Yarn](#install-yarn-1) - - [Optional](#optional) - - [Install Rust](#install-rust) - - [Building celo-monorepo](#building-celo-monorepo) - - [Running the mobile wallet](#running-the-mobile-wallet) - -This is a living document! Please edit and update it as part of your onboarding process :-) - -## Reading - -Review the README from each directory in [packages](packages/). The [protocol](packages/protocol) is a good starting point. - -## Getting everything installed - -Follow these steps to get everything that you need installed to build the celo-monorepo codebase on your computer. - -### Common stuff - -#### (Optional) Install Go - -We need Go for [celo-blockchain](https://github.com/celo-org/celo-blockchain), the Go Celo implementation, and `gobind` to build Java language bindings to Go code for the Android Geth client. - -For go installation instructions see [celo-blockchain instructions](https://github.com/celo-org/celo-blockchain#building-the-source). - -Once you have go installed run the following to install gobind - -`go get golang.org/x/mobile/cmd/gobind` - - -#### Install Node - -Currently Node.js v18 is required in order to work with this repo. - -Install `nvm` (allows you to manage multiple versions of Node) by following the [instructions here](https://github.com/nvm-sh/nvm). - -Once `nvm` is successfully installed, restart the terminal and run the following commands to install the `npm` versions that [celo-monorepo](https://github.com/celo-org/celo-monorepo) will need: - -```bash -# restart the terminal after installing nvm -nvm install 18 -nvm alias default 18 -``` - -### MacOS - -#### Xcode CLI - -Install the Xcode command line tools: - -```bash -xcode-select --install -``` - -#### Homebrew - -Install [Homebrew](https://brew.sh/), the best way of managing packages on OSX: - -```bash -/usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)" -``` - -#### Install Yarn - -We use Yarn to build all of the [celo-monorepo] repo. Install it using [Homebrew](#homebrew): - -```bash -brew install yarn -``` - -### Linux - -#### Install Yarn - -We use Yarn to build all of the [celo-monorepo](https://github.com/celo-org/celo-monorepo) repo. Install it by running the following: - -```bash -# for documentation on yarn visit https://yarnpkg.com/en/docs/install#debian-stable -curl -sS https://dl.yarnpkg.com/debian/pubkey.gpg | sudo apt-key add - -echo "deb https://dl.yarnpkg.com/debian/ stable main" | sudo tee /etc/apt/sources.list.d/yarn.list -sudo apt-get update && sudo apt-get install yarn -``` - -### Optional - -#### Install Rust - -We use Rust for some [cryptography repositories](https://github.com/celo-org?q=&type=&language=rust) This is not -required if you only want use the blockchain, monorepo, and mobile wallet. - -```bash -curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -``` - -Now lets add Rust to the `PATH`: - -``` -echo "export PATH=$PATH:~/.cargo/bin/" >> ~/.bashrc -source ~/.bashrc -``` - -With Rust binaries in your PATH you should be able to run: - -```bash -rustup install 1.42.0 -rustup default 1.42.0 -``` - -If you're building Geth for Android, you need a NDK that has a cross-compilation toolchain. We need version 19. - -On Mac (darwin): - -```bash -brew cask install https://raw.githubusercontent.com/Homebrew/homebrew-cask/a39a95824122da8448dbeb0b0ca1dde78e5a793c/Casks/android-ndk.rb - -export ANDROID_NDK=/usr/local/share/android-ndk -``` - -In `celo-blockchain`, define the relevant environment variables, e.g.: - -```bash -export NDK_VERSION=android-ndk-r19c -``` - -and run `make ndk_bundle`. This will download the NDK for your platform. - -## Building celo-monorepo - -Clone the [celo-monorepo](https://github.com/celo-org/celo-monorepo) repo: - -```bash -mkdir ~/celo -cd celo -git clone https://github.com/celo-org/celo-monorepo.git -``` - -Then install the packages: - -```bash -cd celo-monorepo -# install dependencies and run post-install script -yarn -# build all packages -yarn build --ignore docs -``` - -> Note that if you do your checkouts with a different method, Yarn will fail if -> you haven’t used git with ssh at least once previously to confirm the -> github.com host key. Clone a repo or add the github host key to -> `~/.ssh/known_hosts` and then try again. - -> The docs package relies on gitbook which has problems off of a fresh install. Running -> `yarn build --ignore docs` is a known workaround. \ No newline at end of file diff --git a/baklava_genesis_balances.json b/baklava_genesis_balances.json deleted file mode 100644 index 29039bd4246..00000000000 --- a/baklava_genesis_balances.json +++ /dev/null @@ -1,255 +0,0 @@ -{ - "value": "12000000000000000000000", - "addresses": [ - "028fad6c3142681c21517cd4414a6be4438f2556", - "0328b30dd55baa5a5fc1be23c4575a7fb779dc25", - "035ef01395e2cc1219482f0542a05b627fac36b6", - "0474fe22b801e5684ac54f65c2c5c34352c1e479", - "0537d935b50124b3bb9277d0356ce45400579c57", - "060b52f506579625ad73b99f1be8d6b06259953d", - "06d335ecd7ebe9a5bce97d7835de1fdf2858fa71", - "07894441a09086351ebb5b428a5717b7d2c7c592", - "07b5206afea553d49dbcffbbba23fca5896cc603", - "080d324a83ce58f99cb11535ae5e51f0b8374f8d", - "097623488573f87a02bbf292f9e78c0c751752e1", - "0ab74005a6017533d2f38f74bd029d18426bda0b", - "0b3654a18513c76eee3a576241f328a3514f26e8", - "0b3b9a58056d786e42a024225a507f26a6cd698c", - "0b993f51c1d60a68b9e55b03a0fbf4ab777cf6e7", - "0bcdf73c45d99f33d1d8354980d7ca73c4cf72db", - "0c0ae152fcbeabe3585198755ffe4f7d22bb5039", - "0c561cfe5616e01467ba9792f0bc320314c7b1c6", - "0e447c8d3369e0e95a081ddb9e4c181d56f3685e", - "0e5f835af9f0c32bd48b6540207844dd08bca3e9", - "0ebd974acb2dc4906508356a3f23c0701c4d9206", - "0f28312466e1a2190bdf458f43820c5a6b2a8292", - "10e7e7616304773bb6a6584ad7b7e9ae17cc8352", - "13b3e17f8cf507d8fee3532c3826d1b9d84d8bb2", - "1569c74093b9393ba5d5a684013458b1f3000185", - "1674e1f2c0b6d6f1b569f95795f623b76919bb91", - "192e049d4a4a0789dc964e1d46c2e3cd3b825480", - "19f79eaa92d20e0a2e7bf24e32fffc5a1239bc8c", - "1a7a7af9e78a690979963f1641f9694853226d0a", - "1b4ea26c03baea69e85e354af4c695be43a510d3", - "1dbe73b0480058106798adda62e25d5279233c1b", - "207202b4876e9452a3e10a0480b76d9f3ac6c4da", - "208767b8d773aec8fc56ab0345809b7fe56344de", - "21b3c28b7bd2b41fd78b181cf1a97b27e6ff2014", - "22031b9ba9c5fe20bedcf559e7d1966e2ff03173", - "255f0de37b99d52ad4189d06ab4f63a748b2d235", - "257105a729cedb5701dad1e0073ae751d0b788a1", - "27840F3C97F96311E9510EfCF9Ea28aC94e01f78", - "28c9c08659984625584e8df09e119b4d157f86aa", - "2947ea31ab65b1fee886dc6c113f79b222c804b7", - "29dd8b1a27f9381aea07f24c97f17e9df5442101", - "2c8a412026454d1e56b9687dd88d4a6080c10e87", - "2dc4051191c05c00821168256084455a5eb1ba1e", - "2ee2a7fd94a31cf1e6db3875f4fd3d961b15de17", - "2fbaaf12676191eab6bf9d92e84eabf11f303d7f", - "307501123022f0e78a523fffc40788ee2381ac22", - "30fd0740b4a6dfa31924982d377ee634b86d7a95", - "320ba815c2d43e44e6e7e8926d7c384cb4c12ea2", - "3216695f7a483f1275c419bb0d914412509821a0", - "333d3ad4628e36db96bec8b3292f41bace58fc33", - "3521a14aae3a4fac3c6d0d6ceb8642cdd21323a2", - "35537e972e63a72e6dcbd93d092110adee296ce5", - "3616626057d9012469ed3926e626ae8333904482", - "362f0dd270e96a8ab333c4283541fb15dc817126", - "3a60dfe47fb866306934d40e0934158e98ec8cea", - "3b63ae3002c0fab6e1481d4a33ec89650872b3d5", - "3cb7762c05231f62c5703849a685f0817073e2f2", - "3de7dddd7ea601097c78e874e914550c9ceb6950", - "3fba989dbd94d5742c5d27e241ef5472a940f8ff", - "412f2910970f81e49af9b46067d3fd4d7a281679", - "41716bcf7c9a97179a7a276b153a7563e5f777eb", - "43c8cc9884d8e2ea870c33bdc25f495c25f269df", - "43dda21951e06e5c70bd9196a49b2d2eb47018d8", - "4418a99ac82d042971668a78604d1baaa92b895f", - "44e11b8eb591a5b8b0cf4db6b02ad60e6ed89a62", - "44edd5027f894058cfb4b519beb04f247336bc22", - "4500b26a6f52836510bfd013241df203f03456f8", - "453f92064cd11cfc185b058757fd5ab23f98b978", - "48cedc58b10af13d688631bc3cb78a05b8a6e56a", - "49c3c6a9b0c17ac32169ffc45998aa80029128b3", - "4a36b1eeb0178c9948cffb183a2333e98ebb888d", - "4a47f329f9429ee621e6b359e46ff3b5ac79e5e7", - "4bacb36b23afb8d66dd1a85d5a0d3a4455db22ea", - "4d2f9739e6d82663bbd1fc3fd3cee041d631af7d", - "4d838eaefe4234307e66f8d3f3019bc305b0d1b5", - "4da496af48238fefea8b4ffe6183310750cca6ea", - "4dbda5f6d756b6288aa7a8be0d45a1644465c166", - "4df2ad5fbad3978b0330232c30b1a47f02303466", - "4e0585b243c3b434c846b050354cf216e76d1ee4", - "4f584db9ec38df930185df315d941d42b3186b9f", - "4f6e39afb99896f5f463eb516f9edde3361f1ac5", - "4f9f7fa5db4a44a469f95115e0525f548605c7fd", - "506423ab317013b94d1ddb2adc15cda413096987", - "522f59f216e9468cdb859a96f025348a237a4935", - "5279bd79792c588e2553c56580c3ab058809e67a", - "556b94105721e92331ef09e5689a72481a590dc1", - "5582268e275fc3407f92fa80dd3f0d05904a723a", - "558e42639b924b15a68982b52ae30e9e5505eaf7", - "575240f049aea7bc783f60bebd7fcb5584a54b4a", - "5862cf5a76efb0315d34de733e0300279add5d52", - "5a4b9e7bf6ff475c52e372025c5b161014f5f70d", - "5eee762a960560cb6a625f280e53f92d93ec9b78", - "608fe2dbe23917f523e157fe7d8044243417a1e8", - "60E8dd93413a473a7BF354E292057688F179E8c0", - "61a092b2d98d9f821e9089aae1d490bf77db1ce5", - "61d7e2d39b924744febc65d141f30e9f644bd286", - "62646f78bf550b801b619983747599e23deabe3a", - "62ced47f08c104920ffa023c8d8f75ff18fdfc1d", - "62f959f22dbe968d1a41d426ae9dad5596b93236", - "65c5bf635e57d6791a5a81fd64c186fc6b43f459", - "66d4902810a03f45015998a8e92be8e2fa49c05a", - "673ec125d05f8aa71f735407a87a14836d775512", - "675e639e4e19a22204bc1f5ea006e948413f796b", - "6835cd60eccbf73437e3a62cd460a52e1e9755fa", - "686ee1620230c237eef7963f68c9e5d29565a0e8", - "694fb0e26fc592984c8a310eb4fe9bcc3037ca48", - "69f52b7f0492889e18285f516b71cbaa1b67981f", - "6a2998527cd699982364c9399d17b63c6ca22a41", - "6a2beea75f32f2e7b264e9f25ac970c5210e9230", - "6af7f878dc47d3cd83507b760aba82af45c9f902", - "6bb1e8c0ecc1bdb3d6ce3b3359f2ffaa3cfb6f28", - "6c5035a2a6f4818d7451747f3ef5edfb8265e776", - "6d289a49d8e3fb5dc8c22be2648c57fcbb7e7ed2", - "6e580d964c7fb4338b59093b574950e3999bf9a6", - "6e80dea424a522b0257adc7fb3b8e7c6e29b02e5", - "6f46d495a492a212e330d20189a0656cf072836a", - "73b49da41223585a650712363ba7878ed60a964d", - "75bda3ac305484c50df6c66014b11eff13216f26", - "775a5c899fd4a7cdd54e9f41738087748cfcb953", - "77611f4e37c0fea6dbf9237c0549b5e4e231bf3f", - "7a914a1b9a025eca6a2c06936c94101184741ea2", - "7b8ce1efc66fb0730b95436d66d89a5b0888d656", - "7bf109e20bae23c8a68c3a975c9e3d7bea941658", - "7f53f21f0fd99983fc1d0d527ac5ff6fed52b9a0", - "818c54fe6ab2cec008ba64a7ec249f60b0c37a48", - "81a9cc4103e2c66a652b2c17baaf134fdaeafca9", - "84599e63cdf1f592a00be4c624587e1c59ff8ca0", - "8482688f494a359d9efe5c37c5e16071f97890bd", - "8792e294df1d75eaa215a2e6e18e9b28cfc48d07", - "87d7ca6e380fe9c785cab4d6ffc2bf7fec625855", - "89a2bb2f22e84aedf0c4152b9ef80cc2f015df51", - "8b4a3adeb377d478708aee2f0dacdaaf889845bc", - "8b5b77c229e405c33c7f99acbfb70b1f52c17462", - "8cc0160e1b17fb885cb918021621820642d87797", - "8fb800b94d2b1258236e598fcd5e65eaf45ca671", - "8fe0973e8e40cce79a142b7b1a9e5de987095adf", - "909adbc46da983140af4d4228e07b217f2ee5864", - "91d44695a988bf5ac192c8f1f0493d48724d15ad", - "91f5f037efdb97d0c7a840276837d4226b91a258", - "92fb67ace261e016678cd221972f21f5c03d781a", - "93c110dc9812dc075c1541b2c5d1030b2ad6c293", - "940f7e5e603377f58ffa019ea2ad7200c88c5086", - "94befc20ca6bb6b21a6e6d3757b6f2335675f65c", - "95785ce0ca05f23484342ca3c2337d7660b68d36", - "95b664befd3de2de8e8bdf1b1313aae02cc24d5b", - "96b473b7a71f1a3804ad306af4f07f86c796eff5", - "9880279fa2a2170b9f3d89181249a34a291e6547", - "99100ada9e90a4b1e8e7754fe6b4f3fb1d049113", - "9913e776ed11e9247ef89b4e8b2b7c062faa5281", - "9a9dbbb96fa2c49247360e5eacd72e80b6129458", - "9abd6739153d95512d0d68ec0960d296e2f174bc", - "9b6828d728ec0aedd7e8fe1da93979d66f70968b", - "9dba468122b12cd04bf4828c57f67a87fe03bbf0", - "9e60a9b6380562874c9bfc1d38bece0bcd1d9b30", - "9f51e74987043df2c2f461cabd7d1c93cd79f498", - "9f79acea4994cc09f31f2fc87105aa619fa22020", - "9f85baefb7942f8b6d03c28af32d09756830c1e8", - "a250b8d79653de86cf7431ff0719dc77fe08ccfb", - "a3e05f1152e832744047e231c7aea4e8cbb5db26", - "a5317a01d642fa1979056e886504513b28e31f05", - "a6db023dfb73785f3321cd409622e6b7df69e9b2", - "a866efed4e0af66a1fbf2c0cbc3e39197cbc5b01", - "a87c05c4598fcf6d7bb9f9e41ebaa05c2cb46abd", - "aace0cefe666cdf4d63f7b9d57a6e6cf5cf84d9f", - "ad67bae6526873c5a6df80639ac16d89e627641c", - "ae3aaffa0f2b5d8490bfa212c39a032852fb20de", - "aeac20cdb7f9cdda85ad09ac0920047e41bf0bcc", - "af8911392621c7881f4ef1a83021c651e0656f2a", - "b060351f01492a56287f0d9e07da18116c93728f", - "b0c9fa734e0fac63d9051554e869c33ebcfcec6d", - "b1beb7832f72b27d045a9078d8adb0a3e42b3f99", - "b1cb7ea2d15ef69d01aaf8185edbef0076d83454", - "b57bc132a8bd43cdb57f8d46f7f4c7ae18620e46", - "b584f6a8e1196b27592e3ec69446bed1b76d52d9", - "b7b3c854b00d65208e410a5d16eb54f58dbebbc7", - "b84fa68db82af1bfcbb4b876e017657afb1dd615", - "ba4d18be33b5d5f5a80fba1206c3e82944665d76", - "bcd25468e4d433ac442b87f32b37b39678bf7077", - "bf453cc91a56da43fe6ecb834679cb4522a8889d", - "bfea1fc9870ee47c517faf53da39edd9cb2142fa", - "c189de40c8af704581e80c5e74a79751ea6ca1e0", - "c1b1230bbf9590232e6d2d28c4c1c418a32339ef", - "c1c8432a07183e96043d00ce2a60d637cb509590", - "c2d5c2ac49fb7511584e567a48ef96917110ad9d", - "c37021cc0ec286891a203d19901b0572b7bead1d", - "c3a0ac447e6e2ba795cbdd8a3648cb2779203ea4", - "c50fbd7054e3eef480fb8c61a6a5798b3a8bcbbb", - "c53f5d047ea8a7725a4fd76002739d76548196f4", - "c5636b98fd6ddff3a7bb239e9729d75338e9a61f", - "c566b9f9b5551588b93a8bc00d34d2123d318ce3", - "c6034975ee23bb19685bffd3c4e818e964e0d7c7", - "c8c8e28dbe45a95b70d3f19b9a225b3531642a79", - "c9c349bb76a657f9210da350d664929c5946c90e", - "c9d6fe14d8e93269bd3487d1d1a7693ef60001e3", - "c9d9a8dbbc238ce17134dc9c12447dcc137630a7", - "ca21a04402913ad8a0c9fd05c48e7c5865b75ade", - "cad07ebf3c922bfe1292d4fbfce015a934e520b9", - "cbebe96dd4811243e23b1f7b43d7c3266c59af2f", - "cccd3999d5b421f906c4a35c0c95bcd533e1cfbb", - "cf08368e0aa6d6017a2fede771ea72abb555f266", - "cf5d9c9514e7217a19cd975cecc0257c9b69db1b", - "cf946b26fff0c2c1d3da93b7647034b90b5fcf99", - "d0604078265249629759f6f342cc5d19461b2c8b", - "d1d4c8b640ce83faf89c60bf47e7319f0ffe57f0", - "d4e068fb4467816c768fa3460ce9d5ae5642de37", - "d5c181972763b082d4d6ca84662878674c0bd35c", - "d660eb4c3d84beafe836e351d6c65f9cf62bd51f", - "d6f05649c5350c20030de1dc57ed31510141f487", - "d99bedef76c9caa7ec0cfdaa79c1b9330bb02061", - "db904be629d8d064238f65cc04e95166a36d9357", - "dbc52f9e934145e2e3172aead59303328bbad2d1", - "dbdb0c678342d4752c82df89ad8408902a55591a", - "dd30983c073d20116477ff672543e3eec3f491af", - "df155c157f2f53a653a6a62d0458647261514093", - "df917121efdc92e1ea4414731ad3fd5d07bde322", - "e01886c1c29f5e1ca8f6de0c02108967712ae216", - "e193d7ccb6d5644bff9c9bf41ff1e62765b382e8", - "e2b8269e836c0f2ede96802405410203fb0de3a6", - "e32c4fc7e75091197e7ce24f0b1257ea858c6a0d", - "e33722565fadde6225a26586f48d8625c50147c4", - "e49515934d5eb7e8a2df1f6b00a819de9938f2c6", - "e4c1da47b33abd25578becb8f90d2480d9575f89", - "e4dc20758df2b3dab5dbbfd3239ccdc0b9682206", - "e556d76384334a7138ca85e5eb4281c909628ae1", - "e59b4d7b0cf1a2b73a439894dc3d6fa08a53ce84", - "e5DE9f2d1535D5b2CF8D0BA91d1F92C475f88a3b", - "e609163306c42d60080c2c87e80881c4dc52e115", - "e6234077285ce68da7d6c62ec68894bd83a15398", - "e67a310436b8a3a23a994e4a0b29e8d82721be7c", - "e6f62ed66bf346e2dad9619c18c66e2a35c77da0", - "e7c85a3f18d18d40d713041a69d4795f36339f7d", - "e81ece7f8f962bd1076f1134d1fd523d64e43702", - "e820238dc7b75b7602dafc7ffccd35c9ef5e2fdf", - "e89897510bfb2fc0647325fb6d580594c061acbf", - "ea6a71634f7228798ec8ac6395525a632ec4945e", - "ea6f11e720ca72af1d8a94fff813617ee21be08e", - "ec4e104bd412aaea47c4ae753a8b3a194e9939ac", - "edf3dacb0fda9a955956cfc7f833ea4dbe26c4f2", - "ee9d99071e541e19eae7f1063539ef5b39199bd6", - "ef621af83c70695daac497d7959d2848c8a2acea", - "f327386ef9d6be057d1c720fef633938743660f3", - "f3ee14dfbbbb7f0ef1ef787be39bf4814741e300", - "f4691164b53141bd4082592c0e979689a0486500", - "f5b15312e6407988372d9280d064daa0248181be", - "f7084f320f3779dd2a92a2f58d23abb35f8a9c15", - "f75daae125083fca31583fef2d0a84e0344a1534", - "f823e8a4ba6adddb02e97b5b8886d18e41b2723e", - "fb92c612be9c9758d7111daa07954eb0455d224b", - "ff0559d9062650213623968d2bd4e25f8705910c", - "ffac3dcf14dc8e678204e2388031d4cc3e366261" - ] -} diff --git a/cloudbuild.yaml b/cloudbuild.yaml deleted file mode 100644 index baf4cb878b5..00000000000 --- a/cloudbuild.yaml +++ /dev/null @@ -1,29 +0,0 @@ -# We use kaniko for building docker images -# More details: https://github.com/GoogleContainerTools/kaniko - -steps: - -- id: "docker:celotool" - name: gcr.io/kaniko-project/executor:v1.23.1 - args: [ - "--dockerfile=dockerfiles/celotool/Dockerfile", - "--cache=true", - "--destination=gcr.io/$PROJECT_ID/celo-monorepo:celotool-$COMMIT_SHA" - ] - waitFor: ['-'] - -- id: "docker:cli" - name: gcr.io/kaniko-project/executor:v1.23.1 - args: [ - "--dockerfile=dockerfiles/cli/Dockerfile", - "--cache=true", - "--destination=gcr.io/$PROJECT_ID/celocli:$COMMIT_SHA", - "--build-arg", - "celo_env=alfajores" - ] - waitFor: ['-'] - -options: - machineType: 'N1_HIGHCPU_8' - -timeout: 3000s \ No newline at end of file diff --git a/developer_key_publishing.md b/developer_key_publishing.md index f1a8e6f45e5..e337fc90b22 100644 --- a/developer_key_publishing.md +++ b/developer_key_publishing.md @@ -6,7 +6,7 @@ In support of these uses, [OpenPGP](https://www.openpgp.org/) public keys can be If you want to read more about OpenPGP keys, including their structure and metadata, check out [Anatomy of a GPG Key by Dave Steele](https://davesteele.github.io/gpg/2014/09/20/anatomy-of-a-gpg-key/). -> Note: This guide assumes you have an @clabs.co email address, but if you do not, simply change the email domain to your primary developer email (e.g. @example.com for alice@example.com) Some additional setup will be required in the DNS records for your domain to configure [OpenPGP WKD]((https://gnupg.org/blog/20161027-hosting-a-web-key-directory.html)). +> Note: This guide assumes you have an @clabs.co email address, but if you do not, simply change the email domain to your primary developer email (e.g. @example.com for alice@example.com) Some additional setup will be required in the DNS records for your domain to configure [OpenPGP WKD](https://gnupg.org/blog/20161027-hosting-a-web-key-directory.html). ## Setup @@ -60,7 +60,7 @@ We recommend generating your developer key pair with a YubiKey. Generating your 4. You may add a comment, but it is not required. 4. `quit` -See the [official YubiKey documentation](https://support.yubico.com/support/solutions/articles/15000006420-using-your-yubikey-with-openpgp) for more information. +See the [official YubiKey documentation](https://support.yubico.com/hc/en-us/articles/360013790259-Using-Your-YubiKey-with-OpenPGP) for more information. ### On your machine @@ -87,7 +87,7 @@ If you've generated a key on your local machine, it can be imported onto your Yu 2. `keytocard` and select `3` to set the authentication key on the YubiKey. 3. `quit` and save your changes. -See the [official YubiKey documentation](https://support.yubico.com/support/solutions/articles/15000006420-using-your-yubikey-with-openpgp) +See the [official YubiKey documentation](https://support.yubico.com/hc/en-us/articles/360013790259-Using-Your-YubiKey-with-OpenPGP) #### Verify your signing key @@ -133,13 +133,13 @@ This command will query [clabs.co](https://clabs.co) over HTTPS and retrieve the ### Manually fetching a key -If you want to check a key is correctly hosted, or want to fetch a key manualy for some reason, you can do so with: +If you want to check a key is correctly hosted, or want to fetch a key manually for some reason, you can do so with: ```bash curl https://openpgpkey.clabs.co/.well-known/openpgpkey/clabs.co/hu/$USER_HASH ``` -`$USER_HASH` can be obtained by observing the output of the WKD utility. It is the filename of the resulting key file that is added to the website repsitory for publishing. It can also be calculated directly following the specification in the WKD specification, [draft-koch-openpgp-webkey-service](https://datatracker.ietf.org/doc/draft-koch-openpgp-webkey-service/?include_text=1), section 3.1. (Warning: It uses an obscure varient of base32 :| ) +`$USER_HASH` can be obtained by observing the output of the WKD utility. It is the filename of the resulting key file that is added to the website repository for publishing. It can also be calculated directly following the specification in the WKD specification, [draft-koch-openpgp-webkey-service](https://datatracker.ietf.org/doc/draft-koch-openpgp-webkey-service/?include_text=1), section 3.1. (Warning: It uses an obscure variant of base32 :| ) ## Document Signing diff --git a/dockerfiles/celotool/Dockerfile b/dockerfiles/celotool/Dockerfile deleted file mode 100644 index 9d731f74d8d..00000000000 --- a/dockerfiles/celotool/Dockerfile +++ /dev/null @@ -1,49 +0,0 @@ -FROM node:18 -LABEL org.opencontainers.image.authors="devops@clabs.co" - -WORKDIR /celo-monorepo - -# Needed for gsutil -RUN apt-get update && \ - apt-get upgrade -y && \ - apt-get install -y lsb-release && \ - apt-get install -y curl build-essential git python3 && \ - export CLOUD_SDK_REPO="cloud-sdk-$(lsb_release -c -s)" && \ - echo "deb http://packages.cloud.google.com/apt $CLOUD_SDK_REPO main" | tee -a /etc/apt/sources.list.d/google-cloud-sdk.list && \ - curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | apt-key add - && \ - apt-get update -y && \ - apt-get install -y google-cloud-sdk kubectl netcat-openbsd && \ - curl https://raw.githubusercontent.com/helm/helm/master/scripts/get-helm-3 | bash && \ - rm -rf /var/lib/apt/lists/* - -RUN alias python=python3 - -# ensure yarn.lock is evaluated by kaniko cache diff -COPY lerna.json package.json yarn.lock ./ -COPY packages/celotool/package.json packages/celotool/ -COPY packages/env-tests/package.json packages/env-tests/package.json -COPY packages/protocol/package.json packages/protocol/ -COPY scripts/ scripts/ -# Makes build fail if it doesn't copy git, will be removed after build -COPY .git .git -COPY .gitmodules .gitmodules - -RUN yarn install --network-timeout 100000 --frozen-lockfile && yarn cache clean - -COPY packages/celotool packages/celotool/ -COPY packages/env-tests packages/env-tests -COPY packages/helm-charts packages/helm-charts -COPY packages/protocol packages/protocol/ - -RUN yarn build - -RUN rm -r .git -RUN rm .gitmodules - -ENV PATH="/celo-monorepo/packages/celotool/bin:${PATH}" - -COPY --from=golang:1.18-stretch /usr/local/go/ /usr/local/go/ - -ENV PATH="/usr/local/go/bin:${PATH}" - -CMD ["celotooljs.sh"] diff --git a/dockerfiles/metadata-crawler/Dockerfile b/dockerfiles/metadata-crawler/Dockerfile deleted file mode 100644 index 4df83640847..00000000000 --- a/dockerfiles/metadata-crawler/Dockerfile +++ /dev/null @@ -1,25 +0,0 @@ -FROM node:18 -LABEL org.opencontainers.image.authors="devops@clabs.co" - -WORKDIR /celo-monorepo - -# Monorepo dependencies -RUN apt-get update && \ - apt-get upgrade -y && \ - apt-get install -y lsb-release && \ - apt-get install -y curl build-essential git python3 && \ - alias python=python3 && \ - rm -rf /var/lib/apt/lists/* - -RUN npm install -g typescript npm - -COPY lerna.json package.json yarn.lock ./ -COPY scripts scripts/ -COPY packages/metadata-crawler packages/metadata-crawler/ - -RUN yarn install --network-timeout 100000 --frozen-lockfile && \ - yarn cache clean && \ - yarn build - -WORKDIR /celo-monorepo/packages/metadata-crawler -CMD ["node", "lib/crawler.js"] diff --git a/package.json b/package.json index 8c20f992118..f800d85e230 100644 --- a/package.json +++ b/package.json @@ -17,15 +17,11 @@ "test": "yarn run lerna run test", "build": "yarn run lerna run build", "clean": "yarn run lerna run clean", - "docs": "yarn run lerna run docs", "check-licenses": "yarn licenses list --prod | grep '\\(─ GPL\\|─ (GPL-[1-9]\\.[0-9]\\+ OR GPL-[1-9]\\.[0-9]\\+)\\)' && echo 'Found GPL license(s). Use 'yarn licenses list --prod' to look up the offending package' || echo 'No GPL licenses found'", - "report-coverage": "yarn run lerna run test-coverage", - "test:watch": "node node_modules/jest/bin/jest.js --watch", "postinstall": "yarn run lerna run postinstall", "keys:decrypt": "bash scripts/key_placer.sh decrypt", "keys:encrypt": "bash scripts/key_placer.sh encrypt", "check:packages": "node ./scripts/check-packages.js", - "celotool": "yarn --cwd packages/celotool run --silent cli", "prepare": "husky install" }, "workspaces": { @@ -63,9 +59,7 @@ "typescript": "^5.3.3" }, "resolutions": { - "ganache": "npm:@celo/ganache@7.8.0-unofficial.0", - "bip39": "https://github.com/bitcoinjs/bip39#d8ea080a18b40f301d4e2219a2991cd2417e83c2", - "blind-threshold-bls": "npm:@celo/blind-threshold-bls@1.0.0-beta", + "bip39": "https://github.com/bitcoinjs/bip39#a7ecbfe2e60d0214ce17163d610cad9f7b23140c", "@types/bn.js": "4.11.6", "bignumber.js": "9.0.0" } diff --git a/packages/celotool/.eslintrc.js b/packages/celotool/.eslintrc.js deleted file mode 100644 index 63096d24e87..00000000000 --- a/packages/celotool/.eslintrc.js +++ /dev/null @@ -1,8 +0,0 @@ -module.exports = { - rules: { - 'no-underscore-dangle': 'off', - '@typescript-eslint/no-unsafe-argument': 'off', - '@typescript-eslint/no-unsafe-return': 'off', - 'no-bitwise': 'off', - }, -} diff --git a/packages/celotool/.gitignore b/packages/celotool/.gitignore deleted file mode 100644 index 3aab66ec184..00000000000 --- a/packages/celotool/.gitignore +++ /dev/null @@ -1,4 +0,0 @@ -/lib/ -.tmp - -twilio-config.js \ No newline at end of file diff --git a/packages/celotool/CHANGELOG.md b/packages/celotool/CHANGELOG.md deleted file mode 100644 index c58c315ced5..00000000000 --- a/packages/celotool/CHANGELOG.md +++ /dev/null @@ -1,116 +0,0 @@ -# @celo/celotool - -## 2.0.1 - -### Patch Changes - -- Updated dependencies [9ab9d00eb] -- Updated dependencies [1c9c844cf] -- Updated dependencies [9ab9d00eb] - - @celo/contractkit@6.0.0 - - @celo/env-tests@1.0.3 - - @celo/explorer@5.0.7 - - @celo/governance@5.0.7 - -## 2.0.1-beta.0 - -### Patch Changes - -- Updated dependencies [1c9c844cf] -- Updated dependencies [86bbfddf1] - - @celo/contractkit@6.0.0-beta.0 - - @celo/governance@5.0.7-beta.0 - - @celo/env-tests@1.0.3-beta.0 - - @celo/explorer@5.0.7-beta.0 - - -## 2.0.0 - -### Major Changes - -- 97d5ccf43: Remove lookup command, use `celocli identity:identifier --phoneNumber` - -### Patch Changes - -- 22ea7f691: Remove moment.js dependency -- Updated dependencies -- Updated dependencies [679ef0c60] -- Updated dependencies [32face3d8] -- Updated dependencies [22ea7f691] -- Updated dependencies [97d5ccf43] -- Updated dependencies [87647b46b] - - @celo/contractkit@5.2.0 - - @celo/connect@5.1.1 - - @celo/env-tests@1.0.2 - - @celo/base@6.0.0 - - @celo/cryptographic-utils@5.0.6 - - @celo/explorer@5.0.6 - - @celo/governance@5.0.6 - - @celo/utils@5.0.6 - -## 2.0.0-beta.0 - -### Major Changes - -- 97d5ccf43: Remove lookup command, use `celocli identity:identifier --phoneNumber` - -### Patch Changes - -- 22ea7f691: Remove moment.js dependency -- Updated dependencies -- Updated dependencies [32face3d8] -- Updated dependencies [22ea7f691] -- Updated dependencies [97d5ccf43] -- Updated dependencies [87647b46b] - - @celo/contractkit@5.2.0-beta.0 - - @celo/env-tests@1.0.2-beta.0 - - @celo/base@6.0.0-beta.0 - - @celo/explorer@5.0.6-beta.0 - - @celo/governance@5.0.6-beta.0 - - @celo/connect@5.1.1-beta.0 - - @celo/cryptographic-utils@5.0.6-beta.0 - - @celo/utils@5.0.6-beta.0 - -## 1.0.1 - -### Patch Changes - -- Updated dependencies [d48c68afc] -- Updated dependencies [d48c68afc] -- Updated dependencies [53bbd4958] -- Updated dependencies [d48c68afc] -- Updated dependencies [53bbd4958] -- Updated dependencies [d48c68afc] -- Updated dependencies [d48c68afc] -- Updated dependencies [d48c68afc] - - @celo/contractkit@5.1.0 - - @celo/connect@5.1.0 - - @celo/cryptographic-utils@5.0.5 - - @celo/governance@5.0.5 - - @celo/explorer@5.0.5 - - @celo/utils@5.0.5 - - @celo/base@5.0.5 - - @celo/network-utils@5.0.5 - - @celo/env-tests@1.0.1 - -## 1.0.1-beta.0 - -### Patch Changes - -- Updated dependencies [d48c68afc] -- Updated dependencies [d48c68afc] -- Updated dependencies [53bbd4958] -- Updated dependencies [d48c68afc] -- Updated dependencies [53bbd4958] -- Updated dependencies [d48c68afc] -- Updated dependencies [d48c68afc] -- Updated dependencies [d48c68afc] - - @celo/contractkit@5.1.0-beta.0 - - @celo/connect@5.1.0-beta.0 - - @celo/cryptographic-utils@5.0.5-beta.0 - - @celo/governance@5.0.5-beta.0 - - @celo/explorer@5.0.5-beta.0 - - @celo/utils@5.0.5-beta.0 - - @celo/base@5.0.5-beta.0 - - @celo/network-utils@5.0.5-beta.0 - - @celo/env-tests@1.0.1-beta.0 diff --git a/packages/celotool/README-e2e.md b/packages/celotool/README-e2e.md deleted file mode 100644 index bfef3698e96..00000000000 --- a/packages/celotool/README-e2e.md +++ /dev/null @@ -1,30 +0,0 @@ -# Celo-Blockchain End-to-End Tests - -This package contains a number of end-to-end tests that depend both on the -monorepo protocol package and the Golang celo-blockchain implementation. - -## Setup - -1. Run `yarn` to install node dependencies -2. Other dependencies: - 1. `nc`, the [netcat](https://en.wikipedia.org/wiki/Netcat) networking utility - -## Usage - -The tests are run using bash script wrappers. They are the -`ci_test_.sh` files in this package. Each requires a version of -celo-blockchain to be specified, which can be done in two ways. - -### Celo-blockchain built from local source - -``` -./ci_test_governance.sh local PATH -``` - -Where `PATH` is a path to a local source repository for celo-blockchain. - -### Celo-blockchain built from a specific GitHub branch - -``` -./ci_test_governance.sh checkout BRANCH -``` diff --git a/packages/celotool/README.md b/packages/celotool/README.md deleted file mode 100644 index 2a86142903c..00000000000 --- a/packages/celotool/README.md +++ /dev/null @@ -1,90 +0,0 @@ -# Celotool - -A useful tool for various scripts that we as an engineering team might run. -This is the only remaining version, in Typescript. There used to be a Python version too. -Hence the references to celotooljs. - -## Setup - -```bash -# Install packages -yarn -``` - -If you want to use this tool from anywhere, add an alias to your ~/.bash_profile. - -`alias celotooljs=/packages/celotool/bin/celotooljs.sh` - -## Usage - -Running `celotooljs` should give you the output like the following that let's you know what you can do: - -```bash - -celotooljs - -Commands: -celotooljs account commands for fauceting, - looking up accounts and users -celotooljs backup command for backing up a miner's - persistent volume (PVC) -celotooljs copy-contract-artifacts command for copying contract - artifacts in a format to be easily - consumed by other (typescript) - packages. It will use the ABI of a - particular contract and swap the - address for the address of the - Proxy. -celotooljs deploy commands for deployment of various - packages in the monorepo -celotooljs geth commands for geth -celotooljs links commands for various useful links -celotooljs port-forward command for port-forwarding to a - specific network -celotooljs restore command for restoring a miner's - persistent volume (PVC) from - snapshot -celotooljs switch command for switching to a - particular environment -celotooljs transactions commands for reading transaction - data -Options: ---version Show version number [boolean] ---verbose Whether to show a bunch of debugging output like stdout and - stderr of shell commands [boolean] [default: false] ---yesreally Reply "yes" to prompts about changing staging/production - (be careful!) [boolean] [default: false] - --help Show help [boolean] -``` - -### How to Faucet an Account - -Run this command: -`celotooljs account faucet --celo-env --account --gold 10 --dollar 10` - -### How to Setup a Local Celo Blockchain Node - -You might need to setup a local node for some reasons, therefore `celotooljs` provides you with -a few useful commands to make running a node really easy. - -- Clone [Celo Blockchain repo](https://github.com/celo-org/celo-blockchain) -- Build `celotooljs geth build --geth-dir -c` -- Init `celotooljs geth init --geth-dir --data-dir -e ` -- Run `celotooljs geth run --geth-dir --data-dir --sync-mode ` - -### How to Deploy a Test Network to the Cloud - -- Setup the environment variables: MNEMONIC, and GETH_ACCOUNT_SECRET. - -- Deploy: `celotooljs deploy initial testnet -e yourname` - -- Get pods: `kubectl get pods -n yourname` - -- Start shell: `kubectl exec -n podname -it podname /bin/sh` - -- Tear down: `celotooljs deploy destroy testnet -e yourname` - -#### MacOS Setup - -- Install Helm 3.4 or higher (available on Homebrew) - To get past the Unidentified Developer error: open the directory containing helm, then ctrl-click helm and select Open then Open again. Repeat for tiller. diff --git a/packages/celotool/bin/celotooljs.sh b/packages/celotool/bin/celotooljs.sh deleted file mode 100755 index 9692a4fb935..00000000000 --- a/packages/celotool/bin/celotooljs.sh +++ /dev/null @@ -1,5 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -cd $(dirname $0) -yarn run --silent cli "$@" diff --git a/packages/celotool/ci_test_blockchain_parameters.sh b/packages/celotool/ci_test_blockchain_parameters.sh deleted file mode 100755 index 032c8fd7394..00000000000 --- a/packages/celotool/ci_test_blockchain_parameters.sh +++ /dev/null @@ -1,21 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# This tests that the geth node will exit if its version is too low - -# For testing a particular branch of Geth repo (usually, on Circle CI) -# Usage: ci_test_exit.sh checkout -# For testing the local Geth dir (usually, for manual testing) -# Usage: ci_test_exit.sh local - -export TS_NODE_FILES=true -if [ "${1}" == "checkout" ]; then - # Test master by default. - BRANCH_TO_TEST=${2:-"master"} - echo "Checking out geth at branch ${BRANCH_TO_TEST}..." - ./node_modules/.bin/mocha --node-option loader=ts-node/esm src/e2e-tests/blockchain_parameters_tests.ts --branch ${BRANCH_TO_TEST} -elif [ "${1}" == "local" ]; then - export GETH_DIR="${2}" - echo "Testing using local geth dir ${GETH_DIR}..." - ./node_modules/.bin/mocha --node-option loader=ts-node/esm src/e2e-tests/blockchain_parameters_tests.ts --localgeth ${GETH_DIR} -fi diff --git a/packages/celotool/ci_test_cip35.sh b/packages/celotool/ci_test_cip35.sh deleted file mode 100755 index 5001c52c586..00000000000 --- a/packages/celotool/ci_test_cip35.sh +++ /dev/null @@ -1,21 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# This tests that the geth node will exit if its version is too low - -# For testing a particular branch of Geth repo (usually, on Circle CI) -# Usage: ci_test_exit.sh checkout -# For testing the local Geth dir (usually, for manual testing) -# Usage: ci_test_exit.sh local - -export TS_NODE_FILES=true -if [ "${1}" == "checkout" ]; then - # Test master by default. - BRANCH_TO_TEST=${2:-"master"} - echo "Checking out geth at branch ${BRANCH_TO_TEST}..." - ./node_modules/.bin/mocha --node-option loader=ts-node/esm src/e2e-tests/cip35_tests.ts --branch ${BRANCH_TO_TEST} -elif [ "${1}" == "local" ]; then - export GETH_DIR="${2}" - echo "Testing using local geth dir ${GETH_DIR}..." - ./node_modules/.bin/mocha --node-option loader=ts-node/esm src/e2e-tests/cip35_tests.ts --localgeth ${GETH_DIR} -fi diff --git a/packages/celotool/ci_test_governance.sh b/packages/celotool/ci_test_governance.sh deleted file mode 100755 index 27cfd5bdfec..00000000000 --- a/packages/celotool/ci_test_governance.sh +++ /dev/null @@ -1,21 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# This test starts a standalone Geth node and runs transactions on it. - -# For testing a particular branch of Geth repo (usually, on Circle CI) -# Usage: ci_test_governance.sh checkout -# For testing the local Geth dir (usually, for manual testing) -# Usage: ci_test_governance.sh local - -export TS_NODE_FILES=true -if [ "${1}" == "checkout" ]; then - # Test master by default. - BRANCH_TO_TEST=${2:-"master"} - echo "Checking out geth at branch ${BRANCH_TO_TEST}..." - ./node_modules/.bin/mocha --node-option loader=ts-node/esm src/e2e-tests/governance_tests.ts --branch ${BRANCH_TO_TEST} -elif [ "${1}" == "local" ]; then - export GETH_DIR="${2}" - echo "Testing using local geth dir ${GETH_DIR}..." - ./node_modules/.bin/mocha --node-option loader=ts-node/esm src/e2e-tests/governance_tests.ts --localgeth ${GETH_DIR} -fi diff --git a/packages/celotool/ci_test_replicas.sh b/packages/celotool/ci_test_replicas.sh deleted file mode 100755 index 469c3bbed98..00000000000 --- a/packages/celotool/ci_test_replicas.sh +++ /dev/null @@ -1,21 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# This test tests rotating validators with zero downtime. - -# For testing a particular branch of Geth repo (usually, on Circle CI) -# Usage: ci_test_replicas.sh checkout -# For testing the local Geth dir (usually, for manual testing) -# Usage: ci_test_replicas.sh local - -export TS_NODE_FILES=true -if [ "${1}" == "checkout" ]; then - # Test master by default. - BRANCH_TO_TEST=${2:-"master"} - echo "Checking out geth at branch ${BRANCH_TO_TEST}..." - ./node_modules/.bin/mocha --node-option loader=ts-node/esm src/e2e-tests/replica_tests.ts --branch ${BRANCH_TO_TEST} -elif [ "${1}" == "local" ]; then - export GETH_DIR="${2}" - echo "Testing using local geth dir ${GETH_DIR}..." - ./node_modules/.bin/mocha --node-option loader=ts-node/esm src/e2e-tests/replica_tests.ts --localgeth ${GETH_DIR} -fi diff --git a/packages/celotool/ci_test_slashing.sh b/packages/celotool/ci_test_slashing.sh deleted file mode 100755 index e350181c958..00000000000 --- a/packages/celotool/ci_test_slashing.sh +++ /dev/null @@ -1,21 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# Slashing tests - -# For testing a particular branch of Geth repo (usually, on Circle CI) -# Usage: ci_test_exit.sh checkout -# For testing the local Geth dir (usually, for manual testing) -# Usage: ci_test_exit.sh local - -export TS_NODE_FILES=true -if [ "${1}" == "checkout" ]; then - # Test master by default. - BRANCH_TO_TEST=${2:-"master"} - echo "Checking out geth at branch ${BRANCH_TO_TEST}..." - ./node_modules/.bin/mocha --node-option loader=ts-node/esm src/e2e-tests/slashing_tests.ts --branch ${BRANCH_TO_TEST} -elif [ "${1}" == "local" ]; then - export GETH_DIR="${2}" - echo "Testing using local geth dir ${GETH_DIR}..." - ./node_modules/.bin/mocha --localgeth ${GETH_DIR} --node-option loader=ts-node/esm,experimental-specifier-resolution=node ./src/e2e-tests/slashing_tests.ts -fi diff --git a/packages/celotool/ci_test_sync.sh b/packages/celotool/ci_test_sync.sh deleted file mode 100755 index 261e843bd63..00000000000 --- a/packages/celotool/ci_test_sync.sh +++ /dev/null @@ -1,21 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# This test starts a Geth mining node and checks that other nodes can sync with it. - -# For testing a particular branch of Geth repo (usually, on Circle CI) -# Usage: ci_test_sync.sh checkout -# For testing the local Geth dir (usually, for manual testing) -# Usage: ci_test_sync.sh local - -export TS_NODE_FILES=true -if [ "${1}" == "checkout" ]; then - # Test master by default. - BRANCH_TO_TEST=${2:-"master"} - echo "Checking out geth at branch ${BRANCH_TO_TEST}..." - ./node_modules/.bin/mocha --node-option loader=ts-node/esm src/e2e-tests/sync_tests.ts --branch ${BRANCH_TO_TEST} -elif [ "${1}" == "local" ]; then - export GETH_DIR="${2}" - echo "Testing using local geth dir ${GETH_DIR}..." - ./node_modules/.bin/mocha --node-option loader=ts-node/esm src/e2e-tests/sync_tests.ts --localgeth ${GETH_DIR} -fi diff --git a/packages/celotool/ci_test_transfers.sh b/packages/celotool/ci_test_transfers.sh deleted file mode 100755 index ab91e473489..00000000000 --- a/packages/celotool/ci_test_transfers.sh +++ /dev/null @@ -1,21 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# This test starts a standalone Geth node and runs transactions on it. - -# For testing a particular branch of Geth repo (usually, on Circle CI) -# Usage: ci_test_transfers.sh checkout -# For testing the local Geth dir (usually, for manual testing) -# Usage: ci_test_transfers.sh local - -export TS_NODE_FILES=true -if [ "${1}" == "checkout" ]; then - # Test master by default. - BRANCH_TO_TEST=${2:-"master"} - echo "Checking out geth at branch ${BRANCH_TO_TEST}..." - ./node_modules/.bin/mocha --node-option loader=ts-node/esm src/e2e-tests/transfer_tests.ts --branch ${BRANCH_TO_TEST} -elif [ "${1}" == "local" ]; then - export GETH_DIR="${2}" - echo "Testing using local geth dir ${GETH_DIR}..." - ./node_modules/.bin/mocha --node-option loader=ts-node/esm src/e2e-tests/transfer_tests.ts --localgeth ${GETH_DIR} -fi diff --git a/packages/celotool/ci_test_validator_order.sh b/packages/celotool/ci_test_validator_order.sh deleted file mode 100755 index 7a70bf2e889..00000000000 --- a/packages/celotool/ci_test_validator_order.sh +++ /dev/null @@ -1,21 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# This test starts a standalone Geth node and runs transactions on it. - -# For testing a particular branch of Geth repo (usually, on Circle CI) -# Usage: ci_test_validator_order.sh checkout -# For testing the local Geth dir (usually, for manual testing) -# Usage: ci_test_validator_order.sh local - -export TS_NODE_FILES=true -if [ "${1}" == "checkout" ]; then - # Test master by default. - BRANCH_TO_TEST=${2:-"master"} - echo "Checking out geth at branch ${BRANCH_TO_TEST}..." - ./node_modules/.bin/mocha --node-option loader=ts-node/esm src/e2e-tests/validator_order_tests.ts --branch ${BRANCH_TO_TEST} -elif [ "${1}" == "local" ]; then - export GETH_DIR="${2}" - echo "Testing using local geth dir ${GETH_DIR}..." - ./node_modules/.bin/mocha --node-option loader=ts-node/esm src/e2e-tests/validator_order_tests.ts --localgeth ${GETH_DIR} -fi diff --git a/packages/celotool/cloudbuild.yaml b/packages/celotool/cloudbuild.yaml deleted file mode 100644 index 8dad1f34cd3..00000000000 --- a/packages/celotool/cloudbuild.yaml +++ /dev/null @@ -1,16 +0,0 @@ -steps: - -# Unshallow clone -- name: gcr.io/cloud-builders/git - args: ['fetch', '--unshallow'] - -# build docker image for google container registry -- name: gcr.io/kaniko-project/executor:latest - args: [ - "--dockerfile=dockerfiles/monorepo/Dockerfile.celotool", - "--cache=true", - "--destination=gcr.io/$PROJECT_ID/celo-monorepo:celotool-$COMMIT_SHA" - ] - id: Build celotool docker image - waitFor: ['-'] -timeout: 1000s \ No newline at end of file diff --git a/packages/celotool/genesis_baklava.json b/packages/celotool/genesis_baklava.json deleted file mode 100644 index 3d4dc7fdd79..00000000000 --- a/packages/celotool/genesis_baklava.json +++ /dev/null @@ -1,523 +0,0 @@ -{ - "config": { - "homesteadBlock": 0, - "eip150Block": 0, - "eip150Hash": "0x0000000000000000000000000000000000000000000000000000000000000000", - "eip155Block": 0, - "eip158Block": 0, - "byzantiumBlock": 0, - "constantinopleBlock": 0, - "petersburgBlock": 0, - "istanbulBlock": 0, - "chainId": 62320, - "istanbul": { - "policy": 2, - "blockperiod": 5, - "requesttimeout": 10000, - "epoch": 17280, - "lookbackwindow": 12 - } - }, - "timestamp": "0x5b843511", - "extraData": "0xecc833a7747eaa8327335e8e0c6b6d8aa3a38d0063591e43ce116ccf5c89753ef90f08f90276940cc59ed03b3e763c02d54d695ffe353055f1502d943f5084d3d4692cf19b0c98a9b22de614e49e147094ef0186b8eda17be7d1230eeb8389fa85e157e1fb94edddb60ef5e90fb09707246df193a55df3564c9d94d5e454462b3fd98b85640977d7a5c783ca16222894a4f1bad7996f346c3e90b90b60a1ca8b67b51e4b945b991cc1da0b6d54f8befa9de701d8bc85c92324946dfdaa51d146ecff3b97614ef05629ea83f4997e94d2b16050810600296c9580d947e9d919d0c332ed94fe144d67068737628effb701207b3eb30ef93c699482e64996b355625efeaad12120710706275b5b9a94241752a3f65890f4ac3eaec518ff94567954e7b5941bddeaf571d5da96ce6a127feb3cadadb531f43394f86345e9c9b39ab1cbe82d7ad35854f905b8b835945c3512b1697302c497b861cbfda158f8a3c5122c94a02a692d70fd9a5269397c044aebdf1085ba090f94ac91f591f12a8b6531be43e0ccf21cd5fa0e80b094718a8ac0943a6d3ffa3ec670086bfb03817ed54094b30980ce21679314e240de5cbf437c15ad459eb89499eca23623e59c795eceb0edb666eca9ec27233994c030e92d19229c3efd708cf4b85876543ee1a3f7945c98a3414cb6ff5c24d145f952cd19f5f1f56643941979b042ae2272197f0b74170b3a6f44c3cc5c0594db871070334b961804a15f3606fbb4fac7c7f93294c656c97b765d61e0fbcb1197dc1f3a91cc80c2a494ad95a2f518c197dc9b12ee6381d88bba11f2e0e5944d4b5bf033e4a7359146c9ddb13b1c821fe1d0d3949c64da169d71c57f85b3d7a17db27c1ce94fbde494b5f32e89ccad3d396f50da32e0a599e43ce87dd794ba40db8ab5325494c9e7e07a4c4720990a39305cf90b7cb860d5b174e904a2dfbcc181640651bc4c627c86fc9611d92795f18bfe779efb377ae799329419f4ee5a1e5c12103f9a5201257da7a821439d4c86731efe1959bf4076fccf1404c6bca78b6065a466571ef3428f9e203a8ccef519aad1622c9e7401b86078e6e671e6b9cf6f8fc21fb3f9eb8cf1d64bbdc1fd1213856c7fad711e9b26600f609a2e9cf6322f5cac415eaca793018b9fa902b49d0828258aed10b247e4409c320244adcfc941af6a09e87df18edd5239df5d12e4c38858f9159f5cabb680b860dea0e3506854e5d08b932c558da137c231185169d8490cebe070eee539eea00842a3a9787c79859ee82fec298bad0e012e8e4d859b9a13bbd7c97af622a1409276b01d51cb6e8cc36c8ff7a4d2492c35d9c258d37e8eb2cd59b9c9dcd7cc3e80b860dd794eab9568cea95207f500651867934d5c923bec817eb11536ac8ca4107b127d266ff9e67d407c10a82c4f529ac100080357a69ec30d5e7cf0c68109312b6137ca43cbe7ca4a58a7ff34d1d9c70be34fbf3dce59edd1248709091b80c8b580b86049ab79869c44658468d010f831680cf798bc9b7a26145be11b7f6ce547d1e7c13781548663cf710b1d9dfde04fa44700865048c711d0bf2715dc089ed9337e822fb8ce2b889e1512d1de58b17004548e246c14f0fa37988c9edc5cc8f895b400b860855b3dc7efe86a6907eb56a0ab2b663123e28f469359bdf558864b12a867b52e73acf0833534303236c350e2f3489101289db1d0d3f9a9ebe00e18b20109a91e012dda871a62cd513b85f42fbe5684c8131067402c7e944d27ad10c76458fa80b8601e8b9ec07ff41b4a7b9636470920c7dbc175ac79441f12b3f93da55c26a48c17367ac3421106c955b4913460c9558501410c7dca5cdcd9372a82fa22eed8b9b7c725e390fb97b3a354cafeae18b99b24e07609122ac17924ae40362c48ca3b80b86021eafc9e179b482c450eb092766e79c74bddc02126eb07a56954a74d9cae7bf9aabacd22178a4646fcb2e57ba8826d00c71802db142b2f9d62de6295a9c21835965e0ea5564f37e57241561341bb870723fc828f20100250af23f59f32fa3900b8603044b1df058de34a35bf47d35a83e346e961a4cb4426fb5ad0ea9fc4d7a2d1f74ca6aa9ff3218f41e23b2c097181a80150f3e56f6ad0aa9d57ec9e71b1538a22761cf430783b73c12c0aed2232318374fdc18347270c1f4ce0fb032a788c0f00b860796cdfcfcc45ac40ce1c476d419805d59bb3511aebfc8e2c5bfbeb757ae5005713994fa21ced588c3b73d4e469dd1201a41c53b85c2a16a0582b9c0e375dc35b92b4e3fd4fc8a8802337565b840a9537368e16c4c076deebef8e81d463c77101b86070534c36b6f06081d3e2228167b4d3f0cf8e235b63850f8b2cbff9ded8c6bb62abb424a9aabfed54ce07143c374b5800d80b39dd3fcce57961858fff9f306bd74dfc576c55a6f8710bd9f35db772a53331fe69cda78ce9ab3ad4c2d2e08a8981b86051360cc471b15b3d2b1c88d1ed8595cfd89d9b9fab605408ffd1a98b0af0bc1293cbc2fe360a4362ffea907ac22bac0067e3259fb5d4064cc4813ec9449297f7b924034e9631ddc91e088b467eeae97b6133fd275f5f5830a936ba81b2bc9681b860ae05dd90d739565eae5aca380bef7a2dd3a6bdeb429f2a514864e52d8e60c0a9bc2e022292167ae221adea3ed2d19b009d715d34e3d5924a2cbc11503439c0bbe669e097698cf49a91012640c95dd7a9c77d2e06311a08d338b9541726162000b860c71dde0acd7a9014a2b4fbc0eee9d56711fa9683b6dfc9469c34ff61d52ba49f983c2de743aae2fa2bf7520de5bb9800333d40b75108cb4208d42597b60600de1bd83b282bed596e36783821af3675fd6267b578eba2613f186a43264d410801b86059e868a7daeb24dae2a81a8a397c8b3b4c80b74dbc566c0e8813194c4a0a94850abbc5fb63b56c2c55ad2a65d8534900c9810ecffa3fe2e3b5b759c9716ee3dba16eff9a1785c8b2f1c637322e951a4cc752c31ac400d8222c8ba25c47557680b860f5e06a742ce9ee4408a75a38eeb5c06803a87b97709e43308e14fd6a027e048706a31c4ef62084c8758754f95b897801b316a4cb159f60880f69a806ba64cd243727d644d1b10b8890935c2d1d753321f10ea9543418aa7e2b324ff76ab1cc80b86042cf85912f3b0420d43bc42775d51e77693fcb33a862fbfba18a05d5a8da3cdf7c5f105ee2e3413a9bed07e502af44002e3f48bec159a159db5253ea243d37a2090ff6e9c6214632100f364df129732e57512161b3957cb6f84603afc4859701b860d6592adca17cd2958a84bc1ed35e04aa33abc6ab93de2d3f78cb9e48f3036885fab3282ab7318e4103a4c078659fd9008e4ccde2494eb493f9f851eb1e0d338e2124b393d9491ad2f17dd4f468fa2ab80a7c0c2c4142f6bbb8eb6b33efcf3d01b8609787a55f5dfa9539d1b0146177f6cd56e7ca3b58a63c36cfcff3b14a78b68d7a5096219162a55e84525080dd834fb000e4157ec4baa6e21c51161f3e8b4aa8f828edcc93e1d8c6fa9a6608d01880064f86c60014c38087a83d8d8acd54375000b860effce8fdc3099723aa7be8d75a6600d3aca652f52aed0191dcb6bfb02cc88b8993fb29627ae5459f8c7affe1349a92001acdc1745ab09d073001521a9ec27158c5bf242db7fdf5745de83e340d7dbe7a9940c207a3e32d27d6406cca25552e00b8604c61d0bf7d775a61a61a5ef721663de02d0b379a1b589fe62d05917f67962459f0f8854f91384ad02b4efeed68ab3301ac31ec73f5f7e4546f329dcfab7666acbcd5022a51b229b7a9fe979d486b83984659a6cddee2bc49317e56af54498781b860d21657eb1b8e3d38c32873ce326195ee957059968d8926ea219852700d9a884289cd45c5dff6baf2528cb20bb3dc2f00f53c8221eb6c8056b14da363a98e573e4eaf46bc84eed3468364aa24848951c1de38cbe9945e47995e23ce3455dd1880b860da25af7a34042ca2e366d359dede467837181b89c6bd8a4d68c47f15952808f7a8c625a524d604a02936b7467f2a6f016a6a9bdf109c3aa4d3c6f6314dc04ff64bc4297e85232b16071eda952b0cf2b297f9a6cbba24e9805c6ea440c04d3f80b860ced170a03c9d7e505d04e13307eed25e1af3b96317799088881154ab6891fb13ca05a4ddbf7a63c233957502c2ff67002ee5015e2083b498773be347939124008c2f7f856ced900701077b5c4c6a1622c08b0f69fbf7e2e97f92ca8f1c870901b860a4c7328ef48a970211572e76c1c3bb2b8a1440233774e12479125808290fbc0e57f276034fa2735f1550fdf7a1b56101c97845c23dba5023aa922c6152028bbc93f39f222b4c58a015db70094efde2f5567f03c7eda6355e8ef9ad9c43065c81b8603b07a92815d862c4be414d89f03c9f5354190954ed4c3dd0282e5964aef7954469dee36726340db09aadfba322354c016077b90479c8c5301b362f83b71d9c351ffb57ca39ef611ca4f5d6a8d614411e77a2fcab5de1e6adb9f1a6e3c5f77a80b860aa6f0e66fe03056fdd2a7a003e0b9f29133f2ebab7c26d3699500b6ac51248f1e3f747a3c3a5b2d11d817b1252679400291d023ae71f3a9f5723b122081a6932075a974f75467aef517944c8d206d307c4a5ebe8fc8aa374baf75dfbece96c00b86038e7f70fab621ef588c0fe24d9f288387913faaf52a5813f7da47d5fa8a9e32a098e6323ad2c1133221a43bd97b40c01d6cad22a6c7c2969beb123a42b41539c3a38da1b0ed7d762aa9ab67239a1d0afdfc3466287ca4ad33bb69b647f144600b860acde65212b2f8f1e15bba36f0b2cc0d345678718695e78c7bfbe1566c4691c0352237896e812217184cdc897bf96f500cf616132520278666d00d27a07610de649687cfc28f8c568dce232583004b21d0ab8ea4361d84eb6f9540a1d0b3c3800b860761f7d6dd8c2df51b358a60bd809c693a364b0aefcfa4cb492525ab82b9a3c506e1ba17347f82cff57411ddd6bfd9d01cf9bd337de5916008358cf6f1b411131f43818607db22f439ecb431ae773cea4d39b8c3057b35c64b40beedfdd9e6d8080b8410000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000f86480b86000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000080f86480b86000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000080", - "coinbase": "0x0000000000000000000000000000000000000000", - "alloc": { - "fCf982bb4015852e706100B14E21f947a5Bb718E": { - "balance": "200000000000000000000000000" - }, - "0xd71fea6b92d3f21f659152589223385a7329bb11": { - "balance": "1000000000000000000000" - }, - "0x1e477fc9b6a49a561343cd16b2c541930f5da7d2": { - "balance": "1000000000000000000000" - }, - "0x460b3f8d3c203363bb65b1a18d89d4ffb6b0c981": { - "balance": "1000000000000000000000" - }, - "0x3b522230c454ca9720665d66e6335a72327291e8": { - "balance": "1000000000000000000000" - }, - "0x0AFe167600a5542d10912f4A07DFc4EEe0769672": { - "balance": "1000000000000000000000" - }, - "0x412ebe7859e9aa71ff5ce4038596f6878c359c96": { - "balance": "1000000000000000000000" - }, - "0xbbfe73df8b346b3261b19ac91235888aba36d68c": { - "balance": "1000000000000000000000" - }, - "0x02b1d1bea682fcab4448c0820f5db409cce4f702": { - "balance": "1000000000000000000000" - }, - "0xe90f891710f625f18ecbf1e02efb4fd1ab236a10": { - "balance": "1000000000000000000000" - }, - "0x28c52c722df87ed11c5d7665e585e84aa93d7964": { - "balance": "1000000000000000000000" - }, - "0Cc59Ed03B3e763c02d54D695FFE353055f1502D": { - "balance": "103010030000000000000000000" - }, - "3F5084d3D4692cf19b0C98A9b22De614e49e1470": { - "balance": "10011000000000000000000" - }, - "EF0186B8eDA17BE7D1230eeB8389fA85e157E1fb": { - "balance": "10011000000000000000000" - }, - "edDdb60EF5E90Fb09707246DF193a55Df3564c9d": { - "balance": "10011000000000000000000" - }, - "d5e454462b3Fd98b85640977D7a5C783CA162228": { - "balance": "10011000000000000000000" - }, - "a4f1bad7996f346c3E90b90b60a1Ca8B67B51E4B": { - "balance": "10011000000000000000000" - }, - "5B991Cc1Da0b6D54F8befa9De701d8BC85C92324": { - "balance": "10011000000000000000000" - }, - "6dfdAa51D146eCff3B97614EF05629EA83F4997E": { - "balance": "10011000000000000000000" - }, - "D2b16050810600296c9580D947E9D919D0c332ed": { - "balance": "10011000000000000000000" - }, - "Fe144D67068737628efFb701207B3eB30eF93C69": { - "balance": "10011000000000000000000" - }, - "82E64996B355625efeAaD12120710706275b5b9A": { - "balance": "10011000000000000000000" - }, - "241752a3f65890F4AC3eAeC518fF94567954e7b5": { - "balance": "10011000000000000000000" - }, - "1bdDeaF571d5da96ce6a127fEb3CADaDB531f433": { - "balance": "10011000000000000000000" - }, - "F86345e9c9b39aB1cbE82d7aD35854f905B8B835": { - "balance": "10011000000000000000000" - }, - "5c3512b1697302c497B861CBfDA158f8a3c5122C": { - "balance": "10011000000000000000000" - }, - "a02A692d70Fd9A5269397C044aEBDf1085ba090f": { - "balance": "10011000000000000000000" - }, - "aC91f591F12a8B6531Be43E0ccF21cd5fA0E80b0": { - "balance": "10011000000000000000000" - }, - "718A8AC0943a6D3FFa3Ec670086bfB03817ed540": { - "balance": "10011000000000000000000" - }, - "b30980cE21679314E240DE5Cbf437C15ad459EB8": { - "balance": "10011000000000000000000" - }, - "99eCa23623E59C795EceB0edB666eca9eC272339": { - "balance": "10011000000000000000000" - }, - "c030e92d19229c3EfD708cf4B85876543ee1A3F7": { - "balance": "10011000000000000000000" - }, - "5c98A3414Cb6Ff5c24d145F952Cd19F5f1f56643": { - "balance": "10011000000000000000000" - }, - "1979b042Ae2272197f0b74170B3a6F44C3cC5c05": { - "balance": "10011000000000000000000" - }, - "Db871070334b961804A15f3606fBB4fAc7C7f932": { - "balance": "10011000000000000000000" - }, - "C656C97b765D61E0fbCb1197dC1F3a91CC80C2a4": { - "balance": "10011000000000000000000" - }, - "aD95a2f518c197dc9b12eE6381D88bba11F2E0E5": { - "balance": "10011000000000000000000" - }, - "4D4B5bF033E4A7359146C9ddb13B1C821FE1D0d3": { - "balance": "10011000000000000000000" - }, - "9C64dA169d71C57f85B3d7A17DB27C1ce94FBDE4": { - "balance": "10011000000000000000000" - }, - "B5f32e89ccaD3D396f50da32E0a599E43CE87dd7": { - "balance": "10011000000000000000000" - }, - "Ba40Db8ab5325494C9E7e07A4c4720990A39305c": { - "balance": "10011000000000000000000" - }, - "8B7852DA535df3D06D6ADc1906778afd9481588a": { - "balance": "10011000000000000000000" - }, - "a8F41EA062C22dAFFc61e47cF15fc898517b86B1": { - "balance": "10011000000000000000000" - }, - "66a3Fc7E8fd6932568cDB6610F5a67BeD9F5beF8": { - "balance": "10011000000000000000000" - }, - "10301d9389653497F62876f450332467E07eEe1F": { - "balance": "10011000000000000000000" - }, - "6c3ac5fcb13E8DCd908C405Ec6DAcF0EF575D8FC": { - "balance": "10011000000000000000000" - }, - "85226637919D3d47E1A37b3AF989E9aE1a1C4790": { - "balance": "10011000000000000000000" - }, - "43BCa16603c56cb681d1da3636B7a1A225598bfc": { - "balance": "10011000000000000000000" - }, - "E55d8Bc08025BDDF8Da02eEB54882d0586f90700": { - "balance": "10011000000000000000000" - }, - "40E1C73f6228a2c15e10aF2F3e890098b777ED15": { - "balance": "10011000000000000000000" - }, - "DbbF476089a186a406EA13a4c46813f4BccC3660": { - "balance": "10011000000000000000000" - }, - "7baCEA66a75dD974Ad549987768bF8d8908b4917": { - "balance": "10011000000000000000000" - }, - "fbF4C2362a9EB672BAC39A46AFd919B3c12Ce44c": { - "balance": "10011000000000000000000" - }, - "A8dB96136990be5B3d3bfe592e5A5a5223350A7A": { - "balance": "10011000000000000000000" - }, - "1Dd21ED691195EBA816d59B3De7Fab8b3470Ae4B": { - "balance": "10011000000000000000000" - }, - "058A778A6aeEfacc013afba92578A43e38cc012D": { - "balance": "10011000000000000000000" - }, - "13f52Ab66871880DC8F2179d705281a4cf6a15fB": { - "balance": "10011000000000000000000" - }, - "eD1Ed9a71E313d1BCe14aB998E0646F212230a33": { - "balance": "10011000000000000000000" - }, - "c563F264f98e34A409C6a085da7510De8B6FE90B": { - "balance": "10011000000000000000000" - }, - "c6D678fC6Cc1dA9D5eD1c0075cF7c679e7138e02": { - "balance": "10011000000000000000000" - }, - "5179fc80CaB9BB20d5405a50ec0Fb9a36c1B367a": { - "balance": "10011000000000000000000" - }, - "0d473f73AAf1C2bf7EBd2be7196C71dBa6C1724b": { - "balance": "100110000000000000000" - }, - "6958c5b7E3D94B041d0d76Cac2e09378d31201bd": { - "balance": "10011000000000000000000" - }, - "628d4A734d1a2647c67D254209e7B6471a11a5cb": { - "balance": "10011000000000000000000" - }, - "E1601e3172F0ef0100e363B639Bd44420B7E5490": { - "balance": "10011000000000000000000" - }, - "3337F2Cd103976F044b55D3E69aB06d1ebB142Db": { - "balance": "10011000000000000000000" - }, - "8D0D5c57dC232Be15Df4A1a048EF36162C853b94": { - "balance": "10011000000000000000000" - }, - "14800c28F3cF1Dd17AaC55263ef4e173b0e8e3Ef": { - "balance": "10011000000000000000000" - }, - "f3996A0f0f593BfD5E39780059C5430fab7359FD": { - "balance": "10011000000000000000000" - }, - "2217FeBe31Aea6C771AF163dCc453F9f060a4a00": { - "balance": "10011000000000000000000" - }, - "f426CC817400766cd6b44F13Cb63Ca648e323484": { - "balance": "10011000000000000000000" - }, - "B2C4913e257a34445Ec31685E625bb4060FB8e1f": { - "balance": "10011000000000000000000" - }, - "9438dbD05dfC19F049a469185c7599daa82646e8": { - "balance": "10011000000000000000000" - }, - "4BeD66Bf507f3CF524704267908Ea4ee3cDe3053": { - "balance": "10011000000000000000000" - }, - "9a850fe8105e9CCfBD9d1D06D535BB4948f3f6Cf": { - "balance": "10011000000000000000000" - }, - "1277eE554565542A8d0553E1e54006d006db75bd": { - "balance": "10011000000000000000000" - }, - "D7e829bE8E374D3fBbd2F68D9A916cB2f769BA89": { - "balance": "10011000000000000000000" - }, - "3691b847eD14E296afC90Ff3E37D21e518306170": { - "balance": "10011000000000000000000" - }, - "c4C703357B01672cF95bFa0450a5717812Bc7ffb": { - "balance": "10011000000000000000000" - }, - "0c9369077836353A8D92aeD29C72A7DfD300B354": { - "balance": "10011000000000000000000" - }, - "856DF2A3bdBb8086cE406C469dDE94d12C1E3176": { - "balance": "10011000000000000000000" - }, - "E40B3e5c59e2157037b699895329DBe4aA33C039": { - "balance": "10011000000000000000000" - }, - "edb47aF3aC2325735722450D1E7DA082bDDad58c": { - "balance": "10011000000000000000000" - }, - "315D669866E13fA302B76c85481F9181e06304Ce": { - "balance": "10011000000000000000000" - }, - "A5185E3328592428d5989422e0339247dD77e10D": { - "balance": "10011000000000000000000" - }, - "85Fd1d1Cd6655EbB89db7D6cA0a5C9c62F7a4CFf": { - "balance": "10011000000000000000000" - }, - "ACC9E4430EC1011673547395A191C6b152763EA4": { - "balance": "10011000000000000000000" - }, - "3824967C172D52128522dD257FE8f58C9099166B": { - "balance": "10011000000000000000000" - }, - "5542aDEA3092da5541250d70a3Db28Ad9BE7Cfc7": { - "balance": "10011000000000000000000" - }, - "c61Cd4477f0A98BfC97744481181730f7af7c14f": { - "balance": "10011000000000000000000" - }, - "5D7Ffd0fC6DAA67AbF7d48ae69f09dbe53d86983": { - "balance": "10011000000000000000000" - }, - "350914ABD4F095534823C1e8fA1cfD7EF79e7E4c": { - "balance": "10011000000000000000000" - }, - "ECa6f058B718E320c1D45f5D1fb07947367C3D4B": { - "balance": "10011000000000000000000" - }, - "9C577D0795Ed0cA88814d149c2DC61E8Fc48Ad81": { - "balance": "10011000000000000000000" - }, - "72fE8bC8E3Ff1e56543c9c1F9834D6dfC31BEDDC": { - "balance": "10011000000000000000000" - }, - "6Ff2CFa7899073CD029267fd821C9497811b5f7E": { - "balance": "10011000000000000000000" - }, - "4685D123aE928a7912646681ba32035ad6F010a6": { - "balance": "10011000000000000000000" - }, - "4799946c8B21fF5E58A225AeCB6F54ec17a94566": { - "balance": "10011000000000000000000" - }, - "1D7dA5a23a99Fc33e2e94d502E4Fdb564eA0B24C": { - "balance": "10011000000000000000000" - }, - "DFc9719cD9c7982e4A1FFB4B87cC3b861C40E367": { - "balance": "10011000000000000000000" - }, - "0c1F0457ce3e87f5eA8F2C3A007dfe963A6Ff9a7": { - "balance": "10011000000000000000000" - }, - "7dC23b30dFDc326B9a694c6f9723DC889fe16b7d": { - "balance": "10011000000000000000000" - }, - "3F0c4cFDD40D16B7C15878AcCdc91Be9ca4DeE79": { - "balance": "10011000000000000000000" - }, - "B984a83416F560437C7866e26CdDb94bDB821594": { - "balance": "10011000000000000000000" - }, - "138EA4C57F5b3984EFacd944b3b85dfDd5A78Dcc": { - "balance": "10011000000000000000000" - }, - "AD4f16F3435E849505C643714C9E5f40f73c4a5a": { - "balance": "10011000000000000000000" - }, - "6b38E861ec0b65fd288d96d5630711C576362152": { - "balance": "10011000000000000000000" - }, - "AE15D05100CE807d0aC93119f4ada8fa21441Fd2": { - "balance": "10011000000000000000000" - }, - "e0e25c5734bef8b2Add633eAa2518B207DAa0D66": { - "balance": "10011000000000000000000" - }, - "9039Ce107A9cD36Ed116958E50f8BDe090e2406f": { - "balance": "10011000000000000000000" - }, - "089bE2dD42096ebA1d94aad20228b75df2BeeBC7": { - "balance": "10011000000000000000000" - }, - "E3a79AEee437532313015892B52b65f52794F8a2": { - "balance": "10011000000000000000000" - }, - "Cc38EE244819649C9DaB02e268306cED09B20672": { - "balance": "10011000000000000000000" - }, - "eb0357140a1a0A6c1cB9c93Bf9354ef7365C97d9": { - "balance": "10011000000000000000000" - }, - "44370D6b2d010C9eBFa280b6C00010AC99a45660": { - "balance": "10011000000000000000000" - }, - "762438915209d038340C3Af9f8aAb8F93aDc8A9A": { - "balance": "10011000000000000000000" - }, - "9CBa7aD50fa366Ff6fC2CAe468929eC9AD23Ea2B": { - "balance": "10011000000000000000000" - }, - "4f4F159826b2B1eE903A811fCd86E450c9954396": { - "balance": "10011000000000000000000" - }, - "3C132B8465e2D172BB7bab6654D85E398ee7c8AD": { - "balance": "10011000000000000000000" - }, - "0582426C929B7e525c22201Bd4c143E45189C589": { - "balance": "10011000000000000000000" - }, - "fb542740B34dDC0ADE383F2907a1e1E175E0BF5a": { - "balance": "10011000000000000000000" - }, - "184Ca91AfE8F36bC5772b29cE2A76c90fCef34D0": { - "balance": "10011000000000000000000" - }, - "0C6f48B50B166ddcE52CEE051acCAfFB8ecB4976": { - "balance": "10011000000000000000000" - }, - "3aD2bE38fA3DFa7969E79B4053868FD1C368eAb2": { - "balance": "10011000000000000000000" - }, - "a6A690637b088E9A1A89c44c9dC5e14eD4825053": { - "balance": "10011000000000000000000" - }, - "C224B131Ea71e11E7DF38de3774AAAAe7E197BA4": { - "balance": "10011000000000000000000" - }, - "d3C18531f0879B9FB8Ed45830C4ce6b54dC57128": { - "balance": "10011000000000000000000" - }, - "02a272d17E1308beF21E783A93D1658f84F2D414": { - "balance": "10011000000000000000000" - }, - "57A1aC8167d94b899b32C38Ff9D2B2bD0e55C10d": { - "balance": "10011000000000000000000" - }, - "F8fc7D740929E5DD4eBA8fd5a6873Be6a4151087": { - "balance": "10011000000000000000000" - }, - "B2AfC45838b364240dE17D3143AA6096d3340A91": { - "balance": "10011000000000000000000" - }, - "eAf133d1e0Dd325721665B19f67C9b914EE2469F": { - "balance": "10011000000000000000000" - }, - "B7660F1B075e56780e7E026ff66995765f5f1f7F": { - "balance": "10011000000000000000000" - }, - "F25087E27B7a59003bb08d2cAc7A69E7c15a4be8": { - "balance": "10011000000000000000000" - }, - "E65054681206658A845140331459A057C4EB3CA7": { - "balance": "10011000000000000000000" - }, - "e7569A0F93E832a6633d133d23503B5175bEa5Db": { - "balance": "10011000000000000000000" - }, - "a9f6102BCf5351dFdC7fA0CA4Fa0A711e16605c3": { - "balance": "10011000000000000000000" - }, - "1AB9aA0E855DF953CF8d9cC166172799afD12a68": { - "balance": "10011000000000000000000" - }, - "6C04aA35c377E65658EC3600Cab5E8FFa95567D9": { - "balance": "10011000000000000000000" - }, - "6b82AD37e64c91c628305813B2DA82F18f8e2a2B": { - "balance": "10011000000000000000000" - }, - "AD5D1DeD72F0e70a0a5500B26b82B1A2e8A63471": { - "balance": "10011000000000000000000" - }, - "72B3589771Ec8e189a5d9Fe7a214e44085e89054": { - "balance": "10011000000000000000000" - }, - "74F57dA8be3E9AB4463DD70319A06Fb5E3168211": { - "balance": "10011000000000000000000" - }, - "b6f7F57b99DB21027875BEa3b8531d5925c346cE": { - "balance": "10011000000000000000000" - }, - "279d05241d33Dc422d5AEcAc0e089B7f50f879c3": { - "balance": "10011000000000000000000" - }, - "d57FEfe1B634ab451a6815Cd6769182EABA62779": { - "balance": "10011000000000000000000" - }, - "e86C8538Bdfb253E8D6cC29ee24A330905324849": { - "balance": "10011000000000000000000" - }, - "2C58D7f7f9CDF79CF3Cd5F4247761b93428A4E9e": { - "balance": "10011000000000000000000" - }, - "37326cEfAFB1676f7Af1CcDcCD37A846Ec64F19d": { - "balance": "10011000000000000000000" - }, - "f01DCf91d5f74BDB161F520e800c64F686Eb253F": { - "balance": "10011000000000000000000" - }, - "Ba85246bc2A4fdaC1cB2e3C68383Fe79A6466fd9": { - "balance": "10011000000000000000000" - }, - "4A76f81eA26381981a3B740975fb4F605989b585": { - "balance": "10011000000000000000000" - }, - "00ee7168618BaE4F4d2900D5063c62948c6F0566": { - "balance": "10011000000000000000000" - }, - "E1aD0B232B4262E4A279C91070417DAAF202623F": { - "balance": "10011000000000000000000" - }, - "f611173319b22080E0F02eE724781d85f4b39Ae6": { - "balance": "10011000000000000000000" - }, - "158659458dff3a9E5182cA0e8Ba08F53463FA5e7": { - "balance": "10011000000000000000000" - }, - "FEB11610ad367b0c994274A8153E50F4557e473F": { - "balance": "10011000000000000000000" - }, - "e1eB2279f45760Ab9D734782B1a0A8FD3d47D807": { - "balance": "10011000000000000000000" - }, - "8667d005eCF50Eb247890a11FCdCfC321DC1Da9f": { - "balance": "10011000000000000000000" - }, - "5Ce612A664C2f35558Dcab7edb999619e155CD07": { - "balance": "10011000000000000000000" - }, - "aD95f88cCd3aBC12ddd6cD0b9a777B95339b747b": { - "balance": "10011000000000000000000" - }, - "6E5a5A2963F6d0C2EA26682a152fE3ac7CBC1227": { - "balance": "10011000000000000000000" - }, - "000000000000000000000000000000000000ce10": { - "code": "0x60806040526004361061004a5760003560e01c806303386ba3146101e757806342404e0714610280578063bb913f41146102d7578063d29d44ee14610328578063f7e6af8014610379575b6000600160405180807f656970313936372e70726f78792e696d706c656d656e746174696f6e00000000815250601c019050604051809103902060001c0360001b9050600081549050600073ffffffffffffffffffffffffffffffffffffffff168173ffffffffffffffffffffffffffffffffffffffff161415610136576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260158152602001807f4e6f20496d706c656d656e746174696f6e20736574000000000000000000000081525060200191505060405180910390fd5b61013f816103d0565b6101b1576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260188152602001807f496e76616c696420636f6e74726163742061646472657373000000000000000081525060200191505060405180910390fd5b60405136810160405236600082376000803683855af43d604051818101604052816000823e82600081146101e3578282f35b8282fd5b61027e600480360360408110156101fd57600080fd5b81019080803573ffffffffffffffffffffffffffffffffffffffff1690602001909291908035906020019064010000000081111561023a57600080fd5b82018360208201111561024c57600080fd5b8035906020019184600183028401116401000000008311171561026e57600080fd5b909192939192939050505061041b565b005b34801561028c57600080fd5b506102956105c1565b604051808273ffffffffffffffffffffffffffffffffffffffff1673ffffffffffffffffffffffffffffffffffffffff16815260200191505060405180910390f35b3480156102e357600080fd5b50610326600480360360208110156102fa57600080fd5b81019080803573ffffffffffffffffffffffffffffffffffffffff16906020019092919050505061060d565b005b34801561033457600080fd5b506103776004803603602081101561034b57600080fd5b81019080803573ffffffffffffffffffffffffffffffffffffffff1690602001909291905050506107bd565b005b34801561038557600080fd5b5061038e610871565b604051808273ffffffffffffffffffffffffffffffffffffffff1673ffffffffffffffffffffffffffffffffffffffff16815260200191505060405180910390f35b60008060007fc5d2460186f7233c927e7db2dcc703c0e500b653ca82273b7bfad8045d85a47060001b9050833f915080821415801561041257506000801b8214155b92505050919050565b610423610871565b73ffffffffffffffffffffffffffffffffffffffff163373ffffffffffffffffffffffffffffffffffffffff16146104c3576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260148152602001807f73656e64657220776173206e6f74206f776e657200000000000000000000000081525060200191505060405180910390fd5b6104cc8361060d565b600060608473ffffffffffffffffffffffffffffffffffffffff168484604051808383808284378083019250505092505050600060405180830381855af49150503d8060008114610539576040519150601f19603f3d011682016040523d82523d6000602084013e61053e565b606091505b508092508193505050816105ba576040517f08c379a000000000000000000000000000000000000000000000000000000000815260040180806020018281038252601e8152602001807f696e697469616c697a6174696f6e2063616c6c6261636b206661696c6564000081525060200191505060405180910390fd5b5050505050565b600080600160405180807f656970313936372e70726f78792e696d706c656d656e746174696f6e00000000815250601c019050604051809103902060001c0360001b9050805491505090565b610615610871565b73ffffffffffffffffffffffffffffffffffffffff163373ffffffffffffffffffffffffffffffffffffffff16146106b5576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260148152602001807f73656e64657220776173206e6f74206f776e657200000000000000000000000081525060200191505060405180910390fd5b6000600160405180807f656970313936372e70726f78792e696d706c656d656e746174696f6e00000000815250601c019050604051809103902060001c0360001b9050610701826103d0565b610773576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260188152602001807f496e76616c696420636f6e74726163742061646472657373000000000000000081525060200191505060405180910390fd5b8181558173ffffffffffffffffffffffffffffffffffffffff167fab64f92ab780ecbf4f3866f57cee465ff36c89450dcce20237ca7a8d81fb7d1360405160405180910390a25050565b6107c5610871565b73ffffffffffffffffffffffffffffffffffffffff163373ffffffffffffffffffffffffffffffffffffffff1614610865576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260148152602001807f73656e64657220776173206e6f74206f776e657200000000000000000000000081525060200191505060405180910390fd5b61086e816108bd565b50565b600080600160405180807f656970313936372e70726f78792e61646d696e000000000000000000000000008152506013019050604051809103902060001c0360001b9050805491505090565b600073ffffffffffffffffffffffffffffffffffffffff168173ffffffffffffffffffffffffffffffffffffffff161415610960576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260118152602001807f6f776e65722063616e6e6f74206265203000000000000000000000000000000081525060200191505060405180910390fd5b6000600160405180807f656970313936372e70726f78792e61646d696e000000000000000000000000008152506013019050604051809103902060001c0360001b90508181558173ffffffffffffffffffffffffffffffffffffffff167f50146d0e3c60aa1d17a70635b05494f864e86144a2201275021014fbf08bafe260405160405180910390a2505056fea165627a7a72305820959a50d5df76f90bc1825042f47788ee27f1b4725f7ed5d37c5c05c0732ef44f0029", - "storage": { - "0xb53127684a568b3173ae13b9f8a6016e243e63b6e8ee1178d6a717850b5d6103": "0x0Cc59Ed03B3e763c02d54D695FFE353055f1502D" - }, - "balance": "0" - } - }, - "number": "0x0", - "gasUsed": "0x0", - "mixHash": "0x63746963616c2062797a616e74696e65206661756c7420746f6c6572616e6365", - "parentHash": "0x0000000000000000000000000000000000000000000000000000000000000000" -} diff --git a/packages/celotool/genesis_baklavastaging.json b/packages/celotool/genesis_baklavastaging.json deleted file mode 100644 index a0a350df7ad..00000000000 --- a/packages/celotool/genesis_baklavastaging.json +++ /dev/null @@ -1,223 +0,0 @@ -{ - "config": { - "homesteadBlock": 0, - "eip150Block": 0, - "eip150Hash": "0x0000000000000000000000000000000000000000000000000000000000000000", - "eip155Block": 0, - "eip158Block": 0, - "byzantiumBlock": 0, - "constantinopleBlock": 0, - "petersburgBlock": 0, - "istanbulBlock": 0, - "chainId": 31420, - "istanbul": { - "policy": 2, - "blockperiod": 5, - "requesttimeout": 10000, - "epoch": 17280, - "lookbackwindow": 12 - } - }, - "timestamp": "0x5b843511", - "extraData": "0xecc833a7747eaa8327335e8e0c6b6d8aa3a38d0063591e43ce116ccf5c89753ef90a62f901a4944588abb84e1bbefc2bcf4b2296f785fb7ad9f28594a83dbbd6565fffa42e0439e08c97d253427ff86c940d8ccae442ef5e452e4948feffe0b8295073e51f94c955ad5b9292647669f7e6d544125e4632360eb094f77e9dd83a22cab28cf4cedad25792348249e8de94fffa11ef0475ba651a3bcb9ba42184fbcdb9e69e94c58ddc41e3af815d5c3c2d8a8465e0832e87005f948baaf7c944452b8873c70af106d129dcdad99eec946da12f24f14ad0efeea4fd0b1b14b2bf26009f17942379ad55e7d3175674efa7a02e1c30ca423de9b794a49557af4be0d55c1c1c419c3f3f0067fdc9f51c94a517e5ac9e2d409cb055f35f030c4598b28b3b769429974d953f6fdda44c9c90677426719e086a03fc94f9fd3859a9b3e0758328d7767c540c3397f28664944de1f59d382ae77048ce9ec9412f41af64e82016944cb3107cacf8272363cee5862724089f60f8c38694153ca761ba7316718ab80a0c019a545c9a71e332942a847599f069e3745a1fc5b6bd1cb4185151be999431ca13d06166304c4e227f016ebdd1a1a50e77269447de51228ec2650b4fe5dbcfeec208f68f332073f907a8b86031edb83bd19da2fd32959e52710d5a8177d7f3853fc6ab192cd0754ce84486cebbe0586c585a5a2255739aff3fce48009443dd17ee38ef23582b41560c70c088e37d686c5672146a93005563382c16644c6cab9d534ba200599cb9fce9390001b860deb530af5ca9bde08587ee01e0c92e9f62bc01c833a474b703a9dd957368e0fbf4585d6e3a849c34c4074c5ff6f27200e6df0c941f66650ff381a1c8b78d50d900de531199d999d8344e2f4423f9058bd42940aec18e789e3a23ba032901de00b860b6744672c6b69cb122321dc6b07d1c279a712db3e165901b6d89d28a1adc64076b3577c31d09879c55b69cb050e00a000a8f76604433c7ce68b1483cfa6b6ac198ae8b47986a140f6810b2f23d7c39ad6e18df8c4ef737f065d72cb350c17980b860454c4022c00a991579229ee764cfee95ed396bf92e366eab56366639237dcd06e0d73b687a89f96565849bb382d43f01e9949e065efbe9c433751cde2d720578973e4c73b3077495c24ed109c9554a2a8bef27ff1d35d501a32e5c3b95f10880b860c8129042c92534bbec19e52f4684991376925d7275e47c6c90b609b9cf732871526ce0773fbd29a71e51d9e9ca6c220027123bbbc9bb843936fd67a02a36b95ee83ced06e3f34bc608f1b81b44cc2f2a8131d2028e9df4715b3652ec43ef0800b860a69c450526807639775c4592cd439cc7fc628a9660dacce203fc3072bd524d7bf2e9f21ee7b67fd15ea53ba0a99921018e788c3dc7885e413e802eaddcd763527374b834cbd8650402daf798b0e1c258ab4784286fe41649776ab63200e1a981b8609de03d06d5f1d3222b67b0b65e7d9eec35685fa69e24f5c429e7ce1250efff812e9e19444f9c684cc524fbce7c851600f0ea65c91c3b2117f6828ca73e62134913571afc73d1743f3a79d067742cc9782138d5d95f45040cf2ed633a7d868581b8609c316c12d0f68667be982b2342340a051368a5cd199054efcb230263f8a1a3c0e7e6dc8d8c097a2e21bfe600022f7f01a85730c9cfab81f111020c0692a0e9f30afced6fe9dd377eff8aaec4af030d947325c5ddf8a7ef2c02769512e107c080b860d57d25ef5cacd7e3da9aff67523302ae423b8af7452129656ac13f1a8b2ae7dcb4a985655656ba0c20e0d21c83b7410065db6501fa38208540824459b71d8c1ed96b53b10d359b414a8e4b5f02558d3abebfdebc8fd30fcef252474a57e52781b86055601af6d309e6ca30c7b1c59bb0a2441ba1917f80851f0997dd79a35f90226734ff8268eb848d52000fa26fca006b014931f58eaf8f8af16faf7f196021bbfc014c21b01b27b95eb9a163b817261d4c2ceac14d37020889277f748122a60400b860af6427cad4bcc9b5c336abfdf14d34d1874506539cb5c492ad6210127783962e3729a5826c94d4aa2f81e74a6b824a00a211bb4e6a79db9adbcb334d00c87c9c838388b59d4aa3e2810acf23359451a0d5b9e6ac370b3dcbaf7c7b36e8fd9f01b86078e54d72c1f2494d4352a0b47ddeade7b88a8d959e691e480369c76cb244d7993d6917e6803b2e077c74ad54f2746a011246370f83a4283820de29f14c1c4fc3564d29615ea3f8317dd266a4555739575ee7428b30c27ef85aa68509d1946300b86054970d70b996b31c7fab533c7466fd606036c05515342407169f231b367c0f3456a11c26ff1d79edeb8fda0911be080127aa5a206f102a9fe1133ac5a875712a786983d81259a1ddb4e8161529c44b6599082a72aa2b0d713669bbacd5030b80b86032561cefa7f227f627fcea569a9332c4bfdf59653460abc712d7bf3a4b8dd087148564be38df63796346dfddfa49bb0008e8fada39f805891982ec250f33ccff7f7016b3bc86fab5b20c605c85860ac785707f186f91f36a3339d079ee440181b8607ec523dd6fcc98709acd8563035ddfca983e5faa725d90328ae7a4fafa8ba576ea5724a7f74ee056e4717d248527c3003c69b7f38d1250ced33dbbd74acadcfc0ffce6595982bcd47c97ea8f6a802ba581f8167a343e0e241a1950faa28f1d80b8606dda7632cc74de8c6071f78379c70dbfc9751ccf2e87c2fe1ccb101d513d220700c70f6ef2a4320aa95899f9b7f2530079b49ac64420c3da998cc76fc2a840b055da01b8e7894f5561b47c2e5f0de6b492e61bdaa3f602ddf1794a4f1509c400b860083290fafc9e80fa11523e24f7edbda1e55beda51c73892173c3221aeccb5b0755d926984ef2ed84e85a8ee33429a501853bd27b74be9da9f5951bc5b7e798c57470031fd6605a6688ec349e3fbf289aaea27482e7ef770ebbc924332d08f380b86062418bca58d3bac7c37dffb5b749ae7275710d697685f1310c4454295c15a424bf9eda516ffe39208d959db2b4121e01f16ef48dffc817443c8026a2d209ac9f1cfef13c25e7629485467fe4b44ae6e76ec074a51dbd0655404379087626be00b860076edd818b100cfa22cab56df2b15a7d37e36f8c3843e104f85027a253f5fa76a40c58f6698b92a395f09979146220010b5a9c5b96fac074eacf4bcb6265fe1d6c75a9d707307f81677cefec3fc5c6b0809c1da3126309e5cb33107525096b01b8600133959a8f45c1fbde384fa586e9208616795abe799c6d1c81bc99a380eb47cf3055652bb82b40639456d032b2af1c0065ece9eae2b2a18e3dd7ac32f9b2807b444a47b17512c5146545e37bf3259d08fec5ea57d46f8b177f7e0e61af7c848180b8410000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000f86480b86000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000080f86480b86000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000080", - "coinbase": "0x0000000000000000000000000000000000000000", - "alloc": { - "fCf982bb4015852e706100B14E21f947a5Bb718E": { - "balance": "200000000000000000000000000" - }, - "0d473f73AAf1C2bf7EBd2be7196C71dBa6C1724b": { - "balance": "1000000000000000000000" - }, - "8F7ca85A9E4A18B551b765706bd0B6f26927D86F": { - "balance": "1000000000000000000000" - }, - "3EaEe6C693420Ae86643EB2837978da8eEbf973f": { - "balance": "1000000000000000000000" - }, - "Dd3E5FcE22938c0f482004527D468a8799C4a61E": { - "balance": "1000000000000000000000" - }, - "Fb2Ee4Da251fC6A9DF7eb8d5c4ea1DeC99d127eA": { - "balance": "1000000000000000000000" - }, - "d321C7356DFB5b6F4AD9e5B58C51B46409fe1442": { - "balance": "1000000000000000000000" - }, - "bbbC38f6a383293522d4aEDaA98b7d2D73E90A73": { - "balance": "1000000000000000000000" - }, - "B9E0b0B8fdA1001392c8fFd19f6B7ad5286589F2": { - "balance": "1000000000000000000000" - }, - "44740e3eedfD3a2A2e7662de9165a6E20bBcC72C": { - "balance": "1000000000000000000000" - }, - "7a2cb0438e7B9801C29B39Ff94439aFf930CDf9F": { - "balance": "1000000000000000000000" - }, - "4588ABb84e1BBEFc2BcF4b2296F785fB7AD9F285": { - "balance": "103010030000000000000000000" - }, - "A83DBBd6565fFfa42e0439e08c97D253427Ff86c": { - "balance": "10011000000000000000000" - }, - "0d8ccaE442eF5e452E4948FEFfe0b8295073E51f": { - "balance": "10011000000000000000000" - }, - "C955AD5b9292647669F7e6d544125E4632360Eb0": { - "balance": "10011000000000000000000" - }, - "f77E9dd83a22CAb28Cf4cEDad25792348249E8de": { - "balance": "10011000000000000000000" - }, - "fFfA11Ef0475ba651A3Bcb9BA42184fbcDB9E69e": { - "balance": "10011000000000000000000" - }, - "C58Ddc41e3af815d5C3c2d8a8465e0832E87005F": { - "balance": "10011000000000000000000" - }, - "8BAaF7C944452B8873C70aF106d129DCdAd99eEC": { - "balance": "10011000000000000000000" - }, - "6dA12F24F14Ad0eFeEA4fd0b1b14B2BF26009f17": { - "balance": "10011000000000000000000" - }, - "2379AD55e7d3175674EFa7A02E1c30cA423De9B7": { - "balance": "10011000000000000000000" - }, - "a49557Af4BE0d55c1C1C419C3F3f0067fdc9F51c": { - "balance": "10011000000000000000000" - }, - "A517e5AC9e2d409CB055F35F030c4598B28B3B76": { - "balance": "10011000000000000000000" - }, - "29974D953F6fDDA44c9c90677426719E086a03FC": { - "balance": "10011000000000000000000" - }, - "F9FD3859a9B3E0758328D7767c540C3397F28664": { - "balance": "10011000000000000000000" - }, - "4dE1f59d382ae77048CE9ec9412f41AF64e82016": { - "balance": "10011000000000000000000" - }, - "4cb3107cAcf8272363CEE5862724089F60f8c386": { - "balance": "10011000000000000000000" - }, - "153cA761ba7316718ab80A0C019A545c9A71e332": { - "balance": "10011000000000000000000" - }, - "2A847599F069E3745A1fc5b6bD1cB4185151bE99": { - "balance": "10011000000000000000000" - }, - "31Ca13d06166304c4E227f016EbDd1a1a50e7726": { - "balance": "10011000000000000000000" - }, - "47DE51228ec2650b4Fe5DbCFEec208f68f332073": { - "balance": "10011000000000000000000" - }, - "8CC7a93e38b1DA573b41E48E5e47bA16E1273a47": { - "balance": "10011000000000000000000" - }, - "027c60c37Bada21d549a3B34B6B8e049f828e0AB": { - "balance": "10011000000000000000000" - }, - "363FCf294ABD4c8CdEa84175EcCDa950942921AC": { - "balance": "10011000000000000000000" - }, - "C233C86dCe075e78F0e36d0B84552495672A6948": { - "balance": "10011000000000000000000" - }, - "756b80b05A9CCE51bdFE91570e54CB09DEF9812e": { - "balance": "10011000000000000000000" - }, - "6b65877717370650aBA981b88A99e18b556F122D": { - "balance": "10011000000000000000000" - }, - "3C4f9eFd8700C223D3B7AE797FA78A522D961A7F": { - "balance": "10011000000000000000000" - }, - "30A8b35193818CcD401C14BcFA4FCd4d30bD4562": { - "balance": "10011000000000000000000" - }, - "7ce9B5FA976e0Eb4273a15FBc11aB3114396dBd5": { - "balance": "10011000000000000000000" - }, - "0cF60c0aa43cdcacfB50D589D9B1C2972094d4cf": { - "balance": "10011000000000000000000" - }, - "7b107F6A7E2342e1b36A14E34Ade428A62A0B305": { - "balance": "10011000000000000000000" - }, - "bF57d286f91b4ec3c0C7520e80FfeB40544d3b19": { - "balance": "10011000000000000000000" - }, - "8E4224DA8ecD76956695345a1addC671dE73f70b": { - "balance": "10011000000000000000000" - }, - "c0875151593247CbBEf78e9fd80Ffef102F8Ca8b": { - "balance": "10011000000000000000000" - }, - "37165522688De1Fc5D9CD17e836cad13570D60A9": { - "balance": "10011000000000000000000" - }, - "c8AD9F1A616Ea390318894770729572896319e92": { - "balance": "10011000000000000000000" - }, - "b531171f24f987644e3084251C2F2B22b25Df5aD": { - "balance": "10011000000000000000000" - }, - "c51aa01A1795e3DF7eF0e511bE2184397E659F8f": { - "balance": "10011000000000000000000" - }, - "B192511F6CFB728f90424f40145Db9d201e50591": { - "balance": "10011000000000000000000" - }, - "0E60b3040A367fcda85DBCC8eBC6589f188e4938": { - "balance": "10011000000000000000000" - }, - "2305daC99567d5810498C194Df27959ca4FE5bD6": { - "balance": "10011000000000000000000" - }, - "d410913B5c05e6eFd3FAb69D22B85301EEbC37b7": { - "balance": "10011000000000000000000" - }, - "4C1aE1e2120aAaC99331DfbBf4DcC0Eb1430b55D": { - "balance": "10011000000000000000000" - }, - "f72A61C0ebFE1b646452381967377eF6E5378012": { - "balance": "10011000000000000000000" - }, - "B932b6e0445011A5a2F572A64227FfD645B3A833": { - "balance": "10011000000000000000000" - }, - "2315C905Bc60AD5b3aF0d72eD203e1475C39E38D": { - "balance": "10011000000000000000000" - }, - "fF8dFf8f27ac481d9C66740eDAadE82734f392bc": { - "balance": "10011000000000000000000" - }, - "Ef72830EA68559fea1F22D46d02A4418D8444730": { - "balance": "10011000000000000000000" - }, - "7dB7a7e101C213ff24378f490c4BBD3F52575a0e": { - "balance": "10011000000000000000000" - }, - "e7843bdBd67EF741d41d4a0EBb14B36C1B8C6AbE": { - "balance": "10011000000000000000000" - }, - "0646451cAea32ba86E0Cf9509704ADE400c57AAA": { - "balance": "10011000000000000000000" - }, - "000000000000000000000000000000000000ce10": { - "code": "0x60806040526004361061004a5760003560e01c806303386ba3146101e757806342404e0714610280578063bb913f41146102d7578063d29d44ee14610328578063f7e6af8014610379575b6000600160405180807f656970313936372e70726f78792e696d706c656d656e746174696f6e00000000815250601c019050604051809103902060001c0360001b9050600081549050600073ffffffffffffffffffffffffffffffffffffffff168173ffffffffffffffffffffffffffffffffffffffff161415610136576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260158152602001807f4e6f20496d706c656d656e746174696f6e20736574000000000000000000000081525060200191505060405180910390fd5b61013f816103d0565b6101b1576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260188152602001807f496e76616c696420636f6e74726163742061646472657373000000000000000081525060200191505060405180910390fd5b60405136810160405236600082376000803683855af43d604051818101604052816000823e82600081146101e3578282f35b8282fd5b61027e600480360360408110156101fd57600080fd5b81019080803573ffffffffffffffffffffffffffffffffffffffff1690602001909291908035906020019064010000000081111561023a57600080fd5b82018360208201111561024c57600080fd5b8035906020019184600183028401116401000000008311171561026e57600080fd5b909192939192939050505061041b565b005b34801561028c57600080fd5b506102956105c1565b604051808273ffffffffffffffffffffffffffffffffffffffff1673ffffffffffffffffffffffffffffffffffffffff16815260200191505060405180910390f35b3480156102e357600080fd5b50610326600480360360208110156102fa57600080fd5b81019080803573ffffffffffffffffffffffffffffffffffffffff16906020019092919050505061060d565b005b34801561033457600080fd5b506103776004803603602081101561034b57600080fd5b81019080803573ffffffffffffffffffffffffffffffffffffffff1690602001909291905050506107bd565b005b34801561038557600080fd5b5061038e610871565b604051808273ffffffffffffffffffffffffffffffffffffffff1673ffffffffffffffffffffffffffffffffffffffff16815260200191505060405180910390f35b60008060007fc5d2460186f7233c927e7db2dcc703c0e500b653ca82273b7bfad8045d85a47060001b9050833f915080821415801561041257506000801b8214155b92505050919050565b610423610871565b73ffffffffffffffffffffffffffffffffffffffff163373ffffffffffffffffffffffffffffffffffffffff16146104c3576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260148152602001807f73656e64657220776173206e6f74206f776e657200000000000000000000000081525060200191505060405180910390fd5b6104cc8361060d565b600060608473ffffffffffffffffffffffffffffffffffffffff168484604051808383808284378083019250505092505050600060405180830381855af49150503d8060008114610539576040519150601f19603f3d011682016040523d82523d6000602084013e61053e565b606091505b508092508193505050816105ba576040517f08c379a000000000000000000000000000000000000000000000000000000000815260040180806020018281038252601e8152602001807f696e697469616c697a6174696f6e2063616c6c6261636b206661696c6564000081525060200191505060405180910390fd5b5050505050565b600080600160405180807f656970313936372e70726f78792e696d706c656d656e746174696f6e00000000815250601c019050604051809103902060001c0360001b9050805491505090565b610615610871565b73ffffffffffffffffffffffffffffffffffffffff163373ffffffffffffffffffffffffffffffffffffffff16146106b5576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260148152602001807f73656e64657220776173206e6f74206f776e657200000000000000000000000081525060200191505060405180910390fd5b6000600160405180807f656970313936372e70726f78792e696d706c656d656e746174696f6e00000000815250601c019050604051809103902060001c0360001b9050610701826103d0565b610773576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260188152602001807f496e76616c696420636f6e74726163742061646472657373000000000000000081525060200191505060405180910390fd5b8181558173ffffffffffffffffffffffffffffffffffffffff167fab64f92ab780ecbf4f3866f57cee465ff36c89450dcce20237ca7a8d81fb7d1360405160405180910390a25050565b6107c5610871565b73ffffffffffffffffffffffffffffffffffffffff163373ffffffffffffffffffffffffffffffffffffffff1614610865576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260148152602001807f73656e64657220776173206e6f74206f776e657200000000000000000000000081525060200191505060405180910390fd5b61086e816108bd565b50565b600080600160405180807f656970313936372e70726f78792e61646d696e000000000000000000000000008152506013019050604051809103902060001c0360001b9050805491505090565b600073ffffffffffffffffffffffffffffffffffffffff168173ffffffffffffffffffffffffffffffffffffffff161415610960576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260118152602001807f6f776e65722063616e6e6f74206265203000000000000000000000000000000081525060200191505060405180910390fd5b6000600160405180807f656970313936372e70726f78792e61646d696e000000000000000000000000008152506013019050604051809103902060001c0360001b90508181558173ffffffffffffffffffffffffffffffffffffffff167f50146d0e3c60aa1d17a70635b05494f864e86144a2201275021014fbf08bafe260405160405180910390a2505056fea165627a7a72305820959a50d5df76f90bc1825042f47788ee27f1b4725f7ed5d37c5c05c0732ef44f0029", - "storage": { - "0xb53127684a568b3173ae13b9f8a6016e243e63b6e8ee1178d6a717850b5d6103": "0x4588ABb84e1BBEFc2BcF4b2296F785fB7AD9F285" - }, - "balance": "0" - } - }, - "number": "0x0", - "gasUsed": "0x0", - "mixHash": "0x63746963616c2062797a616e74696e65206661756c7420746f6c6572616e6365", - "parentHash": "0x0000000000000000000000000000000000000000000000000000000000000000" -} diff --git a/packages/celotool/genesis_rc0.json b/packages/celotool/genesis_rc0.json deleted file mode 100644 index 39332896408..00000000000 --- a/packages/celotool/genesis_rc0.json +++ /dev/null @@ -1,87 +0,0 @@ -{ - "config": { - "homesteadBlock": 0, - "eip150Block": 0, - "eip150Hash": "0x0000000000000000000000000000000000000000000000000000000000000000", - "eip155Block": 0, - "eip158Block": 0, - "byzantiumBlock": 0, - "constantinopleBlock": 0, - "petersburgBlock": 0, - "chainId": 200312, - "istanbul": { - "policy": 2, - "blockperiod": 5, - "requesttimeout": 10000, - "epoch": 17280, - "lookbackwindow": 12 - } - }, - "nonce": "0x0", - "timestamp": "0x5b843511", - "gasLimit": "0x8000000", - "extraData": "0x0000000000000000000000000000000000000000000000000000000000000000f91854f9041a9404aceda1f7909187796d6d59f842ed4aeea3e4b694d832461f6c870780263f9e8986b2cd1bb428290f941a13ab893ce00b72e8f146e2bb348e81d8e0910e945006b13983d03aac5b5ad95958cd6fd5d734bb2b94de7d3354eec658c6cbbe4a4b3f6aa6da63a121a8945c99cf35a86a2b495ec4dde1cca717ac341f945e9433753436f0586c77f8aa76d27a1486d8ec6047cf9413970bd076704cf24e7a88543ce0db073e3bbeb79493d94614c4cca93a97cdce21db5c674696e6dd74946ca0bf8bc24c5a0175a3ec133037814604f4fffd941354bc1df8844c82183646d4d6474a147b2a1be894e79bbc523556ba2bb457ade14d3394ba75d22cf194b589bb9df58e3fb545f094aaf07d2ec6a584abbc94e9efd6b2dcf46f4677f6e067869821a6717a7ad994c75404c5f4abea37b663a8ee5c1d76f3e35c4a1d9430ff181acfed6956e3b18a812f94849d98ff163594c72b2b51c72db728e8cf35b0e5c04938451a942294fc6092e4184014b8954612363052957fb38b695c943092bde74f475396a6c41dae1211dd26624a06e6941766e8a5d2ab98ead916bee43fabed08ce0c73dd94e8f113cccf0913b3bc69ae0e038194e285c6171c949669e3e907c90caafccbc668fe99c074c9e7ab6f94e1466d089588412920c3c3deb796ceb0c5235dc7944f7b30c72344c4270811a634a5bfe5cea1aa48fe946750196385695f6a693b40d689a18767590f5e2b949bcef1e207ffad55a35f19b6a0a8c282f16cb08a94b294662f6788a8b09f47bfe9f90fc87e52f0b9ca94ae49f41c0b5120149e5944703261ee8ccb86f58d9417c89224e78aba10546a0dc0563b73fc7056103994466bb2037f82eee8701f2f5013bfa9ad38fe421894bcf279d32fad69ba5169b3bc027ae7aac5e62c2094e03d02d1686b4e2d1bc09968825fe29458bf631f94b839fdde1578591000a2e5a77b50e3fa6f9bf7f394be480459079791fc3ed66d8e75892539d18426a59404e130dd50fc495210ff59d023c43c1d053d6e6d9475c12b971fef93f8a420b8724c60880a8806aed594c6782b9c3da0faacc7cd7fa5a0654485f3b7f5389493fbf71e611d10414707cc43b5bd8ab8380f3c0c94812b104f97fe0eb3db732a73f886ed809d7c9b7c94fede68be72688e4ceef312459d6922b549e91ce3943925e708970e83a33bbd60586c180e24f3d4abe3945a80e4c6703d70013f64521ff59c34f1ef2d548b94fbff194380133db49891da3aafa39f1e4ac3ed9494aa4fc38b65706138f3455a7dc93351bf9f852ec3946bca4159a155b9604ed05132ac0e95a36c9a6ee5944280119413a36456bdfa511fb1fdc9f659a53e55946bad169434301c27088c8fcc663715e2a2947582943a5e24148144b0b2c1fd8418c4dacc9e823303339410a6c36ba0c4b61fd05ce93da7b2b380a34ab5089445fb47864ba49e7e1c3eb9d4f5c9e4a39025d5d3f91324b8601471fa8b5ffbe112c6d894494a8f8f779e3bae02b3a2623924d0f98dd0e0445a64951bf59365a2aa542a10b2dc8e0c003c1d2b17bbe146ba12da2b7b4ce28d17bce7c5d099bd59b8b372159c3b156e0908f478f473762b0dc7c71146f8e74000b860fb11c8d67fd2634619b3ecf3feea2c616ecfc42bcd841f88191fa15d197399b00e193658de8a9bc7ed98e395a13a9b00bbd4a84341856a3e10f63f351992dce6cb7b8013a44a6309960ddde3b51782074d55a0401a81b3b6689b2224b3e60880b860ec18716b5a0b7db9f7d61481173fcf2e6b640fb13b6f06694552d471814e3386aee01af7e44c8e8c8b98999094f32e01c86646d21dcbf54b660bf5b16dd38a65e7a21d72f9c918b641da2a87c73a4630053177b2119f86f947895ff9ea000400b860abc62865f1b21c10040e53c048fcf1386b33ed914b1a1e229bc8b239ae1e1a7f645cf2a45ef2b1eb495545536a323701266d0d38a1ddc1251a83534d40ae9f5d370f813a127c288c1cb272da938e5af20795afb9322e30447add8e1ef30a5f80b8606904af78424d3d4743001d790d8a3a6f96c5464369f8d65005fb59eb840803208a7334720985eee63e8727417be2e50089e4d9076a8e4aafdee89461223cc4cd7b79ad4ee43f4eb5661ae335d95e90549a43c4b39f9497fd59f48ec426b99f01b860b8d5cb9651caa4533c9b4a91db9dc16f7e02bfdef77f73ec8fa36f64dc19e46858d6fa05d6270337e950f4b257499e00a6fefc966bfc28bd2109bc5f8830b5ad891bef81a000cc65259ccdad1fb850234de80da88e4e4870c7dadf85c5de5a80b860162afae7824ae865664766453ee3e595b2a2258715d9743ffcbeb21ceffa6b5df8b53c631c4da52d7a728df6739525001090659f2a58fa425334a594dd43cb3d7009814672cd2ff980eb5643e397477253f8e4d34aa413332efd1ce88cc34f80b86040970e807f3eb28288448dc5487f382dcb5750cb103cf2c71432cd3867dc022d75649880f29dc0dd2bc0fb675621ab0181a7d087faa9f1bbe85f47d3a890dd741c32bc3fc708a1b3af1e239309bd37ac8647a4c948eeacbb38512201b7df1301b860abec982653a35765ca6dbf813c5199922b2b97c5ab177829153902a24dfe2d34a89446357420111ebdb0fa794d550b00b4572f1e0197394cfd3857a4fd133e8e420b78d2e26d16506285fc87dde2f48c06033f03891fb10fc4dcece988c4be00b86037bd28a0bb83192056f1217be6991d3a2268cbd20f4a3b7da0d5c21a11ff9b67a4ded41e171c4075d15d35ac6f9cc4005040306c5385895114ae73a79d18860a078015a7695bd80751a0075c08013f45a78691146422875edb32701c5b330700b86062febf1d5e7b4372c973108e220c4010634afa3521996982e562379fe453a5dc068752889eea30339f1b7524cfbe140157ebcfa712313883ca59534ef17847ab394e0ba4ab412be660a1eeeccf7d32df80f25ee3bd23285bb278274f4f169880b8609a9a7809c80f53af392363b4ca8bae4ad5dbcaa79b8f742ac67cf1907c06ff3d99eee5298cce0f8658611d3aedccb7003528c49ef308b6d3079651751e27780ac6386dbf690fe01f085f18a59321d110cabe425786dbe7141cdd0ee79a70ad01b86032052e6f9f54800cca7ccaa3fcfb564ee577dc7f03a216c564d3fe4821f08aa2abb29ff5ed857ea36a9e1e54b2c64c00ce20ed5c771e38056e35502e1680d821e2d1aee6897365bd7bac665f7011b546dcad9231ac2fa2c83b2eef8b984df280b86042e5c9147aedbcae2c5e662fb3e74655f47cce7c8e4a3f64b97dbe6ed0089a375518f7547275952bc253797c35b74f01f25de8bfca9a0d7fa7d48f9ebe0f98f987383ab02c207ec5067275cd34dc4dad11e61d1c881c70da4135d656ccb52081b86051f56e8ff81ace6ca4da7374c60037f9773da298da52e9fa5a7d03865d33aca335d1ebc5540e92a9641f924d25be81018fc65ba8514e3270ce8215f89fc87978e8b31d50f62e2ad3455b964a72279fa7e816c64a4c713ed1a0979c4d60fe9901b86018d34f82f1e3eba03281b1ca3709989985a0e5ad36197326a251bf4325042bf01d8c479ca25fe29765b0d8de2813e600f2b30cf3672bf3effea745fd7f068b1361bc9b40f4e3f4a91f7f217fc248a8eb3fb1942e00e78ecfbe299c7918571e81b86062d109d216febdef5211998b27e9fb7fb4044b75cbbdaabc4acd4674e1b99e3ce94b1d8fedb163a208457512eccfa5008d1ac6b392dd6ac18062922af75cefa080e4624e847e843b1f9cac2833cd548890d50ecdd25532366ca51fc99afc8200b8601eb16de3bb6aaa2a1bd915e839996b9b542dca6e572aafdb40e33107418f878ab98ef93ab3356be7b3c3b56bcdb20d014247e669ff6c1123f66e4bf5ed7bf034eead1dcd600b0662ca786470cba77806b1a658fb08b345dcabc3de2579a49801b86061eee673d944148fdfced4a78fe3d68609ee58ffc63b4edeecbc9e19b0c204a1c7bdb54e7e1a8cc5b5b82f28799b2d00b1fdac61430b7f10f30dee9ac3c1467b3bdfcc3911c3a1edf461d5996dbc837e45f4b4221b893e52cb589f2f72460380b8601f464fd2676b15885a65d50fb35d79d3938ea43f7d27ac62d7b234c6aa671d0441cd5becef867d50573dddf7596599016da243a1b26115797b9a7ab7e7a5d0dd237053a2b23d1f6aa05f519ee298ca793221c3e08031fa844785626a35dde480b860ed753a819e47259cd8a0ca333f5c53260692bf438abfbda1e52518035bc72a33da1c99bd84655594dbb3b344b0506e00e8d60a2bdc2304ebcc704547c6ac230137ae4aaad2abeca9046879e434cb0cb28e80407d9c2c88b59bce0da2e53be100b86000380f86c503d051da502e5c881da7a25832215d239c66a55859353fc370fe8957f6975c99d98bf3a99bab5bcd560f01095308369e997f514c269665c70a7c3c24b50fe76e4dbe4d06368ef4eef5d100af992bffae5be31bb7391243a5d8a900b8603867d54f0eef3ea518e1eb47136caccc9fc6e2462846b8d516cda21618942708ea94022a7e7015111331883d79c35b00b58d519c4f5a2878230056b4079fb8d86756ba153128a9b6655dfa0eb511be3214ca5fdb2737efeb0c7cf4cad4e74400b860106f14f677bc4b406aba98e3c53db4f1af54d2136af68251e73368c3b232fa22b4a84f0a42d782a148190a8ec4569b00a942693b745a5969aaf23ba50eb30dfe302adbbdfc9b426e3480a60e254078665b8539db4be6ccda80b2fd8a37378081b86060f1356d4daa6a974a36c09da90b74a881acd8f2ad011b2501815cc05385f7f45f19a8ea9b69a61564295c00ca870601c8a2b2103b495088790c6930dd4050462546a5cd5c5d2f6eb96092584dc922a57530086a6190c131bb24cbeb012e1881b8605b07536bcbda46c75f5d9aa172fa987288e32678b4d61f02ce1045d7947fc38eeeac4512e827452491f8441f78ca8701881ff2fb683555a9b540853ef7f924d532ef045e49992ca1194d7a1a19fe01276e9981133b6a8e8df23d112bc411a181b86094384f0d1aabbb74f4142f4912ac6be043d9188e37d10780bc18ad18ecdf7adea9b6f92330d08a8b36a39f7e537e9300c6cbec405680fed4d3ac301b29bc29400309b39899011c1b950eec082c189f8be87e9790c7364d470654052cea20a780b86040e0cd46282a1f3475f347af631cc407b2b8b16c332e87024ad048b1289066506ace966a9185695bb72c485c020d62009ab316ec66e3773d6f63f425b9d54cde8ac0168d56eb18bcfd4b47527bcbf2d55a1df5012a27a2b59a7db07a760aff80b86012e57b96d7bbfa33f1b4cb066c962edf6e136b728b8e7ebbfa806165e93a1b45c534d00ec4344c70965340e87b459d00c9d54ca07c109d116ecd0422829cabc327ada8086085d7ca848cf615974f1d5b64041682fe1f913ec6fc7b1cd107a100b860beaa533350443f1a6f87987b29bcebf023ca1c9ba89b22d353b4abe7bce9a973190334ccfedb7d76767cbdcce9974701c64fbd05646fa4c28489f69c8fb27a84cf72c63f2b58c4eb0b15aecb6788c93da276f622c3c99ba5ce61a27da47b2a00b8603eb674128b8d4b487a5cda866e7e257768b516718528bfda6c444457fdf460deb230fd4b50718ed3d1d72b0094787c014a36b3662cad9b88469d774a5dfaf1f02b508adec452f60768ffccea4a389005f2abf2f5694ea932568d5fede4427200b860f46afe6d3b8ad94feb0fa9a20a4877d6b0ff82c2f9542d86957f32ec1399d2987c63dfe41b208200ff0c86e4722cf000ebddcc9700b46dde2b69eb52b9eb9307febdc8e830eb12725298aa686c5fa588d069d578e000d12a58f17fe5907d0a80b860acbdfb53273119e76ca26130ba32c66333d41aa0586ef415bde0b7cd2af2f298f37303836694d0983fb35d4a2dbcf600b0bdd4d5704c9055fb59684f537459dacbae8a5bb5fc8c1333cd612946c2f5195ebe9b19736ec1aa9c55c730b0935a01b86032a91a2907a2007d9160bdf16cecd0702f97e7ae4657cb1c2087c8da76eacc4c041916cb348e7f8e60b04794d3d21a019d8ad0d1819afd9161b0b33251e7c016f44cedc4fa3d76b4676bccc7533e3bfa213649383c5b3e2c6c83b31755752381b86088657485203bdb3490aac2df454f72e2c652abb570ffd28dadb5fa4772eaedd56f9af322b27fd92a815b90301fce210194cbd1030ada7556caec5e9fd4343eac25f6306d51e5176416f8dde221e319f43ea3c0bfbc7c9ada83b222f4448c8b01b8601ef3b3873a45e5ae5147333641c3d6892c445ee1a3c43fddea12b15dff203ad487e5d1800382aec4185cec0e2114c50098cbf0a1728b5c1ce87baccfff3e72bff2c51bd275e26ca9f624c313f02731b03725ba88fa8f3bb21c015f295b281200b86077c3557f788f31a6571ef0ab7492b4a50a27a70689c5eb9b60f01853ea6bc9edcb833a34eae060f5f90cae8501667800235c67723b1ec7002bcf1acb594b8d1c8df2f2654e7eed0cfd8cd979e1325386490552f107201d940011516090045500b86077aa64da8d4c5790e88381a5010e04f2b6833656b59656b7c82de481ba30dc4722e6cffa3f1f4279e1d2feb52c253f002ddcd8f12723f20c026800b1cd92fbc197fe526a6894f1f242b8d5db49a8328fd8219e1f5e9637df733f849826fcb680b860f10b21bf22dc2b49895224b0c3ebd135eb90dcc5754fce587de7a57825cb5317fc7ab79d6da70c5128bf92f0281d7901e7a08cf964bd241c348f5a2d9ee77dc4e52b1cd18ff2a9297442c7a39b31bce6da19cff512172cb64c96fecd41625a80b860b704769c2cad92f0935fe5591ce267ee42f7295e986ad5f9aacf521c3c424a18a8e3e14fc3f4a98c1d3ddaec5e3cee004302721b546f8ab1dfdb7164699717b126c2fdadffb64a5a30a545dea31a02a88dfebbc477a92dd5d7cc5c66b76f2480b860c2467f28f785cff88cf0b3c5243886def3409c2def841e905145a5717ea60994e3e7495220ba96a5f4f469850a9ef700b3a8f5a1747c57de8e88bfbbc839be7f2acaa40ba53858e2d37b35250829c6089ba1fef98fedf6e744325c90ea85e800b86055419af0800fecf25e6d92574bde4bf41168c7550ba600f156a9979f1af32b77f5b63368b39f92ca03ac1cc4d6b37d011240c13b007863c43076870e56117b9d4026b1268d7e5802ab1d28a492d974564de4bbea06d2c8db1c471286e9150d00b860336a9e16ae5745ad94a766ee1be7fc7ba3ce45dcdd78965cd238f2c63e5bf722052c8b436c7ce5512bee110e97677700323ada1620ab275f22938dc4a7ab2e07d14864524d84ba87db039ac19a18eafbabb287639f1177f4dda404bb566d4400b860a00fb9bd0eda3da5d11309fc2b7d2eb21533d6db0080c9ee4a75e3acf5c08e8e4e6565a3befc47729f0cee9c816b6200427720e0cba9b44bb20ff59752226c77941c70952bb02189e48a41b040c786d7d1f8cb69b8aa22ac42f3b8d2988f8381b860e88d29ab31574603fe9ac15e78924dbff6b48035208284e02099fa25e6d6e984dd3bf3148631a7544ff9582fbe204600e62b4fe24725064324b08c1773a900df6272fa5caab21947112b886fbcffed78cbe1e94c5caa28782cc2ae80c1c32c80b8607bc9faed4576390b7e26760c27746e8c1def4dd58e50ef1060434022f239aaf8f350f919d848dd313ab9c6497277360025ffcbc4e1f6a5346ddf9d940c0fd9da861759b998ceff0756cce7721801f31abb2487b5411d91fd440f5c0839b01100b8607e896229363f4e8f305b1ba8afbbc37d28d0295305fe4bd24fc74416e0bcbe9f2f21b8b67d190d4d0e1b158c1a705401f95347503220f265be2973511cd0da96e7070503c688f3ae20d932addcfd6156857473f083a195d38e4fff6031f6d700b86079e2e7d5d170514d1be612ed99c7f44c3b32e40faeb8b063e903439850aeb2563342223f1893fd71569f00a9fceb5d014310a61950969f89768a1219562119234173d6b1b48dada6e7385576c7ca501756179c22342eadfe25b57449812c7881b860c63e20aa60fb50e67f4759c812788e767b307b6e0fc9f04fc532bf59830d496f40e6ef4773b3a3158170ed2355e80c008feaa696564f896babd926d4eeff2625d20ddecd90caa1779e6f0316917733f5f3b4c89267b95a779e3bcd8e2b9d3b00b860c7c82587e95c7c13eafc5b30ae4986b1907dc586f0283309d15f23fe5b7d600a8dd8583cde4eaa82e1063293ee361b00ec9eb79f8e5e2e0fd84fd56cbd0bd472b9265217cc399d4f967f37e56880b4b1fed159ccc44fc52e7700bdfa0494310080b8410000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000f86480b86000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000080f86480b86000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000080", - "difficulty": "0x1", - "coinbase": "0x0000000000000000000000000000000000000000", - "alloc": { - "0x36940810BfDB329B31e38d3e97aFD673081B497C": { - "balance": "120000000000000000000000000" - }, - "0xfCf982bb4015852e706100B14E21f947a5Bb718E": { - "balance": "200000000000000000000000000" - }, - "0xbA8761304CEc7bE0f83C6F8Fa7EBBa3eE0b6Ae97": { - "balance": "27375005000000000000000000" - }, - "0xDb39DBE5abE42466F122b24c44518b1089ef8fC8": { - "balance": "57291662000000000000000000" - }, - "0x671D520ae3E89Ea5383A5d7162bCed79FD25CdEe": { - "balance": "20000000000000000000000000" - }, - "0x868E39ef6c51b979526c15Fb801a7dD567CBCffF": { - "balance": "20000000000000000000000000" - }, - "0x4eC7a9e67FB05f555d775604cF591Ccb3C47d7b9": { - "balance": "20000000000000000000000000" - }, - "0xaB5FD29Ce411C7b5c0c41d04a11d40f8fbCdA7a2": { - "balance": "20000000000000000000000000" - }, - "0x14F40ca7600B5605D9fE704A949DA41806509497": { - "balance": "19053338000000000000000000" - }, - "0x469be98FE71AFf8F6e7f64F9b732e28A03596B5C": { - "balance": "20000000000000000000000" - }, - "0x1cd43c11CaEaf4a43413258eF7E0704e157F7811": { - "balance": "22000000000000000000000000" - }, - "0x9268f359CAD800a2746FBC96C4A5CF68eE1c4EC2": { - "balance": "22000000000000000000000000" - }, - "0x10E0833be035607feEA41d32b6567D15D12511Ec": { - "balance": "22250000000000000000000000" - }, - "0x8f55CE88b4F62F22c663f5A539414dcCeF969c32": { - "balance": "11000000000000000000000000" - }, - "0xF607d4dd519B4bc963C9c48E8650E67C51DbC35b": { - "balance": "11000000000000000000000000" - }, - "0x515033209a0A29034DC3F037cC72a6014b902341": { - "balance": "10000000000000000000000000" - }, - "000000000000000000000000000000000000ce10": { - "code": "0x60806040526004361061004a5760003560e01c806303386ba3146101e757806342404e0714610280578063bb913f41146102d7578063d29d44ee14610328578063f7e6af8014610379575b6000600160405180807f656970313936372e70726f78792e696d706c656d656e746174696f6e00000000815250601c019050604051809103902060001c0360001b9050600081549050600073ffffffffffffffffffffffffffffffffffffffff168173ffffffffffffffffffffffffffffffffffffffff161415610136576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260158152602001807f4e6f20496d706c656d656e746174696f6e20736574000000000000000000000081525060200191505060405180910390fd5b61013f816103d0565b6101b1576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260188152602001807f496e76616c696420636f6e74726163742061646472657373000000000000000081525060200191505060405180910390fd5b60405136810160405236600082376000803683855af43d604051818101604052816000823e82600081146101e3578282f35b8282fd5b61027e600480360360408110156101fd57600080fd5b81019080803573ffffffffffffffffffffffffffffffffffffffff1690602001909291908035906020019064010000000081111561023a57600080fd5b82018360208201111561024c57600080fd5b8035906020019184600183028401116401000000008311171561026e57600080fd5b909192939192939050505061041b565b005b34801561028c57600080fd5b506102956105c1565b604051808273ffffffffffffffffffffffffffffffffffffffff1673ffffffffffffffffffffffffffffffffffffffff16815260200191505060405180910390f35b3480156102e357600080fd5b50610326600480360360208110156102fa57600080fd5b81019080803573ffffffffffffffffffffffffffffffffffffffff16906020019092919050505061060d565b005b34801561033457600080fd5b506103776004803603602081101561034b57600080fd5b81019080803573ffffffffffffffffffffffffffffffffffffffff1690602001909291905050506107bd565b005b34801561038557600080fd5b5061038e610871565b604051808273ffffffffffffffffffffffffffffffffffffffff1673ffffffffffffffffffffffffffffffffffffffff16815260200191505060405180910390f35b60008060007fc5d2460186f7233c927e7db2dcc703c0e500b653ca82273b7bfad8045d85a47060001b9050833f915080821415801561041257506000801b8214155b92505050919050565b610423610871565b73ffffffffffffffffffffffffffffffffffffffff163373ffffffffffffffffffffffffffffffffffffffff16146104c3576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260148152602001807f73656e64657220776173206e6f74206f776e657200000000000000000000000081525060200191505060405180910390fd5b6104cc8361060d565b600060608473ffffffffffffffffffffffffffffffffffffffff168484604051808383808284378083019250505092505050600060405180830381855af49150503d8060008114610539576040519150601f19603f3d011682016040523d82523d6000602084013e61053e565b606091505b508092508193505050816105ba576040517f08c379a000000000000000000000000000000000000000000000000000000000815260040180806020018281038252601e8152602001807f696e697469616c697a6174696f6e2063616c6c6261636b206661696c6564000081525060200191505060405180910390fd5b5050505050565b600080600160405180807f656970313936372e70726f78792e696d706c656d656e746174696f6e00000000815250601c019050604051809103902060001c0360001b9050805491505090565b610615610871565b73ffffffffffffffffffffffffffffffffffffffff163373ffffffffffffffffffffffffffffffffffffffff16146106b5576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260148152602001807f73656e64657220776173206e6f74206f776e657200000000000000000000000081525060200191505060405180910390fd5b6000600160405180807f656970313936372e70726f78792e696d706c656d656e746174696f6e00000000815250601c019050604051809103902060001c0360001b9050610701826103d0565b610773576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260188152602001807f496e76616c696420636f6e74726163742061646472657373000000000000000081525060200191505060405180910390fd5b8181558173ffffffffffffffffffffffffffffffffffffffff167fab64f92ab780ecbf4f3866f57cee465ff36c89450dcce20237ca7a8d81fb7d1360405160405180910390a25050565b6107c5610871565b73ffffffffffffffffffffffffffffffffffffffff163373ffffffffffffffffffffffffffffffffffffffff1614610865576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260148152602001807f73656e64657220776173206e6f74206f776e657200000000000000000000000081525060200191505060405180910390fd5b61086e816108bd565b50565b600080600160405180807f656970313936372e70726f78792e61646d696e000000000000000000000000008152506013019050604051809103902060001c0360001b9050805491505090565b600073ffffffffffffffffffffffffffffffffffffffff168173ffffffffffffffffffffffffffffffffffffffff161415610960576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260118152602001807f6f776e65722063616e6e6f74206265203000000000000000000000000000000081525060200191505060405180910390fd5b6000600160405180807f656970313936372e70726f78792e61646d696e000000000000000000000000008152506013019050604051809103902060001c0360001b90508181558173ffffffffffffffffffffffffffffffffffffffff167f50146d0e3c60aa1d17a70635b05494f864e86144a2201275021014fbf08bafe260405160405180910390a2505056fea165627a7a723058206808dd43e7d765afca53fe439122bc5eac16d708ce7d463451be5042426f101f0029", - "storage": { - "0xb53127684a568b3173ae13b9f8a6016e243e63b6e8ee1178d6a717850b5d6103": "0x469be98FE71AFf8F6e7f64F9b732e28A03596B5C" - }, - "balance": "0" - } - }, - "number": "0x0", - "gasUsed": "0x0", - "mixHash": "0x63746963616c2062797a616e74696e65206661756c7420746f6c6572616e6365", - "parentHash": "0x0000000000000000000000000000000000000000000000000000000000000000" -} diff --git a/packages/celotool/genesis_rc1.json b/packages/celotool/genesis_rc1.json deleted file mode 100644 index 7cdd781b32a..00000000000 --- a/packages/celotool/genesis_rc1.json +++ /dev/null @@ -1,133 +0,0 @@ -{ - "config": { - "homesteadBlock": 0, - "eip150Block": 0, - "eip150Hash": "0x0000000000000000000000000000000000000000000000000000000000000000", - "eip155Block": 0, - "eip158Block": 0, - "byzantiumBlock": 0, - "constantinopleBlock": 0, - "petersburgBlock": 0, - "istanbulBlock": 0, - "chainId": 42220, - "istanbul": { - "policy": 2, - "blockperiod": 5, - "requesttimeout": 3000, - "epoch": 17280, - "lookbackwindow": 12 - } - }, - "nonce": "0x0", - "timestamp": "0x5ea06a00", - "gasLimit": "0x1312d00", - "extraData": "0xecc833a7747eaa8327335e8e0c6b6d8aa3a38d0063591e43ce116ccf5c89753ef91f4df905559427f326ab753d62cdb55018c62e9ee7e671bbb121947eec94733d16b96c6fe877464630bb5be1e5c3f294f0d17b624521c0a599b063d73a73f9719307b48f94a7681958b7f07f4fb8bd5b0e075fb2aa2b021d5994ecc20b5f3d6cdf41fc12707ad7872ff64256e29794e5617da4dfa6ce4912f1d39889fe2dec2f7e66b39439ec4f2a82f9f0f39929415c65db9ea5df54e41d94341dec14b7a56c242ce9cf939815ec7bb11042449469901924e6c045a03cc163c6b8ae8af80fa1ee80940a6641d4736767d1f2bcf2450200ef384391b4419466bdb4d2ff2ee4c68517e6fcd25cc3034c86916094606311948f7426ddfd23c1521b15eddb52e83b29944cb90ebba92141ed3021f5dc4e6c8bb642095846948c6f9aad8281a21e7f6522602f2d6469c950e0bb942a1bff2452aca1ca5ffabbd34b2744109d11e4f594b4fa2d21b238e12ee4a863517cb5092f2330cb1b9442d441b6793e6162b979fbf6ad0af0063cbec96a94436d12f639a32509685080161ff0365fc15545f094e0c5f6172673ad70a76ff264cbc0df783930b47d9421245b0a2c3235f1108d1aa01ae376849d36e68494f27bb4eabc4400a1abe9d80d7537ab0ef1b058bf948440e805b89f48c932265e3c4ad033813669d87a942eb79345089ca6f703f3b3c4235315cbeaad6d3c943aa1fa695aa89958ebdb5346d6760b72250dc1d794173c75c8f1be201ce89cf426fe12c9997d7096269474288dbefa3a55986c039953b67139a466474fc49482f0e7879314516952f7961b15c63fc6b2734dfb9459f7b67e6beae0223ddc91eec010b670c553e8e094e10a8cc6c22cdc320c67bd600a1d8a0a46d7f40094e3020350aceea29b783e0c947ae001692b8f624894b952930a3656a9cbab21df5919f94c61a495bf79942289a63b4b4700eeeca35323ea51785f366dd705940223e40d1f93a6fe5bef63605992ada10740e13b940610b8b4e6f5c3241d53ed3374ddca8969cd053c9463b4b616c5345e3dcc9e21db69297e2129447f4e94c9a7781729f95b88239c3bfc91fb52f92b44a1169456259f876eb6a7264d9f1a59952baad599ff964094f139e74adec329e715ab49a68c5548a00e40cbc29433657019d60a0a41f1b9970bd4b28a3a83dbeafc940f5640bd556b0be19262d1817b213ddb3424d91d94ed8bf82d2e579ff6363aef139f8b147a0105f17e94d8c68ebecb6f074ac5c4fb66a690ac0ad38a5a3c94c6f916ad6e360651bb95f8e67c1c28805745d084943ed95d6d4ce36ea7b349cd401e324316d956331a944a03c4c2e101ac4612d89b79f61c9c5bdd51929d94dd0f3f7beb37fe9d4496f8098446b65ddfb1fa0294642cbe89a58909ba712dd11ed4c4b2359bd8c85d94ffbcf262c1d5c4392ef469ba79f2cd195d2affda9468e0104fd2b5a2c93e97c2ba172c4d2a4223f76f945cab520442de9babc290b25e5e2e6a1194ec670794bc6963fc0e2f5547ba949ed39e80b8388321104f9443882141555003b3e71110f567373b59ac4cb0bd9448cedc58b10af13d688631bc3cb78a05b8a6e56a9465698c9ec5af10345cd1e39472d60fb6133bad6f94d507309fd69635aa37810a65a4da27ec47a1ba0594c46dc0741ff61af883e284daea062ae7382e709194198958f0b860ab0e3937f468fe366aac9eebad2e940ec5a403212d732d8d7ced050e9510f6327453c694464cce7999b3d90a8f1ffaf94a71fbdea1e6543594097095b8cf5cabf0c39e548d8df55dd635d84d0794aa937da037e617e868795eac1dcd43c663014d3294f11073eb2d259b90a91954cae30d0e6e9ecc7f11945b55452bdba5971d606f47647bd383f3c3fa728594439d5e4d7578ecc9efa52a8cf1887b11fd0fb9009492f628b0157d47c992f5c69dbbd038b110e27826f918e2b86005f784c52234d0c40ab13b0636c5728217a8f3100046593c30271d39d59fcdb9a0053a874710e9c65e0d8a030c16a700b4ec430207cc5037c5e2698ef839fa0b7a4372f4b5217deda2a8df087e3b552f300016ce9147a2ff4db4c8a87d77b080b860d3f640c3da685893c41eb6301fdde2920c5a09296438abe5894074592f27547fb4ad02d222b1a330ff5aea065eff5200973d4fcb57808d8d9b09337e09aec279dc3337e758c51a3219596699006ee3683ff0c733f16eb917fad37f7f5d505201b86057b5158540d1672de49073e469e5426766094ce6514ac29580803ff9d22cdef3b6fed28a8ad9de63bf938c1c7b0e7a0156fb03f5363d09bf337713d8d45475df48ccea0fa793f591fbb62673545fc846c04e6aa9e2eaaab72c3b9ba4069ea280b860ca75a4c4e1f2898c0ae0ed53698b0271d1f8869ecd945891ae6a9b02682335af625c29f80572b440f91a3a343f3ea600a49a426960b6c1c5dd7b24540ecb745329312dfdd525fd173418c846f0096acba466a27446114f3a2b2a887986846b01b860b3dff5505fe9ec48fa11a67dbf166e7d5ab3473ff9f5fdc90c690eb1142ad687b17eb8ab0c08615c9397b351c53e79016d593d588f8c607b2d1136713d05066ffd8a30589357afa54a0a7ac8777372787261dde126c4f0c04bf741f809c3be80b860451da71ff5b1e9b4fd587fd76f53808191096cc5869596eda8f3ad697426a98a659f5f4818e57866d4b0dd90f0684900742a167b314f0276e5e6a8bc6f40ce1cd1d35c7f9be56f1240de596b30a4b40010dfd21b7fc9aef400faaa421e27a080b860277d5eedd8cf7d549296a6f8ad2869ec6e90183b9e8ebbf49a9d3ecd817757b50c6acc6cab9f0dbcf4b87251b6e82a0192a00c1e509225ec1ec08dcbeb04ba2808b16f72af90f33277e614dc19383256b3980421441ce3e75356ee35e5f62d80b86007431d3192f7a273382b645bc300cb0161c2395c005bf389698ce4a3ac56cffb9405456102f5b6e41b42bb28e7cb1f0141d3e1be1b066a94cc2befe88b0f5eef63c60a1ab8a1f0cc02e73c0ce5aac464332d3a2c1074a5126807a75323280301b8600d39e1cd00ff4ce5e41fa9ea90526f3e3540b64ba52954e5682f31ec20360b2995d5caac7cee778ff7c13aaca74b9b004032f87a331867e9ebc7ed06c5b77707c88614714f2e0ed30c85c519aa823aa9d1a7a56766a294c7802aad300b408280b860ce9efb226a3c085071c7aea84f66040dc16a1600893eb1e6ef507fa72c6684f24d2c3b4ffee9881142fc41cfbd6b3b01933838ed7bd05c16450cab703b12ee623f8745f7f618981601a9c4de1337fe881975bc8572536281b9dcc51400177680b860075724b0f2fa9ced3440a76107a53b02a0b8b5a5a510d94b02b5c0696db96910204d54ee3138b3e5949464585d98de00a2964d480ca31adf09c962802895ea160d195bce583e59094bf7c11d33eaffef12cb999284bcf842bbab2c87647d8801b8603b0ffa0a101f959879b7e14d830862981d6c311ffd440f214eee0a2891743a25355e9230266aeadbd20f3bde0ccb30008bf326df8fba9b701d4764ad49663da19417132edf4281df4560715649491d9794dc64e9540d16b51b362e139c3aa701b8601a65e75f0f47d7c92da92de15372a0052338b952cc1307d5ee1596a670fedeb3b5ac8bc690f3f5e5f34fb8d18a5441011d6023e3c2b343bffc655972102a7181dffcab47bd0a2d6d46eedc6f2b70cd64025715de2b8ae23c032e61b380663c80b86071ca0f2be942a733074e8e0d92ae0adc2617bcfe503b97f8339c8f1c498d85608fa5966da7abdd9651c9ce4e3411e900c4d65d5c7d858f1e5ee212c9e543d3727aeaa9abbdcb68848e7c990c06aa70a4bf2585959943935549e7a9e2c2ba0d80b8600122d3f6ae5e2bc6fdee3cd35a3522ea519bfa6f365451fa275c4688bcfb5cec418fa053a8859981e973ea273ca68f014e840d0ccf18cc7c498742103c3116732ef197a9f90f50bf64f83754a86ad441fe9691bfdf384a1aaacf38430a2b1001b8601135b86a2587c75c55db8274e498bd3f2887a1a455a665b8314655124c5e8197d3ef892f3dea2bd80b047530f837ae00dc263da929f0135a4f8ee73d47d5c2ffcaaa7b1e3a9cc59e496feeea2323e5eb16c9de2e3b973c35c0a82ca587477201b860e162bf706b79405326d52e29e2ea247a55ebf69ac34900e4e1249c3fe28fd3dabf40d7022f57bc7b16bedbc78bb49000ba7cd92e123b69309a0572cf49b1bdf4c5525ba656decffa5da91ed4fb565d384979e9a58f72986b654908bcc7308580b860e9dbc5c1d073954ad6ace9afb19d8679e8c87fd12cd1356999a8e167390f3de966f3485b500c40c61f829e12f56512001b4d4c4fcb19737735eba15d24d43231ce64cf348de3765fdbd01f4612af6ad7ebf9fafb594ade18a7717c189e9c2080b860bd8e3d6ad24e5a7e4f084b3142c6bebe26d248e09363a9bbfc9cd07ed59a65bc140a39564f6ecdf287c2ee1ed57a930035a92306b6d5925af8785c813bad6113ba42da88d3c0586e438df75bee506bdf9f10c927493524897ba485946f976f00b86030feb59a6804a2df697b3a40c691f176575ffb0d09fa167a59642d041dfec97e97414ea7abc8bc7bdad5f1a684e2bc00fb2c01bc03aa9e8caaaa82ca3210de31f9c7abff6baba5e08e2dbdfa85953ac39bf3b895627e7be493fbb6e346061400b860f24f6f1e6423fd83d1ef275484a5950fed90794d7bfb6224de768bf716646dee263de0d9ed4c681c0def78d16df0f000f1e17b8657f1d8e30dcb7e94cd1c79b9c82d8f04252b078ccf3719c7658dd82b9d93e02671925b5aeb4ae38163129100b860f6a586f1ceb85a23980bdfea738678f5906de5303a7ff6ceb3ad675f93a84fcc502275fd7bd8fde4de89fd381cb8eb000b3d496e248208d00eae4f421ce041d73789e794a372ca788655eb5413c3b7cab1c993e47ff6c568ca4da484bac88b80b86086ab977d6064531b136aba2fbcc50cfb332b939e0af22c7ff3800c804b33aad998ef69a97204076028a6946d627d2f0086992cba351fd538a91986253884325aba6d66ec1ac10eb8e18330180820cc388c22de7d64f9f0592b74ee5b35745a80b860fda02a60f25f87c0fc86be0bf52bf1d0be37db80661a2fa35c7f52a4c5f5edccca04768c762caaa2303f8cac2ba93e016dce7c8382e3d002e751e0bf33563eb8f6e7937ab2998300e96f5ed3d857d081ff389c495febaabe836a7cd06012ae80b8609271a103e5971edd694b3c6363a2517b0676849b05660094baf240ea9677e171169a8eebed5ad6000871b4f42d4ca701b74b10f91c0de059d225ced0222e391cf7bdb5934d1677193837d5a1f17bb710f4747956d819db51c817cbba53324b81b860719b2de30cbcdee6e62445b7c2eedb63520beb4cb6e4a55f7e121a5bc6a1143e1322d3ffea28ab624d30f6da8962b8003eeb056df8e46a49035c0aaf139a372e954d8ebc3cbd2d0b2e5727d54f3f4fcd0e95dc29bfa9f9ddcda98bbe8107ae00b860b054d545899f55cbb84853125d50ed06211ace3eeb6160a85583de508569e65592f5a221a000cb99997c0dd2738527010f2b1e83eeb7ca41c40ad9295b35f1172004a63567f140c1d2b0e04338161e8f1be88c1543ac75d587e42c3ed7dc1880b860e8de954d3100b125b2a3c1e4a81e3ac5f4b38f85c2ed91d752c507d4a37c338ca215e474a48511c599407b85c01d37012cd5012c74c5c0c3f88b4480a73bb52d1ff49848ca014f54cde1a891e6efa483b5f64016a9303515806c45d9f79a7e81b860bfce6153e2c2dabc724d9ccf5cad8916753dce1a495a29053ff7e5b74b0eace4889d5fd2d5859400d82b6bf29cecc9006b6e37152cb1f4f72e7d0c0897439f4ef886e26b4345d2f3e73bcf08d47f02c02976730620d7f844313c099894176980b860d56a6034613c0db14cad99e0d2dc3a2482ff8167a032218ac10508bb04fb8ef91d40a6ef0731f1e162761f09be237800363199f518a1bad98f9f2d179d2028137f9a86588c5cd91534a53d4edcc718b40aab61343571636d052eb7b288578c80b86025138a99de6619b2ad03707f847ac0f5d79b0fba2c67a02f5430a57c7a651d6cd63988cb91aa331acd8b7221a41fc4007be73ad86365de5029c22186d2c90a864fd3d386cc074b25f33b4e5f17839d2780b62390f1fe63ec690c1022c039f500b860c1e12e9c0b26d73c27dd5f60f08aa15c952afaa395dbc8b4cb99453bbf3f14ede8a9fc579f64500497344cb3637aa900ac8618d237580558d9ab5b042ac411fd121713744378c9f4d897907f09a71af9331bfecb3853e3f0577be558c9019881b860c8e6cf45b640fa91385b4f7e891fe68e5ef6847d032d6ee1b52eef0ff577ad796b881537f94adcbd1b52ba3e0e5d7e014e97406029095598a373fd6d9b6c7754aa87b87560e363faee89c11482cae4a063d7cbe65747f2b095cd2f6688a8bd00b86008fb22b9fe04caf62059216dfa1d2274ade1df9e8dc0b3c37c2f04a8b3d30a73f8b7ddd873d08c9b9302b936d303380147ce0e19c4600b780a1814730462147fb8280fbd7fe0aa7290d325a2b7b5092e05c037664a0aaea127b0d95fd13e3d81b86068dd5abe43baf4f4817a49b094c0bb19ed4971c49a6f91863163f655b6f312d80b7c7843b4cb7a27f7e54d48c41c7a01c5e22e1b0e38a5dbb03dfee76076a9af56146359add6003410005332417f6b65db718acce693cfa21a8fd92f6ad72800b86091785b64cfd3ac6025b6b536499ac63868344b8f86660ed73b574b000477807d53dacf1d9da258ee1b113485a01579001c5a510dd3b3c5ff1f417c4d97faf997aecedf7602bd96155f8396f5c9fa53d2c601b0de0241fff49939b8a97ce59f81b86000d869e78871139e8a5ff6d590593d94ea39ea73e87ac8d901f0f524372b93752a488b34e05afe05ab53b1c1b5b9a50040341dc01361110551358c9e982f30755e19ec2ad1c1a40afba9998f763599149b976caaded14c1793fe8c469b17c900b860e87f2bfc49c076239d7522280098c9c4808524c67f3891608c42a510340d64c7e3c1c074c34050007837c13536ee4001108efc87fb2eb8c3236a738122ec086a7b8e6857e82454ff6284a81bf0362424ba96be762d4c06e6aa1b59f20c870780b8608bd2cb218f465c82702aba73a061227cd83dd8e3115ae592811a4d9a8236e3cb99cfe81b2e22e3620635d42f63b7db00cd546bdf9fad7b49d0b4686962c8e0f4544c8df9f41d30d7d489c86745ef9b61a7dfb498f7554eeaf7e24654e5b2e080b8602a5d5145b7a8100575e96cadf2513e7e64dc1354bd891a24f985c2574c9f5f136c6837eab290543d769227785520970133022637837b6e711918f3d2e98291bf3b518247b54e925a0cd080d517731b2725be888fca298ae9ec6f5d2c19146b01b8609cedde7d49e2e115896b789ece09a4e8816fae153de891774f0f716a3088e01754498f8ab82d3a4ef5d53dbe17338f0089d1de1eb60bfe3ec2b9701cd882b1c174bdcfaac2225f57481676bd3ee845825be823b9f9317e407b015159b9ce7e80b8609124b91b223bf9bcb33f75ab8fad0180fdc57536d34349945b80458abc8e59cb06005fabea656b2cd0dc74c0d06160019706965ec3ee0d3b9dd76eb9d39158bd512fe7384bb9e139bc6391d5f7b969397bd85a91bb91670880d861e8fcbd0700b860338fa194daae9ba1245931653fa349ce85d37e2f42f871c6b67397401e190b497d15ff23e8f6e9d8637f84fc2dd4640066fee692f5d38208233e7570c3ed49a4ba032d79c7a974883fa5b5c2a113259b924a346bc4edaf717b3d272ceb152980b8606150a62f3bae3a35a639d66dc37ffbc979dda85d6f5c278a5e3a563ed865018c86563be9b878261ef8d34edd3c313500a4f170e22f386674d538b9e15930f39216a38c2a0b60380a140e5897f42d37077784cd0f1a93746e4da42a5da8de1081b8605fd1581e41fa22e7ea0d3ee0674e80e2168fdb42c1aaa3eb8aa395fdde11de5ca98ee71c50b365ef6ad9850bacf7ab011eb446da79efba3ae91fe1f7d9274dacda39b0b3bfaec7d7904eac59d72683c52b6ea091ca1374d8b744fd1d595c5700b860e04a671c7f9af919093a9c7aaf5c3d04eff31189d65f5a2aef8a08119ed1c150464c40085bfe2fc1b598fc2ebb095101d0103b995e3a762a9063d823ef6686422caa5a390f9cef5ee6c4b911fd2a818f43fa47ab4d21b01cc7b071b517438d81b86014cfe4d455820e0b9189ccb32b565b90d414ed7d315d3e61141f085b6aaea34b0c90d1f502f544408e7ed0573e1d1701d38063b239cf847018daa41b7ef75b4458d2fa23e459e9e8039668aca6bc70df1051a6466a5a5bc636e68bc8e2380a80b86045bab7cfb2814e1e3e1a40779219bb48c5d51419ce4c27359d0dfea9fbf1d193ca1cf07934ebf3d6699093387d0e0f000dc5fd2058e7ac50e67c26e5172567c2e7ba9fefa6376e44de800c8955cb330df27476d8ba9c8ec06bba46fddeb95e81b860a9fc64c64c039cbd8b23c10c706e51e0013289201111cf8c50e22781161df397fa9c1b91dbab44969aa4f8819f222e01bab90190bebfaecff7076de325dae8080bfe921fbb4ca474d45126fe5ae699dadea98a5255c048b955ab9ba1de8f4c01b860aa5729ac27a01f26924e863eea7d7b49e4f8ccb15c758ea5ac4d19691a8df31f70138e7f3011fc1e53d52d7063dd8d01a9b3800a66914956769a13ff8d7960725bf1b9073052f3ff85e57b34165702baede05d4e807bfb565a38c24e3542c780b86014dafe41c0d96b22188f0cf6cb99274e09d8ed0546bb57b28a3590449726aedf809c049663a2fe4e40b91830ac864d006a9c5b41cbddeabeef99cfddf6c298de0d482b6e945993241d8987f7fac004fe159f2206060b69d1f8f7f67f63f26280b860878790bb38b1d31707aa37b5af439d64d4a29ea64418b7c01139a26e4f297da6c61a806361e6fee8e40556500c8a6000183b2a42c54d714a143fe12492081f373e3e27e3921adeb276a8daac8b87a534c0fa1d45195d8a35cbda27f883128680b8607983d25868e1915ca8c20603bb70475b398acb04fef221106475f0fc25a3df25d181050214da347bfcd8f8292e5257014ba77cf67ee5f42202257317808f126930b4eaaf4bec801c4465ced54bd78331f8d3dadfdf2f91de294836c13e946001b860c700f20fab4ebb8ba2f47adf177e3a9dbbfb8d999acbaf0055cc540758f8c00c4564813d910e7be02e4ceaa97fb88000b3e8e6fb4308fb943fa0a3d8ee32c3a16e67eae0817370de54bec1e13193cc845cedb70e48171e097550475684cc6600b8603a3857a5e4a709e1891a13797d55c1698f931755492099926314feec955da31fcea8a8c8e52ce2efd74008f32dbc1c00c1b3d36c1f3097d806f1687eff4a979ff967a8233f923a1af6a3104c12d9d44388bece91a248bd86f253e709cea7a380b860d8704fbf6968ffd9f39f78173726dfebbaaffebd8a33dfe29a72b9a64204debf2016ca94cbecb1b84cab9d5cc168f70082530b8cfab0225a37c8cc3044c288ab0fea2f484b56a33fe6c92d6c50f254e6b1689ac6bb1429cc36a0a7dfa85cf880b86089bc906b5ea07752b1981558fd5d329f6d428bc14dfde022745303eae0077612b9c473cd66e4d91778282ec98bbd74001b61cd71448750df642fb580ce81a62ff2f5ab4378528ccd65acff67ac2175678f598529c768247c77938ca2d529aa01b860fb9de050c059a11175be23daeef1bfc374a688d8fd776e7d2dc81fe301068a3af03ad522bd915a154dc8b5b8e12d38003f17bf9f3001e8bc682e38cfc93c6901417afe1ddbfe11b04f42ca4f321f7ba73fa98f0e8c8d75781b14c0dea847a500b860bef17561ef747e0f11fd339987a9813fa8ac1cbbd6457ab8fae4841f2217de019de7caf421631372b7eec730d9210d0128e6c560948c19ab6141d6699e0283a5b98abb8266d731f2b6435a07b44360a3f5622cff59103ec95938ab74fe1d8f01b860a47fd022dda04d8dd36e0a8127be68736e47e490b946d2d88a9fdcca0a5114a930db2319e7e9f378d1b81da865d38b01a063e9994e620483b08047da39ee2fba35f66df5734e1545d297be0225ddfda34e494c1f09825eed303d166a8b702f01b8605966b41d69c0e94e202f65823338c3ee0357b024a6ef625765bcf3c20e1786258a18221e0b9a34b43380e06984d92f0183d0f86d0dfe0cac237ee38b28876dada3ae9d1a7c1e6627c3b524d0e5d4270bbe159661622821412b3c1162881ead80b8608fad3c2111494834f30e6d8a3368f2aa156c991fa4b9d55c31da88310a61248305697ad5e885404bc474419383de4f006c89c84955321716f6ceb4d5db433fcabe843d566a5e9526d29c58b38fda9ff2a13f7841f25cda8ff091c965bdf91780b8607f9df4fdb96e20e0a480a62c48896646d96982d7fccccdf70c61f2f52ab4ea2901b5ef468203f39d9d8325843ada0d00e9562c64f79ec16d69b5921ad36306ae9ae2ab97806b354af8cb63722a96aed5fa0542fff58d2e41f6aee3cb641f7781b8609cc39207797e5c37f6336c4444cf275d0afd43a9b257a08086f15867886833744c9b5753fcff76fce3790d7f25a45e004af9831464bf43e39634ecfce8da4641da78ad4ba20e540c007dbc19e391ae69cfb7809b2d20627d553a16d7d71b5a80b860e6631bd0b82c41327b582d7b239c9893cd1162cae05f360d8808700b38ee0ee965e9bd9246d79e00c234c87aa7699800c73428f7eccd77337bf79bf6b43f48345be265c38d128d2cd7f37167cadb3565d66e6a7f67b12f4bed63e5d50748010080b8410000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000f86480b86000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000080f86480b86000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000080", - "difficulty": "0x1", - "coinbase": "0x0000000000000000000000000000000000000000", - "alloc": { - "0x11901cf7eEae1E2644995FB2E47Ce46bC7F33246": { - "balance": "120000000000000000000000000" - }, - "0xC1cDA18694F5B86cFB80c1B4f8Cc046B0d7E6326": { - "balance": "20000000000000000000000000" - }, - "0xa5d40D93b01AfBafec84E20018Aff427628F645E": { - "balance": "20000000000000000000000000" - }, - "0x8d485780E84E23437f8F6938D96B964645529127": { - "balance": "20000000000000000000000000" - }, - "0x5F857c501b73ddFA804234f1f1418D6f75554076": { - "balance": "20000000000000000000000000" - }, - "0xaa9064F57F8d7de4b3e08c35561E21Afd6341390": { - "balance": "20000000000000000000000000" - }, - "0x7FA26b50b3e9a2eC8AD1850a4c4FBBF94D806E95": { - "balance": "20000000000000000000000000" - }, - "0x08960Ce6b58BE32FBc6aC1489d04364B4f7dC216": { - "balance": "20000000000000000000000000" - }, - "0x77B68B2e7091D4F242a8Af89F200Af941433C6d8": { - "balance": "20000000000000000000000000" - }, - "0x75Bb69C002C43f5a26a2A620518775795Fd45ecf": { - "balance": "20000000000000000000000000" - }, - "0x19992AE48914a178Bf138665CffDD8CD79b99513": { - "balance": "20000000000000000000000000" - }, - "0xE23a4c6615669526Ab58E9c37088bee4eD2b2dEE": { - "balance": "20000000000000000000000" - }, - "0xDe22679dCA843B424FD0BBd70A22D5F5a4B94fe4": { - "balance": "10200014000000000000000000" - }, - "0x743D80810fe10c5C3346D2940997cC9647035B13": { - "balance": "20513322000000000000000000" - }, - "0x8e1c4355307F1A59E7eD4Ae057c51368b9338C38": { - "balance": "7291740000000000000000000" - }, - "0x417fe63186C388812e342c85FF87187Dc584C630": { - "balance": "20000062000000000000000000" - }, - "0xF5720c180a6Fa14ECcE82FB1bB060A39E93A263c": { - "balance": "30000061000000000000000000" - }, - "0xB80d1e7F9CEbe4b5E1B1Acf037d3a44871105041": { - "balance": "9581366833333333333333335" - }, - "0xf8ed78A113cD2a34dF451Ba3D540FFAE66829AA0": { - "balance": "11218686833333333333333333" - }, - "0x9033ff75af27222c8f36a148800c7331581933F3": { - "balance": "11218686833333333333333333" - }, - "0x8A07541C2eF161F4e3f8de7c7894718dA26626B2": { - "balance": "11218686833333333333333333" - }, - "0xB2fe7AFe178335CEc3564d7671EEbD7634C626B0": { - "balance": "11218686833333333333333333" - }, - "0xc471776eA02705004C451959129bF09423B56526": { - "balance": "11218686833333333333333333" - }, - "0xeF283eca68DE87E051D427b4be152A7403110647": { - "balance": "14375000000000000000000000" - }, - "0x7cf091C954ed7E9304452d31fd59999505Ddcb7a": { - "balance": "14375000000000000000000000" - }, - "0xa5d2944C32a8D7b284fF0b84c20fDcc46937Cf64": { - "balance": "14375000000000000000000000" - }, - "0xFC89C17525f08F2Bc9bA8cb77BcF05055B1F7059": { - "balance": "14375000000000000000000000" - }, - "0x3Fa7C646599F3174380BD9a7B6efCde90b5d129d": { - "balance": "14375000000000000000000000" - }, - "0x989e1a3B344A43911e02cCC609D469fbc15AB1F1": { - "balance": "14375000000000000000000000" - }, - "0xAe1d640648009DbE0Aa4485d3BfBB68C37710924": { - "balance": "20025000000000000000000000" - }, - "0x1B6C64779F42BA6B54C853Ab70171aCd81b072F7": { - "balance": "20025000000000000000000000" - }, - "000000000000000000000000000000000000ce10": { - "code": "0x60806040526004361061004a5760003560e01c806303386ba3146101e757806342404e0714610280578063bb913f41146102d7578063d29d44ee14610328578063f7e6af8014610379575b6000600160405180807f656970313936372e70726f78792e696d706c656d656e746174696f6e00000000815250601c019050604051809103902060001c0360001b9050600081549050600073ffffffffffffffffffffffffffffffffffffffff168173ffffffffffffffffffffffffffffffffffffffff161415610136576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260158152602001807f4e6f20496d706c656d656e746174696f6e20736574000000000000000000000081525060200191505060405180910390fd5b61013f816103d0565b6101b1576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260188152602001807f496e76616c696420636f6e74726163742061646472657373000000000000000081525060200191505060405180910390fd5b60405136810160405236600082376000803683855af43d604051818101604052816000823e82600081146101e3578282f35b8282fd5b61027e600480360360408110156101fd57600080fd5b81019080803573ffffffffffffffffffffffffffffffffffffffff1690602001909291908035906020019064010000000081111561023a57600080fd5b82018360208201111561024c57600080fd5b8035906020019184600183028401116401000000008311171561026e57600080fd5b909192939192939050505061041b565b005b34801561028c57600080fd5b506102956105c1565b604051808273ffffffffffffffffffffffffffffffffffffffff1673ffffffffffffffffffffffffffffffffffffffff16815260200191505060405180910390f35b3480156102e357600080fd5b50610326600480360360208110156102fa57600080fd5b81019080803573ffffffffffffffffffffffffffffffffffffffff16906020019092919050505061060d565b005b34801561033457600080fd5b506103776004803603602081101561034b57600080fd5b81019080803573ffffffffffffffffffffffffffffffffffffffff1690602001909291905050506107bd565b005b34801561038557600080fd5b5061038e610871565b604051808273ffffffffffffffffffffffffffffffffffffffff1673ffffffffffffffffffffffffffffffffffffffff16815260200191505060405180910390f35b60008060007fc5d2460186f7233c927e7db2dcc703c0e500b653ca82273b7bfad8045d85a47060001b9050833f915080821415801561041257506000801b8214155b92505050919050565b610423610871565b73ffffffffffffffffffffffffffffffffffffffff163373ffffffffffffffffffffffffffffffffffffffff16146104c3576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260148152602001807f73656e64657220776173206e6f74206f776e657200000000000000000000000081525060200191505060405180910390fd5b6104cc8361060d565b600060608473ffffffffffffffffffffffffffffffffffffffff168484604051808383808284378083019250505092505050600060405180830381855af49150503d8060008114610539576040519150601f19603f3d011682016040523d82523d6000602084013e61053e565b606091505b508092508193505050816105ba576040517f08c379a000000000000000000000000000000000000000000000000000000000815260040180806020018281038252601e8152602001807f696e697469616c697a6174696f6e2063616c6c6261636b206661696c6564000081525060200191505060405180910390fd5b5050505050565b600080600160405180807f656970313936372e70726f78792e696d706c656d656e746174696f6e00000000815250601c019050604051809103902060001c0360001b9050805491505090565b610615610871565b73ffffffffffffffffffffffffffffffffffffffff163373ffffffffffffffffffffffffffffffffffffffff16146106b5576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260148152602001807f73656e64657220776173206e6f74206f776e657200000000000000000000000081525060200191505060405180910390fd5b6000600160405180807f656970313936372e70726f78792e696d706c656d656e746174696f6e00000000815250601c019050604051809103902060001c0360001b9050610701826103d0565b610773576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260188152602001807f496e76616c696420636f6e74726163742061646472657373000000000000000081525060200191505060405180910390fd5b8181558173ffffffffffffffffffffffffffffffffffffffff167fab64f92ab780ecbf4f3866f57cee465ff36c89450dcce20237ca7a8d81fb7d1360405160405180910390a25050565b6107c5610871565b73ffffffffffffffffffffffffffffffffffffffff163373ffffffffffffffffffffffffffffffffffffffff1614610865576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260148152602001807f73656e64657220776173206e6f74206f776e657200000000000000000000000081525060200191505060405180910390fd5b61086e816108bd565b50565b600080600160405180807f656970313936372e70726f78792e61646d696e000000000000000000000000008152506013019050604051809103902060001c0360001b9050805491505090565b600073ffffffffffffffffffffffffffffffffffffffff168173ffffffffffffffffffffffffffffffffffffffff161415610960576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004018080602001828103825260118152602001807f6f776e65722063616e6e6f74206265203000000000000000000000000000000081525060200191505060405180910390fd5b6000600160405180807f656970313936372e70726f78792e61646d696e000000000000000000000000008152506013019050604051809103902060001c0360001b90508181558173ffffffffffffffffffffffffffffffffffffffff167f50146d0e3c60aa1d17a70635b05494f864e86144a2201275021014fbf08bafe260405160405180910390a2505056fea165627a7a723058206808dd43e7d765afca53fe439122bc5eac16d708ce7d463451be5042426f101f0029", - "storage": { - "0xb53127684a568b3173ae13b9f8a6016e243e63b6e8ee1178d6a717850b5d6103": "0xE23a4c6615669526Ab58E9c37088bee4eD2b2dEE" - }, - "balance": "0" - } - }, - "number": "0x0", - "gasUsed": "0x0", - "mixHash": "0x63746963616c2062797a616e74696e65206661756c7420746f6c6572616e6365", - "parentHash": "0x0000000000000000000000000000000000000000000000000000000000000000" -} diff --git a/packages/celotool/genesis_validators_baklava.json b/packages/celotool/genesis_validators_baklava.json deleted file mode 100644 index 0aeb8dad977..00000000000 --- a/packages/celotool/genesis_validators_baklava.json +++ /dev/null @@ -1,206 +0,0 @@ -[ - { - "address": "893c4d601ed879b4ad36fc31f0c0214d547113eb", - "blsPublicKey": "3695297deb0f970e1cd8c64a852b45ad788e151734ab05a20206be63f7f42817fb5e3e035d7f9dba56917a0bc75a3f0061e767a5cc5a60469709f170f4f7d76dcf6e30f04a520745ac8f3f131d6b854ea2a899c3317cc48bf06cdb07f4514a81" - }, - { - "address": "75af50cac2b2eb330b975c1b999fef571c870870", - "blsPublicKey": "6b628400822daba0c52481cd1f8708056c40c392b362a9028d17032b6324b114620118a2770e0ddccc717685c2380600d035299ff9fc354180d251acaa778cac2a2eb64466e977204e582ea1342c0100c01ff5ea558be47b1f36cfbaee7a1e80" - }, - { - "address": "ffff741c41fb487f4d64fcc9e32fbb38e2a8372c", - "blsPublicKey": "f6c310b565f6f975faf4326707551a9154b848a222c14c6e18b87c0bd9bcf3524d95b70d5faffe1d911955186f160d01c82d4df63cc8dbc52ef2e106c89d2b838c57a4e6fd033d7d4cd49698111fd78321a962d03b1a31eefdc84397c0d51b80" - }, - { - "address": "c496e9791d39a6f0ae54ed52897e581d168a5b45", - "blsPublicKey": "baa5538d5579ce32668fbf4e80cd9fd31f09447ba473da1ae44a51ce6feb4f46d8a14008fc515fa7e0e9d720c83b7001bdbfbce56284c382c5a267e7a245daeaa9e2914dc92cf2aa2ebcde339e309c5d1a0df1533cbd631168fc3e4a1d248801" - }, - { - "address": "b182f37daee2285f14b4091b702ceccb00d50312", - "blsPublicKey": "5a13510aa9b74ecd0bb7d8bd7f65dd5622ff4a61eebc1976529a26954d70732d591589dcf49e3f8a87f450642336e200391e1e8b57afe3d7240b80020deef9db2d9b260fc3f30ab4168521de586c211baccb505b75af030d47bd1d6fd49ab480" - }, - { - "address": "beeb24d10c9fe58715a0c853db8ddd8e00191771", - "blsPublicKey": "382d76c49b99ba0d0aab030afe569b89288f399c727e3c1dcefbe1208f0c2d86498f62e1fee2d50486e2d6aa0d61130172cd07019fdfe79cbbec315f53465d9a8121fcdcfe71efede293111915281e82add7252b88b540b3a0dcd2fcfec91c81" - }, - { - "address": "aa3b76b6618afe574b278da9b71af0e66aa6f646", - "blsPublicKey": "348aa54a71033d583a29d00e3dfb1f696633c6520bc71099fd26a228d1882f380ab86ef19b7efe544d47ba8aeddf770125075f6c15b82611a31e746668c1aa8af2355efe6e4cd297c0f570fce18ecc2f274bef5115a8d8ba9a18a6563e6c6181" - }, - { - "address": "e2368b04a1d14f286faf8c90153e33dc0b0879fb", - "blsPublicKey": "17a5c4c6a8aa852ab1ac7771e3caa6546da92938bda6d4e0c0ea05da85adbe08ca2ee58e657a82b619d1a729b6510401686ca15382abdbd856716fa867573c35ab8605d2d57b0573ff589c087cf57e5408cbf7be57ce5946a6455b1955d9d300" - }, - { - "address": "72bda6988c4551083c14dddd2709edee96a9b4a8", - "blsPublicKey": "9887deb26c20c4b5b448e04efe7b18781355f9648feb47d978382aa3c789f545416d20bca757df66b70867ccffeba0014bb1b45fafe0366aff4d513c023a55ee35ccc5a86d5214f813996e2ded810e3cac11cb15b7ca0b97aad1a307897d4d01" - }, - { - "address": "f0d17b624521c0a599b063d73a73f9719307b48f", - "blsPublicKey": "57b5158540d1672de49073e469e5426766094ce6514ac29580803ff9d22cdef3b6fed28a8ad9de63bf938c1c7b0e7a0156fb03f5363d09bf337713d8d45475df48ccea0fa793f591fbb62673545fc846c04e6aa9e2eaaab72c3b9ba4069ea280" - }, - { - "address": "38f3eebb5a820ba4b83b1f6a324c09ec1db2d2f9", - "blsPublicKey": "720216212176780b12cd297d1feff6a964329448d821fd69e0a4fcc53c4edc00ca9c89ab2c35abe7b68cd47bc230bd00597450ca0bea04562f78b0102d406f4b651957989c1471674eb16a531c2f762c84c3432ccd7fab87b0a185a942d00301" - }, - { - "address": "2f916156a2dcd5dfb6054eb62a677c55c3a4037a", - "blsPublicKey": "68438d70fb61a2d0adc3710b619aa5f9a603168f93bf0504401464df3acb14cc1815bf449b7f8791fef7e309b60e7b0145d425515d1f4f380b66c256a4218700ae48e2d2185a636487d6f531f546b9127b957084d0821acccf5ad0776b06be80" - }, - { - "address": "ee80387e93e4d6d41c4bf51623bc8a42ba55a449", - "blsPublicKey": "92b17fb70e62710f3b87642830e23258a0fd1041da9606c7eea7019f86da50ff4cd06ad48a4d0ca93e62a6c7ed6c5e0036f4443d0cfe9122156a346836a9163404d58856bbd447939189bf1cc8a7ee8be156415642992f8e0659526d15bb0c00" - }, - { - "address": "d32e429FE2155971825E14E2fd89785301ab6Dc3", - "blsPublicKey": "bfd4b2392664c68bbd30d6a77cd1c7230e340ac65ecc07449c0d704096af06926238c60e7764ffcd0b9a53d82d7050009cfd9481ebd7630fd2540b834cef70b5276d07b75913a20222d74f36db8beabfcd39f7c5d57c4a28f87e8e737123bd80" - }, - { - "address": "8c517b3bab0e2d65d5cdf3750ca360ff05b3fc1d", - "blsPublicKey": "2a10c24fb6cfb029650a90f10f6586f26e61e3a51d68342785862e666f6c730bb929ff2c5a20f5ceff41cb53a2333c00aa0e4b0e47c821c668d6b84fac85c5b5cd360c53dfdf9a6c48e0f4ff8c3997ff191463f90fa644fc2870174d67885b80" - }, - { - "address": "62d56ba5bb3b841fade256d97e7d4d57364fb881", - "blsPublicKey": "8aed791b4381c7bd38f8358cbb7f64bb6b74c34b85b27026745df7cae91f5cdd50ba6cadd919494ec9849e628138870188ef21da8282207be4acd30803288e2674a7c46322f41f3ecf685f5b635970c0d19802defc344cc4023f8e41e50e3c00" - }, - { - "address": "beb2b0ea812caaaea3a1f894fa9138231e2c7d38", - "blsPublicKey": "5008d7aece8407c044c4a7a77b37690a08b0edbde0942446f31e3362031e689b30872d8732f74dd3e45bbe85c072a4005ad20cf2d9e0824eed9a607b980d0cd5d160bcbd5ffc34a6a87176e257c927cd419ac080df27cce888eb94c7bc5a3880" - }, - { - "address": "430c9d55CF6116c65482379c039584B9b965f323", - "blsPublicKey": "a4ef4602775c99871c1f910af02cc997cb6f6b1524450898a1de0a3369750ce17deda194ce40c80650025770e1c60a008916618d314d7008a44e312f2f032fedd08177ef854cf78dc269122dd7eef415989e975f13d501830df4ecd3fe248700" - }, - { - "address": "821a6b00d9f658e1732ce741d91984c688e25c5c", - "blsPublicKey": "f01e9ee01d1a76e596827b897e39a1e497b00d6ed276675201e81a07509d4043354c89d0ed515098b0730958cc9dd200fd9cf9d1f2fdd7823937af05e03eb1b043de63e2c8ff7bfebc9cd4f3983243bd275d0266bbaee5e70244126687470f00" - }, - { - "address": "8339eae8ee5b8ea60d80711c9fe30ae92c0eba2f", - "blsPublicKey": "1f599e48d2f04e623c42ae66d379f15ff8c1a3bc4584a27f234fc00e7feb64dcee6a0faa4cfc3737e507b9c38d640401b23b2c6fb9bb7f00a2144b8031b9b2446ac121e7ca7f06e0b6e084b082ada9fc31837d094b0d6c3d24965a9a61fa7a01" - }, - { - "address": "a5f837aa6be966d9f293de6f8cc65fac7064220f", - "blsPublicKey": "c2cc4ba57abb3768f552b30768ad8de8d6b934d8fd2eca699a2b2f712e8d56b50ea2e59993175a396e471e3a6028b30054cf156b0c75b9df6ea2e15e03811960aa5999411522b01286bc1cf17e4ef366a48e3e075998394e090f46c1d8212c81" - }, - { - "address": "d2462858d04dfdb0c7ad26dfa21933526491c1c3", - "blsPublicKey": "3201cd2f5f5f41b6949a03a600347bf9920c891b883dc4c8a530188fec4f98ac96d6caf9e87a05c81962760c20c30c0198745f18efc87a19ec94a06cb01d72313a71f4ff30e94f04504e609cd495ed4b78dccdbdc6ece12d0200996763439c00" - }, - { - "address": "42a237fd620f714536d6f63c0f46d092c9ea0122", - "blsPublicKey": "b1d7e87a67d355c42cf7db5c599da2ce9624372cb941f338d68b80bb8376fc8600e6d925d9902f32270d6372d4215c0068879d52d16dda84ac2efbebbde57d7ca5d451f269aa9903926163d7ce75b68d2547c34e07303b9db69d0044c9219200" - }, - { - "address": "dce0096c358e90c4645d68a53e897416e786d05f", - "blsPublicKey": "872c5c11351fbe7f31bb38ff5f7b07aa8bc4902a06b81f4fe70e7bc5be86ad260be53c26ef3d1e6293e3308c33ef8d000c13182ad835adf86e816496d6519ae00bd607f6f5d974fab0792802d69688e6ef37078285a1a2e9f8489bfe5a87e600" - }, - { - "address": "d10ddf202de766dbaaea7c436a90f5aa71e5bdc4", - "blsPublicKey": "6afab29c7f64915ebcd2851799af2f43cebbab8fe051965103989c47e576181533f7bf64619a4ee50bd787bcb65f5b012d60c2eadaad3118cba6088cfb9204f5cc465d47dd4c834b568f313d5b0689c6b6f3734cd19d4df40185d007b22d3681" - }, - { - "address": "2ec7df44dbbc3d458acc4108aac8d1cb0bec11c0", - "blsPublicKey": "55470892fbef78fa87bf31aa738a58fdca928a16d99fff12e8ec3d71fb89489286206e32aed5063f1fdd6294f8a3a2011e19cb83873a777741f3d8521ea33edac732704e36f1aac01f212eaf72405cf8d980dd88bcb88751a6bbe7207a5e1081" - }, - { - "address": "b8f90aed575898c38c11deda226e1294f4d9dfcd", - "blsPublicKey": "207a4c7a86274921d013932e2f88ec795dfcb1f85df6bba7039fd14526cd44245f29d5916d17bebf40ed08be3d0541006fb3443238819cdf36eff79d8706ceda92b0b3bf05d6dfd9a723da2685acc34e59928fed096dea64f742fb55c7f34c80" - }, - { - "address": "8a63fE13A4be506e8B51d296f60F192458672E62", - "blsPublicKey": "3891197bd1712a3cf13fc13f2f2308bb4acdd4e46e3a62abd28a8e3add64ebcabfb939c743e8e4d54da2a32c39719d013ca29923eb2c685ab851a4f644824d70b1fd36812f513f5f23fa8f82ea228ba2d8f8b771b109e99e1905e16b1899eb00" - }, - { - "address": "80be4e53c6bf959445598965c2910c2c91ec81f7", - "blsPublicKey": "efff9b35b865ad56dea6f6c80d96ae4ba88997a60451b1535ea51ef6ae1d26a1c25246f33139f64046909d4a30a90f00a782c6a5a910499b65d7f2c3064dca95cb6dda83c10c2f9dc8f60d1afeb91c067205dff722dfe964f66a9a98e4047800" - }, - { - "address": "4663bf9f8c6b32f095126e138678732c0d183f7e", - "blsPublicKey": "6dd6f3f642e00ffc73a2d76c8f58128aa434b526c0487487f193b6717665a2eccff314d7d0769dc17fad47db19981a00c8a11f156a2bdb36164b479cb1bcf636896066e3c3b561c478d357b343bfbc6aed51b08f60b8f804ebd78e2efb525c80" - }, - { - "address": "bafa573a3d5333fc88560de1c312992c56d517aa", - "blsPublicKey": "8202e06fb2d18ad9a85b3af760290ca4aae79fe2d354ec4c462f65e98339f9bfbf6c24503c73b054bfa740ea595241011c0d599d95697238c700a380326dea544cd7b085a3b8db22ae900ac82a5c23ade8c4290e022b0e425a55f742352dc480" - }, - { - "address": "8a0d880f275d3741f507a0180E843a2805eC1e2e", - "blsPublicKey": "acc4826a9317688bdf782adc5acdf12656ffc2f327ca699418c287a15c2c22e0d16e632e592f5b0748b40a8e374e33009d8cfdfef7f81493a77843054cd006858b1b217043954423bb0924d0387b6609a504e15e6c054fe1288e0bd337253100" - }, - { - "address": "cc8c30b74073f537a6f790ae68649788a66f2d71", - "blsPublicKey": "6c93b4d43d69c248efadf8d03a4c9672ea222830a1530b29deab346933fc4492a211223c446ea24af059d9528ab3c100f0c4769fb226335489969277140da7c8c7f036d76f73c1d5f359744018b76b13c0eb07812f08923649e4a45b3b69fc80" - }, - { - "address": "ba7d40c43dc59e2686fb38edcbdd69c3e53eba34", - "blsPublicKey": "86ef363fcabe2e3c27769de9974f4b8604b47cdb2fe57208f0a59899b77306b7770f7403b222b6222efeb67747d06e01fc2ac657a21f37ab982d8463ba9c7fafaeda20370ec3f6cc565dbec4a69f59f9d7404725e3b701681d41a870b1f42281" - }, - { - "address": "1db8f844aa731e4a20e139c7e7ebfeda26888cc9", - "blsPublicKey": "51026c0f52d8612414359ee4e6049a275387d3033cf51a8e23b3e037bd7006d916017b5253da7068532e6242b1d96900d8130a60b9b0b3ecb06a08472728bddd55f778e485188c3717b911039ebe3b2cdd89573229bdacaecba9a220b2decb80" - }, - { - "address": "3acce236f234aa330395d50d6155f86c51697318", - "blsPublicKey": "6a2cc88ae37c395815dcedb56f5ee6b701e06e97bdb6232c2b0e21f719a1504d0f3b259a63e2e54d6b1a27cbb0b051007f0c39d42f699cfc608c25b636ea221500d0eb75fe0f26d35beec47e1161675a7f571ed70684778ee7c4bc6221d8af80" - }, - { - "address": "cd97f1e7cddff4523f03a5072e9be8ea7edbdaec", - "blsPublicKey": "c5f05101b67ffd245abb878fc8d68058e3a65b4672453b90cb116a0cb8f59879a73944d8a2c25e8a2865e07e1e4e0100cd255dbd7055b5228eb1d4cccea7a1288ef83bab770633b0054f8c8d9ad4e7fc15af631c2b2282337faddb2220e54901" - }, - { - "address": "537bf2c6227ea6ce4eb250da02e6a6aaa3de4f7b", - "blsPublicKey": "3178c750f9274f0f1ea4103997e7d469b2ef4b9dd2f1dff7dc33ae5aaa5fdfa32a9e44e844de658502fbd8925e9a8a00d4422f22a7a5c3e4a790d21334eb70a744da6a863c4e060481781b8bb3fd1e9291ebfb548f7d8ec122c2c0c43eb46000" - }, - { - "address": "48cedc58b10af13d688631bc3cb78a05b8a6e56a", - "blsPublicKey": "7983d25868e1915ca8c20603bb70475b398acb04fef221106475f0fc25a3df25d181050214da347bfcd8f8292e5257014ba77cf67ee5f42202257317808f126930b4eaaf4bec801c4465ced54bd78331f8d3dadfdf2f91de294836c13e946001" - }, - { - "address": "f410a55e2f4b49996a1cb884c3107490aee09330", - "blsPublicKey": "61810f2366f415b7b67e0ff9c82ae55fa213a95ff61df8718e806931ce135c0324462a33ccbde34ff901025ac334640064b41f7854c6eabcf3d9f7a4e4ff90522a4ba2681a0df03d08c67fb79f0fe98cd8c6f5ad3df155a0210ff84fc93d1500" - }, - { - "address": "4ec65178abe63805b2b0ec2718280454d9b6353d", - "blsPublicKey": "6a50718246d5c167429ce48b757a28441b217eb0903d762a3039d0ee59c00584b05418920be4eacb2fe2051d492a3d003a2a00065986e15815c26434c68caf177cbea44a38a886e86b34dba8a17dca43b729b9d69007e25d735e0ec62f7b6d81" - }, - { - "address": "23836969f0d095afeb652c75523f740898e0c462", - "blsPublicKey": "424b8239213c6f2c3e181b815294b9dd78663e29ee1234e1511e584b7b736571a6799135381cced36dcbd12dd19a5901f479679592608ab7f98aa1f607a3fe4b5b29a6580f724086610c901f143be653dcaf9fd477ac4d21d263e8ad8d048681" - }, - { - "address": "23e15d949c13e3010b75afd19b1d3b294310c7e6", - "blsPublicKey": "f03d1c933feb791285892c9e6552bc6ccf07dd7f66862ff5554844db033c17a005087b5847718d4611e3ecb293ce670144a99a13ad3356ca56535815a0c6c21c2f84bbc0b40c78f6b63137a17709fa4be124be10e90a42bda72aee977ca86881" - }, - { - "address": "0ec5a403212d732d8d7ced050e9510f6327453c6", - "blsPublicKey": "fb9de050c059a11175be23daeef1bfc374a688d8fd776e7d2dc81fe301068a3af03ad522bd915a154dc8b5b8e12d38003f17bf9f3001e8bc682e38cfc93c6901417afe1ddbfe11b04f42ca4f321f7ba73fa98f0e8c8d75781b14c0dea847a500" - }, - { - "address": "0ec5a403212d732d8d7ced050e9510f6327453c6", - "blsPublicKey": "fb9de050c059a11175be23daeef1bfc374a688d8fd776e7d2dc81fe301068a3af03ad522bd915a154dc8b5b8e12d38003f17bf9f3001e8bc682e38cfc93c6901417afe1ddbfe11b04f42ca4f321f7ba73fa98f0e8c8d75781b14c0dea847a500" - }, - { - "address": "e224ceb83a39d2ffa7eaa0c501e6954db5667e59", - "blsPublicKey": "0fa447ead6255657aa79229944a3ddf36984c7b9011e859f90bc6567ab61133b27b6bfa3a5e322dcf36abff543cc0d0131066520ad92d12b8603c25c561bf48806f178a995d233862038c46e81933e1ab0a7211b9b0933499cbc5867db736481" - }, - { - "address": "3c8afff5c9332161a20db995e2c831baf02d9f91", - "blsPublicKey": "2e45f40c799aec72e87e0ba4713056dd17cc9bdb29bb221c8fd6377336ae773afa84b8c0bb8f1f2edc88975b8ec986006f321462e28706618d690d24337a9e1c7b96e74d8c8f2fe740bac461635b4828046227f92f97b13da8540bdcc2f1a380" - }, - { - "address": "0f409814CAd27cE583De0d6a10Ecf53d48b9535f", - "blsPublicKey": "2b8ab0795a174e96061dbb8c32bd38468497e56379b2ac79f6a7506f711b817cb57c8c1d8743d88c436cdcde3b51980132c5508f79cfc477bbd1b94fefd5c408a7072e85041f17042c1ad055c04e67a780f497e60aac3c38ddb2f0004237aa01" - }, - { - "address": "f6a969790936c5285f7a7d1af642707d0c8e2418", - "blsPublicKey": "3729df3ecff9e168c1ab2b2698251797494847ee14bffe9c4ce121d90d381b0d51f7208c793f55a9cac9d50e25d63d0018ee5daa523319bbd5087dc35cef8ec53338d810adf8b4565f39fccdddd7596a63198f045a2f3d92bd0b72b4487bcd00" - }, - { - "address": "7a450c89257bf0ead22f143493250907411cd1d6", - "blsPublicKey": "44a466fb5cf01c878071e12dc391753cb59005e39c6604fa0d1444ae36518df771f252891072c26652e12a5a5b1f3c00f479f3e3ace6412768ee5729825bac3529cfde3b44cc258b847dbd87f6d55dd68124d054314eea0a9169b36e46e34081" - }, - { - "address": "fb041f37e22b25e30e10cfd662e6a5f2d1305476", - "blsPublicKey": "c6ec185bce68a0a1fc78f78a5a9da12d39635f94b6981a64d7ed7a095839b6839da577383ddf9ef940d93409ec305801cda7a06f3c3adde48e9060ca396bd8a2821fbc435842bba5ba1dc0b81cfef04cdcf0ca7b02b6e4d70ca9caed84387080" - } -] diff --git a/packages/celotool/genesis_validators_rc1.json b/packages/celotool/genesis_validators_rc1.json deleted file mode 100644 index 98379df5e28..00000000000 --- a/packages/celotool/genesis_validators_rc1.json +++ /dev/null @@ -1,328 +0,0 @@ -[ - { - "address": "27F326ab753D62CDB55018c62E9ee7E671Bbb121", - "blsPublicKey": "05f784c52234d0c40ab13b0636c5728217a8f3100046593c30271d39d59fcdb9a0053a874710e9c65e0d8a030c16a700b4ec430207cc5037c5e2698ef839fa0b7a4372f4b5217deda2a8df087e3b552f300016ce9147a2ff4db4c8a87d77b080", - "blsPop": "8df6590b8a584b3f4bd47d208fc7061584188c6f74c9271dcaead50adc6017c3e4a552820077f8971858a5a1aa8e3680" - }, - { - "address": "7eEC94733d16B96C6fE877464630BB5Be1E5c3f2", - "blsPublicKey": "d3f640c3da685893c41eb6301fdde2920c5a09296438abe5894074592f27547fb4ad02d222b1a330ff5aea065eff5200973d4fcb57808d8d9b09337e09aec279dc3337e758c51a3219596699006ee3683ff0c733f16eb917fad37f7f5d505201", - "blsPop": "782c90e7fc352f0a99ad5a793e7dd90a7417d2c9f69c02f4cb6821b147a1efaf1dda74b7666fb27b5e23082bb9138a80" - }, - { - "address": "f0d17b624521c0a599b063d73a73f9719307b48f", - "blsPublicKey": "57b5158540d1672de49073e469e5426766094ce6514ac29580803ff9d22cdef3b6fed28a8ad9de63bf938c1c7b0e7a0156fb03f5363d09bf337713d8d45475df48ccea0fa793f591fbb62673545fc846c04e6aa9e2eaaab72c3b9ba4069ea280", - "blsPop": "326ca5e3a23a45e4c89e5876682ebf6c53ded5807fbe758c66523c8a3bb335d0d8acca0b39ddc971d269e4bf2bc98c01" - }, - { - "address": "A7681958B7f07f4fB8BD5b0e075FB2Aa2b021d59", - "blsPublicKey": "ca75a4c4e1f2898c0ae0ed53698b0271d1f8869ecd945891ae6a9b02682335af625c29f80572b440f91a3a343f3ea600a49a426960b6c1c5dd7b24540ecb745329312dfdd525fd173418c846f0096acba466a27446114f3a2b2a887986846b01", - "blsPop": "0007ee1e065a5d71e524dc532824e2e35b2c3788455eb6866f375cf738cce0abfa533d9071b864f5b8046b09a9049001" - }, - { - "address": "ECC20b5f3D6cDF41fc12707AD7872Ff64256e297", - "blsPublicKey": "b3dff5505fe9ec48fa11a67dbf166e7d5ab3473ff9f5fdc90c690eb1142ad687b17eb8ab0c08615c9397b351c53e79016d593d588f8c607b2d1136713d05066ffd8a30589357afa54a0a7ac8777372787261dde126c4f0c04bf741f809c3be80", - "blsPop": "b33378a91d0bea36d2977443185e8c8da2951e1cc7dd24121d66d71b19bd47b6c9fe4c7f28e544956c62126850637901" - }, - { - "address": "e5617da4dfa6ce4912f1d39889fe2dec2f7e66b3", - "blsPublicKey": "451da71ff5b1e9b4fd587fd76f53808191096cc5869596eda8f3ad697426a98a659f5f4818e57866d4b0dd90f0684900742a167b314f0276e5e6a8bc6f40ce1cd1d35c7f9be56f1240de596b30a4b40010dfd21b7fc9aef400faaa421e27a080", - "blsPop": "c4f53cededae924fbaaa44019f122e75d9b1f8d98a177da7d52109aafa25cb33c272d839cc6fbffe4afdb5a5bca35981" - }, - { - "address": "39ec4f2a82f9f0f39929415c65db9ea5df54e41d", - "blsPublicKey": "277d5eedd8cf7d549296a6f8ad2869ec6e90183b9e8ebbf49a9d3ecd817757b50c6acc6cab9f0dbcf4b87251b6e82a0192a00c1e509225ec1ec08dcbeb04ba2808b16f72af90f33277e614dc19383256b3980421441ce3e75356ee35e5f62d80", - "blsPop": "1880725666ec3175a7721142a808bbb654a555dd381d563f907cc1e7aa82fc04ade06a471543de9001b6ba6f003a7900" - }, - { - "address": "341Dec14b7A56c242CE9Cf939815ec7bb1104244", - "blsPublicKey": "07431d3192f7a273382b645bc300cb0161c2395c005bf389698ce4a3ac56cffb9405456102f5b6e41b42bb28e7cb1f0141d3e1be1b066a94cc2befe88b0f5eef63c60a1ab8a1f0cc02e73c0ce5aac464332d3a2c1074a5126807a75323280301", - "blsPop": "1e913c644b1699c791805527bfbd539de4b7148e28f9999efa0948136c623d8a0139493cbd8322ab247c40fb61397c01" - }, - { - "address": "69901924e6c045a03cc163c6b8ae8af80fa1ee80", - "blsPublicKey": "0d39e1cd00ff4ce5e41fa9ea90526f3e3540b64ba52954e5682f31ec20360b2995d5caac7cee778ff7c13aaca74b9b004032f87a331867e9ebc7ed06c5b77707c88614714f2e0ed30c85c519aa823aa9d1a7a56766a294c7802aad300b408280", - "blsPop": "8cb085ac2186dfa767ae27fcf9a81f78e7c8d26dfe10a238bb1cb6cf708ef0b3a62a08b3510162d97fea4578e9326b01" - }, - { - "address": "0A6641d4736767D1f2bCf2450200eF384391B441", - "blsPublicKey": "ce9efb226a3c085071c7aea84f66040dc16a1600893eb1e6ef507fa72c6684f24d2c3b4ffee9881142fc41cfbd6b3b01933838ed7bd05c16450cab703b12ee623f8745f7f618981601a9c4de1337fe881975bc8572536281b9dcc51400177680", - "blsPop": "d9e2c53fc104b58d04fe62288f432f14c7dc6658c27acb803cdc98743cd18a8b7dfa53b0e7d77105b9ed0d2c0a22fc80" - }, - { - "address": "66bdb4D2ff2Ee4C68517e6fCd25Cc3034C869160", - "blsPublicKey": "075724b0f2fa9ced3440a76107a53b02a0b8b5a5a510d94b02b5c0696db96910204d54ee3138b3e5949464585d98de00a2964d480ca31adf09c962802895ea160d195bce583e59094bf7c11d33eaffef12cb999284bcf842bbab2c87647d8801", - "blsPop": "3833035453c03ca035586f536eb32b33995e9326647b8ec308b4e0b9ab8eaec13c4130e858dbf7a3a01aecd157960a00" - }, - { - "address": "606311948f7426Ddfd23C1521b15eDDB52E83B29", - "blsPublicKey": "3b0ffa0a101f959879b7e14d830862981d6c311ffd440f214eee0a2891743a25355e9230266aeadbd20f3bde0ccb30008bf326df8fba9b701d4764ad49663da19417132edf4281df4560715649491d9794dc64e9540d16b51b362e139c3aa701", - "blsPop": "363688de4b6c0aa6a53cc337dcee033ad9f73fb981930e78be0bbeea593b9cf94553a8512f8049a1ffa17d88d9774381" - }, - { - "address": "4CB90Ebba92141eD3021F5dC4e6C8bb642095846", - "blsPublicKey": "1a65e75f0f47d7c92da92de15372a0052338b952cc1307d5ee1596a670fedeb3b5ac8bc690f3f5e5f34fb8d18a5441011d6023e3c2b343bffc655972102a7181dffcab47bd0a2d6d46eedc6f2b70cd64025715de2b8ae23c032e61b380663c80", - "blsPop": "7bbc9c317c7a406cd3ed651f81308615d26ec8701d2f714a0b11f63e7a67a779f43813436827e9aeb1f011310b962881" - }, - { - "address": "8c6F9aAd8281A21e7f6522602f2D6469c950e0BB", - "blsPublicKey": "71ca0f2be942a733074e8e0d92ae0adc2617bcfe503b97f8339c8f1c498d85608fa5966da7abdd9651c9ce4e3411e900c4d65d5c7d858f1e5ee212c9e543d3727aeaa9abbdcb68848e7c990c06aa70a4bf2585959943935549e7a9e2c2ba0d80", - "blsPop": "672a89e0df5f3c3b470d9cbf24650102c5f54cf61281b8df542d70dd81c4f5f4c1cc27cdf59b2e6ad6a133a965982181" - }, - { - "address": "2a1BfF2452AcA1ca5FFABbD34b2744109d11E4f5", - "blsPublicKey": "0122d3f6ae5e2bc6fdee3cd35a3522ea519bfa6f365451fa275c4688bcfb5cec418fa053a8859981e973ea273ca68f014e840d0ccf18cc7c498742103c3116732ef197a9f90f50bf64f83754a86ad441fe9691bfdf384a1aaacf38430a2b1001", - "blsPop": "bfcafd40590a90c488358efe8cae948f248224bbf587c0755820a6c0870bdccbf211238484178828c74afc08868e9301" - }, - { - "address": "b4fa2d21b238e12ee4a863517cb5092f2330cb1b", - "blsPublicKey": "1135b86a2587c75c55db8274e498bd3f2887a1a455a665b8314655124c5e8197d3ef892f3dea2bd80b047530f837ae00dc263da929f0135a4f8ee73d47d5c2ffcaaa7b1e3a9cc59e496feeea2323e5eb16c9de2e3b973c35c0a82ca587477201", - "blsPop": "a085173ab5a1dd464ead8dc7c54719dcba8a61c432db686c68a883e3aca410d16b84fb611e0d080c6218cb3f94fc8e01" - }, - { - "address": "42d441b6793e6162b979fbf6ad0af0063cbec96a", - "blsPublicKey": "e162bf706b79405326d52e29e2ea247a55ebf69ac34900e4e1249c3fe28fd3dabf40d7022f57bc7b16bedbc78bb49000ba7cd92e123b69309a0572cf49b1bdf4c5525ba656decffa5da91ed4fb565d384979e9a58f72986b654908bcc7308580", - "blsPop": "77167f6e2a1160d6956c5c96a66b98f7409e32969e5d24ca2d28092fa5fc603c2ff7a220806ea52dcdd010f7e28ce680" - }, - { - "address": "436d12F639A32509685080161Ff0365Fc15545f0", - "blsPublicKey": "e9dbc5c1d073954ad6ace9afb19d8679e8c87fd12cd1356999a8e167390f3de966f3485b500c40c61f829e12f56512001b4d4c4fcb19737735eba15d24d43231ce64cf348de3765fdbd01f4612af6ad7ebf9fafb594ade18a7717c189e9c2080", - "blsPop": "c25e9708b2a3a6e1e5fc625a8a45048fb8b2fae9209346643d3b2ce4ba00948714eb4131a065309d8690f3aa689a7980" - }, - { - "address": "e0c5f6172673ad70a76ff264cbc0df783930b47d", - "blsPublicKey": "bd8e3d6ad24e5a7e4f084b3142c6bebe26d248e09363a9bbfc9cd07ed59a65bc140a39564f6ecdf287c2ee1ed57a930035a92306b6d5925af8785c813bad6113ba42da88d3c0586e438df75bee506bdf9f10c927493524897ba485946f976f00", - "blsPop": "4c65e9a51cde701d3abcd015b6787f9931a57e3913118befebaf6d7bffcb74ef8b66ab8e4abcf0504ce2fdd33b277c01" - }, - { - "address": "21245b0A2c3235F1108d1Aa01AE376849D36e684", - "blsPublicKey": "30feb59a6804a2df697b3a40c691f176575ffb0d09fa167a59642d041dfec97e97414ea7abc8bc7bdad5f1a684e2bc00fb2c01bc03aa9e8caaaa82ca3210de31f9c7abff6baba5e08e2dbdfa85953ac39bf3b895627e7be493fbb6e346061400", - "blsPop": "1e953a19e0d18da7b791b7cad62a57e3ab0954ab7f3cc912a9b6a7b56801183ceaed7482a826ff97c9507cf320ad3d81" - }, - { - "address": "f27bb4eabc4400a1abe9d80d7537ab0ef1b058bf", - "blsPublicKey": "f24f6f1e6423fd83d1ef275484a5950fed90794d7bfb6224de768bf716646dee263de0d9ed4c681c0def78d16df0f000f1e17b8657f1d8e30dcb7e94cd1c79b9c82d8f04252b078ccf3719c7658dd82b9d93e02671925b5aeb4ae38163129100", - "blsPop": "4de6ec6e768781729d0ef2936ee8f64df678a58ae58bdfa90d9897e1a23471c95af3aa88271863d55d180f9f2088be80" - }, - { - "address": "8440e805b89f48c932265e3c4ad033813669d87a", - "blsPublicKey": "f6a586f1ceb85a23980bdfea738678f5906de5303a7ff6ceb3ad675f93a84fcc502275fd7bd8fde4de89fd381cb8eb000b3d496e248208d00eae4f421ce041d73789e794a372ca788655eb5413c3b7cab1c993e47ff6c568ca4da484bac88b80", - "blsPop": "e9b48e137fa2b0277085b34863f11469763c280a0672545edaa51c56c0cee7d2ebcf10571b561d71b57f1fd3e8551701" - }, - { - "address": "2Eb79345089cA6F703F3b3C4235315CbeAaD6D3C", - "blsPublicKey": "86ab977d6064531b136aba2fbcc50cfb332b939e0af22c7ff3800c804b33aad998ef69a97204076028a6946d627d2f0086992cba351fd538a91986253884325aba6d66ec1ac10eb8e18330180820cc388c22de7d64f9f0592b74ee5b35745a80", - "blsPop": "c9e3a638c221d41f4568a6f9b781e4314591a769cf21d1aa32370304a5a9c7228c1d44d3a48c489877bf059fc7ddf200" - }, - { - "address": "3Aa1fa695aA89958EBDB5346d6760B72250dC1d7", - "blsPublicKey": "fda02a60f25f87c0fc86be0bf52bf1d0be37db80661a2fa35c7f52a4c5f5edccca04768c762caaa2303f8cac2ba93e016dce7c8382e3d002e751e0bf33563eb8f6e7937ab2998300e96f5ed3d857d081ff389c495febaabe836a7cd06012ae80", - "blsPop": "d44c9332df502b385aca6d7a11ed02bce0862a98e3a0943c9c40c014a357ba1097c48b0ca5345bccba65025f169b0c81" - }, - { - "address": "173c75C8F1BE201cE89cF426Fe12c9997D709626", - "blsPublicKey": "9271a103e5971edd694b3c6363a2517b0676849b05660094baf240ea9677e171169a8eebed5ad6000871b4f42d4ca701b74b10f91c0de059d225ced0222e391cf7bdb5934d1677193837d5a1f17bb710f4747956d819db51c817cbba53324b81", - "blsPop": "bb19f1e1e399faec8d291d71a31718b508e782f21111248ad2a9fe2aaccb37df33d1b8647974040649983516a9febd80" - }, - { - "address": "74288dbEFa3a55986c039953b67139A466474fC4", - "blsPublicKey": "719b2de30cbcdee6e62445b7c2eedb63520beb4cb6e4a55f7e121a5bc6a1143e1322d3ffea28ab624d30f6da8962b8003eeb056df8e46a49035c0aaf139a372e954d8ebc3cbd2d0b2e5727d54f3f4fcd0e95dc29bfa9f9ddcda98bbe8107ae00", - "blsPop": "069ec7a9207444613130c1984553081676c3675b41c5846c4719e45f9951e686b0cfe3d5d8323d4d4c261d5fdbb03180" - }, - { - "address": "82f0E7879314516952f7961b15C63FC6B2734DFB", - "blsPublicKey": "b054d545899f55cbb84853125d50ed06211ace3eeb6160a85583de508569e65592f5a221a000cb99997c0dd2738527010f2b1e83eeb7ca41c40ad9295b35f1172004a63567f140c1d2b0e04338161e8f1be88c1543ac75d587e42c3ed7dc1880", - "blsPop": "97860ebe1169d6b1f5446ee5485bb698fae8dba39891852c770edf040cd43b5479532efc28e499f908a4f256b7601181" - }, - { - "address": "59F7b67e6BeAE0223DDC91Eec010b670c553E8e0", - "blsPublicKey": "e8de954d3100b125b2a3c1e4a81e3ac5f4b38f85c2ed91d752c507d4a37c338ca215e474a48511c599407b85c01d37012cd5012c74c5c0c3f88b4480a73bb52d1ff49848ca014f54cde1a891e6efa483b5f64016a9303515806c45d9f79a7e81", - "blsPop": "90dde5eb63a5306fb7ee8520f0810acb960167772f7d964d81b198d80a02f91c4719828ab3732b7c228b3e22d88a8180" - }, - { - "address": "e10A8Cc6c22CDc320c67BD600a1d8a0a46D7f400", - "blsPublicKey": "bfce6153e2c2dabc724d9ccf5cad8916753dce1a495a29053ff7e5b74b0eace4889d5fd2d5859400d82b6bf29cecc9006b6e37152cb1f4f72e7d0c0897439f4ef886e26b4345d2f3e73bcf08d47f02c02976730620d7f844313c099894176980", - "blsPop": "c636f7de65c4846b56381933f255bd3bf2df8594c4bb5efde571df36d5fa09eac71992780e7a7f7573b2dd23469a6a00" - }, - - { - "address": "e3020350aceeA29B783e0C947Ae001692B8F6248", - "blsPublicKey": "d56a6034613c0db14cad99e0d2dc3a2482ff8167a032218ac10508bb04fb8ef91d40a6ef0731f1e162761f09be237800363199f518a1bad98f9f2d179d2028137f9a86588c5cd91534a53d4edcc718b40aab61343571636d052eb7b288578c80", - "blsPop": "3988cfe70f34d8c7823dee13e76ac2efdb83602d5d17c4689f076a9373e55f71a80599cc622216461a7a009d2e141500" - }, - { - "address": "b952930a3656a9cbab21df5919f94c61a495bf79", - "blsPublicKey": "25138a99de6619b2ad03707f847ac0f5d79b0fba2c67a02f5430a57c7a651d6cd63988cb91aa331acd8b7221a41fc4007be73ad86365de5029c22186d2c90a864fd3d386cc074b25f33b4e5f17839d2780b62390f1fe63ec690c1022c039f500", - "blsPop": "0d84d1b7adf26c6d30b31e22671e51a11709685e714836cc113dabc26e9d8fe4f5a07990cfd7695115cbede03a84eb00" - }, - { - "address": "2289a63b4b4700eEECA35323Ea51785f366dd705", - "blsPublicKey": "c1e12e9c0b26d73c27dd5f60f08aa15c952afaa395dbc8b4cb99453bbf3f14ede8a9fc579f64500497344cb3637aa900ac8618d237580558d9ab5b042ac411fd121713744378c9f4d897907f09a71af9331bfecb3853e3f0577be558c9019881", - "blsPop": "2a16befec80fee3bf940965ab5670ebe13c564d9b007914bc16b5de89e7787a680627a93b9ecbf927ead2b1cd45f1480" - }, - { - "address": "0223E40d1f93A6Fe5BEf63605992aDA10740E13B", - "blsPublicKey": "c8e6cf45b640fa91385b4f7e891fe68e5ef6847d032d6ee1b52eef0ff577ad796b881537f94adcbd1b52ba3e0e5d7e014e97406029095598a373fd6d9b6c7754aa87b87560e363faee89c11482cae4a063d7cbe65747f2b095cd2f6688a8bd00", - "blsPop": "6c4c6e5bf183fa84815c84d37f804db9c1c51fe9c77aa725d686f2808c31ff51eebf8ba5470609ae46da3a2f0246c100" - }, - { - "address": "0610b8B4E6f5C3241D53eD3374DdcA8969cD053c", - "blsPublicKey": "08fb22b9fe04caf62059216dfa1d2274ade1df9e8dc0b3c37c2f04a8b3d30a73f8b7ddd873d08c9b9302b936d303380147ce0e19c4600b780a1814730462147fb8280fbd7fe0aa7290d325a2b7b5092e05c037664a0aaea127b0d95fd13e3d81", - "blsPop": "fee988a71b813aafa1fe0e692e0197df3b9946e3a59dddf46944a178249b1c0a981b1e264e3e7f93ef24b8727ac03181" - }, - { - "address": "63b4b616c5345e3DcC9e21dB69297e2129447f4e", - "blsPublicKey": "68dd5abe43baf4f4817a49b094c0bb19ed4971c49a6f91863163f655b6f312d80b7c7843b4cb7a27f7e54d48c41c7a01c5e22e1b0e38a5dbb03dfee76076a9af56146359add6003410005332417f6b65db718acce693cfa21a8fd92f6ad72800", - "blsPop": "a494765b096110901eb3913b433e865e72766183def25a220d6d4c1e26d296eff94a97fb3c1efd35aa70206814f0bf00" - }, - { - "address": "c9A7781729f95b88239c3BFc91fb52f92B44a116", - "blsPublicKey": "91785b64cfd3ac6025b6b536499ac63868344b8f86660ed73b574b000477807d53dacf1d9da258ee1b113485a01579001c5a510dd3b3c5ff1f417c4d97faf997aecedf7602bd96155f8396f5c9fa53d2c601b0de0241fff49939b8a97ce59f81", - "blsPop": "1457b0a74938bc535d310352dec6fe4459143bacaaa6bc198020859af7edcec9b2f9eade86ea1f27a287023c4647b280" - }, - { - "address": "56259f876eB6a7264D9f1a59952bAAD599fF9640", - "blsPublicKey": "00d869e78871139e8a5ff6d590593d94ea39ea73e87ac8d901f0f524372b93752a488b34e05afe05ab53b1c1b5b9a50040341dc01361110551358c9e982f30755e19ec2ad1c1a40afba9998f763599149b976caaded14c1793fe8c469b17c900", - "blsPop": "fc9193e1d8c07c9d5a25cd483e4397f0ef80a52a78e585f04bfebb2275f60975f02f314a3d81d5e539d7b83ccf2daa80" - }, - { - "address": "F139e74aDEC329E715AB49a68c5548A00e40CBc2", - "blsPublicKey": "e87f2bfc49c076239d7522280098c9c4808524c67f3891608c42a510340d64c7e3c1c074c34050007837c13536ee4001108efc87fb2eb8c3236a738122ec086a7b8e6857e82454ff6284a81bf0362424ba96be762d4c06e6aa1b59f20c870780", - "blsPop": "7ae5fe658b4bb7b3aab3868a1504bd272a0b87689c895bce63d791d03eb0aae2fae14d2814785351bad72500d2320600" - }, - { - "address": "33657019D60A0A41f1B9970bD4B28A3A83DBeafc", - "blsPublicKey": "8bd2cb218f465c82702aba73a061227cd83dd8e3115ae592811a4d9a8236e3cb99cfe81b2e22e3620635d42f63b7db00cd546bdf9fad7b49d0b4686962c8e0f4544c8df9f41d30d7d489c86745ef9b61a7dfb498f7554eeaf7e24654e5b2e080", - "blsPop": "41a0094d8e94f622043fa6fd3eed9c7ca83ce3ab2d5cedd48980b7c059a02011eb4c3f2983ac9f42d406fe1a78d23d00" - }, - { - "address": "0F5640bd556B0BE19262D1817B213DDB3424d91d", - "blsPublicKey": "2a5d5145b7a8100575e96cadf2513e7e64dc1354bd891a24f985c2574c9f5f136c6837eab290543d769227785520970133022637837b6e711918f3d2e98291bf3b518247b54e925a0cd080d517731b2725be888fca298ae9ec6f5d2c19146b01", - "blsPop": "ebf73526d3103e85e37316af3b6905ee68f692ccefff0a48d0e17b5a850baa9f444815a4da7a78535c0d4a18c4869b00" - }, - { - "address": "ed8bf82d2e579ff6363aef139f8b147a0105f17e", - "blsPublicKey": "9cedde7d49e2e115896b789ece09a4e8816fae153de891774f0f716a3088e01754498f8ab82d3a4ef5d53dbe17338f0089d1de1eb60bfe3ec2b9701cd882b1c174bdcfaac2225f57481676bd3ee845825be823b9f9317e407b015159b9ce7e80", - "blsPop": "6e1b5dfb7a1ea330a8ece9708bd83219c5d44186f58423991ee1a9951f98c8ef0a8e844169217871cfe6ebe188c2bb80" - }, - { - "address": "d8C68eBeCb6F074ac5C4FB66a690AC0Ad38a5a3c", - "blsPublicKey": "9124b91b223bf9bcb33f75ab8fad0180fdc57536d34349945b80458abc8e59cb06005fabea656b2cd0dc74c0d06160019706965ec3ee0d3b9dd76eb9d39158bd512fe7384bb9e139bc6391d5f7b969397bd85a91bb91670880d861e8fcbd0700", - "blsPop": "589d9703772e103b414b96ad2b39946ebbd671dcb86cffc6cd7eee70991f005ef5dc31f422c5be9f525d940e4fdb5b80" - }, - { - "address": "c6f916aD6E360651Bb95f8e67C1C28805745d084", - "blsPublicKey": "338fa194daae9ba1245931653fa349ce85d37e2f42f871c6b67397401e190b497d15ff23e8f6e9d8637f84fc2dd4640066fee692f5d38208233e7570c3ed49a4ba032d79c7a974883fa5b5c2a113259b924a346bc4edaf717b3d272ceb152980", - "blsPop": "fde95baada162d51e09dde8554ed15ee745759ab1a3c9ec3f6eefdba23b8eaad206283b40a76ea7e9317894bd6641e81" - }, - { - "address": "3ed95D6D4Ce36Ea7B349cD401e324316D956331a", - "blsPublicKey": "6150a62f3bae3a35a639d66dc37ffbc979dda85d6f5c278a5e3a563ed865018c86563be9b878261ef8d34edd3c313500a4f170e22f386674d538b9e15930f39216a38c2a0b60380a140e5897f42d37077784cd0f1a93746e4da42a5da8de1081", - "blsPop": "db8082ee5e6675f9a915decc718f69407da0173d9b1e8696a876043e2b237b6788dc9bb1fb2eae3f46ef3d7844751581" - }, - { - "address": "4A03C4c2E101AC4612d89b79f61c9C5BDd51929D", - "blsPublicKey": "5fd1581e41fa22e7ea0d3ee0674e80e2168fdb42c1aaa3eb8aa395fdde11de5ca98ee71c50b365ef6ad9850bacf7ab011eb446da79efba3ae91fe1f7d9274dacda39b0b3bfaec7d7904eac59d72683c52b6ea091ca1374d8b744fd1d595c5700", - "blsPop": "7e7267713730539c9f67e926b4daa7240e3d8a203d6254f9ef256bed9d4afac9dda11123dd1f1ba72cf2ab123d6c4481" - }, - { - "address": "DD0f3F7beb37fe9D4496F8098446B65DDFB1Fa02", - "blsPublicKey": "e04a671c7f9af919093a9c7aaf5c3d04eff31189d65f5a2aef8a08119ed1c150464c40085bfe2fc1b598fc2ebb095101d0103b995e3a762a9063d823ef6686422caa5a390f9cef5ee6c4b911fd2a818f43fa47ab4d21b01cc7b071b517438d81", - "blsPop": "a419e142a4e1f1f699fbf8f8dcc4718afebab936816ace8374c72e996f873d4180fad6528b65a3f007654f18f10b5d00" - }, - { - "address": "642CBE89A58909BA712dD11ed4C4b2359bd8C85d", - "blsPublicKey": "14cfe4d455820e0b9189ccb32b565b90d414ed7d315d3e61141f085b6aaea34b0c90d1f502f544408e7ed0573e1d1701d38063b239cf847018daa41b7ef75b4458d2fa23e459e9e8039668aca6bc70df1051a6466a5a5bc636e68bc8e2380a80", - "blsPop": "52a7e4854eacf7842224530c2d5338fc31c12a3996134c69e81079627d7c952e962827a8d863ec2bb88fa02a41100200" - }, - { - "address": "FFbCF262C1d5c4392ef469BA79f2CD195d2AffDa", - "blsPublicKey": "45bab7cfb2814e1e3e1a40779219bb48c5d51419ce4c27359d0dfea9fbf1d193ca1cf07934ebf3d6699093387d0e0f000dc5fd2058e7ac50e67c26e5172567c2e7ba9fefa6376e44de800c8955cb330df27476d8ba9c8ec06bba46fddeb95e81", - "blsPop": "48ee6eb577379aa76f34570e6d12f225bfa299d297f9631900bf0d3d2d9bf80915c2ed3cb38c5f75f1893e5c7d168801" - }, - { - "address": "68e0104fD2B5A2c93E97C2bA172C4D2A4223f76f", - "blsPublicKey": "a9fc64c64c039cbd8b23c10c706e51e0013289201111cf8c50e22781161df397fa9c1b91dbab44969aa4f8819f222e01bab90190bebfaecff7076de325dae8080bfe921fbb4ca474d45126fe5ae699dadea98a5255c048b955ab9ba1de8f4c01", - "blsPop": "3b4374544c02f51c5acbde67f407b17f8b5513ab5286b8e8213ba56adb510557f54626009d6e68f8f0505384c64a7d01" - }, - { - "address": "5cAB520442de9baBC290B25E5e2e6A1194Ec6707", - "blsPublicKey": "aa5729ac27a01f26924e863eea7d7b49e4f8ccb15c758ea5ac4d19691a8df31f70138e7f3011fc1e53d52d7063dd8d01a9b3800a66914956769a13ff8d7960725bf1b9073052f3ff85e57b34165702baede05d4e807bfb565a38c24e3542c780", - "blsPop": "629e6b0901b08231287b3bfd0244f9730567d18e675c02dfd4a6c42083d32050b03c88437802a74d66912089d66e7a01" - }, - { - "address": "Bc6963fc0e2f5547bA949ed39e80b8388321104f", - "blsPublicKey": "14dafe41c0d96b22188f0cf6cb99274e09d8ed0546bb57b28a3590449726aedf809c049663a2fe4e40b91830ac864d006a9c5b41cbddeabeef99cfddf6c298de0d482b6e945993241d8987f7fac004fe159f2206060b69d1f8f7f67f63f26280", - "blsPop": "1eb62a80cd53de8e1103f9f3f809e4ed6b5774be448debad52f04cea649c77a18e795e18f6e5ecc89b1d7d0e949b1e81" - }, - { - "address": "43882141555003B3e71110f567373b59Ac4cb0BD", - "blsPublicKey": "878790bb38b1d31707aa37b5af439d64d4a29ea64418b7c01139a26e4f297da6c61a806361e6fee8e40556500c8a6000183b2a42c54d714a143fe12492081f373e3e27e3921adeb276a8daac8b87a534c0fa1d45195d8a35cbda27f883128680", - "blsPop": "8adbdb18b1996f0d156ba332c7cc5cd7a22f701f92cb6102fe81a1f353c24757d2fa3181fea45d0309a39357bd158601" - }, - { - "address": "48cedc58b10af13d688631bc3cb78a05b8a6e56a", - "blsPublicKey": "7983d25868e1915ca8c20603bb70475b398acb04fef221106475f0fc25a3df25d181050214da347bfcd8f8292e5257014ba77cf67ee5f42202257317808f126930b4eaaf4bec801c4465ced54bd78331f8d3dadfdf2f91de294836c13e946001", - "blsPop": "2b0cbe1b28d6a36828022b21807fe74fb64b6de77325719347cbff51b256ae91eb96ad9339eb68b6a43b522613049200" - }, - { - "address": "65698c9ec5Af10345cd1e39472D60FB6133bad6F", - "blsPublicKey": "c700f20fab4ebb8ba2f47adf177e3a9dbbfb8d999acbaf0055cc540758f8c00c4564813d910e7be02e4ceaa97fb88000b3e8e6fb4308fb943fa0a3d8ee32c3a16e67eae0817370de54bec1e13193cc845cedb70e48171e097550475684cc6600", - "blsPop": "3c7dd9c5298690b795c30da0a78435a17717c88b8adea527f74335d2cdafa993153837f4b6a5beaedb4aa25bb20aa700" - }, - { - "address": "D507309fd69635aa37810A65A4dA27Ec47a1BA05", - "blsPublicKey": "3a3857a5e4a709e1891a13797d55c1698f931755492099926314feec955da31fcea8a8c8e52ce2efd74008f32dbc1c00c1b3d36c1f3097d806f1687eff4a979ff967a8233f923a1af6a3104c12d9d44388bece91a248bd86f253e709cea7a380", - "blsPop": "25544d8e28279a43483cd8b45386e4ab071618e352172a60c85c3e65bd5d9efc46bc490eb36707886a7356ac864a1981" - }, - { - "address": "C46Dc0741FF61af883E284daEA062ae7382E7091", - "blsPublicKey": "d8704fbf6968ffd9f39f78173726dfebbaaffebd8a33dfe29a72b9a64204debf2016ca94cbecb1b84cab9d5cc168f70082530b8cfab0225a37c8cc3044c288ab0fea2f484b56a33fe6c92d6c50f254e6b1689ac6bb1429cc36a0a7dfa85cf880", - "blsPop": "d0feceb95fb140f985996bcbfb2db22f18f5ba7b02b5b0f7167d02f6ab517080a4db6c4108f46bca1417d32f50dad100" - }, - { - "address": "198958f0b860AB0E3937F468FE366AAc9EEbaD2e", - "blsPublicKey": "89bc906b5ea07752b1981558fd5d329f6d428bc14dfde022745303eae0077612b9c473cd66e4d91778282ec98bbd74001b61cd71448750df642fb580ce81a62ff2f5ab4378528ccd65acff67ac2175678f598529c768247c77938ca2d529aa01", - "blsPop": "2a3a0e6a10d83a353eb25d6cb56b6693150e0d8645ed2df72e77676bc430ae39e825dd1467025c9f08a1eae142685281" - }, - { - "address": "0ec5a403212d732d8d7ced050e9510f6327453c6", - "blsPublicKey": "fb9de050c059a11175be23daeef1bfc374a688d8fd776e7d2dc81fe301068a3af03ad522bd915a154dc8b5b8e12d38003f17bf9f3001e8bc682e38cfc93c6901417afe1ddbfe11b04f42ca4f321f7ba73fa98f0e8c8d75781b14c0dea847a500", - "blsPop": "39896a46d6bb64549527a34474aba7b852954bc5bfb8b69e3593bfef083974a75d47ca4a92296774935ccb4feaac9a81" - }, - { - "address": "464CCE7999B3D90a8F1fFaf94A71FBDeA1E65435", - "blsPublicKey": "bef17561ef747e0f11fd339987a9813fa8ac1cbbd6457ab8fae4841f2217de019de7caf421631372b7eec730d9210d0128e6c560948c19ab6141d6699e0283a5b98abb8266d731f2b6435a07b44360a3f5622cff59103ec95938ab74fe1d8f01", - "blsPop": "02fedb06b8477cdc8b6d4f556f85cae99c483eefeb01eed7808decc05c1d55405e7a42c299e89d6249cfbd3122d66881" - }, - { - "address": "097095B8cF5cabF0c39E548D8DF55DD635D84D07", - "blsPublicKey": "a47fd022dda04d8dd36e0a8127be68736e47e490b946d2d88a9fdcca0a5114a930db2319e7e9f378d1b81da865d38b01a063e9994e620483b08047da39ee2fba35f66df5734e1545d297be0225ddfda34e494c1f09825eed303d166a8b702f01", - "blsPop": "afe49aecae6f431e88469fa5e185f4e34d3c6b02f41487927690ef4f2d877349b6dcdd62dff31c508a291b7099a23880" - }, - { - "address": "AA937dA037E617e868795EAC1dcD43C663014D32", - "blsPublicKey": "5966b41d69c0e94e202f65823338c3ee0357b024a6ef625765bcf3c20e1786258a18221e0b9a34b43380e06984d92f0183d0f86d0dfe0cac237ee38b28876dada3ae9d1a7c1e6627c3b524d0e5d4270bbe159661622821412b3c1162881ead80", - "blsPop": "ff11ad6c1336adb225fb1319d8372cb5408956dd8cd1272f0210ef9ff8120a3ceeca92b21723375c75e5347935225b80" - }, - { - "address": "f11073eb2D259b90A91954caE30D0E6E9ecC7F11", - "blsPublicKey": "8fad3c2111494834f30e6d8a3368f2aa156c991fa4b9d55c31da88310a61248305697ad5e885404bc474419383de4f006c89c84955321716f6ceb4d5db433fcabe843d566a5e9526d29c58b38fda9ff2a13f7841f25cda8ff091c965bdf91780", - "blsPop": "c932f10076512167e286373ac63e4a7aa60793b4a029d68a2972cd65017e145b521d06667e64ed720f041f8a3b0ae680" - }, - { - "address": "5b55452Bdba5971D606F47647bd383f3C3fa7285", - "blsPublicKey": "7f9df4fdb96e20e0a480a62c48896646d96982d7fccccdf70c61f2f52ab4ea2901b5ef468203f39d9d8325843ada0d00e9562c64f79ec16d69b5921ad36306ae9ae2ab97806b354af8cb63722a96aed5fa0542fff58d2e41f6aee3cb641f7781", - "blsPop": "96394cfa698e6f8fdd6b66f47c99becf374c3c4daae50a48ddb94ce15cee803b12f985684d6bc759c2549b700f343600" - }, - { - "address": "439d5E4D7578ECc9eFa52A8Cf1887b11FD0FB900", - "blsPublicKey": "9cc39207797e5c37f6336c4444cf275d0afd43a9b257a08086f15867886833744c9b5753fcff76fce3790d7f25a45e004af9831464bf43e39634ecfce8da4641da78ad4ba20e540c007dbc19e391ae69cfb7809b2d20627d553a16d7d71b5a80", - "blsPop": "ca89ff2d6ebe328f9644dfbfc5f1c4c9276db8e5c37336e89d21a576074dfa92d1115d5defa2d7d4af0bf9c03a885181" - }, - { - "address": "92f628b0157D47C992F5C69dBBD038b110E27826", - "blsPublicKey": "e6631bd0b82c41327b582d7b239c9893cd1162cae05f360d8808700b38ee0ee965e9bd9246d79e00c234c87aa7699800c73428f7eccd77337bf79bf6b43f48345be265c38d128d2cd7f37167cadb3565d66e6a7f67b12f4bed63e5d507480100", - "blsPop": "ccdf1a5f80759b6748720ae804df42c44fa6438bc8b8129f47f8bb31078a073d7a5959a59b8fc8392f200862378a5d80" - } -] \ No newline at end of file diff --git a/packages/celotool/package.json b/packages/celotool/package.json deleted file mode 100644 index e39d976451d..00000000000 --- a/packages/celotool/package.json +++ /dev/null @@ -1,71 +0,0 @@ -{ - "name": "@celo/celotool", - "version": "2.0.1", - "description": "Celotool is our hub for all scripts that people need to run within the monorepo", - "main": "index.js", - "author": "Celo", - "license": "Apache-2.0", - "dependencies": { - "@celo/base": "^6.0.0", - "@celo/connect": "^5.1.2", - "@celo/cryptographic-utils": "^5.0.7", - "@celo/contractkit": "^7.0.0", - "@celo/env-tests": "1.0.3", - "@celo/explorer": "^5.0.8", - "@celo/governance": "^5.0.9", - "@celo/network-utils": "^5.0.5", - "@celo/utils": "^5.0.6", - "@ethereumjs/util": "8.0.5", - "@ethereumjs/rlp": "4.0.1", - "@google-cloud/monitoring": "0.7.1", - "@google-cloud/pubsub": "^0.28.1", - "@google-cloud/secret-manager": "3.12.0", - "@google-cloud/storage": "^2.4.3", - "bignumber.js": "^9.0.0", - "bip32": "3.1.0", - "bip39": "https://github.com/bitcoinjs/bip39#d8ea080a18b40f301d4e2219a2991cd2417e83c2", - "bunyan": "1.8.12", - "bunyan-gke-stackdriver": "0.1.2", - "compare-versions": "^6.0.0", - "dotenv": "^16.0.3", - "ecurve": "^1.0.6", - "eth-lib": "^0.2.8", - "ethereum-cryptography": "1.2.0", - "generate-password": "^1.5.1", - "rlp": "^2.2.4", - "minimist": "^1.2.5", - "js-yaml": "^3.13.1", - "lodash": "^4.17.21", - "node-fetch": "^2.6.7", - "prompts": "1.2.0", - "read-last-lines": "^1.7.2", - "sleep-promise": "^8.0.1", - "string-hash": "^1.1.3", - "tiny-secp256k1": "2.2.1", - "chai": "^4.3.7", - "mocha": "^10.2.0", - "web3": "1.10.0", - "web3-eth-admin": "1.0.0-beta.55", - "yargs": "17.7.2" - }, - "devDependencies": { - "@tsconfig/recommended": "^1.0.3", - "@celo/dev-utils":"^0.0.3", - "@celo/protocol": "1.0.2", - "@types/bunyan": "1.8.8", - "@types/chai": "^4.1.3", - "@types/dotenv": "^8.2.0", - "@types/mocha": "^10.0.1", - "@types/node-fetch": "^2.5.7", - "@types/prompts": "^1.1.1", - "@types/string-hash": "^1.1.1", - "@types/yargs": "^13.0.2", - "web3-core": "1.10.0" - }, - "scripts": { - "cli": "TS_NODE_FILES=true ts-node -r tsconfig-paths/register src/cli.ts", - "lint": "yarn run --top-level eslint .", - "build": "tsc -b ." - }, - "private": true -} diff --git a/packages/celotool/requirements.txt b/packages/celotool/requirements.txt deleted file mode 100644 index e5ed2a09137..00000000000 --- a/packages/celotool/requirements.txt +++ /dev/null @@ -1 +0,0 @@ -docopt diff --git a/packages/celotool/src/cli.ts b/packages/celotool/src/cli.ts deleted file mode 100755 index 92c6f6071fd..00000000000 --- a/packages/celotool/src/cli.ts +++ /dev/null @@ -1,33 +0,0 @@ -#!/usr/bin/env yarn run ts-node -r tsconfig-paths/register --cwd ../celotool -import yargs from 'yargs' - -// eslint-disable-next-line @typescript-eslint/no-unused-expressions -yargs - .scriptName('celotooljs') - .option('verbose', { - type: 'boolean', - description: - 'Whether to show a bunch of debugging output like stdout and stderr of shell commands', - default: false, - }) - .option('yesreally', { - type: 'boolean', - description: 'Reply "yes" to prompts about changing staging/production (be careful!)', - default: false, - }) - .option('helmdryrun', { - type: 'boolean', - description: 'Simulate the Helm deployment. Other deployment operations can be executed', - default: false, - }) - .middleware([ - (argv: any) => { - process.env.CELOTOOL_VERBOSE = argv.verbose - process.env.CELOTOOL_CONFIRMED = argv.yesreally - process.env.CELOTOOL_HELM_DRY_RUN = argv.helmdryrun - }, - ]) - .commandDir('cmds', { extensions: ['ts'] }) - .demandCommand() - .help() - .wrap(yargs.terminalWidth()).argv diff --git a/packages/celotool/src/cmds/account.ts b/packages/celotool/src/cmds/account.ts deleted file mode 100644 index b77e76a3058..00000000000 --- a/packages/celotool/src/cmds/account.ts +++ /dev/null @@ -1,15 +0,0 @@ -import { addCeloEnvMiddleware, CeloEnvArgv } from 'src/lib/env-utils' -import yargs from 'yargs' - -export const command = 'account ' - -export const describe = 'commands for inviting, fauceting, looking up accounts and users' - -export type AccountArgv = CeloEnvArgv - -export const builder = (argv: yargs.Argv) => { - return addCeloEnvMiddleware(argv).commandDir('account', { extensions: ['ts'] }) -} -export const handler = () => { - // empty -} diff --git a/packages/celotool/src/cmds/account/faucet.ts b/packages/celotool/src/cmds/account/faucet.ts deleted file mode 100644 index 6422fa022c4..00000000000 --- a/packages/celotool/src/cmds/account/faucet.ts +++ /dev/null @@ -1,189 +0,0 @@ -import { newKitFromWeb3 } from '@celo/contractkit' -import { celoTokenInfos, CeloTokenType, Token } from '@celo/contractkit/lib/celo-tokens' -import { concurrentMap, sleep } from '@celo/utils/lib/async' -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { execCmd } from 'src/lib/cmd-utils' -import { convertToContractDecimals } from 'src/lib/contract-utils' -import { getBlockscoutUrl } from 'src/lib/endpoints' -import { envVar, fetchEnv } from 'src/lib/env-utils' -import { portForwardAnd } from 'src/lib/port_forward' -import { validateAccountAddress } from 'src/lib/utils' -import Web3 from 'web3' -import yargs from 'yargs' -import { AccountArgv } from '../account' - -export const command = 'faucet' - -export const describe = 'command for fauceting an address with gold and/or dollars' - -interface FaucetArgv extends AccountArgv { - account: string - tokenParams: TokenParams[] - checkZero: boolean - checkDeployed: boolean - blockscout: boolean -} -interface TokenParams { - token: CeloTokenType - amount: number -} - -export const builder = (argv: yargs.Argv) => { - return argv - .option('account', { - type: 'string', - description: 'Account(s) to faucet', - demand: 'Please specify comma-separated accounts to faucet', - coerce: (addresses) => { - return addresses.split(',').map((a: string) => { - if (!a.startsWith('0x')) { - a = `0x${a}` - } - if (!validateAccountAddress(a)) { - throw Error(`Receiver Address is invalid: "${a}"`) - } - return a - }) - }, - }) - .array('tokenParams') - .option('tokenParams', { - type: 'string', - description: ' pair to faucet', - demand: - 'Please specify stableToken,amount pairs to faucet (ex: --tokenParams CELO,3 cUSD,10 cEUR,5)', - coerce: (pairs) => { - // Ensure that pairs are formatted properly and use possible tokenParams - const validCeloTokens = Object.values(celoTokenInfos).map((tokenInfo) => { - return tokenInfo.symbol - }) - return pairs.map((pair: string) => { - const [token, amount] = pair.split(',') - if (token === undefined || amount === undefined) { - throw Error(`Format of tokenParams should be: --tokenParams tokenName,amount`) - } - // Note: this does not check if token has been deployed on network - if (!validCeloTokens.includes(token as CeloTokenType)) { - throw Error(`Invalid token '${token}', must be one of: ${validCeloTokens.join('|')}.`) - } - if (!(amount && /^\d+$/.test(amount))) { - throw Error(`Invalid amount '${amount}', must consist of only numbers.`) - } - return { - token: token as CeloTokenType, - amount: Number(amount), - } - }) - }, - }) - .option('checkZero', { - type: 'boolean', - description: 'Check that the balance is zero before fauceting', - default: false, - }) - .option('checkDeployed', { - type: 'boolean', - description: 'Check that token is deployed on current network', - default: false, - }) - .option('blockscout', { - type: 'boolean', - description: 'Open in blockscout afterwards', - default: false, - }) -} - -export const handler = async (argv: FaucetArgv) => { - await switchToClusterFromEnv(argv.celoEnv) - const addresses = argv.account - - const cb = async () => { - const web3 = new Web3('http://localhost:8545') - const kit = newKitFromWeb3(web3) - const account = (await kit.connection.getAccounts())[0] - console.info(`Using account: ${account}`) - kit.connection.defaultAccount = account - - // Check that input token has been deployed to this network - if (argv.checkDeployed) { - const deployedCeloTokens = Object.values(await kit.celoTokens.validCeloTokenInfos()).map( - (tokenInfo) => { - return tokenInfo.symbol - } - ) - argv.tokenParams.map((tokenParam) => { - if (!deployedCeloTokens.includes(tokenParam.token)) { - throw Error( - `Invalid token '${tokenParam.token}' (or not yet deployed on ${ - argv.celoEnv - }) must be one of: ${deployedCeloTokens.join('|')}.` - ) - } - }) - } - - const faucetToken = async (tokenParams: TokenParams) => { - if (!tokenParams.amount) { - return - } - - const tokenWrapper = await kit.celoTokens.getWrapper(tokenParams.token as any) - for (const address of addresses) { - if (argv.checkZero) { - // Throw error if address account balance of this token is not zero - if (!(await tokenWrapper.balanceOf(address)).isZero()) { - throw Error( - `Unable to faucet ${tokenParams.token} to ${address} on ${argv.celoEnv}: --checkZero specified, but balance is non-zero` - ) - } - } - const tokenAmount = await convertToContractDecimals(tokenParams.amount, tokenWrapper) - console.info(`Fauceting ${tokenAmount.toFixed()} of ${tokenParams.token} to ${address}`) - - if (tokenParams.token === Token.CELO) { - // Special handling for reserve transfer - const reserve = await kit.contracts.getReserve() - if (await reserve.isSpender(account)) { - await reserve.transferGold(address, tokenAmount.toFixed()).sendAndWaitForReceipt() - return - } - } - await tokenWrapper.transfer(address, tokenAmount.toFixed()).sendAndWaitForReceipt() - console.info(`Successfully fauceted ${tokenParams.token}`) - } - } - // Ensure all faucets attempts are independent of failures and report failures. - const failures = ( - await concurrentMap( - Math.min(argv.tokenParams.length, 10), - argv.tokenParams, - async (tokenParams) => { - return faucetToken(tokenParams) - .then(() => null) - .catch((err) => `Token ${tokenParams.token}: (${err})`) - } - ) - ).filter((x) => x != null) - if (failures.length) { - console.error(`Error(s) fauceting: \n${failures.join('\n')}`) - return - } - - if (argv.blockscout) { - // Open addresses in blockscout - await sleep(1 + parseInt(fetchEnv(envVar.BLOCK_TIME), 10) * 1000) - const blockscoutUrl = getBlockscoutUrl(argv.celoEnv) - for (const address of addresses) { - await execCmd(`open ${blockscoutUrl}/address/${address}`) - } - } - } - - try { - await portForwardAnd(argv.celoEnv, cb) - } catch (error) { - console.error(`Unable to faucet ${argv.account} on ${argv.celoEnv}`) - console.error(error) - process.exit(1) - } -} diff --git a/packages/celotool/src/cmds/account/revoke.ts b/packages/celotool/src/cmds/account/revoke.ts deleted file mode 100644 index 773299b914b..00000000000 --- a/packages/celotool/src/cmds/account/revoke.ts +++ /dev/null @@ -1,37 +0,0 @@ -import { downloadArtifacts } from 'src/lib/artifacts' -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { execCmd } from 'src/lib/cmd-utils' -import { portForwardAnd } from 'src/lib/port_forward' -import yargs from 'yargs' -import { AccountArgv } from '../account' -export const command = 'revoke' - -export const describe = 'command for revoking verification of a phone number' - -interface RevokeArgv extends AccountArgv { - phone: string -} - -export const builder = (argv: yargs.Argv) => { - return argv.option('phone', { - type: 'string', - description: 'Phone number to revoke verification', - demand: 'Please specify phone number to revoke verification', - }) -} - -export const handler = async (argv: RevokeArgv) => { - await switchToClusterFromEnv(argv.celoEnv, true, false) - console.info(`Sending invitation code to ${argv.phone}`) - const cb = async () => { - await execCmd(`yarn --cwd ../protocol run revoke -n ${argv.celoEnv} -p ${argv.phone}`) - } - try { - await downloadArtifacts(argv.celoEnv) - await portForwardAnd(argv.celoEnv, cb) - } catch (error) { - console.error(`Unable to revoke verification for ${argv.phone}`) - console.error(error) - process.exit(1) - } -} diff --git a/packages/celotool/src/cmds/account/scripts/faucet-multiple.sh b/packages/celotool/src/cmds/account/scripts/faucet-multiple.sh deleted file mode 100755 index afd548ebf8d..00000000000 --- a/packages/celotool/src/cmds/account/scripts/faucet-multiple.sh +++ /dev/null @@ -1,13 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -CELO="$(dirname "$0")/../../../../../../.." - -cd $CELO/celo-monorepo && yarn build-sdk $1; - -$CELO/celo-monorepo/packages/celotool/bin/celotooljs.sh port-forward -e $1 & -sleep 5; - -$CELO/celo-monorepo/packages/celotool/bin/celotooljs.sh account faucet-multiple-helper -e $1 --accounts $2 - -killall -9 kubectl; diff --git a/packages/celotool/src/cmds/backup.ts b/packages/celotool/src/cmds/backup.ts deleted file mode 100644 index 1813590ebf1..00000000000 --- a/packages/celotool/src/cmds/backup.ts +++ /dev/null @@ -1,43 +0,0 @@ -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { execCmdWithExitOnFailure } from 'src/lib/cmd-utils' -import { addCeloEnvMiddleware, CeloEnvArgv } from 'src/lib/env-utils' -import yargs from 'yargs' - -export const command = 'backup' - -export const describe = "command for backing up a miner's persistent volume (PVC)" - -interface BackupArgv extends CeloEnvArgv { - minername: string -} - -export const builder = (args: yargs.Argv) => { - return addCeloEnvMiddleware(args).option('minername', { - type: 'string', - description: 'Name of the miner node', - demand: 'Please specify the miner node to backup, eg. gethminer1', - }) -} - -export const handler = async (argv: BackupArgv) => { - await switchToClusterFromEnv(argv.celoEnv, false, true) - - const minerName = argv.minername - // In the future, we can make this configurable. - const zone = 'us-west1-a' - const namespace = argv.celoEnv - const pvc = `${namespace}-${minerName}-pvc` - - const getPVCNameCommand = `kubectl get persistentvolumeclaim ${pvc} --namespace ${namespace} -o=jsonpath={.spec.volumeName}` - const pvcId = (await execCmdWithExitOnFailure(getPVCNameCommand))[0] - console.debug(`Persistent Volume Claim is ${pvcId}`) - const getPDNameCommand = `kubectl get persistentvolume ${pvcId} -o=jsonpath={.spec.gcePersistentDisk.pdName}` - const pdId = (await execCmdWithExitOnFailure(getPDNameCommand))[0] - console.debug(`Persistent Disk is ${pdId}`) - - const snapshotName = `snapshot-${namespace}-${minerName}-pvc-${Date.now()}` - const createSnapshotCommand = `gcloud compute disks snapshot ${pdId} --zone ${zone} --snapshot-names ${snapshotName}` - await execCmdWithExitOnFailure(createSnapshotCommand) - const gcloudSnapshotsUrl = 'https://console.cloud.google.com/compute/snapshots' - console.info(`Snapshot "${snapshotName}" can be seen at ${gcloudSnapshotsUrl}`) -} diff --git a/packages/celotool/src/cmds/bots.ts b/packages/celotool/src/cmds/bots.ts deleted file mode 100644 index a6855752fe1..00000000000 --- a/packages/celotool/src/cmds/bots.ts +++ /dev/null @@ -1,15 +0,0 @@ -import yargs from 'yargs' - -export const command = 'bots ' - -export const describe = 'various bots we have' - -export type BotsArgv = yargs.Argv - -export const builder = (argv: yargs.Argv) => { - return argv.commandDir('bots', { extensions: ['ts'] }) -} - -export const handler = () => { - // empty -} diff --git a/packages/celotool/src/cmds/bots/auto-vote.ts b/packages/celotool/src/cmds/bots/auto-vote.ts deleted file mode 100644 index e5e37787294..00000000000 --- a/packages/celotool/src/cmds/bots/auto-vote.ts +++ /dev/null @@ -1,326 +0,0 @@ -// The purpose of the Voting bot in a testnet is to add incentives for good -// behavior by validators. This introduces some non-validator stakers into the -// network that will judge the validator groups, and vote accordingly. -import { Address, ContractKit, newKitFromWeb3 } from '@celo/contractkit' -import { - ensureLeading0x, - eqAddress, - normalizeAddressWith0x, - NULL_ADDRESS, - privateKeyToAddress, -} from '@celo/utils/lib/address' -import { concurrentMap } from '@celo/utils/lib/async' -import BigNumber from 'bignumber.js' -import { groupBy, mapValues } from 'lodash' -import { envVar, fetchEnv } from 'src/lib/env-utils' -import { AccountType, getPrivateKeysFor } from 'src/lib/generate_utils' -import Web3 from 'web3' -import { Argv } from 'yargs' - -export const command = 'auto-vote' - -export const describe = 'for each of the voting bot accounts, vote for the best groups available' - -interface SimulateVotingArgv { - celoProvider: string - excludedGroups?: string[] -} - -export const builder = (yargs: Argv) => { - return yargs - .option('celoProvider', { - type: 'string', - description: 'The node to use', - default: 'http://localhost:8545', - }) - .option('excludedGroups', { - type: 'string', - description: 'A comma separated list of groups to exclude from voting eligibility', - coerce: (addresses: string) => { - return addresses - .split(',') - .filter((a) => a.length > 0) - .map(normalizeAddressWith0x) - }, - }) -} - -export const handler = async function simulateVoting(argv: SimulateVotingArgv) { - try { - const mnemonic = fetchEnv(envVar.MNEMONIC) - const numBotAccounts = parseInt(fetchEnv(envVar.VOTING_BOTS), 10) - - const excludedGroups: string[] = argv.excludedGroups || [] - - const kit = newKitFromWeb3(new Web3(argv.celoProvider)) - const election = await kit.contracts.getElection() - - const wakeProbability = new BigNumber(fetchEnv(envVar.VOTING_BOT_WAKE_PROBABILITY)) - const baseChangeProbability = new BigNumber(fetchEnv(envVar.VOTING_BOT_CHANGE_BASELINE)) - const exploreProbability = new BigNumber(fetchEnv(envVar.VOTING_BOT_EXPLORE_PROBABILITY)) - const scoreSensitivity = new BigNumber(fetchEnv(envVar.VOTING_BOT_SCORE_SENSITIVITY)) - - const allBotKeys = getPrivateKeysFor(AccountType.VOTING_BOT, mnemonic, numBotAccounts) - await activatePendingVotes(kit, allBotKeys) - - const botKeysVotingThisRound = allBotKeys.filter((_) => - wakeProbability.isGreaterThan(Math.random()) - ) - console.info(`Participating this time: ${botKeysVotingThisRound.length} of ${numBotAccounts}`) - - // If no bots are participating, return early - if (botKeysVotingThisRound.length === 0) { - return - } - - const groupCapacities = await calculateInitialGroupCapacities(kit) - const groupScores = await calculateGroupScores(kit) - const groupWeights = calculateGroupWeights(groupScores, scoreSensitivity) - - const unelectedGroups = Object.keys(groupCapacities).filter((k) => !groupScores.has(k)) - - for (const key of botKeysVotingThisRound) { - const botAccount = ensureLeading0x(privateKeyToAddress(key)) - - kit.connection.addAccount(key) - kit.connection.defaultAccount = botAccount - - console.info(`Voting as: ${botAccount}.`) - try { - // Get current vote for this bot. - // Note: though this returns an array, the bot process only ever chooses one group, - // so this takes a shortcut and only looks at the first in the response - const currentVote = (await election.getVoter(botAccount)).votes[0] - const currentGroup = currentVote ? normalizeAddressWith0x(currentVote.group) : undefined - - // Handle the case where the group the bot is currently voting for does not have a score - if ( - !currentGroup || - shouldChangeVote( - groupScores.get(currentGroup) || new BigNumber(0), - scoreSensitivity, - baseChangeProbability - ) - ) { - // Decide which method of picking a new group, and pick one if there are unelected - // groups with capacity - let randomlySelectedGroup: string = NULL_ADDRESS - if (exploreProbability.isGreaterThan(Math.random())) { - console.info('Vote Method: unweighted random choice of unelected') - randomlySelectedGroup = getUnweightedRandomChoice( - unelectedGroups.filter((k) => - shouldBeConsidered(k, currentGroup, excludedGroups, groupCapacities) - ) - ) - if (randomlySelectedGroup === NULL_ADDRESS) { - console.info('No unelected groups available, falling back to weighted-by-score') - } - } - - // This catches 2 cases in which randomlySelectedGroup is undefined: - // 1. it tried to pick an unelected group, but none were available - // 2. it is not using the "explore" strategy - if (randomlySelectedGroup === NULL_ADDRESS) { - console.info('Vote Method: weighted random choice among those with scores') - randomlySelectedGroup = getWeightedRandomChoice( - groupWeights, - [...groupCapacities.keys()].filter((k) => - shouldBeConsidered(k, currentGroup, excludedGroups, groupCapacities) - ) - ) - } - - if (randomlySelectedGroup === NULL_ADDRESS) { - console.info('Was unable to find an available group to vote for. Skipping this time.') - } else { - await castVote(kit, botAccount, randomlySelectedGroup, groupCapacities) - } - } else { - console.info(`${botAccount} has decided to keep their existing vote`) - } - } catch (error) { - console.error(`Failed to vote as ${botAccount}`) - console.info(error) - } - } - } catch (error) { - console.error(error) - process.exit(1) - } finally { - process.exit(0) - } -} - -async function castVote( - kit: ContractKit, - botAccount: string, - voteForGroup: Address, - groupCapacities: Map -) { - const lockedGold = await kit.contracts.getLockedGold() - const election = await kit.contracts.getElection() - - const lockedGoldAmount = await lockedGold.getAccountTotalLockedGold(botAccount) - if (lockedGoldAmount.isZero()) { - console.info(`No locked gold exists for ${botAccount}, skipping...`) - return - } - - const currentVotes = (await election.getVoter(botAccount)).votes - - // Revoke existing vote(s) if any and update capacity of the group - for (const vote of currentVotes) { - const revokeTxs = await election.revoke(botAccount, vote.group, vote.pending.plus(vote.active)) - await concurrentMap(10, revokeTxs, (tx) => { - return tx.sendAndWaitForReceipt({ from: botAccount }) - }) - const group = normalizeAddressWith0x(vote.group) - const oldCapacity = groupCapacities.get(group) - groupCapacities.set(group, oldCapacity.plus(vote.pending.plus(vote.active))) - } - - const groupCapacity = groupCapacities.get(voteForGroup) - const voteAmount = BigNumber.minimum(lockedGoldAmount, groupCapacity) - const voteTx = await election.vote(voteForGroup, BigNumber.minimum(voteAmount)) - await voteTx.sendAndWaitForReceipt({ from: botAccount }) - console.info(`Completed voting as ${botAccount}`) - - groupCapacities.set(voteForGroup, groupCapacity.minus(voteAmount)) -} - -async function calculateInitialGroupCapacities(kit: ContractKit): Promise> { - console.info('Determining which groups have capacity for more votes') - - const validators = await kit.contracts.getValidators() - const election = await kit.contracts.getElection() - - const groupCapacities = new Map() - for (const groupAddress of await validators.getRegisteredValidatorGroupsAddresses()) { - const vgv = await election.getValidatorGroupVotes(groupAddress) - if (vgv.eligible) { - groupCapacities.set(normalizeAddressWith0x(groupAddress), vgv.capacity) - } - } - - return groupCapacities -} - -async function calculateGroupScores(kit: ContractKit): Promise> { - console.info('Calculating weights of groups based on the scores of elected validators') - const election = await kit.contracts.getElection() - const validators = await kit.contracts.getValidators() - - const validatorSigners = await election.getCurrentValidatorSigners() - const validatorAccounts = ( - await concurrentMap(10, validatorSigners, (acc) => { - return validators.getValidatorFromSigner(acc) - }) - ).filter((v) => !!v.affiliation) // Skip unaffiliated - - const validatorsByGroup = groupBy(validatorAccounts, (validator) => - normalizeAddressWith0x(validator.affiliation) - ) - - const validatorGroupScores = mapValues(validatorsByGroup, (vals) => { - const scoreSum = vals.reduce((a, b) => a.plus(b.score), new BigNumber(0)) - return scoreSum.dividedBy(vals.length) - }) - - return new Map(Object.entries(validatorGroupScores)) -} - -function calculateGroupWeights( - groupScores: Map, - scoreSensitivity: BigNumber -): Map { - const groupWeights = new Map() - for (const group of groupScores.keys()) { - const score = groupScores.get(group) - if (score && score.isGreaterThan(0)) { - groupWeights.set(group, score.pow(scoreSensitivity)) - } else { - groupWeights.set(group, new BigNumber(0)) - } - } - return groupWeights -} - -function getUnweightedRandomChoice(groupsToConsider: string[]): string { - const randomIndex = Math.floor(groupsToConsider.length * Math.random()) - return groupsToConsider[randomIndex] || NULL_ADDRESS -} - -function getWeightedRandomChoice( - groupWeights: Map, - groupsToConsider: string[] -): string { - // Filter to groups open to consideration, and sort from highest probability to lowest - const sortedGroupKeys = [...groupWeights.keys()] - .filter((k) => groupsToConsider.includes(k)) - .sort((a, b) => { - return groupWeights.get(b).comparedTo(groupWeights.get(a)) - }) - - let weightTotal = new BigNumber(0) - for (const key of sortedGroupKeys) { - weightTotal = weightTotal.plus(groupWeights.get(key) || 0) - } - - const choice = weightTotal.multipliedBy(Math.random()) - let totalSoFar = new BigNumber(0) - - for (const key of sortedGroupKeys) { - totalSoFar = totalSoFar.plus(groupWeights.get(key)) - if (totalSoFar.isGreaterThanOrEqualTo(choice)) { - return key - } - } - - // If this happens, it means no groups were available - return NULL_ADDRESS -} - -async function activatePendingVotes(kit: ContractKit, botKeys: string[]): Promise { - const election = await kit.contracts.getElection() - - await concurrentMap(10, botKeys, async (key) => { - kit.connection.addAccount(key) - const account = ensureLeading0x(privateKeyToAddress(key)) - if (!(await election.hasActivatablePendingVotes(account))) { - try { - const activateTxs = await election.activate(account) - await concurrentMap(10, activateTxs, (tx) => tx.sendAndWaitForReceipt({ from: account })) - } catch (error) { - console.error(`Failed to activate pending votes for ${account}`) - } - } - }) -} - -function shouldChangeVote( - score: BigNumber, - scoreSensitivity: BigNumber, - baseChangeProbability: BigNumber -): boolean { - const scoreBasedProbability = score.pow(scoreSensitivity).negated().plus(1) - const scaledProbability = scoreBasedProbability.times(baseChangeProbability.negated().plus(1)) - const totalProbability = scaledProbability.plus(baseChangeProbability) - - return totalProbability.isGreaterThan(Math.random()) -} - -function shouldBeConsidered( - groupAddress: string, - currentGroup: string | undefined, - excludedGroups: string[], - groupCapacities: Map -): boolean { - const normalizedGroupAddress = normalizeAddressWith0x(groupAddress) - const capacity = groupCapacities.get(normalizedGroupAddress) - return !!( - !excludedGroups.includes(normalizedGroupAddress) && - capacity && - capacity.isGreaterThan(0) && - (!currentGroup || !eqAddress(currentGroup, normalizedGroupAddress)) - ) -} diff --git a/packages/celotool/src/cmds/contract_addresses.ts b/packages/celotool/src/cmds/contract_addresses.ts deleted file mode 100644 index db3e000867f..00000000000 --- a/packages/celotool/src/cmds/contract_addresses.ts +++ /dev/null @@ -1,41 +0,0 @@ -import * as fs from 'fs' -import { CONTRACTS_TO_COPY, downloadArtifacts, getContractAddresses } from 'src/lib/artifacts' -import { CeloEnvArgv, addCeloEnvMiddleware } from 'src/lib/env-utils' -import yargs from 'yargs' - -export const command = 'contract-addresses' - -export const describe = 'command for obtaining the contract addesses map' - -interface CopyContractArtifactsArgs extends CeloEnvArgv { - contracts: string - outputPath: string -} - -export const builder = (argv: yargs.Argv) => { - return addCeloEnvMiddleware(argv) - .option('contracts', { - default: CONTRACTS_TO_COPY.join(','), - type: 'string', - description: 'the names of the contracts separated by commas', - }) - .option('output-path', { - alias: 'o', - type: 'string', - description: 'the absolute output folder path', - }) -} - -export const handler = async (argv: CopyContractArtifactsArgs) => { - await downloadArtifacts(argv.celoEnv) - - const contractList = argv.contracts.split(',') - - const addressMap = await getContractAddresses(argv.celoEnv, contractList) - - if (argv.outputPath) { - fs.writeFileSync(argv.outputPath, JSON.stringify(addressMap, null, 2)) - } else { - console.info(addressMap) - } -} diff --git a/packages/celotool/src/cmds/copy_contract_artifacts.ts b/packages/celotool/src/cmds/copy_contract_artifacts.ts deleted file mode 100644 index ed3ceff74d0..00000000000 --- a/packages/celotool/src/cmds/copy_contract_artifacts.ts +++ /dev/null @@ -1,36 +0,0 @@ -import { CONTRACTS_TO_COPY, copyContractArtifacts, downloadArtifacts } from 'src/lib/artifacts' -import { addCeloEnvMiddleware, CeloEnvArgv } from 'src/lib/env-utils' -import yargs from 'yargs' - -export const command = 'copy-contract-artifacts' - -export const describe = - 'command for copying contract artifacts in a format to be easily consumed by other (typescript) packages. It will use the ABI of a particular contract and swap the address for the address of the Proxy.' - -interface CopyContractArtifactsArgs extends CeloEnvArgv { - contracts: string - outputPath: string -} - -export const builder = (argv: yargs.Argv) => { - return addCeloEnvMiddleware(argv) - .option('contracts', { - default: CONTRACTS_TO_COPY.join(','), - type: 'string', - description: 'the names of the contracts separated by commas', - }) - .option('output-path', { - required: true, - alias: 'o', - type: 'string', - description: 'the absolute output folder path', - }) -} - -export const handler = async (argv: CopyContractArtifactsArgs) => { - await downloadArtifacts(argv.celoEnv) - - const contractList = argv.contracts.split(',') - - await copyContractArtifacts(argv.celoEnv, argv.outputPath, contractList) -} diff --git a/packages/celotool/src/cmds/deploy.ts b/packages/celotool/src/cmds/deploy.ts deleted file mode 100644 index 7abad88cb6e..00000000000 --- a/packages/celotool/src/cmds/deploy.ts +++ /dev/null @@ -1,15 +0,0 @@ -import { addCeloEnvMiddleware, CeloEnvArgv } from 'src/lib/env-utils' -import yargs from 'yargs' - -export const command = 'deploy ' - -export const describe = 'commands for deployment of various packages in the monorepo' - -export type DeployArgv = CeloEnvArgv - -export const builder = (argv: yargs.Argv) => { - return addCeloEnvMiddleware(argv).commandDir('deploy', { extensions: ['ts'] }) -} -export const handler = () => { - // empty -} diff --git a/packages/celotool/src/cmds/deploy/clone.ts b/packages/celotool/src/cmds/deploy/clone.ts deleted file mode 100644 index 828333e632f..00000000000 --- a/packages/celotool/src/cmds/deploy/clone.ts +++ /dev/null @@ -1,15 +0,0 @@ -import yargs from 'yargs' -import { DeployArgv } from '../deploy' -export const command = 'clone ' - -export const describe = 'clone the initial deploy of a package in the monorepo' - -export type CloneArgv = DeployArgv - -export const builder = (argv: yargs.Argv) => { - return argv.commandDir('clone', { extensions: ['ts'] }) -} - -export const handler = () => { - // empty -} diff --git a/packages/celotool/src/cmds/deploy/destroy.ts b/packages/celotool/src/cmds/deploy/destroy.ts deleted file mode 100644 index d771e8eed03..00000000000 --- a/packages/celotool/src/cmds/deploy/destroy.ts +++ /dev/null @@ -1,15 +0,0 @@ -import yargs from 'yargs' -import { DeployArgv } from '../deploy' -export const command = 'destroy ' - -export const describe = 'destroy an existing deploy' - -export type DestroyArgv = DeployArgv - -export const builder = (argv: yargs.Argv) => { - return argv.commandDir('destroy', { extensions: ['ts'] }) -} - -export const handler = () => { - // empty -} diff --git a/packages/celotool/src/cmds/deploy/destroy/celostats.ts b/packages/celotool/src/cmds/deploy/destroy/celostats.ts deleted file mode 100644 index 80d1640a22a..00000000000 --- a/packages/celotool/src/cmds/deploy/destroy/celostats.ts +++ /dev/null @@ -1,17 +0,0 @@ -import { removeHelmRelease } from 'src/lib/celostats' -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { exitIfCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import { DestroyArgv } from '../destroy' - -export const command = 'celostats' - -export const describe = 'destroy the celostats package' - -export const builder = {} - -export const handler = async (argv: DestroyArgv) => { - exitIfCelotoolHelmDryRun() - await switchToClusterFromEnv(argv.celoEnv) - - await removeHelmRelease(argv.celoEnv) -} diff --git a/packages/celotool/src/cmds/deploy/destroy/chaoskube.ts b/packages/celotool/src/cmds/deploy/destroy/chaoskube.ts deleted file mode 100644 index 3958eae5411..00000000000 --- a/packages/celotool/src/cmds/deploy/destroy/chaoskube.ts +++ /dev/null @@ -1,16 +0,0 @@ -import { helmReleaseName } from 'src/lib/chaoskube' -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { exitIfCelotoolHelmDryRun, removeGenericHelmChart } from 'src/lib/helm_deploy' -import { DestroyArgv } from '../../deploy/destroy' - -export const command = 'chaoskube' - -export const describe = 'deploy the chaoskube package' - -export const builder = {} - -export const handler = async (argv: DestroyArgv) => { - exitIfCelotoolHelmDryRun() - await switchToClusterFromEnv(argv.celoEnv) - await removeGenericHelmChart(helmReleaseName(argv.celoEnv), argv.celoEnv) -} diff --git a/packages/celotool/src/cmds/deploy/destroy/cluster.ts b/packages/celotool/src/cmds/deploy/destroy/cluster.ts deleted file mode 100644 index 9173803453a..00000000000 --- a/packages/celotool/src/cmds/deploy/destroy/cluster.ts +++ /dev/null @@ -1,30 +0,0 @@ -import { printReleases } from 'src/cmds/deploy/list' -import { deleteCluster, getNonSystemHelmReleases, switchToClusterFromEnv } from 'src/lib/cluster' -import { envTypes, envVar, fetchEnv } from 'src/lib/env-utils' -import { exitIfCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import { DestroyArgv } from '../../deploy/destroy' - -export const command = 'cluster' - -export const describe = 'deletes the cluster for the given environment' - -export const builder = {} - -export const handler = async (argv: DestroyArgv) => { - exitIfCelotoolHelmDryRun() - const envType = fetchEnv(envVar.ENV_TYPE) as envTypes - if (envType !== envTypes.DEVELOPMENT) { - console.error('You can only delete dev clusters') - process.exit(1) - } - - await switchToClusterFromEnv(argv.celoEnv) - const releases = await getNonSystemHelmReleases() - if (releases.length > 0) { - console.error('Cannot delete cluster, contains deployed packages that should be removed first') - printReleases(releases) - process.exit(1) - } - - await deleteCluster() -} diff --git a/packages/celotool/src/cmds/deploy/destroy/fullnodes.ts b/packages/celotool/src/cmds/deploy/destroy/fullnodes.ts deleted file mode 100644 index 107462eb9ec..00000000000 --- a/packages/celotool/src/cmds/deploy/destroy/fullnodes.ts +++ /dev/null @@ -1,21 +0,0 @@ -import { DestroyArgv } from 'src/cmds/deploy/destroy' -import { addContextMiddleware, ContextArgv, switchToContextCluster } from 'src/lib/context-utils' -import { removeFullNodeChart } from 'src/lib/fullnodes' -import { delinkSAForWorkloadIdentity, removeKubectlAnnotateKSA } from 'src/lib/gcloud_utils' -import { exitIfCelotoolHelmDryRun } from 'src/lib/helm_deploy' - -export const command = 'fullnodes' - -export const describe = 'deploy full-nodes in a particular context' - -type FullNodeDestroyArgv = DestroyArgv & ContextArgv - -export const builder = addContextMiddleware - -export const handler = async (argv: FullNodeDestroyArgv) => { - exitIfCelotoolHelmDryRun() - await switchToContextCluster(argv.celoEnv, argv.context) - await removeFullNodeChart(argv.celoEnv, argv.context) - await removeKubectlAnnotateKSA(argv.celoEnv, argv.context) - await delinkSAForWorkloadIdentity(argv.celoEnv, argv.context) -} diff --git a/packages/celotool/src/cmds/deploy/destroy/leaderboard.ts b/packages/celotool/src/cmds/deploy/destroy/leaderboard.ts deleted file mode 100644 index 7bb0f050643..00000000000 --- a/packages/celotool/src/cmds/deploy/destroy/leaderboard.ts +++ /dev/null @@ -1,18 +0,0 @@ -import { createClusterIfNotExists, switchToClusterFromEnv } from 'src/lib/cluster' -import { exitIfCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import { removeHelmRelease } from 'src/lib/leaderboard' -import { DestroyArgv } from '../../deploy/destroy' - -export const command = 'leaderboard' - -export const describe = 'destroy the leaderboard package' - -export const builder = {} - -export const handler = async (argv: DestroyArgv) => { - exitIfCelotoolHelmDryRun() - await createClusterIfNotExists() - await switchToClusterFromEnv(argv.celoEnv) - - await removeHelmRelease(argv.celoEnv) -} diff --git a/packages/celotool/src/cmds/deploy/destroy/load-test.ts b/packages/celotool/src/cmds/deploy/destroy/load-test.ts deleted file mode 100644 index 062469d443e..00000000000 --- a/packages/celotool/src/cmds/deploy/destroy/load-test.ts +++ /dev/null @@ -1,16 +0,0 @@ -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { exitIfCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import { removeHelmRelease } from 'src/lib/load-test' -import { DestroyArgv } from '../../deploy/destroy' - -export const command = 'load-test' - -export const describe = 'destroy load-test deployment' - -export const builder = {} - -export const handler = async (argv: DestroyArgv) => { - exitIfCelotoolHelmDryRun() - await switchToClusterFromEnv(argv.celoEnv) - await removeHelmRelease(argv.celoEnv) -} diff --git a/packages/celotool/src/cmds/deploy/destroy/mock-oracle.ts b/packages/celotool/src/cmds/deploy/destroy/mock-oracle.ts deleted file mode 100644 index 090590fc178..00000000000 --- a/packages/celotool/src/cmds/deploy/destroy/mock-oracle.ts +++ /dev/null @@ -1,17 +0,0 @@ -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { exitIfCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import { removeHelmRelease } from 'src/lib/mock-oracle' -import { DestroyArgv } from '../../deploy/destroy' - -export const command = 'mock-oracle' - -export const describe = 'destroy the mock oracle package' - -export const builder = {} - -export const handler = async (argv: DestroyArgv) => { - exitIfCelotoolHelmDryRun() - await switchToClusterFromEnv(argv.celoEnv) - - await removeHelmRelease(argv.celoEnv) -} diff --git a/packages/celotool/src/cmds/deploy/destroy/odis.ts b/packages/celotool/src/cmds/deploy/destroy/odis.ts deleted file mode 100644 index 3eebf1d1705..00000000000 --- a/packages/celotool/src/cmds/deploy/destroy/odis.ts +++ /dev/null @@ -1,16 +0,0 @@ -import { addContextMiddleware, ContextArgv, switchToContextCluster } from 'src/lib/context-utils' -import { removeHelmRelease } from 'src/lib/odis' -import { DestroyArgv } from '../destroy' - -export const command = 'odis' - -export const describe = 'destroy the odis package' - -type ODISDestroyArgv = DestroyArgv & ContextArgv - -export const builder = addContextMiddleware - -export const handler = async (argv: ODISDestroyArgv) => { - await switchToContextCluster(argv.celoEnv, argv.context) - await removeHelmRelease(argv.celoEnv, argv.context) -} diff --git a/packages/celotool/src/cmds/deploy/destroy/oracle.ts b/packages/celotool/src/cmds/deploy/destroy/oracle.ts deleted file mode 100644 index 3b2bc3654cd..00000000000 --- a/packages/celotool/src/cmds/deploy/destroy/oracle.ts +++ /dev/null @@ -1,33 +0,0 @@ -import { flow } from 'lodash' -import { addContextMiddleware, ContextArgv, switchToContextCluster } from 'src/lib/context-utils' -import { exitIfCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import { CurrencyPair } from 'src/lib/k8s-oracle/base' -import { addCurrencyPairMiddleware, getOracleDeployerForContext } from 'src/lib/oracle' -import yargs from 'yargs' -import { DestroyArgv } from '../../deploy/destroy' - -export const command = 'oracle' - -export const describe = 'destroy the oracle package' - -type OracleDestroyArgv = DestroyArgv & - ContextArgv & { - currencyPair: CurrencyPair - } - -export const builder = (argv: yargs.Argv) => { - return flow([addContextMiddleware, addCurrencyPairMiddleware])(argv) -} - -export const handler = async (argv: OracleDestroyArgv) => { - exitIfCelotoolHelmDryRun() - const clusterManager = await switchToContextCluster(argv.celoEnv, argv.context) - const deployer = getOracleDeployerForContext( - argv.celoEnv, - argv.context, - argv.currencyPair, - false, // doesn't matter if we are using forno as we are just going to remove the chart - clusterManager - ) - await deployer.removeChart() -} diff --git a/packages/celotool/src/cmds/deploy/destroy/prometheus.ts b/packages/celotool/src/cmds/deploy/destroy/prometheus.ts deleted file mode 100644 index 4af0a3d8154..00000000000 --- a/packages/celotool/src/cmds/deploy/destroy/prometheus.ts +++ /dev/null @@ -1,23 +0,0 @@ -import { DestroyArgv } from 'src/cmds/deploy/destroy' -import { switchToClusterFromEnvOrContext } from 'src/lib/cluster' -import { addContextMiddleware, ContextArgv } from 'src/lib/context-utils' -import { exitIfCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import { removeGrafanaHelmRelease, removePrometheus } from 'src/lib/prometheus' - -export const command = 'prometheus' - -export const describe = 'destroy prometheus chart release on a kubernetes cluster using Helm' - -export type PrometheusDestroyArgv = DestroyArgv & ContextArgv - -export const builder = (argv: PrometheusDestroyArgv) => { - return addContextMiddleware(argv) -} - -export const handler = async (argv: PrometheusDestroyArgv) => { - exitIfCelotoolHelmDryRun() - await switchToClusterFromEnvOrContext(argv, true) - - await removeGrafanaHelmRelease() - await removePrometheus() -} diff --git a/packages/celotool/src/cmds/deploy/destroy/promtail.ts b/packages/celotool/src/cmds/deploy/destroy/promtail.ts deleted file mode 100644 index 7b85886b701..00000000000 --- a/packages/celotool/src/cmds/deploy/destroy/promtail.ts +++ /dev/null @@ -1,23 +0,0 @@ -import { DestroyArgv } from 'src/cmds/deploy/destroy' -import { switchToClusterFromEnvOrContext } from 'src/lib/cluster' -import { addContextMiddleware, ContextArgv } from 'src/lib/context-utils' -import { exitIfCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import { removePromtail } from 'src/lib/promtail' - -export const command = 'promtail' - -export const describe = 'destroy promtail chart release on a kubernetes cluster using Helm' - -export type PrometailDestroyArgv = DestroyArgv & ContextArgv - -export const builder = (argv: PrometailDestroyArgv) => { - return addContextMiddleware(argv) -} - -export const handler = async (argv: PrometailDestroyArgv) => { - exitIfCelotoolHelmDryRun() - - await switchToClusterFromEnvOrContext(argv, true) - - await removePromtail() -} diff --git a/packages/celotool/src/cmds/deploy/destroy/pumba.ts b/packages/celotool/src/cmds/deploy/destroy/pumba.ts deleted file mode 100644 index da90709aa4b..00000000000 --- a/packages/celotool/src/cmds/deploy/destroy/pumba.ts +++ /dev/null @@ -1,16 +0,0 @@ -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { exitIfCelotoolHelmDryRun, removeGenericHelmChart } from 'src/lib/helm_deploy' -import { helmReleaseName } from 'src/lib/pumba' -import { DestroyArgv } from '../../deploy/destroy' - -export const command = 'pumba' - -export const describe = 'deploy the pumba package' - -export const builder = {} - -export const handler = async (argv: DestroyArgv) => { - exitIfCelotoolHelmDryRun() - await switchToClusterFromEnv(argv.celoEnv) - await removeGenericHelmChart(helmReleaseName(argv.celoEnv), argv.celoEnv) -} diff --git a/packages/celotool/src/cmds/deploy/destroy/testnet.ts b/packages/celotool/src/cmds/deploy/destroy/testnet.ts deleted file mode 100644 index 7d531742a53..00000000000 --- a/packages/celotool/src/cmds/deploy/destroy/testnet.ts +++ /dev/null @@ -1,17 +0,0 @@ -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { deleteFromCluster, deleteStaticIPs, exitIfCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import { DestroyArgv } from '../../deploy/destroy' - -export const command = 'testnet' -export const describe = 'destroy an existing deploy of the testnet package' - -export const builder = {} - -export const handler = async (argv: DestroyArgv) => { - exitIfCelotoolHelmDryRun() - - await switchToClusterFromEnv(argv.celoEnv) - - await deleteFromCluster(argv.celoEnv) - await deleteStaticIPs(argv.celoEnv) -} diff --git a/packages/celotool/src/cmds/deploy/destroy/tracer-tool.ts b/packages/celotool/src/cmds/deploy/destroy/tracer-tool.ts deleted file mode 100644 index 81780658954..00000000000 --- a/packages/celotool/src/cmds/deploy/destroy/tracer-tool.ts +++ /dev/null @@ -1,18 +0,0 @@ -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { exitIfCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import { removeHelmRelease } from 'src/lib/tracer-tool' -import { DestroyArgv } from '../../deploy/destroy' - -export const command = 'tracer-tool' - -export const describe = 'destroy tracer-tool deployment' - -type TracerToolArgv = DestroyArgv - -export const builder = {} - -export const handler = async (argv: TracerToolArgv) => { - exitIfCelotoolHelmDryRun() - await switchToClusterFromEnv(argv.celoEnv) - await removeHelmRelease(argv.celoEnv) -} diff --git a/packages/celotool/src/cmds/deploy/destroy/transaction-metrics-exporter.ts b/packages/celotool/src/cmds/deploy/destroy/transaction-metrics-exporter.ts deleted file mode 100644 index ba936f744bd..00000000000 --- a/packages/celotool/src/cmds/deploy/destroy/transaction-metrics-exporter.ts +++ /dev/null @@ -1,16 +0,0 @@ -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { exitIfCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import { removeHelmRelease } from 'src/lib/transaction-metrics-exporter' -import { DestroyArgv } from '../../deploy/destroy' - -export const command = 'transaction-metrics-exporter' - -export const describe = 'destroy the transaction metrics exporter deploy' - -export const builder = {} - -export const handler = async (argv: DestroyArgv) => { - exitIfCelotoolHelmDryRun() - await switchToClusterFromEnv(argv.celoEnv) - await removeHelmRelease(argv.celoEnv) -} diff --git a/packages/celotool/src/cmds/deploy/destroy/voting-bot.ts b/packages/celotool/src/cmds/deploy/destroy/voting-bot.ts deleted file mode 100644 index 8e440089f9c..00000000000 --- a/packages/celotool/src/cmds/deploy/destroy/voting-bot.ts +++ /dev/null @@ -1,16 +0,0 @@ -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { exitIfCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import { removeHelmRelease } from 'src/lib/voting-bot' -import { DestroyArgv } from '../../deploy/destroy' - -export const command = 'voting-bot' - -export const describe = 'destroy the voting bot package' - -export const builder = {} - -export const handler = async (argv: DestroyArgv) => { - exitIfCelotoolHelmDryRun() - await switchToClusterFromEnv(argv.celoEnv) - await removeHelmRelease(argv.celoEnv) -} diff --git a/packages/celotool/src/cmds/deploy/destroy/wallet-connect.ts b/packages/celotool/src/cmds/deploy/destroy/wallet-connect.ts deleted file mode 100644 index 46046390eb4..00000000000 --- a/packages/celotool/src/cmds/deploy/destroy/wallet-connect.ts +++ /dev/null @@ -1,16 +0,0 @@ -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { exitIfCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import { removeWalletConnect } from 'src/lib/wallet-connect' -import { DestroyArgv } from '../destroy' - -export const command = 'walletconnect' - -export const describe = 'deploy the walletconnect package' - -export const builder = {} - -export const handler = async (argv: DestroyArgv) => { - exitIfCelotoolHelmDryRun() - await switchToClusterFromEnv(argv.celoEnv) - await removeWalletConnect() -} diff --git a/packages/celotool/src/cmds/deploy/initial.ts b/packages/celotool/src/cmds/deploy/initial.ts deleted file mode 100644 index cd686443c34..00000000000 --- a/packages/celotool/src/cmds/deploy/initial.ts +++ /dev/null @@ -1,15 +0,0 @@ -import yargs from 'yargs' -import { DeployArgv } from '../deploy' -export const command = 'initial ' - -export const describe = 'create the initial deploy of a package in the monorepo' - -export type InitialArgv = DeployArgv - -export const builder = (argv: yargs.Argv) => { - return argv.commandDir('initial', { extensions: ['ts'] }) -} - -export const handler = () => { - // empty -} diff --git a/packages/celotool/src/cmds/deploy/initial/blockchain-api.ts b/packages/celotool/src/cmds/deploy/initial/blockchain-api.ts deleted file mode 100644 index e3a1e9d6850..00000000000 --- a/packages/celotool/src/cmds/deploy/initial/blockchain-api.ts +++ /dev/null @@ -1,29 +0,0 @@ -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { execCmd } from 'src/lib/cmd-utils' -import { AccountType, getAddressFromEnv } from 'src/lib/generate_utils' -import { exitIfCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import { UpgradeArgv } from '../../deploy/upgrade' - -export const command = 'blockchain-api' - -export const describe = 'command for upgrading blockchain-api' - -// Can't extend because yargs.Argv already has a `config` property -type BlockchainApiArgv = UpgradeArgv - -export const handler = async (argv: BlockchainApiArgv) => { - exitIfCelotoolHelmDryRun() - await switchToClusterFromEnv(argv.celoEnv) - const newFaucetAddress = getAddressFromEnv(AccountType.VALIDATOR, 0) // We use the 0th validator as the faucet - console.info(`updating blockchain-api yaml file for env ${argv.celoEnv}`) - await execCmd( - // eslint-disable-next-line no-useless-escape - `sed -i.bak 's/FAUCET_ADDRESS: .*$/FAUCET_ADDRESS: \"${newFaucetAddress}\"/g' ../blockchain-api/app.${argv.celoEnv}.yaml` - ) - await execCmd(`rm ../blockchain-api/app.${argv.celoEnv}.yaml.bak`) // Removing temporary bak file - - // eslint-disable-next-line @typescript-eslint/unbound-method, @typescript-eslint/restrict-template-expressions - console.info(`deploying blockchain-api for env ${argv.config}`) - await execCmd(`yarn --cwd ../blockchain-api run deploy -n ${argv.celoEnv}`) - console.info(`blockchain-api deploy complete`) -} diff --git a/packages/celotool/src/cmds/deploy/initial/celostats.ts b/packages/celotool/src/cmds/deploy/initial/celostats.ts deleted file mode 100644 index 054a3314790..00000000000 --- a/packages/celotool/src/cmds/deploy/initial/celostats.ts +++ /dev/null @@ -1,12 +0,0 @@ -import { installHelmChart } from 'src/lib/celostats' -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { InitialArgv } from '../initial' - -export const command = 'celostats' - -export const describe = 'deploy the celostats package' - -export const handler = async (argv: InitialArgv) => { - await switchToClusterFromEnv(argv.celoEnv) - await installHelmChart(argv.celoEnv) -} diff --git a/packages/celotool/src/cmds/deploy/initial/chaoskube.ts b/packages/celotool/src/cmds/deploy/initial/chaoskube.ts deleted file mode 100644 index 9068607deda..00000000000 --- a/packages/celotool/src/cmds/deploy/initial/chaoskube.ts +++ /dev/null @@ -1,20 +0,0 @@ -import { helmChartDir, helmParameters, helmReleaseName } from 'src/lib/chaoskube' -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { installGenericHelmChart } from 'src/lib/helm_deploy' -import { InitialArgv } from '../../deploy/initial' - -export const command = 'chaoskube' - -export const describe = 'deploy the chaoskube package' - -export const builder = {} - -export const handler = async (argv: InitialArgv) => { - await switchToClusterFromEnv(argv.celoEnv) - await installGenericHelmChart({ - namespace: argv.celoEnv, - releaseName: helmReleaseName(argv.celoEnv), - chartDir: helmChartDir, - parameters: helmParameters(argv.celoEnv), - }) -} diff --git a/packages/celotool/src/cmds/deploy/initial/contracts.ts b/packages/celotool/src/cmds/deploy/initial/contracts.ts deleted file mode 100644 index 5c3ba4dd4c4..00000000000 --- a/packages/celotool/src/cmds/deploy/initial/contracts.ts +++ /dev/null @@ -1,102 +0,0 @@ -/* tslint:disable no-console */ -import { ContractKit, IdentityMetadataWrapper, newKitFromWeb3 } from '@celo/contractkit' -import { createNameClaim } from '@celo/contractkit/lib/identity/claims/claim' -import { concurrentMap } from '@celo/utils/lib/async' -import { LocalSigner } from '@celo/utils/lib/signatureUtils' -import { writeFileSync } from 'fs' -import { uploadArtifacts } from 'src/lib/artifacts' -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { execCmd } from 'src/lib/cmd-utils' -import { privateKeyToAddress } from 'src/lib/generate_utils' -import { exitIfCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import { migrationOverrides, truffleOverrides, validatorKeys } from 'src/lib/migration-utils' -import { portForwardAnd } from 'src/lib/port_forward' -import { uploadFileToGoogleStorage } from 'src/lib/testnet-utils' -import Web3 from 'web3' -import yargs from 'yargs' -import { InitialArgv } from '../../deploy/initial' - -export const command = 'contracts' - -export const describe = 'deploy the celo smart contracts' - -type ContractsArgv = InitialArgv & { - skipFaucetting: boolean -} - -export const builder = (argv: yargs.Argv) => { - return argv.option('skipFaucetting', { - describe: 'skips allocation of cUSD to any oracle or bot accounts', - default: false, - type: 'boolean', - }) -} - -export const CLABS_VALIDATOR_METADATA_BUCKET = 'clabs_validator_metadata' - -function metadataURLForCLabsValidator(testnet: string, address: string) { - return `https://storage.googleapis.com/${CLABS_VALIDATOR_METADATA_BUCKET}/${testnet}/validator-${testnet}-${address}-metadata.json` -} - -async function makeMetadata(testnet: string, address: string, index: number, privateKey: string) { - const nameClaim = createNameClaim(`Validator ${index} on ${testnet}: ${address}`) - - const fileName = `validator-${testnet}-${address}-metadata.json` - const filePath = `/tmp/${fileName}` - - const metadata = IdentityMetadataWrapper.fromEmpty(address) - await metadata.addClaim(nameClaim, LocalSigner(privateKey)) - writeFileSync(filePath, metadata.toString()) - - await uploadFileToGoogleStorage( - filePath, - CLABS_VALIDATOR_METADATA_BUCKET, - `${testnet}/${fileName}`, - false, - 'application/json' - ) -} - -export async function registerMetadata(testnet: string, privateKey: string, index: number) { - const address = privateKeyToAddress(privateKey) - await makeMetadata(testnet, address, index, privateKey) - - const web3: Web3 = new Web3('http://localhost:8545') - const kit: ContractKit = newKitFromWeb3(web3) - kit.connection.addAccount(privateKey) - kit.connection.defaultAccount = address - - const accounts = await kit.contracts.getAccounts() - return accounts - .setMetadataURL(metadataURLForCLabsValidator(testnet, address)) - .sendAndWaitForReceipt() -} - -export const handler = async (argv: ContractsArgv) => { - exitIfCelotoolHelmDryRun() - await switchToClusterFromEnv(argv.celoEnv) - - console.info(`Deploying smart contracts to ${argv.celoEnv}`) - const cb = async () => { - await execCmd( - `yarn --cwd ../protocol run init-network -n ${argv.celoEnv} -c '${JSON.stringify( - truffleOverrides() - )}' -m '${JSON.stringify(await migrationOverrides(!argv.skipFaucetting))}'` - ) - - console.info('Register Metadata for Clabs validators') - await concurrentMap(5, validatorKeys(), (privateKey, index) => - registerMetadata(argv.celoEnv, privateKey, index) - ) - } - - try { - await portForwardAnd(argv.celoEnv, cb) - await uploadArtifacts(argv.celoEnv) - return - } catch (error) { - console.error(`Unable to deploy smart contracts to ${argv.celoEnv}`) - console.error(error) - process.exit(1) - } -} diff --git a/packages/celotool/src/cmds/deploy/initial/fullnodes.ts b/packages/celotool/src/cmds/deploy/initial/fullnodes.ts deleted file mode 100644 index 9e499c71207..00000000000 --- a/packages/celotool/src/cmds/deploy/initial/fullnodes.ts +++ /dev/null @@ -1,45 +0,0 @@ -import { InitialArgv } from 'src/cmds/deploy/initial' -import { addContextMiddleware, ContextArgv, switchToContextCluster } from 'src/lib/context-utils' -import { installFullNodeChart } from 'src/lib/fullnodes' -import { kubectlAnnotateKSA, linkSAForWorkloadIdentity } from 'src/lib/gcloud_utils' -import { isCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import yargs from 'yargs' - -export const command = 'fullnodes' - -export const describe = 'deploy full-nodes in a particular context' - -type FullNodeInitialArgv = InitialArgv & - ContextArgv & { - createNEG: boolean - staticNodes: boolean - } - -export const builder = (argv: yargs.Argv) => { - return addContextMiddleware(argv) - .option('createNEG', { - type: 'boolean', - description: - 'When enabled, will create a network endpoint group for the full node http & ws ports. Only works for GCP.', - default: false, - }) - .option('staticNodes', { - type: 'boolean', - description: - 'when enabled, generates node keys deterministically using the mnemonic and context, and uploads the enodes to GCS', - default: false, - }) -} - -export const handler = async (argv: FullNodeInitialArgv) => { - await switchToContextCluster(argv.celoEnv, argv.context) - if (!isCelotoolHelmDryRun()) { - await linkSAForWorkloadIdentity(argv.celoEnv, argv.context) - } else { - console.info(`Skipping Workload Identity Service account setup due to --helmdryrun.`) - } - await installFullNodeChart(argv.celoEnv, argv.context, argv.staticNodes, argv.createNEG) - if (!isCelotoolHelmDryRun()) { - await kubectlAnnotateKSA(argv.celoEnv, argv.context) - } -} diff --git a/packages/celotool/src/cmds/deploy/initial/leaderboard.ts b/packages/celotool/src/cmds/deploy/initial/leaderboard.ts deleted file mode 100644 index ae959804519..00000000000 --- a/packages/celotool/src/cmds/deploy/initial/leaderboard.ts +++ /dev/null @@ -1,17 +0,0 @@ -import { InitialArgv } from 'src/cmds/deploy/initial' -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { installHelmChart } from 'src/lib/leaderboard' -import yargs from 'yargs' - -export const command = 'leaderboard' - -export const describe = 'deploy the leaderboard for the specified network' - -export const builder = (argv: yargs.Argv) => { - return argv -} - -export const handler = async (argv: InitialArgv) => { - await switchToClusterFromEnv(argv.celoEnv) - await installHelmChart(argv.celoEnv) -} diff --git a/packages/celotool/src/cmds/deploy/initial/load-test.ts b/packages/celotool/src/cmds/deploy/initial/load-test.ts deleted file mode 100644 index 44b340ed96b..00000000000 --- a/packages/celotool/src/cmds/deploy/initial/load-test.ts +++ /dev/null @@ -1,56 +0,0 @@ -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { CeloEnvArgv } from 'src/lib/env-utils' -import { installHelmChart, setArgvDefaults } from 'src/lib/load-test' -import yargs from 'yargs' - -export const command = 'load-test' - -export const describe = 'deploy load-test' - -export interface LoadTestArgv extends CeloEnvArgv { - blockscoutMeasurePercent: number - delay: number - replicas: number - threads: number -} - -export const builder = (argv: yargs.Argv) => { - return argv - .option('blockscout-measure-percent', { - type: 'number', - description: - 'Percent of transactions to measure the time it takes for blockscout to process a transaction. Should be in the range of [0, 100]', - default: 30, - }) - .option('delay', { - type: 'number', - description: - 'Number of ms a client waits between each transaction, defaults to LOAD_TEST_TX_DELAY_MS in the .env file', - default: -1, - }) - .option('replicas', { - type: 'number', - description: - 'Number of load test clients to create, defaults to LOAD_TEST_CLIENTS in the .env file', - default: -1, - }) - .option('threads', { - type: 'number', - description: - 'Number of threads for each client, defaults to LOAD_TEST_THREADS in the .env file', - default: -1, - }) -} - -export const handler = async (argv: LoadTestArgv) => { - await switchToClusterFromEnv(argv.celoEnv) - setArgvDefaults(argv) - - await installHelmChart( - argv.celoEnv, - argv.blockscoutMeasurePercent, - argv.delay, - argv.replicas, - argv.threads - ) -} diff --git a/packages/celotool/src/cmds/deploy/initial/mock-oracle.ts b/packages/celotool/src/cmds/deploy/initial/mock-oracle.ts deleted file mode 100644 index 972378b4255..00000000000 --- a/packages/celotool/src/cmds/deploy/initial/mock-oracle.ts +++ /dev/null @@ -1,17 +0,0 @@ -import { InitialArgv } from 'src/cmds/deploy/initial' -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { installHelmChart } from 'src/lib/mock-oracle' -import yargs from 'yargs' - -export const command = 'mock-oracle' - -export const describe = 'deploy the mock oracle for the specified network' - -export const builder = (argv: yargs.Argv) => { - return argv -} - -export const handler = async (argv: InitialArgv) => { - await switchToClusterFromEnv(argv.celoEnv) - await installHelmChart(argv.celoEnv) -} diff --git a/packages/celotool/src/cmds/deploy/initial/notification-service.ts b/packages/celotool/src/cmds/deploy/initial/notification-service.ts deleted file mode 100644 index 2fbea7f59e2..00000000000 --- a/packages/celotool/src/cmds/deploy/initial/notification-service.ts +++ /dev/null @@ -1,13 +0,0 @@ -import { execCmd } from 'src/lib/cmd-utils' -import { exitIfCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import { InitialArgv } from '../../deploy/initial' - -export const command = 'notification-service' -export const describe = 'command for deploying notification-service' - -export const handler = async (argv: InitialArgv) => { - exitIfCelotoolHelmDryRun() - console.info(`deploying notification-service for env ${argv.celoEnv}`) - await execCmd(`yarn --cwd ../notification-service run deploy -n ${argv.celoEnv}`) - console.info(`notification-service deploy complete`) -} diff --git a/packages/celotool/src/cmds/deploy/initial/odis.ts b/packages/celotool/src/cmds/deploy/initial/odis.ts deleted file mode 100644 index dd217a6801c..00000000000 --- a/packages/celotool/src/cmds/deploy/initial/odis.ts +++ /dev/null @@ -1,19 +0,0 @@ -import { InitialArgv } from 'src/cmds/deploy/initial' -import { addContextMiddleware, ContextArgv, switchToContextCluster } from 'src/lib/context-utils' -import { installODISHelmChart } from 'src/lib/odis' -import yargs from 'yargs' - -export const command = 'odis' - -export const describe = 'deploy the odis signers for the specified network' - -type OdisInitialArgv = InitialArgv & ContextArgv - -export const builder = (argv: yargs.Argv) => { - return addContextMiddleware(argv) -} - -export const handler = async (argv: OdisInitialArgv) => { - await switchToContextCluster(argv.celoEnv, argv.context) - await installODISHelmChart(argv.celoEnv, argv.context) -} diff --git a/packages/celotool/src/cmds/deploy/initial/oracle.ts b/packages/celotool/src/cmds/deploy/initial/oracle.ts deleted file mode 100644 index 51b589aa06d..00000000000 --- a/packages/celotool/src/cmds/deploy/initial/oracle.ts +++ /dev/null @@ -1,36 +0,0 @@ -import { flow } from 'lodash' -import { InitialArgv } from 'src/cmds/deploy/initial' -import { addContextMiddleware, ContextArgv, switchToContextCluster } from 'src/lib/context-utils' -import { CurrencyPair } from 'src/lib/k8s-oracle/base' -import { - addCurrencyPairMiddleware, - addUseFornoMiddleware, - getOracleDeployerForContext, -} from 'src/lib/oracle' -import yargs from 'yargs' - -export const command = 'oracle' - -export const describe = 'deploy the oracle for the specified network' - -type OracleInitialArgv = InitialArgv & - ContextArgv & { - useForno: boolean - currencyPair: CurrencyPair - } - -export const builder = (argv: yargs.Argv) => { - return flow([addContextMiddleware, addCurrencyPairMiddleware, addUseFornoMiddleware])(argv) -} - -export const handler = async (argv: OracleInitialArgv) => { - const clusterManager = await switchToContextCluster(argv.celoEnv, argv.context) - const deployer = getOracleDeployerForContext( - argv.celoEnv, - argv.context, - argv.currencyPair, - argv.useForno, - clusterManager - ) - await deployer.installChart() -} diff --git a/packages/celotool/src/cmds/deploy/initial/prometheus.ts b/packages/celotool/src/cmds/deploy/initial/prometheus.ts deleted file mode 100644 index 21ea49d85ad..00000000000 --- a/packages/celotool/src/cmds/deploy/initial/prometheus.ts +++ /dev/null @@ -1,37 +0,0 @@ -import { InitialArgv } from 'src/cmds/deploy/initial' -import { switchToClusterFromEnvOrContext } from 'src/lib/cluster' -import { addContextMiddleware, ContextArgv } from 'src/lib/context-utils' -import { installGrafanaIfNotExists, installPrometheusIfNotExists } from 'src/lib/prometheus' - -export const command = 'prometheus' - -export const describe = 'deploy prometheus to a kubernetes cluster using Helm' - -export type PrometheusInitialArgv = InitialArgv & - ContextArgv & { - deployGrafana: boolean - skipClusterSetup: boolean - } - -export const builder = (argv: PrometheusInitialArgv) => { - return addContextMiddleware(argv) - .option('deploy-grafana', { - type: 'boolean', - description: 'Include the deployment of grafana helm chart', - default: false, - }) - .option('skipClusterSetup', { - type: 'boolean', - description: 'If you know that you can skip the cluster setup', - default: false, - }) -} - -export const handler = async (argv: PrometheusInitialArgv) => { - const clusterConfig = await switchToClusterFromEnvOrContext(argv, argv.skipClusterSetup) - - await installPrometheusIfNotExists(argv.context, clusterConfig) - if (argv.deployGrafana) { - await installGrafanaIfNotExists(argv.context, clusterConfig) - } -} diff --git a/packages/celotool/src/cmds/deploy/initial/promtail.ts b/packages/celotool/src/cmds/deploy/initial/promtail.ts deleted file mode 100644 index d60ccc4b904..00000000000 --- a/packages/celotool/src/cmds/deploy/initial/promtail.ts +++ /dev/null @@ -1,21 +0,0 @@ -import { InitialArgv } from 'src/cmds/deploy/initial' -import { switchToClusterFromEnvOrContext } from 'src/lib/cluster' -import { addContextMiddleware, ContextArgv } from 'src/lib/context-utils' -import { installPromtailIfNotExists } from 'src/lib/promtail' - -export const command = 'promtail' - -export const describe = 'deploy Promtail to a kubernetes cluster using Helm' - -export type PromtailInitialArgv = InitialArgv & ContextArgv - -export const builder = (argv: PromtailInitialArgv) => { - return addContextMiddleware(argv) -} - -export const handler = async (argv: PromtailInitialArgv) => { - // always skip cluster setup - const clusterConfig = await switchToClusterFromEnvOrContext(argv, true) - - await installPromtailIfNotExists(clusterConfig) -} diff --git a/packages/celotool/src/cmds/deploy/initial/pumba.ts b/packages/celotool/src/cmds/deploy/initial/pumba.ts deleted file mode 100644 index 3b9779d0c82..00000000000 --- a/packages/celotool/src/cmds/deploy/initial/pumba.ts +++ /dev/null @@ -1,20 +0,0 @@ -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { installGenericHelmChart } from 'src/lib/helm_deploy' -import { helmChartDir, helmParameters, helmReleaseName } from 'src/lib/pumba' -import { InitialArgv } from '../../deploy/initial' - -export const command = 'pumba' - -export const describe = 'deploy the pumba package' - -export const builder = {} - -export const handler = async (argv: InitialArgv) => { - await switchToClusterFromEnv(argv.celoEnv) - await installGenericHelmChart({ - namespace: argv.celoEnv, - releaseName: helmReleaseName(argv.celoEnv), - chartDir: helmChartDir, - parameters: helmParameters(), - }) -} diff --git a/packages/celotool/src/cmds/deploy/initial/testnet.ts b/packages/celotool/src/cmds/deploy/initial/testnet.ts deleted file mode 100644 index 5374986ea93..00000000000 --- a/packages/celotool/src/cmds/deploy/initial/testnet.ts +++ /dev/null @@ -1,51 +0,0 @@ -import { createClusterIfNotExists, setupCluster, switchToClusterFromEnv } from 'src/lib/cluster' -import { - createStaticIPs, - installHelmChart, - isCelotoolHelmDryRun, - pollForBootnodeLoadBalancer, -} from 'src/lib/helm_deploy' -import { uploadTestnetInfoToGoogleStorage } from 'src/lib/testnet-utils' -import yargs from 'yargs' -import { InitialArgv } from '../../deploy/initial' - -export const command = 'testnet' - -export const describe = 'deploy the testnet package' - -type TestnetInitialArgv = InitialArgv & { - skipClusterSetup: boolean - useExistingGenesis: boolean -} - -export const builder = (argv: yargs.Argv) => { - return argv - .option('skipClusterSetup', { - type: 'boolean', - description: 'If you know that you can skip the cluster setup', - default: false, - }) - .option('useExistingGenesis', { - type: 'boolean', - description: 'Instead of generating a new genesis, use an existing genesis in GCS', - default: false, - }) -} - -export const handler = async (argv: TestnetInitialArgv) => { - const createdCluster = await createClusterIfNotExists() - await switchToClusterFromEnv(argv.celoEnv) - - if (!argv.skipClusterSetup) { - await setupCluster(argv.celoEnv, createdCluster) - } - - await createStaticIPs(argv.celoEnv) - - await installHelmChart(argv.celoEnv, argv.useExistingGenesis) - if (!isCelotoolHelmDryRun()) { - // When using an external bootnode, we have to await the bootnode's LB to be up first - await pollForBootnodeLoadBalancer(argv.celoEnv) - await uploadTestnetInfoToGoogleStorage(argv.celoEnv) - } -} diff --git a/packages/celotool/src/cmds/deploy/initial/tracer-tool.ts b/packages/celotool/src/cmds/deploy/initial/tracer-tool.ts deleted file mode 100644 index 53637ca0801..00000000000 --- a/packages/celotool/src/cmds/deploy/initial/tracer-tool.ts +++ /dev/null @@ -1,46 +0,0 @@ -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { execCmdWithExitOnFailure } from 'src/lib/cmd-utils' -import { isCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import { installHelmChart } from 'src/lib/tracer-tool' -import yargs from 'yargs' -import { InitialArgv } from '../../deploy/initial' -export const command = 'tracer-tool' - -export const describe = 'deploy tracer-tool' - -interface TracerToolArgv extends InitialArgv { - faucet: boolean -} - -export const builder = (argv: yargs.Argv) => { - return argv.option('faucet', { - type: 'boolean', - description: 'Whether to faucet test accounts before deployment or no', - default: false, - }) -} - -const FIRST_ACCOUNT = '0x4da58d267cd465b9313fdb19b120ec591d957ad2' -const SECOND_ACCOUNT = '0xc70947239385c2422866e20b6cafffa29157e4b3' - -export const handler = async (argv: TracerToolArgv) => { - await switchToClusterFromEnv(argv.celoEnv) - - if (!argv.faucet || isCelotoolHelmDryRun()) { - console.info(`Skipping fauceting test accounts...`) - } else { - console.info(`Fauceting test accounts...`) - await execCmdWithExitOnFailure( - `yarn --cwd ${process.cwd()} run cli account faucet -e ${ - argv.celoEnv - } --account ${FIRST_ACCOUNT}` - ) - await execCmdWithExitOnFailure( - `yarn --cwd ${process.cwd()} run cli account faucet -e ${ - argv.celoEnv - } --account ${SECOND_ACCOUNT}` - ) - } - - await installHelmChart(argv.celoEnv) -} diff --git a/packages/celotool/src/cmds/deploy/initial/transaction-metrics-exporter.ts b/packages/celotool/src/cmds/deploy/initial/transaction-metrics-exporter.ts deleted file mode 100644 index d5fdc8342a1..00000000000 --- a/packages/celotool/src/cmds/deploy/initial/transaction-metrics-exporter.ts +++ /dev/null @@ -1,14 +0,0 @@ -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { installHelmChart } from 'src/lib/transaction-metrics-exporter' -import { InitialArgv } from '../../deploy/initial' - -export const command = 'transaction-metrics-exporter' - -export const describe = 'deploy the transaction metrics exporter' - -export const builder = {} - -export const handler = async (argv: InitialArgv) => { - await switchToClusterFromEnv(argv.celoEnv) - await installHelmChart(argv.celoEnv) -} diff --git a/packages/celotool/src/cmds/deploy/initial/voting-bot.ts b/packages/celotool/src/cmds/deploy/initial/voting-bot.ts deleted file mode 100644 index 17b6f13ce5e..00000000000 --- a/packages/celotool/src/cmds/deploy/initial/voting-bot.ts +++ /dev/null @@ -1,31 +0,0 @@ -import { InitialArgv } from 'src/cmds/deploy/initial' -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { isCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import { ensure0x } from 'src/lib/utils' -import { installHelmChart, setupVotingBotAccounts } from 'src/lib/voting-bot' -import yargs from 'yargs' - -export const command = 'voting-bot' -export const describe = 'deploy voting-bot' - -interface VotingBotArgv extends InitialArgv { - excludedGroups?: string[] -} - -export const builder = (argv: yargs.Argv) => { - return argv.option('excludedGroups', { - type: 'string', - description: 'Addresses of Validator Group(s) that the bot should not vote for.', - coerce: (addresses) => { - return addresses.split(',').map(ensure0x) - }, - }) -} - -export const handler = async (argv: VotingBotArgv) => { - await switchToClusterFromEnv(argv.celoEnv) - if (!isCelotoolHelmDryRun()) { - await setupVotingBotAccounts(argv.celoEnv) - } - await installHelmChart(argv.celoEnv, argv.excludedGroups) -} diff --git a/packages/celotool/src/cmds/deploy/initial/wallet-connect.ts b/packages/celotool/src/cmds/deploy/initial/wallet-connect.ts deleted file mode 100644 index 993a3c45d85..00000000000 --- a/packages/celotool/src/cmds/deploy/initial/wallet-connect.ts +++ /dev/null @@ -1,14 +0,0 @@ -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { installWalletConnect } from 'src/lib/wallet-connect' -import { InitialArgv } from '../../deploy/initial' - -export const command = 'walletconnect' - -export const describe = 'deploy the walletconnect package' - -export const builder = {} - -export const handler = async (argv: InitialArgv) => { - await switchToClusterFromEnv(argv.celoEnv) - await installWalletConnect() -} diff --git a/packages/celotool/src/cmds/deploy/list.ts b/packages/celotool/src/cmds/deploy/list.ts deleted file mode 100644 index f56eae663ac..00000000000 --- a/packages/celotool/src/cmds/deploy/list.ts +++ /dev/null @@ -1,39 +0,0 @@ -import { forEach, groupBy } from 'lodash' -import { - getNonSystemHelmReleases, - getPackageName, - HelmRelease, - switchToClusterFromEnv, -} from 'src/lib/cluster' -import { DeployArgv } from '../deploy' - -export const command = 'list' - -export const describe = 'list the deploys on the cluster given an env' - -export type ListArgv = DeployArgv - -export const builder = {} - -export const handler = async (argv: ListArgv) => { - await switchToClusterFromEnv(argv.celoEnv) - const releases = await getNonSystemHelmReleases() - printReleases(releases) -} - -export function printReleases(releases: HelmRelease[]) { - const releasesByEnv = groupBy(releases, (release) => release.Namespace) - - forEach(releasesByEnv, (envReleases, key) => { - console.info(`Environment: ${key}, Releases:\n`) - - envReleases.forEach((release) => - console.info( - ` - ${getPackageName(release.Chart)} (${release.Status}), last updated at: ${ - release.Updated - }` - ) - ) - console.info(`\n`) - }) -} diff --git a/packages/celotool/src/cmds/deploy/migrate.ts b/packages/celotool/src/cmds/deploy/migrate.ts deleted file mode 100644 index c36cef8d85a..00000000000 --- a/packages/celotool/src/cmds/deploy/migrate.ts +++ /dev/null @@ -1,15 +0,0 @@ -import yargs from 'yargs' -import { DeployArgv } from '../deploy' -export const command = 'migrate ' - -export const describe = 'migrate an existing deploy' - -export type MigrateArgv = DeployArgv - -export const builder = (argv: yargs.Argv) => { - return argv.commandDir('migrate', { extensions: ['ts'] }) -} - -export const handler = () => { - // empty -} diff --git a/packages/celotool/src/cmds/deploy/switch.ts b/packages/celotool/src/cmds/deploy/switch.ts deleted file mode 100644 index 8fc87e7e803..00000000000 --- a/packages/celotool/src/cmds/deploy/switch.ts +++ /dev/null @@ -1,15 +0,0 @@ -import { DeployArgv } from 'src/cmds/deploy' -import yargs from 'yargs' -export const command = 'switch ' - -export const describe = 'switch the exposed deployed service' - -export type SwitchArgv = DeployArgv - -export const builder = (argv: yargs.Argv) => { - return argv.commandDir('switch', { extensions: ['ts'] }) -} - -export const handler = () => { - // empty -} diff --git a/packages/celotool/src/cmds/deploy/upgrade.ts b/packages/celotool/src/cmds/deploy/upgrade.ts deleted file mode 100644 index 5b86da9f6cc..00000000000 --- a/packages/celotool/src/cmds/deploy/upgrade.ts +++ /dev/null @@ -1,15 +0,0 @@ -import yargs from 'yargs' -import { DeployArgv } from '../deploy' -export const command = 'upgrade ' - -export const describe = 'upgrade an existing deploy' - -export type UpgradeArgv = DeployArgv - -export const builder = (argv: yargs.Argv) => { - return argv.commandDir('upgrade', { extensions: ['ts'] }) -} - -export const handler = () => { - // empty -} diff --git a/packages/celotool/src/cmds/deploy/upgrade/all.ts b/packages/celotool/src/cmds/deploy/upgrade/all.ts deleted file mode 100644 index 678fba6b0b3..00000000000 --- a/packages/celotool/src/cmds/deploy/upgrade/all.ts +++ /dev/null @@ -1,63 +0,0 @@ -import { sleep } from '@celo/utils/lib/async' -import { exitIfCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import yargs from 'yargs' -import { UpgradeArgv } from '../../deploy/upgrade' -import { handler as contractsHandler } from '../initial/contracts' -import { handler as celostatsHandler } from './celostats' -import { handler as testnetHandler } from './testnet' - -export const command = 'all' - -export const describe = 'upgrades a typical deploy' - -type AllArgv = UpgradeArgv & { - reset: boolean - useExistingGenesis: boolean - skipFaucetting: boolean - tag: string - suffix: string -} - -export const builder = (argv: yargs.Argv) => { - return argv - .option('reset', { - describe: 'indicates a reset', - default: false, - type: 'boolean', - }) - .option('useExistingGenesis', { - type: 'boolean', - description: 'Instead of generating a new genesis, use an existing genesis in GCS', - default: false, - }) - .option('skipFaucetting', { - describe: 'skips allocation of cUSD to any oracle or bot accounts', - default: false, - type: 'boolean', - }) - .option('tag', { - type: 'string', - description: 'Docker image tag to deploy', - default: '', - }) - .option('suffix', { - type: 'string', - description: 'Instance suffix', - default: '', - }) -} - -export const handler = async (argv: AllArgv) => { - exitIfCelotoolHelmDryRun() - console.info('Deploy the testnet') - await testnetHandler(argv) - console.info('Deploy celostats') - await celostatsHandler(argv) - - if (argv.reset === true) { - console.info('Sleeping for 5 minutes to let pods come up') - await sleep(300000) - console.info('Deploy contracts') - await contractsHandler(argv) - } -} diff --git a/packages/celotool/src/cmds/deploy/upgrade/blockchain-api.ts b/packages/celotool/src/cmds/deploy/upgrade/blockchain-api.ts deleted file mode 100644 index 58ef25f9707..00000000000 --- a/packages/celotool/src/cmds/deploy/upgrade/blockchain-api.ts +++ /dev/null @@ -1,15 +0,0 @@ -import { exitIfCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import { handler as deployInitialBlockchainApiHandler } from '../../deploy/initial/blockchain-api' -import { UpgradeArgv } from '../../deploy/upgrade' - -export const command = 'blockchain-api' - -export const describe = 'command for upgrading blockchain-api' - -// Can't extend because yargs.Argv already has a `config` property -type BlockchainApiArgv = UpgradeArgv - -export const handler = async (argv: BlockchainApiArgv) => { - exitIfCelotoolHelmDryRun() - await deployInitialBlockchainApiHandler(argv) -} diff --git a/packages/celotool/src/cmds/deploy/upgrade/celostats.ts b/packages/celotool/src/cmds/deploy/upgrade/celostats.ts deleted file mode 100644 index 3c969d96136..00000000000 --- a/packages/celotool/src/cmds/deploy/upgrade/celostats.ts +++ /dev/null @@ -1,31 +0,0 @@ -import { installHelmChart, removeHelmRelease, upgradeHelmChart } from 'src/lib/celostats' -import { switchToClusterFromEnv } from 'src/lib/cluster' -import yargs from 'yargs' -import { UpgradeArgv } from '../upgrade' - -export const command = 'celostats' - -export const describe = 'upgrade the celostats package' - -type CelostatsArgv = UpgradeArgv & { - reset: boolean -} - -export const builder = (argv: yargs.Argv) => { - return argv.option('reset', { - description: 'Destroy & redeploy the celostats package', - default: false, - type: 'boolean', - }) -} - -export const handler = async (argv: CelostatsArgv) => { - await switchToClusterFromEnv(argv.celoEnv) - - if (argv.reset === true) { - await removeHelmRelease(argv.celoEnv) - await installHelmChart(argv.celoEnv) - } else { - await upgradeHelmChart(argv.celoEnv) - } -} diff --git a/packages/celotool/src/cmds/deploy/upgrade/chaoskube.ts b/packages/celotool/src/cmds/deploy/upgrade/chaoskube.ts deleted file mode 100644 index 93217c28add..00000000000 --- a/packages/celotool/src/cmds/deploy/upgrade/chaoskube.ts +++ /dev/null @@ -1,20 +0,0 @@ -import { helmChartDir, helmParameters, helmReleaseName } from 'src/lib/chaoskube' -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { upgradeGenericHelmChart } from 'src/lib/helm_deploy' -import { InitialArgv } from '../../deploy/initial' - -export const command = 'chaoskube' - -export const describe = 'deploy the chaoskube package' - -export const builder = {} - -export const handler = async (argv: InitialArgv) => { - await switchToClusterFromEnv(argv.celoEnv) - await upgradeGenericHelmChart({ - namespace: argv.celoEnv, - releaseName: helmReleaseName(argv.celoEnv), - chartDir: helmChartDir, - parameters: helmParameters(argv.celoEnv), - }) -} diff --git a/packages/celotool/src/cmds/deploy/upgrade/contracts.ts b/packages/celotool/src/cmds/deploy/upgrade/contracts.ts deleted file mode 100644 index 7ed31b02c74..00000000000 --- a/packages/celotool/src/cmds/deploy/upgrade/contracts.ts +++ /dev/null @@ -1,49 +0,0 @@ -import { downloadArtifacts, uploadArtifacts } from 'src/lib/artifacts' -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { execCmd } from 'src/lib/cmd-utils' -import { exitIfCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import { migrationOverrides, truffleOverrides } from 'src/lib/migration-utils' -import { portForwardAnd } from 'src/lib/port_forward' -import yargs from 'yargs' -import { UpgradeArgv } from '../../deploy/upgrade' - -export const command = 'contracts' - -export const describe = 'upgrade the celo smart contracts' - -type ContractsArgv = UpgradeArgv & { - skipFaucetting: boolean -} - -export const builder = (argv: yargs.Argv) => { - return argv.option('skipFaucetting', { - describe: 'skips allocation of cUSD to any oracle or bot accounts', - default: false, - type: 'boolean', - }) -} - -export const handler = async (argv: ContractsArgv) => { - exitIfCelotoolHelmDryRun() - await switchToClusterFromEnv(argv.celoEnv) - - console.info(`Upgrading smart contracts on ${argv.celoEnv}`) - const cb = async () => { - await execCmd( - `yarn --cwd ../protocol run migrate -n ${argv.celoEnv} -c '${JSON.stringify( - truffleOverrides() - )}' -m '${JSON.stringify(await migrationOverrides(!argv.skipFaucetting))}'` - ) - } - - try { - await downloadArtifacts(argv.celoEnv) - await portForwardAnd(argv.celoEnv, cb) - await uploadArtifacts(argv.celoEnv) - process.exit(0) - } catch (error) { - console.error(`Unable to upgrade smart contracts on ${argv.celoEnv}`) - console.error(error) - process.exit(1) - } -} diff --git a/packages/celotool/src/cmds/deploy/upgrade/faucet.ts b/packages/celotool/src/cmds/deploy/upgrade/faucet.ts deleted file mode 100644 index 679bab23859..00000000000 --- a/packages/celotool/src/cmds/deploy/upgrade/faucet.ts +++ /dev/null @@ -1,104 +0,0 @@ -import { execSync } from 'child_process' -import { config } from 'dotenv' -import { downloadArtifacts, getContractAddresses } from 'src/lib/artifacts' -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { execCmd } from 'src/lib/cmd-utils' -import { addCeloEnvMiddleware, getEnvFile } from 'src/lib/env-utils' -import { - coerceMnemonicAccountType, - generatePrivateKey, - privateKeyToAddress, -} from 'src/lib/generate_utils' -import { exitIfCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import { portForwardAnd } from 'src/lib/port_forward' -import yargs from 'yargs' -import { UpgradeArgv } from '../../deploy/upgrade' - -export const command = 'faucet' - -export const describe = 'upgrade the faucet (requires firebase login permissions)' - -interface UpgradeFaucetArgs extends UpgradeArgv { - firebaseProject: string -} - -export const builder = (argv: yargs.Argv) => { - return addCeloEnvMiddleware(argv).option('firebaseProject', { - type: 'string', - demand: 'Should be one of celo-faucet or celo-faucet-staging', - description: 'the name of the firebase project to use (celo-faucet or celo-faucet-staging)', - }) -} - -function getEnvMnemonic(env: string): string { - const envMemonicResult = config({ path: getEnvFile(env, '.mnemonic') }) - - if (envMemonicResult.error) { - throw envMemonicResult.error - } else if (envMemonicResult.parsed) { - return envMemonicResult.parsed.MNEMONIC - } - throw new Error('Could not get mnmonic') -} - -export const handler = async (argv: UpgradeFaucetArgs) => { - exitIfCelotoolHelmDryRun() - await switchToClusterFromEnv(argv.celoEnv) - console.info(`Upgrading faucet for network ${argv.celoEnv} on project ${argv.firebaseProject}`) - - try { - const mnemonic = getEnvMnemonic(argv.celoEnv) - const privateKey = generatePrivateKey(mnemonic, coerceMnemonicAccountType('faucet'), 0) - const address = privateKeyToAddress(privateKey) - - const fundFaucetAccounts = async () => { - await execCmd( - // TODO(joshua): Don't copy this from account/faucet in celotool - // TODO(yerdua): reimplement the protocol transfer script here, using - // the SDK + Web3 when the SDK can be built for multiple environments - `yarn --cwd ../protocol run transfer -n ${argv.celoEnv} -a ${address} -d 20000 -g 20000` - ) - } - - await downloadArtifacts(argv.celoEnv) - const addressMap = await getContractAddresses(argv.celoEnv, [ - 'escrow', - 'goldToken', - 'stableToken', - ]) - - console.info(`Switching to firebase project ${argv.firebaseProject}`) - await execCmd(`yarn --cwd ../faucet firebase use ${argv.firebaseProject}`) - - console.info(`Updating contract addresses for ${argv.celoEnv} on ${argv.firebaseProject}`) - await execCmd( - `yarn --cwd ../faucet cli config:set --net ${argv.celoEnv} --escrowAddress ${addressMap.escrow} --goldTokenAddress ${addressMap.goldToken} --stableTokenAddress ${addressMap.stableToken}` - ) - console.info(`Redepolying functions (neeeded for config changes to take place)`) - await execCmd('yarn --cwd ../faucet cli deploy:functions') - - // // Need to clear because we generate the same account each time here. - console.info(`Clearing accounts for network ${argv.celoEnv} on ${argv.firebaseProject}`) - execSync(`yarn --cwd ../faucet cli accounts:clear --net ${argv.celoEnv}`, { - stdio: 'inherit', - }) - - console.info(`Adding one faucet account for network ${argv.celoEnv} on ${argv.firebaseProject}`) - await execCmd( - `yarn --cwd ../faucet cli accounts:add ${privateKey} ${address} --net ${argv.celoEnv}` - ) - - console.info(`Funding account ${address} on ${argv.celoEnv}`) - await portForwardAnd(argv.celoEnv, fundFaucetAccounts) - - console.info( - `Done updating contract addresses and funding the faucet account for network ${argv.celoEnv} in ${argv.firebaseProject}` - ) - console.info('Please double check the TX node IP address to ensure it did not change.') - process.exit(0) - } catch (error) { - console.error(`Unable to upgrade faucet on ${argv.celoEnv}`) - console.error(error) - process.exit(1) - } -} diff --git a/packages/celotool/src/cmds/deploy/upgrade/fullnodes.ts b/packages/celotool/src/cmds/deploy/upgrade/fullnodes.ts deleted file mode 100644 index 2fee4aa51f2..00000000000 --- a/packages/celotool/src/cmds/deploy/upgrade/fullnodes.ts +++ /dev/null @@ -1,55 +0,0 @@ -import { UpgradeArgv } from 'src/cmds/deploy/upgrade' -import { addContextMiddleware, ContextArgv, switchToContextCluster } from 'src/lib/context-utils' -import { upgradeFullNodeChart } from 'src/lib/fullnodes' -import { kubectlAnnotateKSA, linkSAForWorkloadIdentity } from 'src/lib/gcloud_utils' -import { isCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import yargs from 'yargs' - -export const command = 'fullnodes' - -export const describe = 'deploy full nodes in a particular context' - -type FullNodeUpgradeArgv = UpgradeArgv & - ContextArgv & { - createNEG: boolean - reset: boolean - staticNodes: boolean - } - -export const builder = (argv: yargs.Argv) => { - return addContextMiddleware(argv) - .option('createNEG', { - type: 'boolean', - description: - 'When enabled, will create a network endpoint group for the full node http & ws ports. Only works for GCP.', - default: false, - }) - .option('reset', { - type: 'boolean', - description: 'when enabled, deletes the data volumes and redeploys the helm chart.', - default: false, - }) - .option('staticNodes', { - type: 'boolean', - description: - 'when enabled, generates node keys deterministically using the mnemonic and context, and uploads the enodes to GCS', - default: false, - }) -} - -export const handler = async (argv: FullNodeUpgradeArgv) => { - await switchToContextCluster(argv.celoEnv, argv.context) - if (!isCelotoolHelmDryRun()) { - await linkSAForWorkloadIdentity(argv.celoEnv, argv.context) - } - await upgradeFullNodeChart( - argv.celoEnv, - argv.context, - argv.reset, - argv.staticNodes, - argv.createNEG - ) - if (!isCelotoolHelmDryRun()) { - await kubectlAnnotateKSA(argv.celoEnv, argv.context) - } -} diff --git a/packages/celotool/src/cmds/deploy/upgrade/hotfix.ts b/packages/celotool/src/cmds/deploy/upgrade/hotfix.ts deleted file mode 100644 index 0554873625d..00000000000 --- a/packages/celotool/src/cmds/deploy/upgrade/hotfix.ts +++ /dev/null @@ -1,164 +0,0 @@ -// This is a more unusual Celotool command. It basically helps you to execute Hotfixes on testnets. Because constructing proposals is difficult to do via a CLI, you should define them here in code. There are two examples below that you can start from. - -import { newKitFromWeb3 } from '@celo/contractkit' -import { hotfixToHash, ProposalBuilder, proposalToJSON } from '@celo/governance' -import { privateKeyToAddress } from '@celo/utils/lib/address' -import { concurrentMap } from '@celo/utils/lib/async' -import { randomBytes } from 'crypto' -import { getFornoUrl } from 'src/lib/endpoints' -import { envVar, fetchEnv } from 'src/lib/env-utils' -import { AccountType, getPrivateKeysFor } from 'src/lib/generate_utils' -import { exitIfCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import Web3 from 'web3' -import yargs from 'yargs' -import { UpgradeArgv } from '../../deploy/upgrade' - -export const command = 'hotfix' - -export const describe = 'runs a hotfix' - -type EthstatsArgv = UpgradeArgv - -export const builder = (argv: yargs.Argv) => { - return argv -} - -export const handler = async (argv: EthstatsArgv) => { - exitIfCelotoolHelmDryRun() - try { - const kit = newKitFromWeb3(new Web3(getFornoUrl(argv.celoEnv))) - const governance = await kit.contracts.getGovernance() - const keys = getPrivateKeysFor( - AccountType.VALIDATOR, - fetchEnv(envVar.MNEMONIC), - parseInt(fetchEnv(envVar.VALIDATORS), 10) - ) - const addresses = keys.map(privateKeyToAddress) - - console.info('Add keys to ContractKit') - for (const key of keys) { - kit.connection.addAccount(key) - } - - // Here you'll want to assert the current state - // Example A: Update a var on a Celo Core Contract - // const attestations = await kit.contracts.getAttestations() - // const currentNumber = await attestations.attestationExpiryBlocks() - // if (currentNumber !== 727) { - // throw new Error(`Expected current number to be 727, but was ${currentNumber}`) - // } - - // Example B: Repoint a Celo Core Contract proxy - // const validatorsProxyAddress = await kit.registry.addressFor(CeloContract.Validators) - // const currentValidatorsImplementationAddress = await getImplementationOfProxy( - // kit.web3, - // validatorsProxyAddress - // ) - // const desiredImplementationAddress = '0xd18620a5eBE0235023602bB4d490E1e96703EddD' - // console.info('Current Implementation Address: ', currentValidatorsImplementationAddress) - - // console.info('\nBuild Proposal') - - const proposalBuilder = new ProposalBuilder(kit) - - // Example A - // proposalBuilder.addJsonTx({ - // contract: CeloContract.Attestations, - // function: 'setAttestationExpiryBlocks', - // // @ts-ignore - // args: [728], - // value: '0', - // }) - - // Example B - // proposalBuilder.addProxyRepointingTx(validatorsProxyAddress, desiredImplementationAddress) - - const proposal = await proposalBuilder.build() - if (proposal.length === 0) { - console.error('\nPlease see examples in hotfix.ts and add transactions') - process.exit(1) - } - // If your proposal is just made of Celo Registry contract methods, you can print it out - console.info('Proposal: ', await proposalToJSON(kit, proposal)) - - const salt = randomBytes(32) - console.info(`Salt: ${salt.toString('hex')}`) - - const proposalHash = hotfixToHash(kit, proposal, salt) - console.info(`Proposal Hash: ${proposalHash.toString('hex')}`) - - console.info('\nWhitelist the hotfix') - await concurrentMap(25, addresses, async (address, index) => { - try { - await governance.whitelistHotfix(proposalHash).sendAndWaitForReceipt({ from: address }) - } catch (error) { - console.error( - `Error whitelisting for validator ${index} (${address}): ${ - error instanceof Error ? JSON.stringify(error) : error?.toString() - }` - ) - } - }) - - let hotfixRecord = await governance.getHotfixRecord(proposalHash) - console.info('Hotfix Record: ', hotfixRecord) - - console.info('\nApprove the hotfix') - await governance.approveHotfix(proposalHash).sendAndWaitForReceipt({ from: addresses[0] }) - hotfixRecord = await governance.getHotfixRecord(proposalHash) - console.info('Hotfix Record: ', hotfixRecord) - - // This is on master, but not on baklava yet - const canPass = await governance.isHotfixPassing(proposalHash) - const tally = await governance.hotfixWhitelistValidatorTally(proposalHash) - - if (!canPass) { - throw new Error(`Hotfix cannot pass. Currently tally is ${tally}`) - } - - console.info('\nPrepare the hotfix') - await governance.prepareHotfix(proposalHash).sendAndWaitForReceipt({ from: addresses[0] }) - hotfixRecord = await governance.getHotfixRecord(proposalHash) - console.info('\nHotfix Record: ', hotfixRecord) - - if (hotfixRecord.preparedEpoch.toNumber() === 0) { - console.error('Hotfix could not be prepared') - throw new Error() - } - console.info('\nExecute the hotfix') - await governance.executeHotfix(proposal, salt).sendAndWaitForReceipt({ from: addresses[0] }) - - hotfixRecord = await governance.getHotfixRecord(proposalHash) - console.info('\nHotfix Record: ', hotfixRecord) - - if (!hotfixRecord.executed) { - console.error('Hotfix could somehow not be executed') - throw new Error() - } - - // Assert any state to be sure it worked - - // Example A - // const newNumber = await attestations.attestationExpiryBlocks() - // if (newNumber !== 728) { - // throw new Error(`Expected current number to be 728, but was ${newNumber}`) - // } - - // Example B - // const newValidatorsImplementationAddress = await getImplementationOfProxy( - // kit.web3, - // validatorsProxyAddress - // ) - // if (!eqAddress(newValidatorsImplementationAddress, desiredImplementationAddress)) { - // throw new Error( - // `Expected new implementation address to be ${desiredImplementationAddress}, but was ${newValidatorsImplementationAddress}` - // ) - // } - - console.info('Hotfix successfully executed!') - process.exit(0) - } catch (error) { - console.error(error) - process.exit(1) - } -} diff --git a/packages/celotool/src/cmds/deploy/upgrade/leaderboard.ts b/packages/celotool/src/cmds/deploy/upgrade/leaderboard.ts deleted file mode 100644 index f8cccceb64a..00000000000 --- a/packages/celotool/src/cmds/deploy/upgrade/leaderboard.ts +++ /dev/null @@ -1,35 +0,0 @@ -import { UpgradeArgv } from 'src/cmds/deploy/upgrade' -import { createClusterIfNotExists, switchToClusterFromEnv } from 'src/lib/cluster' -import { isCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import { installHelmChart, removeHelmRelease, upgradeHelmChart } from 'src/lib/leaderboard' -import yargs from 'yargs' - -export const command = 'leaderboard' - -export const describe = 'upgrade the leaderboard package' - -type LeaderboardArgv = UpgradeArgv & { - reset: boolean -} - -export const builder = (argv: yargs.Argv) => { - return argv.option('reset', { - description: 'Destroy & redeploy the leaderboard package', - default: false, - type: 'boolean', - }) -} - -export const handler = async (argv: LeaderboardArgv) => { - if (!isCelotoolHelmDryRun()) { - await createClusterIfNotExists() - } - await switchToClusterFromEnv(argv.celoEnv) - - if (argv.reset === true && !isCelotoolHelmDryRun()) { - await removeHelmRelease(argv.celoEnv) - await installHelmChart(argv.celoEnv) - } else { - await upgradeHelmChart(argv.celoEnv) - } -} diff --git a/packages/celotool/src/cmds/deploy/upgrade/load-test.ts b/packages/celotool/src/cmds/deploy/upgrade/load-test.ts deleted file mode 100644 index 7ff50139dce..00000000000 --- a/packages/celotool/src/cmds/deploy/upgrade/load-test.ts +++ /dev/null @@ -1,44 +0,0 @@ -import { builder as initialBuilder, LoadTestArgv } from 'src/cmds/deploy/initial/load-test' -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { isCelotoolHelmDryRun } from 'src/lib/helm_deploy' -import { resetAndUpgrade, setArgvDefaults, upgradeHelmChart } from 'src/lib/load-test' -import yargs from 'yargs' - -export const command = 'load-test' - -export const describe = 'deploy load-test' - -type LoadTestUpgradeArgv = LoadTestArgv & { - reset: boolean -} - -export const builder = (argv: yargs.Argv) => { - initialBuilder(argv).option('reset', { - description: 'Scale down all load-test clients, upgrade, and scale back up', - default: false, - type: 'boolean', - }) -} - -export const handler = async (argv: LoadTestUpgradeArgv) => { - await switchToClusterFromEnv(argv.celoEnv) - setArgvDefaults(argv) - - if (argv.reset === true && !isCelotoolHelmDryRun()) { - await resetAndUpgrade( - argv.celoEnv, - argv.blockscoutMeasurePercent, - argv.delay, - argv.replicas, - argv.threads - ) - } else { - await upgradeHelmChart( - argv.celoEnv, - argv.blockscoutMeasurePercent, - argv.delay, - argv.replicas, - argv.threads - ) - } -} diff --git a/packages/celotool/src/cmds/deploy/upgrade/odis.ts b/packages/celotool/src/cmds/deploy/upgrade/odis.ts deleted file mode 100644 index 62e3dcb398e..00000000000 --- a/packages/celotool/src/cmds/deploy/upgrade/odis.ts +++ /dev/null @@ -1,19 +0,0 @@ -import { UpgradeArgv } from 'src/cmds/deploy/upgrade' -import { addContextMiddleware, ContextArgv, switchToContextCluster } from 'src/lib/context-utils' -import { upgradeODISChart } from 'src/lib/odis' -import yargs from 'yargs' - -export const command = 'odis' - -export const describe = 'upgrade odis on an AKS cluster' - -type OdisUpgradeArgv = UpgradeArgv & ContextArgv - -export const builder = (argv: yargs.Argv) => { - return addContextMiddleware(argv) -} - -export const handler = async (argv: OdisUpgradeArgv) => { - await switchToContextCluster(argv.celoEnv, argv.context) - await upgradeODISChart(argv.celoEnv, argv.context) -} diff --git a/packages/celotool/src/cmds/deploy/upgrade/oracle.ts b/packages/celotool/src/cmds/deploy/upgrade/oracle.ts deleted file mode 100644 index 43ac58cfef0..00000000000 --- a/packages/celotool/src/cmds/deploy/upgrade/oracle.ts +++ /dev/null @@ -1,36 +0,0 @@ -import { flow } from 'lodash' -import { UpgradeArgv } from 'src/cmds/deploy/upgrade' -import { addContextMiddleware, ContextArgv, switchToContextCluster } from 'src/lib/context-utils' -import { CurrencyPair } from 'src/lib/k8s-oracle/base' -import { - addCurrencyPairMiddleware, - addUseFornoMiddleware, - getOracleDeployerForContext, -} from 'src/lib/oracle' -import yargs from 'yargs' - -export const command = 'oracle' - -export const describe = 'upgrade the oracle(s) on an AKS cluster' - -type OracleUpgradeArgv = UpgradeArgv & - ContextArgv & { - useForno: boolean - currencyPair: CurrencyPair - } - -export const builder = (argv: yargs.Argv) => { - return flow([addContextMiddleware, addCurrencyPairMiddleware, addUseFornoMiddleware])(argv) -} - -export const handler = async (argv: OracleUpgradeArgv) => { - const clusterManager = await switchToContextCluster(argv.celoEnv, argv.context) - const deployer = getOracleDeployerForContext( - argv.celoEnv, - argv.context, - argv.currencyPair, - argv.useForno, - clusterManager - ) - await deployer.upgradeChart() -} diff --git a/packages/celotool/src/cmds/deploy/upgrade/prometheus.ts b/packages/celotool/src/cmds/deploy/upgrade/prometheus.ts deleted file mode 100644 index 48e7a9c3a36..00000000000 --- a/packages/celotool/src/cmds/deploy/upgrade/prometheus.ts +++ /dev/null @@ -1,31 +0,0 @@ -import { UpgradeArgv } from 'src/cmds/deploy/upgrade' -import { switchToClusterFromEnvOrContext } from 'src/lib/cluster' -import { addContextMiddleware, ContextArgv } from 'src/lib/context-utils' -import { upgradeGrafana, upgradePrometheus } from 'src/lib/prometheus' - -export const command = 'prometheus' - -export const describe = 'upgrade prometheus to a kubernetes cluster using Helm' - -export type PrometheusUpgradeArgv = UpgradeArgv & - ContextArgv & { - deployGrafana: boolean - } - -export const builder = (argv: PrometheusUpgradeArgv) => { - return addContextMiddleware(argv).option('deploy-grafana', { - type: 'boolean', - description: 'Include the deployment of grafana helm chart', - default: false, - }) -} - -export const handler = async (argv: PrometheusUpgradeArgv) => { - const clusterConfig = await switchToClusterFromEnvOrContext(argv, true) - - await upgradePrometheus(argv.context, clusterConfig) - - if (argv.deployGrafana) { - await upgradeGrafana(argv.context, clusterConfig) - } -} diff --git a/packages/celotool/src/cmds/deploy/upgrade/promtail.ts b/packages/celotool/src/cmds/deploy/upgrade/promtail.ts deleted file mode 100644 index df9a94d914b..00000000000 --- a/packages/celotool/src/cmds/deploy/upgrade/promtail.ts +++ /dev/null @@ -1,21 +0,0 @@ -import { UpgradeArgv } from 'src/cmds/deploy/upgrade' -import { switchToClusterFromEnvOrContext } from 'src/lib/cluster' -import { addContextMiddleware, ContextArgv } from 'src/lib/context-utils' -import { upgradePromtail } from 'src/lib/promtail' - -export const command = 'promtail' - -export const describe = 'upgrade Promtail to a kubernetes cluster using Helm' - -export type PromtailUpgradeArgv = UpgradeArgv & ContextArgv - -export const builder = (argv: PromtailUpgradeArgv) => { - return addContextMiddleware(argv) -} - -export const handler = async (argv: PromtailUpgradeArgv) => { - // always skip cluster setup - const clusterConfig = await switchToClusterFromEnvOrContext(argv, true) - - await upgradePromtail(clusterConfig) -} diff --git a/packages/celotool/src/cmds/deploy/upgrade/pumba.ts b/packages/celotool/src/cmds/deploy/upgrade/pumba.ts deleted file mode 100644 index 1c251007182..00000000000 --- a/packages/celotool/src/cmds/deploy/upgrade/pumba.ts +++ /dev/null @@ -1,20 +0,0 @@ -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { upgradeGenericHelmChart } from 'src/lib/helm_deploy' -import { helmChartDir, helmParameters, helmReleaseName } from 'src/lib/pumba' -import { InitialArgv } from '../../deploy/initial' - -export const command = 'pumba' - -export const describe = 'deploy the pumba package' - -export const builder = {} - -export const handler = async (argv: InitialArgv) => { - await switchToClusterFromEnv(argv.celoEnv) - await upgradeGenericHelmChart({ - namespace: argv.celoEnv, - releaseName: helmReleaseName(argv.celoEnv), - chartDir: helmChartDir, - parameters: helmParameters(), - }) -} diff --git a/packages/celotool/src/cmds/deploy/upgrade/testnet.ts b/packages/celotool/src/cmds/deploy/upgrade/testnet.ts deleted file mode 100644 index 213d22aeaeb..00000000000 --- a/packages/celotool/src/cmds/deploy/upgrade/testnet.ts +++ /dev/null @@ -1,51 +0,0 @@ -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { - isCelotoolHelmDryRun, - resetAndUpgradeHelmChart, - upgradeHelmChart, - upgradeStaticIPs, -} from 'src/lib/helm_deploy' -import { - uploadEnvFileToGoogleStorage, - uploadTestnetStaticNodesToGoogleStorage, -} from 'src/lib/testnet-utils' -import yargs from 'yargs' -import { UpgradeArgv } from '../../deploy/upgrade' - -export const command = 'testnet' -export const describe = 'upgrade an existing deploy of the testnet package' - -type TestnetArgv = UpgradeArgv & { - reset: boolean - useExistingGenesis: boolean -} - -export const builder = (argv: yargs.Argv) => { - return argv - .option('reset', { - describe: 'deletes any chain data in persistent volume claims', - default: false, - type: 'boolean', - }) - .option('useExistingGenesis', { - type: 'boolean', - description: 'Instead of generating a new genesis, use an existing genesis in GCS', - default: false, - }) -} - -export const handler = async (argv: TestnetArgv) => { - await switchToClusterFromEnv(argv.celoEnv) - - await upgradeStaticIPs(argv.celoEnv) - - if (argv.reset === true) { - await resetAndUpgradeHelmChart(argv.celoEnv, argv.useExistingGenesis) - } else { - await upgradeHelmChart(argv.celoEnv, argv.useExistingGenesis) - } - if (!isCelotoolHelmDryRun()) { - await uploadTestnetStaticNodesToGoogleStorage(argv.celoEnv) - await uploadEnvFileToGoogleStorage(argv.celoEnv) - } -} diff --git a/packages/celotool/src/cmds/deploy/upgrade/tracer-tool.ts b/packages/celotool/src/cmds/deploy/upgrade/tracer-tool.ts deleted file mode 100644 index ae0707ffa58..00000000000 --- a/packages/celotool/src/cmds/deploy/upgrade/tracer-tool.ts +++ /dev/null @@ -1,14 +0,0 @@ -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { upgradeHelmChart } from 'src/lib/tracer-tool' -import { UpgradeArgv } from '../../deploy/upgrade' - -export const command = 'tracer-tool' - -export const describe = 'upgrade the tracer-tool deployment' - -export const builder = {} - -export const handler = async (argv: UpgradeArgv) => { - await switchToClusterFromEnv(argv.celoEnv) - await upgradeHelmChart(argv.celoEnv) -} diff --git a/packages/celotool/src/cmds/deploy/upgrade/transaction-metrics-exporter.ts b/packages/celotool/src/cmds/deploy/upgrade/transaction-metrics-exporter.ts deleted file mode 100644 index 2c9d3ed6d35..00000000000 --- a/packages/celotool/src/cmds/deploy/upgrade/transaction-metrics-exporter.ts +++ /dev/null @@ -1,14 +0,0 @@ -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { upgradeHelmChart } from 'src/lib/transaction-metrics-exporter' -import { UpgradeArgv } from '../../deploy/upgrade' - -export const command = 'transaction-metrics-exporter' - -export const describe = 'upgrade the transaction metrics exporter deploy' - -export const builder = {} - -export const handler = async (argv: UpgradeArgv) => { - await switchToClusterFromEnv(argv.celoEnv) - await upgradeHelmChart(argv.celoEnv) -} diff --git a/packages/celotool/src/cmds/deploy/upgrade/wallet-connect.ts b/packages/celotool/src/cmds/deploy/upgrade/wallet-connect.ts deleted file mode 100644 index ab5ad2c02c3..00000000000 --- a/packages/celotool/src/cmds/deploy/upgrade/wallet-connect.ts +++ /dev/null @@ -1,14 +0,0 @@ -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { upgradeWalletConnect } from 'src/lib/wallet-connect' -import { InitialArgv } from '../initial' - -export const command = 'walletconnect' - -export const describe = 'deploy the walletconnect package' - -export const builder = {} - -export const handler = async (argv: InitialArgv) => { - await switchToClusterFromEnv(argv.celoEnv) - await upgradeWalletConnect() -} diff --git a/packages/celotool/src/cmds/fork_env.ts b/packages/celotool/src/cmds/fork_env.ts deleted file mode 100644 index fbbd2f96187..00000000000 --- a/packages/celotool/src/cmds/fork_env.ts +++ /dev/null @@ -1,44 +0,0 @@ -import { parse } from 'dotenv' -import { readFileSync, writeFileSync } from 'fs' -import { map, merge, reduce } from 'lodash' -import path from 'path' -import { CeloEnvArgv, genericEnvFilePath, isValidCeloEnv, monorepoRoot } from 'src/lib/env-utils' -import yargs from 'yargs' -export const command = 'fork-env ' - -export const describe = 'command for forking an environment off the default .env file' - -interface ForkEnvArgs extends CeloEnvArgv { - envVars: string - newEnvName: string -} - -export const builder = (args: yargs.Argv) => { - return args - .positional('newEnvName', { - coerce: (newEnvName: string) => { - if (isValidCeloEnv(newEnvName)) { - return newEnvName - } - - throw new Error(`Invalid new env name: ${newEnvName}`) - }, - }) - .option('envVars', { - type: 'array', - description: 'environment variables you want to override, with ENV_NAME=value', - default: [], - alias: 'e', - }) -} - -export const handler = (argv: ForkEnvArgs) => { - const genericEnvFile = readFileSync(genericEnvFilePath) - const defaultEnvVars = parse(genericEnvFile) - - const envVarsToOverride = reduce(map(argv.envVars, parse), merge, {}) - - const mergedEnvVars = { ...defaultEnvVars, ...envVarsToOverride } - const newEnvFile = map(mergedEnvVars, (value, key) => `${key}="${value}"`).join('\n') - writeFileSync(path.resolve(monorepoRoot, `.env.${argv.newEnvName}`), newEnvFile) -} diff --git a/packages/celotool/src/cmds/gcp/remove-leaked-forwarding-rules.ts b/packages/celotool/src/cmds/gcp/remove-leaked-forwarding-rules.ts deleted file mode 100644 index 3ec996041a6..00000000000 --- a/packages/celotool/src/cmds/gcp/remove-leaked-forwarding-rules.ts +++ /dev/null @@ -1,124 +0,0 @@ -import { zip } from 'lodash' -import { execCmd, execCmdWithExitOnFailure } from 'src/lib/cmd-utils' -import yargs from 'yargs' - -export const command = 'remove-leaked-forwarding-rules' - -export const describe = 'Removes leaked forwarding rules that Kubernetes did not garbage collect' - -interface Argv extends yargs.Argv { - keywords: string - project: string -} - -export const builder = (argv: yargs.Argv) => { - return argv - .option('keywords', { - required: false, - default: '', - type: 'string', - description: 'comma-separated list of keywords when matched with the rule, should be deleted', - }) - .option('project', { - alias: 'p', - required: true, - type: 'string', - description: 'GCP project within which to run this command', - }) -} - -export const handler = async (argv: Argv) => { - console.info('Fetching forwarding-rules') - let rules: any[] = await execCmdWithExitOnFailure( - `gcloud compute forwarding-rules list --format=json --project=${argv.project}` - ).then(([body]) => JSON.parse(body)) - - const candidates = rules.filter((rule) => rule.target && rule.target.includes('targetPools')) - - console.info('Determining health of rules') - const shouldDelete = await Promise.all( - candidates.map(async (rule) => { - const targetComponents = rule.target.split('/') - const zone = targetComponents[8] - const target = targetComponents[10] - - try { - await execCmd( - `gcloud compute target-pools get-health ${target} --region=${zone} --format=json --project=${argv.project}`, - {}, - true - ) - return false - } catch ([error, stdout, stderr]) { - if (typeof stdout === 'string') { - const healthyInstances = JSON.parse(stdout).length - return healthyInstances === 0 - } - } - }) - ) - - const candidatesToDelete = zip(candidates, shouldDelete).filter(([, x]) => x) - console.info( - `Should delete ${candidatesToDelete.length} forwarding-rules that don't have any targets` - ) - - await Promise.all( - candidatesToDelete.map(async ([candidate]) => { - const targetComponents = candidate.target.split('/') - const zone = targetComponents[8] - const target = targetComponents[10] - - console.info(`Deleting forwarding-rule ${candidate.name}`) - await execCmdWithExitOnFailure( - `gcloud compute forwarding-rules delete ${candidate.name} ${getRegionFlag( - candidate.selfLink - )} -q --project=${argv.project}` - ) - console.info(`Deleted forwarding-rule ${candidate.name}`) - - console.info(`Deleting target-pool ${target}`) - await execCmdWithExitOnFailure( - `gcloud compute target-pools delete ${target} --region=${zone} -q --project=${argv.project}` - ) - console.info(`Deleted target-pool ${target}`) - }) - ) - - if (argv.keywords.length === 0) { - console.info(`No keywords given`) - return - } - - const keywordsToMatch = argv.keywords.split(',') - - rules = await execCmdWithExitOnFailure( - `gcloud compute forwarding-rules list --format=json --project=${argv.project}` - ).then(([body]) => JSON.parse(body)) - - const matchingRules = rules.filter((lb) => - keywordsToMatch.some( - (keyword) => lb.description.includes(keyword) || lb.target.includes(keyword) - ) - ) - - await Promise.all( - matchingRules.map(async (rule) => { - console.info(`Deleting forwarding-rule ${rule.name}`) - await execCmdWithExitOnFailure( - `gcloud compute forwarding-rules delete ${rule.name} ${getRegionFlag( - rule.selfLink - )} -q --project=${argv.project}` - ) - console.info(`Deleted forwarding-rule ${rule.name}`) - }) - ) - - return -} - -function getRegionFlag(name: string) { - const parts = name.split('/') - const regionIndicator = parts[7] - return regionIndicator === 'global' ? '--global' : `--region=${parts[8]}` -} diff --git a/packages/celotool/src/cmds/generate.ts b/packages/celotool/src/cmds/generate.ts deleted file mode 100644 index 74d75f19768..00000000000 --- a/packages/celotool/src/cmds/generate.ts +++ /dev/null @@ -1,10 +0,0 @@ -import yargs from 'yargs' - -export const command = 'generate ' -export const describe = 'commands for generating network parameters' - -export const builder = (argv: yargs.Argv) => argv.commandDir('generate', { extensions: ['ts'] }) - -export const handler = () => { - // empty -} diff --git a/packages/celotool/src/cmds/generate/account-address.ts b/packages/celotool/src/cmds/generate/account-address.ts deleted file mode 100644 index aff7d7f1450..00000000000 --- a/packages/celotool/src/cmds/generate/account-address.ts +++ /dev/null @@ -1,57 +0,0 @@ -import { - coerceMnemonicAccountType, - generateAddress, - MNEMONIC_ACCOUNT_TYPE_CHOICES, - privateKeyToAddress, -} from 'src/lib/generate_utils' -import yargs from 'yargs' - -interface AccountAddressArgv { - privateKey: string - mnemonic: string - accountType: string - index: number -} - -export const command = 'account-address' - -export const describe = 'command for generating account address from private key' - -export const builder = (argv: yargs.Argv) => { - return argv - .option('private-key', { - type: 'string', - description: 'private key', - required: false, - }) - .option('mnemonic', { - type: 'string', - description: 'BIP-39 mnemonic', - alias: 'm', - required: false, - }) - .option('accountType', { - alias: 'a', - type: 'string', - choices: MNEMONIC_ACCOUNT_TYPE_CHOICES, - required: false, - }) - .option('index', { - type: 'number', - description: 'Index of key to generate', - alias: 'i', - required: false, - }) -} - -export const handler = (argv: AccountAddressArgv) => { - if (argv.privateKey) { - console.info(privateKeyToAddress(argv.privateKey)) - } else if (argv.mnemonic && argv.accountType && argv.index != null) { - console.info( - generateAddress(argv.mnemonic, coerceMnemonicAccountType(argv.accountType), argv.index) - ) - } else { - console.error('The --private-key or --mnemonic, --accountType and --index must be provided') - } -} diff --git a/packages/celotool/src/cmds/generate/address-from-env.ts b/packages/celotool/src/cmds/generate/address-from-env.ts deleted file mode 100644 index e292a3e11b6..00000000000 --- a/packages/celotool/src/cmds/generate/address-from-env.ts +++ /dev/null @@ -1,44 +0,0 @@ -import { addCeloEnvMiddleware, CeloEnvArgv } from 'src/lib/env-utils' -import { - coerceMnemonicAccountType, - getAddressFromEnv, - MNEMONIC_ACCOUNT_TYPE_CHOICES, -} from 'src/lib/generate_utils' -import yargs from 'yargs' - -export const command = 'address-from-env' - -export const describe = 'command for fetching addresses as specified by the current environment' - -interface AccountAddressArgv extends CeloEnvArgv { - index: number - accountType: string -} - -export const builder = (argv: yargs.Argv) => { - return addCeloEnvMiddleware( - argv - .option('index', { - alias: 'i', - type: 'number', - description: 'account index', - demand: 'Please specifiy account index', - }) - .option('accountType', { - alias: 'a', - type: 'string', - choices: MNEMONIC_ACCOUNT_TYPE_CHOICES, - description: 'account type', - demand: 'Please specifiy account type', - required: true, - }) - ) -} - -export const handler = (argv: CeloEnvArgv & AccountAddressArgv) => { - const validatorAddress = getAddressFromEnv( - coerceMnemonicAccountType(argv.accountType), - argv.index - ) - console.info(validatorAddress) -} diff --git a/packages/celotool/src/cmds/generate/bip32.ts b/packages/celotool/src/cmds/generate/bip32.ts deleted file mode 100644 index eb95aaf3670..00000000000 --- a/packages/celotool/src/cmds/generate/bip32.ts +++ /dev/null @@ -1,50 +0,0 @@ -import { - coerceMnemonicAccountType, - generatePrivateKey, - MNEMONIC_ACCOUNT_TYPE_CHOICES, -} from 'src/lib/generate_utils' -import yargs from 'yargs' - -interface Bip32Argv { - mnemonic: string - accountType: string - index: number -} - -export const command = 'bip32' - -export const describe = 'command for generating a private key using the bip32 standard' - -export const builder = (argv: yargs.Argv) => { - return argv - .option('mnemonic', { - type: 'string', - description: 'BIP-39 mnemonic', - demandOption: 'Please specify a mnemonic from which to derive a private key', - alias: 'm', - }) - .option('accountType', { - alias: 'a', - type: 'string', - choices: MNEMONIC_ACCOUNT_TYPE_CHOICES, - required: true, - }) - .option('index', { - type: 'number', - description: 'Index of key to generate', - demandOption: 'Please specify a key index', - alias: 'i', - }) -} - -/* - * Given a BIP-39 mnemonic, we generate a level 1 child private key using the - * BIP-32 standard. - * https://github.com/bitcoin/bips/blob/master/bip-0032.mediawiki - * https://github.com/bitcoin/bips/blob/master/bip-0039.mediawiki - */ -export const handler = (argv: Bip32Argv) => { - console.info( - generatePrivateKey(argv.mnemonic, coerceMnemonicAccountType(argv.accountType), argv.index) - ) -} diff --git a/packages/celotool/src/cmds/generate/bootnode-enode.ts b/packages/celotool/src/cmds/generate/bootnode-enode.ts deleted file mode 100644 index d58910d5359..00000000000 --- a/packages/celotool/src/cmds/generate/bootnode-enode.ts +++ /dev/null @@ -1,14 +0,0 @@ -/* tslint:disable no-console */ -import { addCeloEnvMiddleware, CeloEnvArgv } from 'src/lib/env-utils' -import { getBootnodeEnode } from 'src/lib/geth' -import yargs from 'yargs' - -export const command = 'bootnode-enode' - -export const describe = 'command for the bootnode enode address for an environment' - -export const builder = (argv: yargs.Argv) => addCeloEnvMiddleware(argv) - -export const handler = async (argv: CeloEnvArgv) => { - console.info(await getBootnodeEnode(argv.celoEnv)) -} diff --git a/packages/celotool/src/cmds/generate/faucet-load-test.ts b/packages/celotool/src/cmds/generate/faucet-load-test.ts deleted file mode 100644 index 2fae45c3eb9..00000000000 --- a/packages/celotool/src/cmds/generate/faucet-load-test.ts +++ /dev/null @@ -1,101 +0,0 @@ -import { newKit } from '@celo/contractkit' -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { convertToContractDecimals } from 'src/lib/contract-utils' -import { addCeloEnvMiddleware, CeloEnvArgv, envVar, fetchEnv } from 'src/lib/env-utils' -import { AccountType, generateAddress } from 'src/lib/generate_utils' -import { getIndexForLoadTestThread } from 'src/lib/geth' -import { portForwardAnd } from 'src/lib/port_forward' -import yargs from 'yargs' - -interface FaucetLoadTest extends CeloEnvArgv { - gold: number - dollars: number - replica_from: number - replica_to: number - threads_from: number - threads_to: number -} - -export const command = 'faucet-load-test' - -export const describe = 'command for fauceting the addresses used for load testing' - -export const builder = (argv: yargs.Argv) => { - return addCeloEnvMiddleware( - argv - .option('gold', { - type: 'number', - description: 'Celo Gold amount to transfer', - default: 10, - }) - .option('dollars', { - type: 'number', - description: 'Celo Dollars amount to transfer', - default: 10, - }) - .option('replica_from', { - type: 'number', - description: 'Index count from', - demandOption: 'Please specify a key index', - }) - .option('replica_to', { - type: 'number', - description: 'Index count to', - demandOption: 'Please specify a key index', - }) - .option('threads_from', { - type: 'number', - description: 'Index of key to generate', - demandOption: 'Please specify a key threads_from', - }) - .option('threads_to', { - type: 'number', - description: 'Index of key to generate', - demandOption: 'Please specify a key threads_to', - }) - ) -} - -export const handler = async (argv: CeloEnvArgv & FaucetLoadTest) => { - await switchToClusterFromEnv(argv.celoEnv) - const accountType = AccountType.LOAD_TESTING_ACCOUNT - const mnemonic = fetchEnv(envVar.MNEMONIC) - - const cb = async () => { - const kit = newKit('http://localhost:8545') - const account = (await kit.web3.eth.getAccounts())[0] - console.info(`Using account: ${account}`) - kit.defaultAccount = account - - const [goldToken, stableToken] = await Promise.all([ - kit.contracts.getGoldToken(), - kit.contracts.getStableToken(), - ]) - - const [goldAmount, stableTokenAmount] = await Promise.all([ - convertToContractDecimals(argv.gold, goldToken), - convertToContractDecimals(argv.dollars, stableToken), - ]) - - for (let podIndex = argv.replica_from; podIndex <= argv.replica_to; podIndex++) { - for (let threadIndex = argv.threads_from; threadIndex <= argv.threads_to; threadIndex++) { - const index = getIndexForLoadTestThread(podIndex, threadIndex) - const address = generateAddress(mnemonic, accountType, index) - console.info(`${index} --> Fauceting ${goldAmount.toFixed()} Gold to ${address}`) - await goldToken.transfer(address, goldAmount.toFixed()).send() - console.info(`${index} --> Fauceting ${stableTokenAmount.toFixed()} Dollars to ${address}`) - await stableToken.transfer(address, stableTokenAmount.toFixed()).send() - } - } - } - - try { - await portForwardAnd(argv.celoEnv, cb) - // note this wasnt called before - await cb() - } catch (error) { - console.error(`Unable to faucet load-test accounts on ${argv.celoEnv}`) - console.error(error) - process.exit(1) - } -} diff --git a/packages/celotool/src/cmds/generate/genesis-file.ts b/packages/celotool/src/cmds/generate/genesis-file.ts deleted file mode 100644 index 74b9fb24db0..00000000000 --- a/packages/celotool/src/cmds/generate/genesis-file.ts +++ /dev/null @@ -1,18 +0,0 @@ -import { addCeloEnvMiddleware, CeloEnvArgv } from 'src/lib/env-utils' -import { generateGenesisFromEnv } from 'src/lib/generate_utils' -import yargs from 'yargs' - -export const command = 'genesis-file' - -export const describe = 'command for creating the genesis file by the current environment' - -type GenesisFileArgv = CeloEnvArgv - -export const builder = (argv: yargs.Argv) => { - return addCeloEnvMiddleware(argv) -} - -export const handler = (_argv: GenesisFileArgv) => { - const genesisFile = generateGenesisFromEnv() - console.info(genesisFile) -} diff --git a/packages/celotool/src/cmds/generate/istanbul-extra.ts b/packages/celotool/src/cmds/generate/istanbul-extra.ts deleted file mode 100644 index 28e4fd8d691..00000000000 --- a/packages/celotool/src/cmds/generate/istanbul-extra.ts +++ /dev/null @@ -1,42 +0,0 @@ -import { readFileSync } from 'fs' -import { addCeloEnvMiddleware, CeloEnvArgv, envVar, fetchEnv } from 'src/lib/env-utils' -import { - generateIstanbulExtraData, - getValidatorsInformation, - Validator, -} from 'src/lib/generate_utils' -import yargs from 'yargs' - -export const command = 'istanbul-extra' - -export const describe = - 'command to compile the istanbul extra data to include in a custom genesis file' - -interface IstanbulExtraArgv extends CeloEnvArgv { - validators: string -} - -export const builder = (argv: yargs.Argv) => { - return addCeloEnvMiddleware( - argv.option('validators', { - type: 'string', - description: 'path to a validators JSON file or the keywod "env"', - demand: 'Please specify the valdiators to include', - required: true, - }) - ) -} - -export const handler = (argv: IstanbulExtraArgv) => { - const validators: Validator[] = - argv.validators === 'env' - ? getValidatorsInformation( - fetchEnv(envVar.MNEMONIC), - parseInt(fetchEnv(envVar.VALIDATORS), 10) - ) - : JSON.parse(readFileSync(argv.validators).toString()) - console.info(validators) - console.info('\nIstanbul extra data:') - const extra = generateIstanbulExtraData(validators) - console.info(extra) -} diff --git a/packages/celotool/src/cmds/generate/prepare-load-test-client.ts b/packages/celotool/src/cmds/generate/prepare-load-test-client.ts deleted file mode 100644 index e9703228be9..00000000000 --- a/packages/celotool/src/cmds/generate/prepare-load-test-client.ts +++ /dev/null @@ -1,55 +0,0 @@ -/* tslint:disable no-console */ -import * as fs from 'fs' -import { AccountType, generatePrivateKey, privateKeyToAddress } from 'src/lib/generate_utils' -import { getIndexForLoadTestThread } from 'src/lib/geth' -import yargs from 'yargs' - -interface Bip32Argv { - mnemonic: string - index: number - threads: number -} - -export const command = 'prepare-load-test' - -export const describe = - 'command for generating public and private keys for a load test instance. Expected to run inside the loadtest pod' - -export const builder = (argv: yargs.Argv) => { - return argv - .option('mnemonic', { - type: 'string', - description: 'BIP-39 mnemonic', - demandOption: 'Please specify a mnemonic from which to derive a private key', - alias: 'm', - }) - .option('index', { - type: 'number', - description: 'Index of key to generate', - demandOption: 'Please specify a key index', - alias: 'i', - }) - .option('threads', { - type: 'number', - description: 'The number of threads', - demandOption: 'Please specify the number of threads of this node', - alias: 't', - }) -} - -export const handler = (argv: Bip32Argv) => { - const accountType = AccountType.LOAD_TESTING_ACCOUNT - // Empty address file if there is any address (i.e.: Used a snapshot with addresses already generated) - fs.writeFileSync(`/root/.celo/address`, ``) - // Generate private keys and addresses for each thread - for (let t = 0; t < argv.threads; t++) { - const index = getIndexForLoadTestThread(argv.index, t) - console.info(`Index for thread ${t} --> ${index}`) - - const privateKey = generatePrivateKey(argv.mnemonic, accountType, index) - const address = privateKeyToAddress(privateKey) - fs.writeFileSync(`/root/.celo/pkey${t}`, `${privateKey}\n`) - fs.appendFileSync(`/root/.celo/address`, `${address}\n`) - console.info(`Address for index ${argv.index} and thread ${t} --> ${address}`) - } -} diff --git a/packages/celotool/src/cmds/generate/public-key.ts b/packages/celotool/src/cmds/generate/public-key.ts deleted file mode 100644 index 6f43c3873e5..00000000000 --- a/packages/celotool/src/cmds/generate/public-key.ts +++ /dev/null @@ -1,54 +0,0 @@ -/* tslint:disable no-console */ -import { - coerceMnemonicAccountType, - generatePrivateKey, - MNEMONIC_ACCOUNT_TYPE_CHOICES, - privateKeyToPublicKey, -} from 'src/lib/generate_utils' -import yargs from 'yargs' - -interface Bip32Argv { - mnemonic: string - accountType: string - index: number -} - -export const command = 'public-key' - -export const describe = 'command for generating the public key using the bip32 standard' - -export const builder = (argv: yargs.Argv) => { - return argv - .option('mnemonic', { - type: 'string', - description: 'BIP-39 mnemonic', - demandOption: 'Please specify a mnemonic from which to derive a public key', - alias: 'm', - }) - .option('accountType', { - alias: 'a', - type: 'string', - choices: MNEMONIC_ACCOUNT_TYPE_CHOICES, - required: true, - }) - .option('index', { - type: 'number', - description: 'Index of key to generate', - demandOption: 'Please specify a key index', - alias: 'i', - }) -} - -/* - * Given a BIP-39 mnemonic, we generate a level 2 child public key from the private key using the - * BIP-32 standard. - * https://github.com/bitcoin/bips/blob/master/bip-0032.mediawiki - * https://github.com/bitcoin/bips/blob/master/bip-0039.mediawiki - */ -export const handler = (argv: Bip32Argv) => { - console.info( - privateKeyToPublicKey( - generatePrivateKey(argv.mnemonic, coerceMnemonicAccountType(argv.accountType), argv.index) - ) - ) -} diff --git a/packages/celotool/src/cmds/geth.ts b/packages/celotool/src/cmds/geth.ts deleted file mode 100644 index 4198222c947..00000000000 --- a/packages/celotool/src/cmds/geth.ts +++ /dev/null @@ -1,18 +0,0 @@ -import yargs from 'yargs' - -export const command = 'geth ' - -export const describe = 'commands for geth' - -export interface GethArgv extends yargs.Argv { - gethDir: string - dataDir: string -} - -export const builder = (argv: yargs.Argv) => { - return argv.commandDir('geth', { extensions: ['ts'] }) -} - -export const handler = () => { - // empty -} diff --git a/packages/celotool/src/cmds/geth/build.ts b/packages/celotool/src/cmds/geth/build.ts deleted file mode 100644 index aac4b0b683b..00000000000 --- a/packages/celotool/src/cmds/geth/build.ts +++ /dev/null @@ -1,33 +0,0 @@ -import { execCmdWithExitOnFailure } from 'src/lib/cmd-utils' -import yargs from 'yargs' -import { GethArgv } from '../geth' - -export const command = 'build' - -export const describe = 'command for building geth' - -interface BuildArgv extends GethArgv { - clean: boolean -} - -export const builder = (argv: yargs.Argv) => { - return argv - .option('geth-dir', { - type: 'string', - description: 'path to geth repository', - demand: 'Please, specify the path to geth directory, where the binary could be found', - }) - .option('clean', { - type: 'boolean', - alias: 'c', - description: 'whether to clean before make', - default: false, - }) -} - -export const handler = async (argv: BuildArgv) => { - const cmd = argv.clean ? `make clean && make -j` : `make -j` - await execCmdWithExitOnFailure(cmd, { cwd: argv.gethDir }) - - console.info(`Geth has been built successfully!`) -} diff --git a/packages/celotool/src/cmds/geth/create_account.ts b/packages/celotool/src/cmds/geth/create_account.ts deleted file mode 100644 index 6a2f7bcaf49..00000000000 --- a/packages/celotool/src/cmds/geth/create_account.ts +++ /dev/null @@ -1,94 +0,0 @@ -/* tslint:disable no-console */ -import fs from 'fs' -import path from 'path' -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { execCmd, execCmdWithExitOnFailure } from 'src/lib/cmd-utils' -import { CeloEnvArgv, addCeloEnvMiddleware } from 'src/lib/env-utils' -import { fetchPassword } from 'src/lib/geth' -import { addCeloGethMiddleware } from 'src/lib/utils' -import yargs from 'yargs' -import { GethArgv } from '../geth' - -export const command = 'create-account' - -export const describe = 'command for creating account and fauceting it' - -interface CreateAccountArgv extends CeloEnvArgv, GethArgv { - faucet: boolean - password: string - passwordFile: string | null -} - -export const builder = (argv: yargs.Argv) => { - return addCeloGethMiddleware(addCeloEnvMiddleware(argv)) - .option('faucet', { - type: 'boolean', - alias: 'f', - default: false, - description: - 'whether to faucet created account with 100 celo dollars and 10 celo gold or not', - }) - .option('password', { - type: 'string', - description: 'account password', - default: '', - }) - .option('password-file', { - type: 'string', - description: 'path to file with account password', - default: null, - }) -} - -export const handler = async (argv: CreateAccountArgv) => { - await switchToClusterFromEnv(argv.celoEnv, false) - - const env = argv.celoEnv - let password = argv.password - const datadir = argv.dataDir - const passwordFile = argv.passwordFile - const needFaucet = argv.faucet - const gethBinary = `${argv.gethDir}/build/bin/geth` - - if (!fs.existsSync(path.resolve(datadir, 'keystore'))) { - console.error(`Error: keystore was not found in datadir ${datadir}`) - console.info(`Try to running "celotooljs geth init"`) - process.exit(1) - } - - if (password.length > 0 && passwordFile !== null) { - console.error(`Please, specify either "password" or "password-file" but not both`) - process.exit(1) - } - - if (passwordFile !== null) { - password = fetchPassword(passwordFile) - } - - const passwordFilePath = path.resolve(__dirname, '__password_tmp') - fs.writeFileSync(passwordFilePath, password) - - const [stdout, stderr] = await execCmd( - `${gethBinary} --datadir=${datadir} account new --password ${passwordFilePath}` - ) - - fs.unlinkSync(passwordFilePath) - - const addressRegex = /Address:.*{([a-zA-Z0-9]+)}/ - const matches = addressRegex.exec(stdout) - if (matches && matches.length === 2) { - const address = matches[1] - console.info(`Created account address: 0x${address}`) - - if (needFaucet) { - console.info(`Fauceting 0x${address} on ${env}`) - await execCmdWithExitOnFailure( - `yarn --cwd ${process.cwd()} run cli account faucet -e ${env} --account 0x${address}` - ) - console.info(`Fauceting completed successfully! 💰💰💰`) - } - } else { - console.error('Error occured while creating account') - console.error(`stderr: ${stderr}`) - } -} diff --git a/packages/celotool/src/cmds/geth/genesis_default.json b/packages/celotool/src/cmds/geth/genesis_default.json deleted file mode 100644 index bdbcb5cc9db..00000000000 --- a/packages/celotool/src/cmds/geth/genesis_default.json +++ /dev/null @@ -1,58 +0,0 @@ -{ - "config": { - "chainId": 1101, - "homesteadBlock": 1, - "eip150Block": 2, - "eip150Hash": "0x0000000000000000000000000000000000000000000000000000000000000000", - "eip155Block": 3, - "eip158Block": 3, - "byzantiumBlock": 4, - "clique": { - "period": 5, - "epoch": 30000 - } - }, - "nonce": "0x0", - "timestamp": "0x5b843511", - "extraData": - "0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000feE1a22F43BeeCB912B5a4912ba87527682ef0fC889F21CE69dcc25a4594f73230A55896d67038065372d2bbBaBaAf1495182E31cF13dB0d18463B0EF71690ea7E0c67827d8968882FAC0c4cBBD65BCE0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000", - "gasLimit": "0x8000000", - "difficulty": "0x0400", - "mixHash": "0x0000000000000000000000000000000000000000000000000000000000000000", - "coinbase": "0x0000000000000000000000000000000000000000", - "alloc": { - "feE1a22F43BeeCB912B5a4912ba87527682ef0fC": { - "balance": "1000000000000000000000000" - }, - "889F21CE69dcc25a4594f73230A55896d6703806": { - "balance": "1000000000000000000000000" - }, - "5372d2bbBaBaAf1495182E31cF13dB0d18463B0E": { - "balance": "1000000000000000000000000" - }, - "F71690ea7E0c67827d8968882FAC0c4cBBD65BCE": { - "balance": "1000000000000000000000000" - }, - "0000000000000000000000000000000000000abe": { - "code": - "0x60806040526004361061006d576000357c0100000000000000000000000000000000000000000000000000000000900463ffffffff16806303386ba3146100df57806342404e071461012d578063bb913f4114610184578063d29d44ee146101c7578063f7e6af801461020a575b600060405180807f6f72672e63656c6f2e696d706c656d656e746174696f6e000000000000000000815250601701905060405180910390209050805460405136810160405236600082376000803683855af43d604051818101604052816000823e82600081146100db578282f35b8282fd5b61012b600480360381019080803573ffffffffffffffffffffffffffffffffffffffff169060200190929190803590602001908201803590602001919091929391929390505050610261565b005b34801561013957600080fd5b5061014261031d565b604051808273ffffffffffffffffffffffffffffffffffffffff1673ffffffffffffffffffffffffffffffffffffffff16815260200191505060405180910390f35b34801561019057600080fd5b506101c5600480360381019080803573ffffffffffffffffffffffffffffffffffffffff169060200190929190505050610360565b005b3480156101d357600080fd5b50610208600480360381019080803573ffffffffffffffffffffffffffffffffffffffff16906020019092919050505061044b565b005b34801561021657600080fd5b5061021f6104bf565b604051808273ffffffffffffffffffffffffffffffffffffffff1673ffffffffffffffffffffffffffffffffffffffff16815260200191505060405180910390f35b600061026b6104bf565b905060008173ffffffffffffffffffffffffffffffffffffffff1614806102bd57508073ffffffffffffffffffffffffffffffffffffffff163373ffffffffffffffffffffffffffffffffffffffff16145b15156102c857600080fd5b6102d184610360565b8373ffffffffffffffffffffffffffffffffffffffff16838360405180838380828437820191505092505050600060405180830381855af4915050151561031757600080fd5b50505050565b60008060405180807f6f72672e63656c6f2e696d706c656d656e746174696f6e000000000000000000815250601701905060405180910390209050805491505090565b60008061036b6104bf565b905060008173ffffffffffffffffffffffffffffffffffffffff1614806103bd57508073ffffffffffffffffffffffffffffffffffffffff163373ffffffffffffffffffffffffffffffffffffffff16145b15156103c857600080fd5b60405180807f6f72672e63656c6f2e696d706c656d656e746174696f6e0000000000000000008152506017019050604051809103902091508282558273ffffffffffffffffffffffffffffffffffffffff167fab64f92ab780ecbf4f3866f57cee465ff36c89450dcce20237ca7a8d81fb7d1360405160405180910390a2505050565b60006104556104bf565b905060008173ffffffffffffffffffffffffffffffffffffffff1614806104a757508073ffffffffffffffffffffffffffffffffffffffff163373ffffffffffffffffffffffffffffffffffffffff16145b15156104b257600080fd5b6104bb82610502565b5050565b60008060405180807f6f72672e63656c6f2e6f776e6572000000000000000000000000000000000000815250600e01905060405180910390209050805491505090565b600060405180807f6f72672e63656c6f2e6f776e6572000000000000000000000000000000000000815250600e019050604051809103902090508181558173ffffffffffffffffffffffffffffffffffffffff167f50146d0e3c60aa1d17a70635b05494f864e86144a2201275021014fbf08bafe260405160405180910390a250505600a165627a7a723058202e143bd88c74e9d3753b7bec1719bf5907ae012bc140b9b2b6edf2f9b686ff5d0029", - "storage": { - "0x34dc5a2556b2030988481969696f29fed38d45813d8003f6c70e5c16ac92ae0f": - "feE1a22F43BeeCB912B5a4912ba87527682ef0fC" - }, - "balance": "0" - }, - "000000000000000000000000000000000000ce10": { - "code": - "0x60806040526004361061006d576000357c0100000000000000000000000000000000000000000000000000000000900463ffffffff16806303386ba3146100df57806342404e071461012d578063bb913f4114610184578063d29d44ee146101c7578063f7e6af801461020a575b600060405180807f6f72672e63656c6f2e696d706c656d656e746174696f6e000000000000000000815250601701905060405180910390209050805460405136810160405236600082376000803683855af43d604051818101604052816000823e82600081146100db578282f35b8282fd5b61012b600480360381019080803573ffffffffffffffffffffffffffffffffffffffff169060200190929190803590602001908201803590602001919091929391929390505050610261565b005b34801561013957600080fd5b5061014261031d565b604051808273ffffffffffffffffffffffffffffffffffffffff1673ffffffffffffffffffffffffffffffffffffffff16815260200191505060405180910390f35b34801561019057600080fd5b506101c5600480360381019080803573ffffffffffffffffffffffffffffffffffffffff169060200190929190505050610360565b005b3480156101d357600080fd5b50610208600480360381019080803573ffffffffffffffffffffffffffffffffffffffff16906020019092919050505061044b565b005b34801561021657600080fd5b5061021f6104bf565b604051808273ffffffffffffffffffffffffffffffffffffffff1673ffffffffffffffffffffffffffffffffffffffff16815260200191505060405180910390f35b600061026b6104bf565b905060008173ffffffffffffffffffffffffffffffffffffffff1614806102bd57508073ffffffffffffffffffffffffffffffffffffffff163373ffffffffffffffffffffffffffffffffffffffff16145b15156102c857600080fd5b6102d184610360565b8373ffffffffffffffffffffffffffffffffffffffff16838360405180838380828437820191505092505050600060405180830381855af4915050151561031757600080fd5b50505050565b60008060405180807f6f72672e63656c6f2e696d706c656d656e746174696f6e000000000000000000815250601701905060405180910390209050805491505090565b60008061036b6104bf565b905060008173ffffffffffffffffffffffffffffffffffffffff1614806103bd57508073ffffffffffffffffffffffffffffffffffffffff163373ffffffffffffffffffffffffffffffffffffffff16145b15156103c857600080fd5b60405180807f6f72672e63656c6f2e696d706c656d656e746174696f6e0000000000000000008152506017019050604051809103902091508282558273ffffffffffffffffffffffffffffffffffffffff167fab64f92ab780ecbf4f3866f57cee465ff36c89450dcce20237ca7a8d81fb7d1360405160405180910390a2505050565b60006104556104bf565b905060008173ffffffffffffffffffffffffffffffffffffffff1614806104a757508073ffffffffffffffffffffffffffffffffffffffff163373ffffffffffffffffffffffffffffffffffffffff16145b15156104b257600080fd5b6104bb82610502565b5050565b60008060405180807f6f72672e63656c6f2e6f776e6572000000000000000000000000000000000000815250600e01905060405180910390209050805491505090565b600060405180807f6f72672e63656c6f2e6f776e6572000000000000000000000000000000000000815250600e019050604051809103902090508181558173ffffffffffffffffffffffffffffffffffffffff167f50146d0e3c60aa1d17a70635b05494f864e86144a2201275021014fbf08bafe260405160405180910390a250505600a165627a7a723058202e143bd88c74e9d3753b7bec1719bf5907ae012bc140b9b2b6edf2f9b686ff5d0029", - "storage": { - "0x34dc5a2556b2030988481969696f29fed38d45813d8003f6c70e5c16ac92ae0f": - "feE1a22F43BeeCB912B5a4912ba87527682ef0fC" - }, - "balance": "0" - } - }, - "number": "0x0", - "gasUsed": "0x0", - "parentHash": "0x0000000000000000000000000000000000000000000000000000000000000000" -} diff --git a/packages/celotool/src/cmds/geth/get_gold_balance.ts b/packages/celotool/src/cmds/geth/get_gold_balance.ts deleted file mode 100644 index aeb2ec4798d..00000000000 --- a/packages/celotool/src/cmds/geth/get_gold_balance.ts +++ /dev/null @@ -1,50 +0,0 @@ -import { execCmdWithExitOnFailure } from 'src/lib/cmd-utils' -import { addCeloEnvMiddleware } from 'src/lib/env-utils' -import { addCeloGethMiddleware, ensure0x } from 'src/lib/utils' -import yargs from 'yargs' -import { GethArgv } from '../geth' - -export const command = 'get gold balance' - -export const describe = 'command for initializing geth' - -interface GetGoldBalanceArgv extends GethArgv { - account: string -} - -export const builder = (argv: yargs.Argv) => { - return addCeloGethMiddleware(addCeloEnvMiddleware(argv)).option('account', { - type: 'string', - description: 'Account to get balance for', - default: null, - }) -} - -const invalidArgumentExit = (argumentName?: string, errorMessage?: string) => { - console.error(`Invalid argument ${argumentName}: ${errorMessage}`) - process.exit(1) -} - -export const handler = async (argv: GetGoldBalanceArgv) => { - const gethBinary = `${argv.gethDir}/build/bin/geth` - const datadir = argv.dataDir - let account = argv.account - - if (account === null || account.length === 0) { - invalidArgumentExit(account, 'Account must be provided') - // This return is required to prevent false lint errors in the code following this line - return - } - account = ensure0x(account) - if (account.length !== 42) { - invalidArgumentExit(account, 'Account must be 40 hex-chars') - } - // eslint-disable-next-line no-useless-escape - const jsCmd = `eth.getBalance\('${account}'\)` - const returnValues = await execGethJsCmd(gethBinary, datadir, jsCmd) - console.info('Gold balance: ' + returnValues[0]) -} - -export const execGethJsCmd = (gethBinary: string, datadir: string, jsCmd: string) => { - return execCmdWithExitOnFailure(`${gethBinary} -datadir "${datadir}" attach --exec "${jsCmd}"`) -} diff --git a/packages/celotool/src/cmds/geth/init.ts b/packages/celotool/src/cmds/geth/init.ts deleted file mode 100644 index dcebac73d92..00000000000 --- a/packages/celotool/src/cmds/geth/init.ts +++ /dev/null @@ -1,78 +0,0 @@ -import fs from 'fs' -import path from 'path' -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { execCmdWithExitOnFailure } from 'src/lib/cmd-utils' -import { addCeloEnvMiddleware, CeloEnvArgv } from 'src/lib/env-utils' -import { getEnodesAddresses, writeStaticNodes } from 'src/lib/geth' -import { addCeloGethMiddleware } from 'src/lib/utils' -import yargs from 'yargs' -import { GethArgv } from '../geth' - -const STATIC_NODES_FILE_NAME = 'static-nodes.json' -const DEFAULT_GENESIS_FILE_NAME = 'genesis_default.json' - -export const command = 'init' - -export const describe = 'command for initializing geth' - -interface InitArgv extends CeloEnvArgv, GethArgv { - genesis: string | null - fetchStaticNodesFromNetwork: boolean | null -} - -export const builder = (argv: yargs.Argv) => { - return addCeloGethMiddleware(addCeloEnvMiddleware(argv)) - .option('genesis', { - type: 'string', - description: - 'path to genesis.json | default genesis_default.json will be used if not specified', - default: null, - }) - .option('fetch-static-nodes-from-network', { - type: 'boolean', - description: 'Automically fetch static nodes from the network', - default: true, - }) -} - -const invalidArgumentExit = (argumentName: string, errorMessage: string) => { - console.error(`Invalid argument ${argumentName}: ${errorMessage}`) - process.exit(1) -} - -export const handler = async (argv: InitArgv) => { - const namespace = argv.celoEnv - const gethBinary = `${argv.gethDir}/build/bin/geth` - const datadir = argv.dataDir - const genesis = argv.genesis ? argv.genesis : path.resolve(__dirname, DEFAULT_GENESIS_FILE_NAME) - - if ( - fs.existsSync(path.resolve(datadir, STATIC_NODES_FILE_NAME)) || - fs.existsSync(path.resolve(datadir, 'geth')) - ) { - invalidArgumentExit('datadir', `Looks like geth has been already initialized in dir ${datadir}`) - } - - if (!fs.existsSync(datadir)) { - // Directory does not exist, create it. - fs.mkdirSync(datadir) - } - - if (!fs.lstatSync(datadir).isDirectory()) { - invalidArgumentExit('datadir', `${datadir} is not a directory`) - } - - if (!fs.existsSync(genesis)) { - invalidArgumentExit('genesis', `No such file: ${genesis}`) - } - - await execCmdWithExitOnFailure(`${gethBinary} --datadir "${datadir}" init ${genesis}`) - - if (argv.fetchStaticNodesFromNetwork) { - await switchToClusterFromEnv(argv.celoEnv, false) - await getEnodesAddresses(namespace).then((enodes) => { - writeStaticNodes(enodes, datadir, STATIC_NODES_FILE_NAME) - console.info(`Geth has been initialized successfully! 😎`) - }) - } -} diff --git a/packages/celotool/src/cmds/geth/run.ts b/packages/celotool/src/cmds/geth/run.ts deleted file mode 100644 index c6f2be712f7..00000000000 --- a/packages/celotool/src/cmds/geth/run.ts +++ /dev/null @@ -1,155 +0,0 @@ -import { spawnSync } from 'child_process' -import fs from 'fs' -import path from 'path' -import { addCeloGethMiddleware, ensure0x, validateAccountAddress } from 'src/lib/utils' -import yargs from 'yargs' -import { GethArgv } from '../geth' - -const STATIC_NODES_FILE_NAME = 'static-nodes.json' - -export const command = 'run' - -export const describe = 'command for running geth' - -interface RunArgv extends GethArgv { - networkId: string - syncMode: string - mining: boolean - minerAddress: string - nodekeyhex: string - minerGasPrice: number - port: number - rpcport: number - wsport: number - verbosity: number -} - -export const builder = (argv: yargs.Argv) => { - return addCeloGethMiddleware(argv) - .option('network-id', { - type: 'string', - description: 'network id', - default: '1101', - }) - .option('sync-mode', { - choices: ['full', 'fast', 'light', 'lightest'], - demandOption: true, - }) - .option('mining', { - type: 'boolean', - description: 'Is mining enabled', - default: false, - }) - .option('miner-address', { - type: 'string', - description: 'Address of the miner', - default: null, - }) - .option('nodekeyhex', { - type: 'string', - description: 'P2P node key as hex', - default: null, - }) - .option('miner-gas-price', { - type: 'number', - description: 'Mining gas price', - default: 0, - }) - .option('port', { - type: 'number', - description: 'Port', - default: 30303, - }) - .option('rpcport', { - type: 'number', - description: 'HTTP-RPC server listening port', - default: 8545, - }) - .option('wsport', { - type: 'number', - description: 'WS-RPC server listening port', - default: 8546, - }) - .option('verbosity', { - type: 'number', - description: 'Verbosity level', - default: 5, - }) - .coerce('miner-address', (minerAddress: string) => - minerAddress === null ? null : ensure0x(minerAddress) - ) -} - -export const handler = (argv: RunArgv) => { - const gethBinary = `${argv.gethDir}/build/bin/geth` - const datadir = argv.dataDir - const networkId = argv.networkId - const syncMode = argv.syncMode - const verbosity = argv.verbosity - const nodekeyhex = argv.nodekeyhex - const port = argv.port - const rpcport = argv.rpcport - const wsport = argv.wsport - - console.info(`sync mode is ${syncMode}`) - const mining = argv.mining - const minerAddress = argv.minerAddress - const minerGasPrice = argv.minerGasPrice - - if (!fs.existsSync(path.resolve(datadir, STATIC_NODES_FILE_NAME))) { - console.error(`Error: static-nodes.json was not found in datadir ${datadir}`) - console.info(`Try running "celotooljs geth static-nodes" or "celotooljs geth init"`) - process.exit(1) - } - - const gethArgs = [ - '--datadir', - datadir, - '--syncmode', - syncMode, - '--rpc', - '--ws', - `--wsport=${wsport}`, - '--wsorigins=*', - '--rpcapi=eth,net,web3,debug,admin,personal', - '--debug', - `--port=${port}`, - '--nodiscover', - `--rpcport=${rpcport}`, - '--rpcvhosts=*', - '--networkid', - networkId, - '--verbosity', - verbosity.toString(), - '--consoleoutput=stdout', // Send all logs to stdout - '--consoleformat=term', - '--istanbul.lookbackwindow=2', - ] - - if (nodekeyhex !== null && nodekeyhex.length > 0) { - gethArgs.push('--nodekeyhex', nodekeyhex) - } - - if (mining) { - if (syncMode !== 'full' && syncMode !== 'fast') { - console.error('Mining works only in full or fast mode') - process.exit(1) - } - - if (!validateAccountAddress(minerAddress)) { - console.error(`Miner address is incorrect: "${minerAddress}"`) - process.exit(1) - } - - gethArgs.push( - '--mine', - `--miner.gasprice=${minerGasPrice}`, - '--password=/dev/null', - `--unlock=${minerAddress}`, - '--light.serve=90', - '--allow-insecure-unlock' // geth1.9 to use http w/unlocking - ) - } - - spawnSync(gethBinary, gethArgs, { stdio: 'inherit' }) -} diff --git a/packages/celotool/src/cmds/geth/simulate_client.ts b/packages/celotool/src/cmds/geth/simulate_client.ts deleted file mode 100644 index 791c805d499..00000000000 --- a/packages/celotool/src/cmds/geth/simulate_client.ts +++ /dev/null @@ -1,177 +0,0 @@ -/* eslint-disable no-console */ -import BigNumber from 'bignumber.js' -import { - AccountType, - generateAddress, - generatePrivateKey, - privateKeyToAddress, -} from 'src/lib/generate_utils' -import { - MAX_LOADTEST_THREAD_COUNT, - TestMode, - faucetLoadTestThreads, - getIndexForLoadTestThread, - simulateClient, -} from 'src/lib/geth' -import * as yargs from 'yargs' -export const command = 'simulate-client' - -export const describe = 'command for simulating client behavior' - -interface SimulateClientArgv extends yargs.Argv { - blockscoutMeasurePercent: number - blockscoutUrl: string - delay: number - index: number - mnemonic: string - recipientIndex: number - contractAddress: string - contractData: string - clientCount: number - reuseClient: boolean - maxGasPrice: number - totalTxGas: number - testMode: string - web3Provider: string - chainId: number -} - -export const builder = () => { - return yargs - .option('blockscout-measure-percent', { - type: 'number', - description: - 'Percent of transactions to measure the time it takes for blockscout to process a transaction. Should be in the range of [0, 100]', - default: 100, - }) - .option('blockscout-url', { - type: 'string', - description: - 'URL of blockscout used for measuring the time for transactions to be indexed by blockscout', - }) - .option('delay', { - type: 'number', - description: 'Delay between sending transactions in milliseconds', - default: 10000, - }) - .option('index', { - type: 'number', - description: - 'Index of the load test account to send transactions from. Used to generate account address', - }) - .option('recipient-index', { - type: 'number', - description: - 'Index of the load test account to send transactions to. Used to generate account address', - default: 0, - }) - .options('contract-address', { - type: 'string', - description: `Contract Address to send to when using test mode: ${TestMode.ContractCall}`, - default: '', - }) - .options('contract-data', { - type: 'string', - description: `Data to send to when using test mode: ${TestMode.ContractCall}`, - default: '', - }) - .options('mnemonic', { - type: 'string', - description: 'Mnemonic used to generate account addresses', - demand: 'A mnemonic must be provided', - }) - .options('client-count', { - type: 'number', - description: `Number of clients to simulate, must not exceed ${MAX_LOADTEST_THREAD_COUNT}`, - default: 1, - }) - .check((argv) => argv['client-count'] <= MAX_LOADTEST_THREAD_COUNT) - .options('reuse-client', { - type: 'boolean', - description: 'Use the same client for all the threads/accounts', - default: false, - }) - .options('max-gas-price', { - type: 'number', - description: 'Max gasPrice to use for transactions', - default: 0, - }) - .options('total-tx-gas', { - type: 'number', - description: 'Gas Target when using data transfers', - default: 500000, - }) - .options('test-mode', { - type: 'string', - description: - 'Load test mode: mixed transaction types, big calldatas, simple transfers paid in CELO, transfers paid in cUSD, ordinals, or contract calls', - choices: [ - TestMode.Mixed, - TestMode.Data, - TestMode.Transfer, - TestMode.StableTransfer, - TestMode.ContractCall, - TestMode.Ordinals, - ], - default: TestMode.Mixed, - }) - .options('web3-provider', { - type: 'string', - description: 'web3 endpoint to use for sending transactions', - default: 'http://127.0.0.1:8545', - }) - .options('chainId', { - type: 'number', - description: 'ChainId to use for sending transactions', - default: '42220', - }) -} - -export const handler = async (argv: SimulateClientArgv) => { - await faucetLoadTestThreads( - argv.index, - argv.clientCount, - argv.mnemonic, - argv.web3Provider, - argv.chainId - ) - for (let thread = 0; thread < argv.clientCount; thread++) { - const senderIndex = getIndexForLoadTestThread(argv.index, thread) - const recipientIndex = getIndexForLoadTestThread(argv.recipientIndex, thread) - const senderPK = generatePrivateKey( - argv.mnemonic, - AccountType.LOAD_TESTING_ACCOUNT, - senderIndex - ) - const recipientAddress = generateAddress( - argv.mnemonic, - AccountType.LOAD_TESTING_ACCOUNT, - recipientIndex - ) - - console.log( - `PK for sender index ${ - argv.index - } thread ${thread}, final index ${senderIndex}: ${privateKeyToAddress(senderPK)}` - ) - console.info( - `Account for recipient index ${argv.recipientIndex} thread ${thread}, final index ${recipientIndex}: ${recipientAddress}` - ) - void simulateClient( - senderPK, - recipientAddress, - argv.contractAddress, - argv.contractData, - argv.delay, - argv.blockscoutUrl, - argv.blockscoutMeasurePercent, - argv.index, - argv.testMode as TestMode, - thread, - new BigNumber(argv.maxGasPrice), - argv.totalTxGas, - argv.web3Provider, - argv.chainId - ) - } -} diff --git a/packages/celotool/src/cmds/geth/start.ts b/packages/celotool/src/cmds/geth/start.ts deleted file mode 100644 index fc5a2f05957..00000000000 --- a/packages/celotool/src/cmds/geth/start.ts +++ /dev/null @@ -1,297 +0,0 @@ -import { readFileSync } from 'fs' -import { addCeloGethMiddleware } from 'src/lib/utils' -import yargs from 'yargs' -import { - AccountType, - getPrivateKeysFor, - getValidatorsInformation, - privateKeyToPublicKey, -} from '../../lib/generate_utils' -import { getEnodeAddress, migrateContracts, runGethNodes } from '../../lib/geth' -import { GethInstanceConfig } from '../../lib/interfaces/geth-instance-config' -import { GethRunConfig } from '../../lib/interfaces/geth-run-config' -import { GethArgv } from '../geth' - -export const command = 'start' -export const describe = 'command for running geth' - -interface StartArgv extends GethArgv { - networkId: string - syncMode: string - mining: boolean - blockTime: number - churritoBlock: number - donutBlock: number - port: number - rpcport: number - wsport: number - verbosity: number - verbose: boolean - instances: number - migrate: boolean - migrateTo: number - migrationOverrides: string - monorepoDir: string - purge: boolean - withProxy: boolean - ethstats: string - mnemonic: string - initialAccounts: string -} - -// hardForkBlockCoercer parses a hard fork activation block as follows: -// "null" => no activation -// "42" => activate at block 42 (and likewise for other numbers >= 0) -const hardForkBlockCoercer = (arg: string) => { - if (arg === 'null') { - return undefined - } else { - const value = parseInt(arg, 10) - if (typeof value === 'number' && value >= 0) { - return value - } else { - throw new Error(`Invalid value for hard fork activation block: '${arg}'`) - } - } -} - -export const builder = (argv: yargs.Argv) => { - return addCeloGethMiddleware(argv) - .option('network-id', { - type: 'string', - description: 'network id', - default: '1101', - }) - .option('sync-mode', { - choices: ['full', 'fast', 'light', 'ultralight', 'lightest'], - default: 'full', - }) - .option('mining', { - type: 'boolean', - description: 'Is mining enabled', - default: false, - }) - .option('port', { - type: 'number', - description: 'Port', - default: 30303, - }) - .option('rpcport', { - type: 'number', - description: 'HTTP-RPC server listening port', - default: 8545, - }) - .option('wsport', { - type: 'number', - description: 'WS-RPC server listening port', - default: 8546, - }) - .option('instances', { - type: 'number', - description: 'Number of instances to run', - default: 1, - }) - .option('with-proxy', { - type: 'boolean', - description: 'Start with proxy in front', - default: false, - }) - .option('verbosity', { - type: 'number', - description: 'Geth Verbosity level', - default: 5, - }) - .option('verbose', { - type: 'boolean', - description: 'Command verbosity flag', - default: false, - }) - .option('purge', { - type: 'boolean', - description: 'This will purge the data directory before starting.', - default: false, - }) - .option('ethstats', { - type: 'string', - description: 'address of the ethstats server', - }) - .option('mnemonic', { - type: 'string', - description: 'seed phrase to use for private key generation', - default: - 'jazz ripple brown cloth door bridge pen danger deer thumb cable prepare negative library vast', - }) - .option('blockTime', { - type: 'number', - description: 'Block Time', - default: 1, - }) - .option('churritoBlock', { - type: 'string', - coerce: hardForkBlockCoercer, - description: 'Churrito hard fork activation block number (use "null" for no activation)', - default: '0', - }) - .option('donutBlock', { - type: 'string', - coerce: hardForkBlockCoercer, - description: 'Donut hard fork activation block number (use "null" for no activation)', - default: '0', - }) - .option('migrate', { - type: 'boolean', - description: 'Migrate contracts', - default: false, - implies: 'monorepo-dir', - }) - .option('migrateTo', { - type: 'number', - description: 'Migrate contracts to level x', - implies: 'monorepo-dir', - }) - .option('migration-overrides', { - type: 'string', - description: 'Path to JSON file containing migration overrides', - implies: 'migrate', - }) - .option('monorepo-dir', { - type: 'string', - description: 'Directory of the mono repo', - }) - .option('initial-accounts', { - type: 'string', - description: - 'Path to JSON file containing accounts to place in the alloc property of the genesis.json file', - }) -} - -export const handler = async (argv: StartArgv) => { - const verbosity = argv.verbosity - const verbose = argv.verbose - - const networkId = parseInt(argv.networkId, 10) - const syncMode = argv.syncMode - const blockTime = argv.blockTime - const churritoBlock = argv.churritoBlock - const donutBlock = argv.donutBlock - - const port = argv.port - const rpcport = argv.rpcport - const wsport = argv.wsport - - const mining = argv.mining - const network = 'local' - const instances = argv.instances - const mnemonic = argv.mnemonic - const migrate = argv.migrate - const migrateTo = argv.migrateTo - const initialAccounts = argv.initialAccounts - ? JSON.parse(readFileSync(argv.initialAccounts).toString()) - : {} - const migrationOverrides = argv.migrationOverrides - ? JSON.parse(readFileSync(argv.migrationOverrides).toString()) - : {} - const monorepoDir = argv.monorepoDir - - const purge = argv.purge - const withProxy = argv.withProxy - - const ethstats = argv.ethstats - - const gethConfig: GethRunConfig = { - runPath: argv.dataDir, - keepData: !purge, - repository: { path: argv.gethDir }, - verbosity, - networkId, - migrate, - migrateTo, - migrationOverrides, - network, - instances: [], - genesisConfig: { - blockTime, - epoch: 17280, - initialAccounts, - churritoBlock, - donutBlock, - }, - } - - const validators = getValidatorsInformation(mnemonic, instances) - - const validatorPrivateKeys = getPrivateKeysFor(AccountType.VALIDATOR, mnemonic, instances) - - const proxyPrivateKeys = getPrivateKeysFor(AccountType.PROXY, mnemonic, instances) - - for (let x = 0; x < instances; x++) { - const node: GethInstanceConfig = { - name: `${x}-node`, - validating: mining, - syncmode: syncMode, - ethstats, - privateKey: validatorPrivateKeys[x], - port: port + x, - rpcport: rpcport + x * 2, - wsport: wsport + x * 2, - minerValidator: validators[x].address, - } - - let proxy: GethInstanceConfig | null = null - - if (withProxy) { - proxy = { - name: `${x}-proxy`, - validating: false, - isProxy: true, - syncmode: syncMode, - ethstats, - privateKey: proxyPrivateKeys[x], - port: port + x + 1000, - proxyport: port + x + 333, - rpcport: rpcport + x * 2 + 1000, - wsport: wsport + x * 2 + 1000, - } - - proxy.proxiedValidatorAddress = validators[x].address - proxy.proxy = validators[x].address - proxy.isProxy = true - - node.isProxied = true - node.proxyAllowPrivateIp = true - node.proxies = [ - getEnodeAddress(privateKeyToPublicKey(proxyPrivateKeys[x]), '127.0.0.1', proxy.proxyport), - getEnodeAddress(privateKeyToPublicKey(validatorPrivateKeys[x]), '127.0.0.1', node.port), - ] - } - - gethConfig.instances.push(node) - if (proxy) { - gethConfig.instances.push(proxy) - } - } - - await runGethNodes({ - gethConfig, - validators, - verbose, - }) - - if (gethConfig.migrate || gethConfig.migrateTo) { - const attestationKeys = getPrivateKeysFor(AccountType.ATTESTATION, mnemonic, instances) - - console.info('Migrating contracts (this will take a long time) ...') - - await migrateContracts( - monorepoDir, - validatorPrivateKeys, - attestationKeys, - validators.map((x) => x.address), - gethConfig.migrateTo, - gethConfig.migrationOverrides, - verbose - ) - - console.info('... done migrating contracts!') - } -} diff --git a/packages/celotool/src/cmds/geth/static_nodes.ts b/packages/celotool/src/cmds/geth/static_nodes.ts deleted file mode 100644 index 0c311606b0b..00000000000 --- a/packages/celotool/src/cmds/geth/static_nodes.ts +++ /dev/null @@ -1,46 +0,0 @@ -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { addCeloEnvMiddleware, CeloEnvArgv } from 'src/lib/env-utils' -import { getEnodesWithExternalIPAddresses, writeStaticNodes } from 'src/lib/geth' -import yargs from 'yargs' - -export const command = 'static-nodes' - -export const describe = - 'command for creating static-nodes.json file containing nodes of transaction nodes in an environment' - -interface StaticNodesArgv extends CeloEnvArgv { - outputDir: string - outputFileName: string | null -} - -export const builder = (argv: yargs.Argv) => { - return addCeloEnvMiddleware(argv) - .option('output-dir', { - type: 'string', - description: 'path to directory where file with enodes addresses will be stored', - demand: 'Please specify the directory where to save the generated file', - }) - .option('output-file-name', { - type: 'string', - default: null, - alias: 'o', - description: - 'output file name | if not specified then {ENV_NAME}_static-nodes.json will be used', - }) -} - -export const handler = async (argv: StaticNodesArgv) => { - await switchToClusterFromEnv(argv.celoEnv, false) - - const namespace = argv.celoEnv - const outputDirPath = argv.outputDir - const outputFileName = argv.outputFileName - - await getEnodesWithExternalIPAddresses(namespace).then((enodes) => { - writeStaticNodes( - enodes, - outputDirPath, - outputFileName ? outputFileName : `${namespace}_static-nodes.json` - ) - }) -} diff --git a/packages/celotool/src/cmds/geth/trace.ts b/packages/celotool/src/cmds/geth/trace.ts deleted file mode 100644 index 07cea411eab..00000000000 --- a/packages/celotool/src/cmds/geth/trace.ts +++ /dev/null @@ -1,88 +0,0 @@ -import { getBlockscoutUrl } from 'src/lib/endpoints' -import { addCeloEnvMiddleware, CeloEnvArgv } from 'src/lib/env-utils' -import { checkGethStarted, getWeb3AndTokensContracts, traceTransactions } from 'src/lib/geth' -import yargs from 'yargs' -import { GethArgv } from '../geth' - -export const command = 'trace ' - -export const describe = 'command for tracing tokens transfers between accounts' - -interface TraceArgv extends GethArgv, CeloEnvArgv { - address1: string - address2: string -} - -export const builder = (argv: yargs.Argv) => { - return addCeloEnvMiddleware(argv) - .option('data-dir', { - type: 'string', - description: 'path to datadir', - demand: 'Please, specify geth datadir', - }) - .positional('address1', { - description: 'sender address', - }) - .positional('address2', { - description: 'recipient address', - }) -} - -const sleep = (ms: number) => { - return new Promise((resolve) => setTimeout(resolve, ms)) -} - -export const handler = async (argv: TraceArgv) => { - const dataDir = argv.dataDir - const address1 = argv.address1 - const address2 = argv.address2 - - checkGethStarted(dataDir) - - let iterations = 70 - let web3AndContracts: Awaited> | null = null - outerwhile: while (iterations-- > 0) { - try { - web3AndContracts = await getWeb3AndTokensContracts() - const { kit: kit1 } = web3AndContracts - const latestBlock = await kit1.connection.getBlock('latest') - if (latestBlock.number === 0) { - throw new Error('Latest block is zero') - } else { - break outerwhile - } - } catch (ignored: any) { - console.warn(ignored.toString()) - if (iterations === 0) { - console.error('Geth start error') - } - await sleep(1000) - } - } - if (!web3AndContracts) { - console.error('bad start -- could not get contracts') - process.exit(1) - } - - if (iterations <= 0) { - console.warn('Can not wait for geth to sync') - console.error('bad start') - process.exit(1) - } - - const { kit, goldToken, stableToken } = web3AndContracts - - // This is needed to turn off debug logging which is made in `sendTransaction` - // and needed only for mobile client. - console.debug = () => { - // empty - } - - await traceTransactions( - kit, - goldToken, - stableToken, - [address1, address2], - getBlockscoutUrl(argv.celoEnv) - ) -} diff --git a/packages/celotool/src/cmds/geth/transfer.ts b/packages/celotool/src/cmds/geth/transfer.ts deleted file mode 100644 index 376ced12487..00000000000 --- a/packages/celotool/src/cmds/geth/transfer.ts +++ /dev/null @@ -1,69 +0,0 @@ -import BigNumber from 'bignumber.js' -import { checkGethStarted, getWeb3AndTokensContracts, transferERC20Token } from 'src/lib/geth' -import yargs from 'yargs' -import { GethArgv } from '../geth' - -export const command = 'transfer ' - -export const describe = 'command for transfering tokens between accounts' - -const CELO_GOLD = 'cGLD' -const CELO_DOLLARS = 'cUSD' - -interface TransferArgv extends GethArgv { - senderAddress: string - receiverAddress: string - token: string - amount: string - password: string -} - -export const builder = (argv: yargs.Argv) => { - return argv - .option('data-dir', { - type: 'string', - description: 'path to datadir', - demand: 'Please, specify geth datadir', - }) - .positional('senderAddress', { - description: 'sender address', - }) - .positional('receiverAddress', { - description: 'recipient address', - }) - .positional('token', { - choices: [CELO_GOLD, CELO_DOLLARS], - }) - .positional('amount', { - description: 'amount to transfer', - }) - .option('password', { - type: 'string', - description: 'sender account password', - default: '', - }) -} - -export const handler = async (argv: TransferArgv) => { - const dataDir = argv.dataDir - const senderAddress = argv.senderAddress - const receiverAddress = argv.receiverAddress - const tokenType = argv.token - const amount = argv.amount - const password = argv.password - - checkGethStarted(dataDir) - - const { kit, goldToken, stableToken } = await getWeb3AndTokensContracts() - - const transferrableToken = tokenType === CELO_GOLD ? goldToken : stableToken - - await transferERC20Token( - kit, - transferrableToken, - senderAddress, - receiverAddress, - new BigNumber(amount), - password - ) -} diff --git a/packages/celotool/src/cmds/local_testnet.ts b/packages/celotool/src/cmds/local_testnet.ts deleted file mode 100644 index e65ca93763a..00000000000 --- a/packages/celotool/src/cmds/local_testnet.ts +++ /dev/null @@ -1,385 +0,0 @@ -import { newKit } from '@celo/contractkit' -import { extend, range } from 'lodash' -import { getHooks, sleep } from 'src/e2e-tests/utils' -import { privateKeyToPublicKey } from 'src/lib/generate_utils' -import { GethInstanceConfig } from 'src/lib/interfaces/geth-instance-config' -import { GethRunConfig } from 'src/lib/interfaces/geth-run-config' -import Web3 from 'web3' -import { Admin } from 'web3-eth-admin' -import yargs from 'yargs' - -const Account: any = require('eth-lib/lib/account') - -export const command = 'local-testnet' -export const describe = `Command to run a local testnet of geth instances. - -Running this command will create a number of geth nodes, connect them together to form a network, -and run smart contract migrations to initialize the core protocols. When this is complete, it will -open a NodeJS console with some preloaded objects to facilitate interactions with the test network. -Exiting this console will kill all running geth instances and exit. - -Examples: -* local-testnet -* local-testnet --local-geth ~/code/celo-blockchain -* local-testnet --validators 5 --proxies 3 --bootnode -* local-testnet --tx-nodes 2 --light-clients 3 -* local-testnet --migrate-to 19 --migration-override '{ "lockedGold": { "unlockingPeriod": 30 } }' -* local-testnet --migrate-to 19 --migration-override ../../node_modules/@celo/dev-utils/lib/migration-override.json -* local-testnet --no-migrate --genesis-override '{ "blockTime": 3, "epoch": 50 }' - -Network makeup is configured the --validators, --tx-nodes, --light-clients, and --lightest-client -flags. These flags will add the corresponding nodes to the network. - -A NodeJS REPL is provided to conveniently interact with the created network. A number of global -variables are defined with useful values including { - Web3 (Imported 'web3' module) - Admin (Imported 'web3-eth-admin' module) - testnet: GethRunConfig (Configuration values for the tesnet) - [nodeType][index] (e.g. validator0, txNode2): { - web3: Web3 (A web3 object connected to the node over RPC) - kit: ContractKit (A contractkit object connected to the node over RPC) - admin: Admin (An Admin object connected to the node over RPC) - config: GethInstanceConfig (Configuration values for the node) - kill(signal?: string): (Send a signal, default SIGTERM, to the node. e.g. SIGINT, SIGSTOP) - } -} - -Tip: Export NODE_OPTIONS="--experimental-repl-await" in your terminal to natively use await. - -When the network is created without a bootnode, all nodes will be connected as follows: -* Proxy nodes are connected to their validators, other proxies and unproxied validators. -* Unproxied validator nodes are connected to all other proxies and unproxied validators. -* Transaction nodes are connected to proxies and unproxied validators and other transaction nodes. -* Light clients are connected to all transaction nodes. - -If the network is started with the --bootnode flag, a bootnode will be created and all nodes will be -connected to it, rather than each other directly. - -By default, the celo-blockchain repository will be cloned to a temporary location and built from -master to produce the geth binary to run for each node. The --branch flag can be used to control -which branch is built in the cloned repository. Alternatively, a existing repository can be used -by specifying the --local-geth flag as the path to that repository root.` - -interface LocalTestnetArgs { - localgeth?: string - keepdata?: boolean - branch?: string - bootnode: boolean - validators: number - proxies: number - txnodes: number - lightclients: number - lightestclients: number - migrate: boolean - migrateTo: number - instances: string - genesisOverride: string - migrationOverride: string -} - -export const builder = (argv: yargs.Argv) => { - return argv - .option('local-geth', { - type: 'string', - description: 'Local path to celo-blockchain repository.', - alias: ['localGeth', 'localgeth'], - }) - .option('keep-data', { - type: 'boolean', - decription: 'Keep the data directory from any previous runs.', - alias: ['keepData', 'keepdata'], - }) - .option('branch', { - type: 'string', - description: 'Branch name for remote celo-blockchain repository.', - }) - .option('bootnode', { - type: 'boolean', - allowNo: true, - description: 'Create a bootnode and connect all nodes to it instead of to each other.', - }) - .option('validators', { - type: 'number', - description: 'Number of validator nodes to create.', - default: 1, - }) - .option('proxies', { - type: 'number', - description: 'Number of proxy nodes to create; assigned to the first n validators.', - default: 0, - }) - .option('tx-nodes', { - type: 'number', - description: 'Number of transaction (i.e. non-validating full nodes) nodes to create.', - default: 0, - alias: ['txnodes', 'txNodes'], - }) - .option('light-clients', { - type: 'number', - description: 'Number of light sync nodes to create.', - default: 0, - alias: ['lightClients', 'lightclients'], - }) - .option('lightest-clients', { - type: 'number', - description: 'Number of lightest sync nodes to create.', - default: 0, - alias: ['lightestClients', 'lightestclients'], - }) - .option('migrate', { - type: 'boolean', - description: 'Whether migrations should be run.', - default: true, - allowNo: true, - }) - .option('migrate-to', { - type: 'number', - description: 'Maximum migration number to run. Defaults to running all migrations.', - alias: ['migrateTo', 'migrateto'], - }) - .option('instances', { - type: 'string', - description: 'Manually enter a GethInstanceConfig[] json blob to add to the config.', - default: '[]', - }) - .option('genesis-override', { - type: 'string', - description: 'Genesis configuration overrides as a GenesisConfig JSON blob.', - default: '{}', - alias: ['genesisOverride', 'genesisoverride'], - }) - .option('migration-override', { - type: 'string', - description: 'Migration configuration overrides as a JSON blob.', - default: '{}', - alias: ['migrationOverride', 'migrationoverride'], - }) -} - -async function repl(config: GethRunConfig) { - const session = require('repl').start() - const formatName = (name: string) => - name - .split('-') - .map((token, i) => (i === 0 ? token[0] : token[0].toUpperCase()) + token.slice(1)) - .join('') - - extend(session.context, { - Web3, - Admin, - testnet: config, - ...config.instances.reduce( - (o, instance) => ({ - ...o, - [formatName(instance.name)]: { - web3: new Web3(getRpcUrl(instance)), - kit: newKit(getRpcUrl(instance)), - admin: new Admin(getRpcUrl(instance)), - config: instance, - kill: (signal?: string) => { - if (!instance.pid) { - throw new Error(`no pid registered for instance ${instance.name}`) - } - process.kill(instance.pid, signal) - }, - }, - }), - {} - ), - }) - - // Wait for the REPL to exit. - let exited = false - const exitHandler = () => { - exited = true - } - session.on('exit', exitHandler) - while (!exited) { - await sleep(0.1) - } - session.removeListener('exit', exitHandler) -} - -function bootnodeConfigs(count: number): GethInstanceConfig[] { - return range(count).map((i) => ({ - name: `bootnode-${i}`, - lightserv: false, - syncmode: 'full', - nodekey: generatePrivateKey(), - port: 0, - })) -} - -function validatorConfigs(count: number, proxyCount: number = 0): GethInstanceConfig[] { - const validators: GethInstanceConfig[] = range(count).map((i) => ({ - name: `validator-${i}`, - validating: true, - syncmode: 'full', - isProxied: i < proxyCount, - proxy: i < proxyCount ? `proxy-${i}` : undefined, - proxyAllowPrivateIp: i < proxyCount ? true : undefined, - port: 0, - })) - const proxies: GethInstanceConfig[] = range(proxyCount).map((i) => ({ - name: `proxy-${i}`, - syncmode: 'full', - isProxy: true, - port: 0, - })) - return validators.concat(proxies) -} - -function txNodeConfigs(count: number): GethInstanceConfig[] { - return range(count).map((i) => ({ - name: `tx-node-${i}`, - lightserv: true, - syncmode: 'full', - port: 0, - })) -} - -function lightClientConfigs(count: number): GethInstanceConfig[] { - return range(count).map((i) => ({ - name: `light-client-${i}`, - syncmode: 'light', - port: 0, - })) -} - -function lightestClientConfigs(count: number): GethInstanceConfig[] { - return range(count).map((i) => ({ - name: `lightest-client-${i}`, - syncmode: 'lightest', - port: 0, - })) -} - -// Populate network information in instance configs. -function populateConnectionInfo(configs: GethInstanceConfig[]): GethInstanceConfig[] { - // Choose ports for each instance. - for (const [i, config] of configs.entries()) { - if (!config.port) { - config.port = 30303 + 2 * i - } - if (config.isProxy && !config.proxyport) { - config.proxyport = 30503 + 2 * i - } - if (!config.rpcport && !config.wsport) { - config.rpcport = 8545 + 2 * i - config.wsport = 8546 + 2 * i - } - } - - // If a bootnode is provided, populate bootnode information in other nodes. - const bootnodes = configs.filter((config) => /bootnode/.test(config.name)) - if (bootnodes.length > 0) { - // Only one in-use bootnode is supported. - const bootnode = bootnodes[0] - for (const config of configs) { - if (config.name === bootnode.name || config.isProxied) { - continue - } - config.bootnodeEnode = getEnodeUrl(bootnode) - } - } - - return configs -} - -function getEnodeUrl(config: GethInstanceConfig) { - if (!config.nodekey) { - throw new Error('cannot get the enode url from a config without a nodekey') - } - return `enode://${privateKeyToPublicKey(config.nodekey)}@localhost:${config.port}` -} - -function generatePrivateKey() { - return Account.create(Web3.utils.randomHex(32)).privateKey.replace('0x', '') -} - -function getRpcUrl(config: GethInstanceConfig) { - return `${config.wsport ? 'ws' : 'http'}://localhost:${config.wsport || config.rpcport}` -} - -function getAdmin(config: GethInstanceConfig) { - if (!config.wsport && !config.rpcport) { - throw new Error('connot connect to admin interface for config without port') - } - return new Admin(getRpcUrl(config)) -} - -async function getEnode(config: GethInstanceConfig) { - const admin = getAdmin(config) - return (await admin.getNodeInfo()).enode -} - -async function connectToEnodes(config: GethInstanceConfig, enodes: string[]) { - const admin = getAdmin(config) - await Promise.all(enodes.map((enode) => admin.addPeer(enode))) -} - -async function connectNodes(configs: GethInstanceConfig[]) { - // Connect tx nodes to validators and other tx nodes. - const validators = configs.filter( - (config) => (config.validating && !config.isProxied) || config.isProxy - ) - const validatorEnodes = await Promise.all(validators.map(getEnode)) - const txNodes = configs.filter((config) => !config.validating && config.syncmode === 'full') - const txNodeEnodes = await Promise.all(txNodes.map(getEnode)) - await Promise.all( - txNodes.map((txNode) => connectToEnodes(txNode, validatorEnodes.concat(txNodeEnodes))) - ) - - // Connect light clients to tx nodes. - const lightClients = configs.filter((config) => ['light', 'lightest'].includes(config.syncmode)) - if (lightClients.length > 0 && txNodeEnodes.length === 0) { - throw new Error('connecting light clients to the network requires at least one tx-node') - } - await Promise.all(lightClients.map((lightClient) => connectToEnodes(lightClient, txNodeEnodes))) -} - -export const handler = async (argv: LocalTestnetArgs) => { - const repoPath = argv.localgeth || '/tmp/geth' - - const gethConfig: GethRunConfig = { - network: 'local', - networkId: 1101, - runPath: '/tmp/e2e', - keepData: argv.keepdata, - migrate: argv.migrate, - migrateTo: argv.migrate ? argv.migrateTo : undefined, - instances: populateConnectionInfo([ - ...validatorConfigs(argv.validators, argv.proxies), - ...txNodeConfigs(argv.txnodes), - ...lightClientConfigs(argv.lightclients), - ...lightestClientConfigs(argv.lightestclients), - ...bootnodeConfigs(argv.bootnode ? 1 : 0), - ...JSON.parse(argv.instances), - ]), - repository: { - path: repoPath, - branch: argv.branch, - remote: !argv.localgeth, - }, - genesisConfig: JSON.parse(argv.genesisOverride), - migrationOverrides: JSON.parse(argv.migrationOverride), - } - const hooks = getHooks(gethConfig) - await hooks.initialize() - - if (!argv.bootnode) { - await connectNodes(gethConfig.instances) - } - - console.info(`Local testnet is online with ${gethConfig.instances.length} nodes:`) - for (const instance of gethConfig.instances) { - console.info( - ` * ${instance.name} (pid:${instance.pid}) is listening on ${getRpcUrl(instance)}` - ) - } - console.info('\nPress CTRL+D to quit') - - await repl(gethConfig) - await hooks.after() - process.exit(0) -} diff --git a/packages/celotool/src/cmds/port_forward.ts b/packages/celotool/src/cmds/port_forward.ts deleted file mode 100644 index f60ce0a92a2..00000000000 --- a/packages/celotool/src/cmds/port_forward.ts +++ /dev/null @@ -1,30 +0,0 @@ -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { addCeloEnvMiddleware, CeloEnvArgv } from 'src/lib/env-utils' -import { defaultPortsString, portForward } from 'src/lib/port_forward' -import yargs from 'yargs' -export const command = 'port-forward' - -export const describe = 'command for port-forwarding to a specific network' - -interface PortForwardArgv extends CeloEnvArgv { - component: string - ports: string -} - -export const builder = (argv: yargs.Argv) => { - return addCeloEnvMiddleware(argv) - .option('component', { - type: 'string', - description: 'K8s component name to forward to', - }) - .option('ports', { - type: 'string', - description: 'Ports to forward: space separated srcport:dstport string', - default: defaultPortsString, - }) -} - -export const handler = async (argv: PortForwardArgv) => { - await switchToClusterFromEnv(argv.celoEnv, false, true) - await portForward(argv.celoEnv, argv.component, argv.ports) -} diff --git a/packages/celotool/src/cmds/restore.ts b/packages/celotool/src/cmds/restore.ts deleted file mode 100644 index 253ba0fc7a8..00000000000 --- a/packages/celotool/src/cmds/restore.ts +++ /dev/null @@ -1,56 +0,0 @@ -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { execCmdWithExitOnFailure } from 'src/lib/cmd-utils' -import { addCeloEnvMiddleware, CeloEnvArgv } from 'src/lib/env-utils' -import yargs from 'yargs' - -export const command = 'restore' - -export const describe = "command for restoring a miner's persistent volume (PVC) from snapshot" - -interface RestoreArgv extends CeloEnvArgv { - minername: string - snapshotname: string -} - -export const builder = (argv: yargs.Argv) => { - return addCeloEnvMiddleware(argv) - .option('minername', { - type: 'string', - description: 'Name of the miner node', - demand: 'Please specify the miner node to restore, eg. gethminer1', - }) - .option('snapshotname', { - type: 'string', - description: 'Name of the snapshot', - demand: 'Name of the snapshot (from gcloud compute snapshots list)', - }) -} - -export const handler = async (argv: RestoreArgv) => { - await switchToClusterFromEnv(argv.celoEnv, true, true) - - const minerName = argv.minername - // In the future, we can make this configurable. - // const zone = 'us-west1-a' - // In the future, we can make this configurable. - const clusterName = 'celo-networks-dev' - const diskType = 'pd-ssd' - const namespace = argv.celoEnv - const pvc = `${namespace}-${minerName}-pvc` - // TODO: figure out how to make this confgurable - - const getPVCNameCommand = `kubectl get persistentvolumeclaim ${pvc} --namespace ${namespace} -o=jsonpath={.spec.volumeName}` - const pvcId = (await execCmdWithExitOnFailure(getPVCNameCommand))[0] - const pvcFullId = `gke-${clusterName}--${pvcId}` - console.debug(`PVC name is ${pvcFullId}`) - - // If the disk already exists, then this command will fail and in that case, - // the disk has to be deleted first via `gcloud compute disks delete ${pvcFullId}` - // That itself requires that the miner node be stopped. - // For now, this step is intentionally manual. - // When we encounter a real world use-case of restore, we can decide whether to automate this or not. - const restoreSnapshotCmd = `gcloud compute disks create ${pvcFullId} --source-snapshot=${argv.snapshotname} --type ${diskType}` - await execCmdWithExitOnFailure(restoreSnapshotCmd) - // const gcloudSnapshotsUrl = 'https://console.cloud.google.com/compute/snapshots' - // console.info(`Snapshot \"${snapshotName}\" can be seen at ${gcloudSnapshotsUrl}`) -} diff --git a/packages/celotool/src/cmds/switch.ts b/packages/celotool/src/cmds/switch.ts deleted file mode 100644 index b2937e7492c..00000000000 --- a/packages/celotool/src/cmds/switch.ts +++ /dev/null @@ -1,14 +0,0 @@ -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { addCeloEnvMiddleware, CeloEnvArgv } from 'src/lib/env-utils' -import yargs from 'yargs' - -export const command = 'switch' - -export const describe = 'command for switching to a particular environment' - -// sets environment variables from .env -export const builder = (argv: yargs.Argv) => addCeloEnvMiddleware(argv) - -export const handler = async (argv: CeloEnvArgv) => { - await switchToClusterFromEnv(argv.celoEnv, false, true) -} diff --git a/packages/celotool/src/cmds/transactions.ts b/packages/celotool/src/cmds/transactions.ts deleted file mode 100644 index f3aa090a3d0..00000000000 --- a/packages/celotool/src/cmds/transactions.ts +++ /dev/null @@ -1,16 +0,0 @@ -import { addCeloEnvMiddleware, CeloEnvArgv } from 'src/lib/env-utils' -import { Argv } from 'yargs' - -export const command = 'transactions ' - -export const describe = 'commands for reading transaction data' - -export type TransactionsArgv = CeloEnvArgv - -export function builder(argv: Argv) { - return addCeloEnvMiddleware(argv).commandDir('transactions', { extensions: ['ts'] }) -} - -export function handler() { - // empty -} diff --git a/packages/celotool/src/cmds/transactions/describe.ts b/packages/celotool/src/cmds/transactions/describe.ts deleted file mode 100644 index c17814b2a26..00000000000 --- a/packages/celotool/src/cmds/transactions/describe.ts +++ /dev/null @@ -1,82 +0,0 @@ -import { newKitFromWeb3 } from '@celo/contractkit' -import { newBlockExplorer } from '@celo/explorer/lib/block-explorer' -import { newLogExplorer } from '@celo/explorer/lib/log-explorer' -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { getFornoUrl } from 'src/lib/endpoints' -import Web3 from 'web3' -import yargs from 'yargs' -import { TransactionsArgv } from '../transactions' -export const command = 'describe ' - -export const describe = 'fetch a transaction, attempt parsing and print it out to STDOUT' - -interface DescribeArgv extends TransactionsArgv { - transactionHash: string -} - -export const builder = (argv: yargs.Argv) => { - return argv.positional('transactionHash', { - description: 'The hash of the transaction', - }) -} - -export const handler = async (argv: DescribeArgv) => { - await switchToClusterFromEnv(argv.celoEnv, false) - - const web3 = new Web3(getFornoUrl(argv.celoEnv)) - const kit = newKitFromWeb3(web3) - const blockExplorer = await newBlockExplorer(kit) - const logExplorer = await newLogExplorer(kit) - const transaction = await web3.eth.getTransaction(argv.transactionHash) - const receipt = await web3.eth.getTransactionReceipt(argv.transactionHash) - - if (process.env.CELOTOOL_VERBOSE === 'true') { - console.info('Raw Transaction Data:') - console.info(transaction) - - console.info('Raw Transaction Receipt') - console.info(receipt) - } - - const parsedTransaction = await blockExplorer.tryParseTx(transaction) - - if (parsedTransaction === null) { - return - } - - console.info('Parsed Transaction Data') - console.info(parsedTransaction) - - if (receipt.logs) { - receipt.logs.forEach((log) => { - const parsedLog = logExplorer.tryParseLog(log) - - if (parsedLog === null) { - return - } - - console.info('Parsed Transaction Log') - console.info(parsedLog) - }) - } - - if (!receipt.status) { - console.info('Transaction reverted, attempting to recover revert reason ...') - - const called = await web3.eth.call( - { - data: transaction.input, - to: transaction.to ? transaction.to : undefined, - from: transaction.from, - }, - transaction.blockNumber - ) - - if (called.startsWith('0x08c379a')) { - console.info('Revert reason is:') - console.info(web3.eth.abi.decodeParameter('string', '0x' + called.substring(10))) - } else { - console.info('Could not retrieve revert reason') - } - } -} diff --git a/packages/celotool/src/cmds/transactions/list.ts b/packages/celotool/src/cmds/transactions/list.ts deleted file mode 100644 index b9fb06bb738..00000000000 --- a/packages/celotool/src/cmds/transactions/list.ts +++ /dev/null @@ -1,95 +0,0 @@ -import { newKitFromWeb3 } from '@celo/contractkit' -import { BlockExplorer, newBlockExplorer } from '@celo/explorer/lib/block-explorer' -import { LogExplorer, newLogExplorer } from '@celo/explorer/lib/log-explorer' -import fetch from 'node-fetch' -import { CONTRACTS_TO_COPY, copyContractArtifacts, downloadArtifacts } from 'src/lib/artifacts' -import { getWeb3Client } from 'src/lib/blockchain' -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { getBlockscoutUrl } from 'src/lib/endpoints' -import Web3 from 'web3' -import yargs from 'yargs' -import { TransactionsArgv } from '../transactions' - -export const command = 'list
' - -export const describe = 'lists transactions to this address' - -interface ListArgv extends TransactionsArgv { - address: string -} - -export const builder = (argv: yargs.Argv) => { - return argv.positional('address', { - description: 'the address to search for', - }) -} - -export const handler = async (argv: ListArgv) => { - await switchToClusterFromEnv(argv.celoEnv, false) - - await downloadArtifacts(argv.celoEnv) - await copyContractArtifacts( - argv.celoEnv, - '../transaction-metrics-exporter/src/contracts', - CONTRACTS_TO_COPY - ) - - const web3 = await getWeb3Client(argv.celoEnv) - const blockscoutURL = getBlockscoutUrl(argv.celoEnv) - const kit = newKitFromWeb3(web3) - const blockExplorer = await newBlockExplorer(kit) - const logExplorer = await newLogExplorer(kit) - const resp = await fetch( - `${blockscoutURL}/api?module=account&action=txlist&address=${argv.address}&sort=desc` - ) - const jsonResp = await resp.json() - - if (jsonResp.result === undefined) { - return - } - - for (const blockscoutTx of jsonResp.result) { - await fetchTx(web3, blockExplorer, logExplorer, blockscoutTx) - } - process.exit(0) -} - -async function fetchTx( - web3: Web3, - blockExplorer: BlockExplorer, - logExplorer: LogExplorer, - blockscoutTx: { hash: string; timeStamp: string } -) { - const transaction = await web3.eth.getTransaction(blockscoutTx.hash) - const receipt = await web3.eth.getTransactionReceipt(blockscoutTx.hash) - - const parsedTransaction = await blockExplorer.tryParseTx(transaction) - - if (parsedTransaction === null) { - console.info(`Unparsable Transaction: ${transaction.hash}`) - return - } - - console.info( - `${parsedTransaction.callDetails.contract}#${ - parsedTransaction.callDetails.function - }(${JSON.stringify(parsedTransaction.callDetails.paramMap)}) ${parsedTransaction.tx.hash}` - ) - - if (receipt.logs) { - receipt.logs.forEach((log) => { - try { - const parsedLog = logExplorer.tryParseLog(log) - - if (parsedLog === null) { - console.info(`\tParsed log is null for log "${log.address}"`) - return - } - - console.info(`\t${parsedLog.event}(${JSON.stringify(parsedLog.returnValues)})`) - } catch (e) { - console.error(`Error while parsing log ${JSON.stringify(log)}`) - } - }) - } -} diff --git a/packages/celotool/src/cmds/unfreeze_contracts.ts b/packages/celotool/src/cmds/unfreeze_contracts.ts deleted file mode 100644 index edc75141e4a..00000000000 --- a/packages/celotool/src/cmds/unfreeze_contracts.ts +++ /dev/null @@ -1,103 +0,0 @@ -/* tslint:disable no-console */ -import { ContractKit, newKitFromWeb3 } from '@celo/contractkit' -import { switchToClusterFromEnv } from 'src/lib/cluster' -import { CeloEnvArgv, addCeloEnvMiddleware } from 'src/lib/env-utils' -import { portForwardAnd } from 'src/lib/port_forward' -import Web3 from 'web3' -import yargs from 'yargs' - -export const command = 'unfreeze-contracts' - -export const describe = 'command for unfreezing epoch rewards' - -interface UnfreezeContractsArgv extends CeloEnvArgv { - rewards: boolean - freeze: boolean - precheck: boolean - verify: boolean -} - -export const builder = (argv: yargs.Argv) => { - return addCeloEnvMiddleware(argv) - .option('rewards', { - type: 'boolean', - description: 'Affect epoch rewards', - default: true, - }) - .option('freeze', { - type: 'boolean', - description: 'Freeze contracts instead of unfreezing', - default: false, - }) - .option('precheck', { - type: 'boolean', - description: 'Check the contract freeze status before continuing', - default: true, - }) - .option('verify', { - type: 'boolean', - description: 'Verify the contract freeze status after', - default: true, - }) -} - -export const handler = async (argv: UnfreezeContractsArgv) => { - await switchToClusterFromEnv(argv.celoEnv) - - const cb = async () => { - const web3: Web3 = new Web3('http://localhost:8545') - const kit: ContractKit = newKitFromWeb3(web3) - const account = (await kit.connection.getAccounts())[0] - console.info(`Using account: ${account}`) - kit.connection.defaultAccount = account - - const [epochRewards, freezerContract] = await Promise.all([ - argv.rewards ? kit._web3Contracts.getEpochRewards() : null, - kit._web3Contracts.getFreezer(), - ]) - - for (const [name, contract] of Object.entries({ epochRewards })) { - if (contract === null) { - continue - } - - const address = contract._address - if (argv.precheck) { - const frozen = await freezerContract.methods.isFrozen(address).call() - // console.debug(`${name}.frozen = ${frozen}`) - if (argv.freeze === frozen) { - console.error(`${name} is already ${argv.freeze ? 'frozen' : 'unfrozen'}. Skipping.`) - continue - } - } - - if (argv.freeze) { - console.info(`Sending freeze transaction to ${name} ...`) - await freezerContract.methods.freeze(address).send({ from: account }) - } else { - console.info(`Sending unfreeze transaction to ${name} ...`) - await freezerContract.methods.unfreeze(address).send({ from: account }) - } - - if (argv.verify) { - const frozen = await freezerContract.methods.isFrozen(address).call() - // console.debug(`${name}.frozen = ${frozen}`) - if (argv.freeze !== frozen) { - console.error( - `${name} is not ${argv.freeze ? 'frozen' : 'unfrozen'}. Something went wrong.` - ) - continue - } - console.info(`Succesfully ${argv.freeze ? 'froze' : 'unfroze'} ${name}`) - } - } - } - - try { - await portForwardAnd(argv.celoEnv, cb) - } catch (error) { - console.error(`Unable to ${argv.freeze ? 'freeze' : 'unfreeze'} contracts on ${argv.celoEnv}`) - console.error(error) - process.exit(1) - } -} diff --git a/packages/celotool/src/e2e-tests/blockchain_parameters_tests.ts b/packages/celotool/src/e2e-tests/blockchain_parameters_tests.ts deleted file mode 100644 index 228a5944ef7..00000000000 --- a/packages/celotool/src/e2e-tests/blockchain_parameters_tests.ts +++ /dev/null @@ -1,83 +0,0 @@ -import { ContractKit, newKitFromWeb3 } from '@celo/contractkit' -import { BlockchainParametersWrapper } from '@celo/contractkit/lib/wrappers/BlockchainParameters' -import { assert } from 'chai' -import Web3 from 'web3' -import { GethRunConfig } from '../lib/interfaces/geth-run-config' -import { getHooks, sleep } from './utils' - -const TMP_PATH = '/tmp/e2e' -const rpcURL = 'http://localhost:8545' - -describe('Blockchain parameters tests', function (this: any) { - this.timeout(0) - - let kit: ContractKit - let parameters: BlockchainParametersWrapper - - const gethConfig: GethRunConfig = { - migrate: true, - runPath: TMP_PATH, - keepData: false, - networkId: 1101, - network: 'local', - genesisConfig: { - churritoBlock: 0, - donutBlock: 0, - espressoBlock: 0, - }, - instances: [ - { - name: 'validator', - validating: true, - syncmode: 'full', - port: 30303, - rpcport: 8545, - }, - ], - } - - const hooks = getHooks(gethConfig) - - before(async function (this: any) { - this.timeout(0) - await hooks.before() - }) - - after(async function (this: any) { - this.timeout(0) - await hooks.after() - }) - - const validatorAddress: string = '0x47e172f6cfb6c7d01c1574fa3e2be7cc73269d95' - - const restartGeth = async () => { - // Restart the validator node - await hooks.restart() - - // TODO(mcortesi): magic sleep. without it unlockAccount sometimes fails - await sleep(2) - - kit = newKitFromWeb3(new Web3(rpcURL)) - - await kit.connection.web3.eth.personal.unlockAccount(validatorAddress, '', 1000) - parameters = await kit.contracts.getBlockchainParameters() - } - - describe('when running a node', () => { - before(async () => { - await restartGeth() - }) - it('block limit should have been set using governance', async () => { - this.timeout(0) - const res = await parameters.getBlockGasLimit() - assert.equal(0, res.comparedTo(13000000)) - }) - it('changing the block gas limit', async () => { - this.timeout(0) - await parameters.setBlockGasLimit(23000000).send({ from: validatorAddress }) - await sleep(2) - const res = await parameters.getBlockGasLimit() - assert.equal(0, res.comparedTo(23000000)) - }) - }) -}) diff --git a/packages/celotool/src/e2e-tests/cip35_tests.ts b/packages/celotool/src/e2e-tests/cip35_tests.ts deleted file mode 100644 index b31e8f93aae..00000000000 --- a/packages/celotool/src/e2e-tests/cip35_tests.ts +++ /dev/null @@ -1,506 +0,0 @@ -import { CeloTx } from '@celo/connect' -import { ContractKit, newKitFromWeb3 } from '@celo/contractkit' -import { privateKeyToAddress } from '@celo/utils/lib/address' -import * as ejsRlp from '@ethereumjs/rlp' -import * as ejsUtil from '@ethereumjs/util' -import BigNumber from 'bignumber.js' -import { assert } from 'chai' -import { keccak256 } from 'ethereum-cryptography/keccak' -import { toHex } from 'ethereum-cryptography/utils' -import lodash from 'lodash' -import Web3 from 'web3' -import { AccountType, generatePrivateKey } from '../lib/generate_utils' -import { GethRunConfig } from '../lib/interfaces/geth-run-config' -import { ensure0x } from '../lib/utils' -import { getHooks, initAndSyncGethWithRetry, mnemonic, sleep } from './utils' - -const TMP_PATH = '/tmp/e2e' -const validatorUrl = 'http://localhost:8545' -const lightUrl = 'http://localhost:8546' - -const notYetActivatedError = 'support for eth-compatible transactions is not enabled' -const notCompatibleError = 'ethCompatible is true, but non-eth-compatible fields are present' -const noReplayProtectionError = 'only replay-protected (EIP-155) transactions allowed over RPC' - -const validatorPrivateKey = generatePrivateKey(mnemonic, AccountType.VALIDATOR, 0) -const validatorAddress = privateKeyToAddress(validatorPrivateKey) -// Arbitrary addresses to use in the transactions -const toAddress = '0x8c36775E95A5f7FEf6894Ba658628352Ac58605B' -const gatewayFeeRecipientAddress = '0xc77538d1e30C0e4ec44B0DcaD97FD3dc63fcaCC4' - -// Simple contract with a single constant -const bytecode = - '0x608060405260008055348015601357600080fd5b5060358060216000396000f3006080604052600080fd00a165627a7a72305820c7f3f7c299940bb1d9b122d25e8f288817e45bbdeaccdd2f6e8801677ed934e70029' - -const verbose = false - -/// ////// Configurable values to run only some of the tests during development //////////////// -// ReplayProtectionTests lets you skip or run only the replay-protection tests during dev -// Value when committing should be "run" -// eslint-disable-next-line -let replayProtectionTests: 'run' | 'skip' | 'only' = 'run' -// devFilter can be used during development to only run a subset of testcases. -// But if you're going to commit you should set them all back to undefined (i.e. no filter). -const devFilter: Filter = { - cipIsActivated: undefined, - lightNode: undefined, - ethCompatible: undefined, - contractCreation: undefined, - useFeeCurrency: undefined, - useGatewayFee: undefined, - useGatewayFeeRecipient: undefined, - sendRawTransaction: undefined, -} -/// //////////////////////////////////////////////////////////////////////////////////////////// - -// Filter specifies which subset of cases to generate. -// (e.g. {lightNode: true, sendRawTransaction: false} makes it only run cases which send through a light -// node using `eth_sendRawTransaction` -type Filter = Partial - -// TestCase describes the specific case we want to test -interface TestCase { - cipIsActivated: boolean - lightNode: boolean - ethCompatible: boolean - contractCreation: boolean - useFeeCurrency: boolean - useGatewayFee: boolean - useGatewayFeeRecipient: boolean - sendRawTransaction: boolean // whether to use eth_sendRawTransaction ot eth_sendTransaction - errorString: string | null - errorReason: string | null -} - -// generateTestCases is used to generate all the cases we want to test for a setup which -// is either pre-Donut or post-Donut (cipIsActivated true means post-Donut) -function generateTestCases(cipIsActivated: boolean) { - const cases: TestCase[] = [] - if (devFilter.cipIsActivated !== undefined && devFilter.cipIsActivated !== cipIsActivated) { - // The devFilter is incompatible with the cipIsActivated value, so there are no cases to run - return cases - } - const getValues = (fieldFilter: boolean | undefined) => { - return fieldFilter === undefined ? [false, true] : [fieldFilter] - } - // Generate all possible combinations (but some are invalid and excluded using 'continue' below) - for (const lightNode of getValues(devFilter.lightNode)) { - for (const ethCompatible of getValues(devFilter.ethCompatible)) { - for (const contractCreation of getValues(devFilter.contractCreation)) { - for (const useFeeCurrency of getValues(devFilter.useFeeCurrency)) { - for (const useGatewayFee of getValues(devFilter.useGatewayFee)) { - for (const useGatewayFeeRecipient of getValues(devFilter.useGatewayFeeRecipient)) { - for (const sendRawTransaction of getValues(devFilter.sendRawTransaction)) { - let errorString: string | null = null - let errorReason: string | null = null - const hasCeloFields = useFeeCurrency || useGatewayFee || useGatewayFeeRecipient - if (ethCompatible && hasCeloFields) { - errorString = notCompatibleError - errorReason = 'transaction has celo-only fields' - } else if (ethCompatible && !cipIsActivated) { - errorString = notYetActivatedError - errorReason = 'Donut is not activated' - } - if (sendRawTransaction && ethCompatible && hasCeloFields) { - // Such scenarios don't make sense, since eth-compatible transactions in RLP can't have - // these fields. So skip these cases. - continue - } - cases.push({ - cipIsActivated, - lightNode, - ethCompatible, - contractCreation, - useFeeCurrency, - useGatewayFee, - useGatewayFeeRecipient, - sendRawTransaction, - errorString, - errorReason, - }) - } - } - } - } - } - } - } - return cases -} - -function getGethRunConfig(withDonut: boolean, withEspresso: boolean): GethRunConfig { - console.info('getGethRunConfig', withDonut) - return { - migrate: true, - runPath: TMP_PATH, - keepData: false, - networkId: 1101, - network: 'local', - genesisConfig: { - churritoBlock: 0, - donutBlock: withDonut ? 0 : null, - espressoBlock: withEspresso ? 0 : null, - gingerbreadBlock: null, - }, - instances: [ - { - name: 'validator', - validating: true, - syncmode: 'full', - lightserv: true, - port: 30303, - rpcport: 8545, - }, - ], - } -} - -/** - * Copied from ethereumjs-utils - * Trims leading zeros from a `Buffer` or `Number[]`. - * @param a (Buffer|Uint8Array) - * @return (Buffer|Uint8Array) - */ -function stripZeros(a: any): Buffer | Uint8Array { - let first = a[0] - while (a.length > 0 && first.toString() === '0') { - a = a.slice(1) - first = a[0] - } - return a -} - -// TestEnv encapsulates a pre-Donut or post-Donut environment and the tests to run on it -class TestEnv { - testCases: TestCase[] - gethConfig: GethRunConfig - cipIsActivated: boolean - replayProtectionIsNotMandatory: boolean - hooks: ReturnType - stableTokenAddr: string = '' - gasPrice: string = '' - - // There are three cases: (a), (b), and (c) below. - // And, for each of these three cases, we have one which connects to the validator and one which - // connects to the light client. - // (a) contractkit instances without the private key, for transacting using `eth_sendTransaction` - kit: ContractKit - kitLight: ContractKit - // (b) contractkit instances with the private key, for signing locally (to then use `eth_sendRawTransaction`) - kitWithLocalWallet: ContractKit - kitWithLocalWalletLight: ContractKit - // (c) web3 instances with the private key, for generating and signing raw eth-compatible transactions (to then - // use with `eth_sendRawTransaction`) - web3: Web3 - web3Light: Web3 - - constructor(cipIsActivated: boolean, replayProtectionIsNotMandatory: boolean) { - this.gethConfig = getGethRunConfig(cipIsActivated, replayProtectionIsNotMandatory) - this.hooks = getHooks(this.gethConfig) - this.cipIsActivated = cipIsActivated - this.replayProtectionIsNotMandatory = replayProtectionIsNotMandatory - this.testCases = generateTestCases(cipIsActivated) - this.kit = newKitFromWeb3(new Web3(validatorUrl)) - this.kitLight = newKitFromWeb3(new Web3(lightUrl)) - this.kitWithLocalWallet = newKitFromWeb3(new Web3(validatorUrl)) - this.kitWithLocalWalletLight = newKitFromWeb3(new Web3(lightUrl)) - this.web3 = new Web3(validatorUrl) - this.web3Light = new Web3(lightUrl) - } - - // before() does all the setup needed to then enable the individual test cases to be run - async before() { - await this.hooks.before() - - // Restart the validator node and start the light node to connect to it and sync up - await this.hooks.restart() - const lightNodeConfig = { - name: 'light', - validating: false, - syncmode: 'light', - port: 30305, - rpcport: 8546, - } - await initAndSyncGethWithRetry( - this.gethConfig, - this.hooks.gethBinaryPath, - lightNodeConfig, - [...this.gethConfig.instances, lightNodeConfig], - verbose, - 3 - ) - - this.stableTokenAddr = (await this.kit.contracts.getStableToken()).address - const gasPriceMinimum = await (await this.kit.contracts.getGasPriceMinimum()).gasPriceMinimum() - this.gasPrice = gasPriceMinimum.times(5).toString() - - // TODO(mcortesi): magic sleep. without it unlockAccount sometimes fails - await sleep(2) - - // Make sure we can use the validator's address to send transactions - // For signing on the node, unlock the account (and add it first if it's the light node) - await this.kit.connection.web3.eth.personal.unlockAccount(validatorAddress, '', 1000) - await this.kitLight.connection.web3.eth.personal.importRawKey(validatorPrivateKey, '') - await this.kitLight.connection.web3.eth.personal.unlockAccount(validatorAddress, '', 1000) - // For the local wallets, add the private key. - // The web3 instances don't need that, because we use a function that takes in the private key. - this.kitWithLocalWallet.connection.addAccount(validatorPrivateKey) - this.kitWithLocalWalletLight.connection.addAccount(validatorPrivateKey) - } - - async generateUnprotectedTransaction(ethCompatible: boolean): Promise { - const encode = ejsRlp.encode - const numToHex = (x: number | BigNumber) => ejsUtil.bufferToHex(ejsUtil.toBuffer(Number(x))) - const nonce = await this.kit.connection.nonce(validatorAddress) - const celoOnlyFields = ethCompatible ? [] : ['0x', '0x', '0x'] - const arr = [ - nonce > 0 ? numToHex(nonce) : '0x', - numToHex(parseInt(this.gasPrice, 10)), - numToHex(1000000), // plenty of gas - ...celoOnlyFields, - toAddress, // to - '0x05', // value: 5 wei - '0x', // no data - ] - // Creates SHA-3 hash of the RLP encoded version of the input. - const signingHash = ejsUtil.toBuffer(keccak256(ejsRlp.encode(arr))) - const pk = ejsUtil.addHexPrefix(validatorPrivateKey) - const sig = ejsUtil.ecsign(signingHash, ejsUtil.toBuffer(pk)) - arr.push( - ejsUtil.bufferToHex(stripZeros(sig.v) as Buffer), - ejsUtil.bufferToHex(stripZeros(sig.r) as Buffer), - ejsUtil.bufferToHex(stripZeros(sig.s) as Buffer) - ) - return ensure0x(toHex(encode(arr))) - } - - runReplayProtectionTests() { - for (const ethCompatible of [false, true]) { - this.runReplayProtectionTest(ethCompatible) - } - } - - runReplayProtectionTest(ethCompatible: boolean) { - describe(`Transaction without replay protection, ethCompatible: ${ethCompatible}`, () => { - let minedTx: any = null // Use any because we haven't added `ethCompatible` to these types - let error: string | null = null - - before(async () => { - const tx = await this.generateUnprotectedTransaction(ethCompatible) - try { - const receipt = await (await this.kit.connection.sendSignedTransaction(tx)).waitReceipt() - minedTx = await this.kit.web3.eth.getTransaction(receipt.transactionHash) - error = null - } catch (err: any) { - error = err.message - } - }) - if (ethCompatible && !this.cipIsActivated) { - it('fails due to being ethereum-compatible', () => { - assert.isNull(minedTx, 'Transaction succeeded when it should have failed') - assert.equal(error, notYetActivatedError) - }) - } else if (this.cipIsActivated) { - if (this.replayProtectionIsNotMandatory) { - // Should succeed, since replay protection is optional after Espresso - it('succeeds', () => { - assert.isNull(error, 'Transaction failed when it should have succeeded') - }) - } else { - // Replay protection is mandatory, so the transaction should fail - it('fails due to replay protection being mandatory', () => { - assert.isNull(minedTx, 'Transaction succeeded when it should have failed') - assert.equal(error, noReplayProtectionError) - }) - } - } else { - // Should succeed, since replay protection is optional before Donut - it('succeeds', () => { - assert.isNull(error, 'Transaction failed when it should have succeeded') - assert.isFalse(minedTx.ethCompatible, 'Transaction has wrong ethCompatible value') - }) - } - }) - } - - runTestCase(testCase: TestCase) { - // Generate a human-readable summary of the test case - const options: string[] = [] - lodash.forEach(testCase, (value, key) => { - if (value === true) { - options.push(key) - } - }) - describe(`Testcase with: ${options.join(', ')}`, () => { - let minedTx: any // Use any because we haven't added `ethCompatible` to these types - let error: string | null = null - - before(async () => { - const tx: CeloTx = { - from: validatorAddress, - gas: 1000000, // plenty for both types of transaction - gasPrice: this.gasPrice, - chainId: this.gethConfig.networkId, - nonce: await this.kit.connection.nonce(validatorAddress), - } - if (testCase.useFeeCurrency) { - tx.feeCurrency = this.stableTokenAddr - } - if (testCase.useGatewayFee) { - tx.gatewayFee = '0x25' - } - if (testCase.useGatewayFeeRecipient) { - tx.gatewayFeeRecipient = gatewayFeeRecipientAddress - } - - if (testCase.contractCreation) { - tx.data = bytecode - } else { - tx.to = toAddress - tx.value = 5 - } - - try { - let txHash: string - // Use the right contractkit/web3 instances according to whether the testcase say to send - // the transaction through the validator or the light client - const k = testCase.lightNode ? this.kitLight : this.kit - const kLocal = testCase.lightNode ? this.kitWithLocalWalletLight : this.kitWithLocalWallet - const w3 = testCase.lightNode ? this.web3Light : this.web3 - - if (testCase.sendRawTransaction) { - // Sign the transaction locally and send using `eth_sendRawTransaction` - let raw: string - if (testCase.ethCompatible) { - const signed = await w3.eth.accounts.signTransaction(tx, validatorPrivateKey) - raw = signed.rawTransaction! - } else { - const signed = await kLocal.connection.wallet.signTransaction(tx) - raw = signed.raw - } - // Once the transaction is signed and encoded, it doesn't matter whether we send it with web3 or contractkit - txHash = (await w3.eth.sendSignedTransaction(raw)).transactionHash - } else { - tx.chainId = undefined // clear the chainId b/c web3js won't format it as a hex bignum... - // Send using `eth_sendTransaction` - const params: any = tx // haven't added `ethCompatible` to the tx type - // Only include ethCompatible if it's true. This confirms that omitting it results to normal Celo - // transactions, but doesn't test that ethCompatible: false also does. But we will see in the resulting - // transaction object (from eth_getTransaction) that it has ethCompatible: false. - if (testCase.ethCompatible) { - params.ethCompatible = true - } - const res = await k.sendTransaction(params) - txHash = (await res.waitReceipt()).transactionHash - } - - minedTx = await k.web3.eth.getTransaction(txHash) - error = null - } catch (err: any) { - error = err.message - } - }) - - // Verify that sending the transaction either worked or failed as expected for this test case - if (testCase.errorString !== null) { - it(`fails with the expected error (${testCase.errorReason})`, () => { - assert.notEqual(error, null, "Expected an error but didn't get one") - assert.match( - error, - new RegExp(testCase.errorString, 'i'), - `Got "${error}", expected "${testCase.errorString}"` - ) - }) - } else { - it('succeeds', () => { - assert.equal(error, null, 'Got an error but expected the transaction to succeed') - }) - it(`ethCompatible is ${testCase.ethCompatible}`, () => { - assert.equal(minedTx.ethCompatible, testCase.ethCompatible) - }) - } - }) - } -} - -describe('CIP-35 >', function (this: any) { - this.timeout(0) - - describe('before activation', () => { - if (devFilter.cipIsActivated === true) { - return - } - const testEnv = new TestEnv(false, false) // not donut, not espresso - before(async function (this) { - this.timeout(0) - await testEnv.before() - }) - - if (replayProtectionTests !== 'only') { - for (const testCase of testEnv.testCases) { - testEnv.runTestCase(testCase) - } - } - - if (replayProtectionTests !== 'skip') { - testEnv.runReplayProtectionTests() - } - - after(async function (this: any) { - this.timeout(0) - await testEnv.hooks.after() - }) - }) - - describe('after activation', () => { - if (devFilter.cipIsActivated === false) { - return - } - const testEnv = new TestEnv(true, false) // donut, not espresso - before(async function (this) { - this.timeout(0) - await testEnv.before() - }) - - if (replayProtectionTests !== 'only') { - for (const testCase of testEnv.testCases) { - testEnv.runTestCase(testCase) - } - } - - if (replayProtectionTests !== 'skip') { - testEnv.runReplayProtectionTests() - } - - after(async function (this: any) { - this.timeout(0) - await testEnv.hooks.after() - }) - }) - - describe('after cip50 (optional replay protection)', () => { - if (devFilter.cipIsActivated === false) { - return - } - const testEnv = new TestEnv(true, true) // donut and espresso - before(async function (this) { - this.timeout(0) - await testEnv.before() - }) - - if (replayProtectionTests !== 'only') { - for (const testCase of testEnv.testCases) { - testEnv.runTestCase(testCase) - } - } - - if (replayProtectionTests !== 'skip') { - testEnv.runReplayProtectionTests() - } - - after(async function (this: any) { - this.timeout(0) - await testEnv.hooks.after() - }) - }) -}) diff --git a/packages/celotool/src/e2e-tests/governance_tests.ts b/packages/celotool/src/e2e-tests/governance_tests.ts deleted file mode 100644 index d3cdbc042a7..00000000000 --- a/packages/celotool/src/e2e-tests/governance_tests.ts +++ /dev/null @@ -1,1114 +0,0 @@ -/* eslint-disable */ -import { ContractKit, newKitFromWeb3 } from '@celo/contractkit' -import { getBlsPoP, getBlsPublicKey } from '@celo/cryptographic-utils/lib/bls' -import { eqAddress, privateKeyToAddress } from '@celo/utils/lib/address' -import { concurrentMap } from '@celo/utils/lib/async' -import { fromFixed, toFixed } from '@celo/utils/lib/fixidity' -import { bitIsSet, parseBlockExtraData } from '@celo/utils/lib/istanbul' -import BigNumber from 'bignumber.js' -import { assert } from 'chai' -import Web3 from 'web3' -import { AccountType, generateAddress, generatePrivateKey } from '../lib/generate_utils' -import { connectBipartiteClique, connectPeers, initAndStartGeth } from '../lib/geth' -import { GethInstanceConfig } from '../lib/interfaces/geth-instance-config' -import { GethRunConfig } from '../lib/interfaces/geth-run-config' -import { - assertAlmostEqual, - getHooks, - mnemonic, - sleep, - waitForAnnounceToStabilize, - waitForBlock, - waitForEpochTransition, - waitToFinishInstanceSyncing, -} from './utils' - -interface MemberSwapper { - swap(): Promise -} - -const TMP_PATH = '/tmp/e2e' -const verbose = false -const carbonOffsettingPartnerAddress = '0x1234567812345678123456781234567812345678' -const validatorAddress = '0x47e172f6cfb6c7d01c1574fa3e2be7cc73269d95' -// The tests calculate some expected values based on the rewards multiplier from the block -// before an epoch block. However, the actual rewards multiplier used for epoch rewards is -// calculated inside the epoch block. Since the multiplier depends on the timestamp, this means -// the expected values will never exactly match the actual values, so we need some tolerance. -// This constant defines the tolerance as a fraction of the expected value. -// values. We use 10^-6, so they have to be match to (nearly) 6 significant figures -const tolerance = new BigNumber(10).pow(new BigNumber(-6)) - -async function newMemberSwapper(kit: ContractKit, members: string[]): Promise { - let index = 0 - const group = (await kit.connection.getAccounts())[0] - await Promise.all(members.slice(1).map((member) => removeMember(member))) - - async function removeMember(member: string) { - return (await kit.contracts.getValidators()) - .removeMember(member) - .sendAndWaitForReceipt({ from: group }) - } - - async function addMember(member: string) { - return ( - await (await kit.contracts.getValidators()).addMember(group, member) - ).sendAndWaitForReceipt({ from: group }) - } - - async function getGroupMembers() { - const groupInfo = await (await kit._web3Contracts.getValidators()).methods - .getValidatorGroup(group) - .call() - return groupInfo[0] - } - - return { - async swap() { - const removedMember = members[index % members.length] - await removeMember(members[index % members.length]) - index = index + 1 - const addedMember = members[index % members.length] - await addMember(members[index % members.length]) - const groupMembers = await getGroupMembers() - assert.include(groupMembers, addedMember) - assert.notInclude(groupMembers, removedMember) - }, - } -} - -interface KeyRotator { - rotate(): Promise -} - -async function newKeyRotator( - kit: ContractKit, - kits: ContractKit[], - privateKeys: string[] -): Promise { - let index = 0 - const validator = (await kit.connection.getAccounts())[0] - const accountsWrapper = await kit.contracts.getAccounts() - - async function authorizeValidatorSigner( - signer: string, - signerKit: ContractKit, - signerPrivateKey: string - ) { - const blsPublicKey = getBlsPublicKey(signerPrivateKey) - const blsPop = getBlsPoP(validator, signerPrivateKey) - const accounts = await signerKit.contracts.getAccounts() - const pop = await accounts.generateProofOfKeyPossession(validator, signer) - return ( - await accountsWrapper.authorizeValidatorSignerAndBls(signer, pop, blsPublicKey, blsPop) - ).sendAndWaitForReceipt({ - from: validator, - }) - } - - return { - async rotate() { - if (index < kits.length) { - const signerKit = kits[index] - const signer: string = (await signerKit.connection.getAccounts())[0] - const signerPrivateKey = privateKeys[index] - await authorizeValidatorSigner(signer, signerKit, signerPrivateKey) - index += 1 - assert.equal(await accountsWrapper.getValidatorSigner(validator), signer) - } - }, - } -} - -async function calculateUptime( - kit: ContractKit, - validatorSetSize: number, - lastBlockNumberOfEpoch: number, - epochSize: number, - lookbackWindow: number -): Promise { - const firstBlockNumberOfEpoch = lastBlockNumberOfEpoch - epochSize + 1 - - const monitoringWindow: [number, number] = [ - firstBlockNumberOfEpoch + lookbackWindow - 1, // last block of first lookbackWindow - lastBlockNumberOfEpoch - 2, // we ignore last 2 block fo epoch - ] - const monitoringWindowSize = monitoringWindow[1] - monitoringWindow[0] + 1 - - // we need to obtain monitoring window blocks shifted by 1, since - // we are interested in parentAggregatedSeal that lives on the next block - const blocks = await concurrentMap(10, [...Array(monitoringWindowSize).keys()], (i) => - kit.connection.getBlock(monitoringWindow[0] + 1 + i) - ) - const lastSignedBlock: number[] = new Array(validatorSetSize).fill(0) - const upBlocks: number[] = new Array(validatorSetSize).fill(0) - - // Follows updateUptime() in core/blockchain.go - for (const block of blocks) { - const blockNumber = block.number - 1 // we are actually monitoring prev block signatures - const bitmap = parseBlockExtraData(block.extraData).parentAggregatedSeal.bitmap - - const isMonitoredBlock = - blockNumber >= monitoringWindow[0] && blockNumber <= monitoringWindow[1] - - const currentLookbackWindow: [number, number] = [blockNumber - lookbackWindow + 1, blockNumber] - - for (let signerIndex = 0; signerIndex < validatorSetSize; signerIndex++) { - if (bitIsSet(bitmap, signerIndex)) { - lastSignedBlock[signerIndex] = blockNumber - } - - const lastSignedWithinLookbackwindow = - lastSignedBlock[signerIndex] >= currentLookbackWindow[0] && - lastSignedBlock[signerIndex] <= currentLookbackWindow[1] - - if (isMonitoredBlock && lastSignedWithinLookbackwindow) { - upBlocks[signerIndex]++ - } - } - } - - const maxPotentialUpBlocks = monitoringWindowSize - return upBlocks.map((x) => new BigNumber(x / maxPotentialUpBlocks)) -} - -// TODO(asa): Test independent rotation of ecdsa, bls keys. -describe('governance tests', () => { - const gethConfig: GethRunConfig = { - runPath: TMP_PATH, - verbosity: 3, - migrate: true, - networkId: 1101, - network: 'local', - genesisConfig: { - churritoBlock: 0, - donutBlock: 0, - espressoBlock: 0, - epoch: 10, - }, - instances: [ - // Validators 0 and 1 are swapped in and out of the group. - { - name: 'validator0', - validating: true, - syncmode: 'full', - port: 30303, - rpcport: 8545, - }, - { - name: 'validator1', - validating: true, - syncmode: 'full', - port: 30305, - rpcport: 8547, - }, - // Validator 2 will authorize a validating key every other epoch. - { - name: 'validator2', - validating: true, - syncmode: 'full', - port: 30307, - rpcport: 8549, - }, - { - name: 'validator3', - validating: true, - syncmode: 'full', - port: 30309, - rpcport: 8551, - }, - { - name: 'validator4', - validating: true, - syncmode: 'full', - port: 30311, - rpcport: 8553, - }, - ], - } - - const hooks: any = getHooks(gethConfig) - - let web3: Web3 - let election: any - let stableToken: any - let sortedOracles: any - let epochRewards: any - let goldToken: any - let reserve: any - let validators: any - let accounts: any - let kit: ContractKit - - before(async function (this: any) { - this.timeout(0) - // Comment out the following line after a local run for a quick rerun. - await hooks.before() - }) - - after(async function (this: any) { - this.timeout(0) - await hooks.after() - }) - - const restart = async () => { - await hooks.restart() - web3 = new Web3('http://localhost:8545') - kit = newKitFromWeb3(web3) - // TODO(mcortesi): magic sleep. without it unlockAccount sometimes fails - await sleep(2) - // Assuming empty password - await kit.connection.web3.eth.personal.unlockAccount(validatorAddress, '', 1000000) - - goldToken = await kit._web3Contracts.getGoldToken() - stableToken = await kit._web3Contracts.getStableToken() - sortedOracles = await kit._web3Contracts.getSortedOracles() - validators = await kit._web3Contracts.getValidators() - reserve = await kit._web3Contracts.getReserve() - election = await kit._web3Contracts.getElection() - epochRewards = await kit._web3Contracts.getEpochRewards() - accounts = await kit._web3Contracts.getAccounts() - - await waitForBlock(web3, 1) - await waitForAnnounceToStabilize(web3) - - const er = await kit._web3Contracts.getEpochRewards() - const fraction = await er.methods.getCarbonOffsettingFraction().call() - await er.methods - .setCarbonOffsettingFund(carbonOffsettingPartnerAddress, fraction) - .send({ from: validatorAddress }) - } - - const getValidatorGroupMembers = async (blockNumber?: number) => { - if (blockNumber) { - const [groupAddress] = await validators.methods - .getRegisteredValidatorGroups() - .call({}, blockNumber) - const groupInfo = await validators.methods - .getValidatorGroup(groupAddress) - .call({}, blockNumber) - return groupInfo[0] - } else { - const [groupAddress] = await validators.methods.getRegisteredValidatorGroups().call() - const groupInfo = await validators.methods.getValidatorGroup(groupAddress).call() - return groupInfo[0] - } - } - - const getValidatorSigner = async (address: string, blockNumber?: number) => { - if (blockNumber) { - return accounts.methods.getValidatorSigner(address).call({}, blockNumber) - } else { - return accounts.methods.getValidatorSigner(address).call() - } - } - - const getValidatorGroupPrivateKey = async () => { - const [groupAddress] = await validators.methods.getRegisteredValidatorGroups().call() - // If we're using mycelo, we can just generate the validator group key directly - const myceloAddress = generateAddress(mnemonic, AccountType.VALIDATOR_GROUP, 0) - if (myceloAddress === groupAddress) { - return '0x' + generatePrivateKey(mnemonic, AccountType.VALIDATOR_GROUP, 0) - } - // Otherwise, the validator group key is encoded in its name (see 30_elect_validators.ts) - const name = await accounts.methods.getName(groupAddress).call() - const encryptedKeystore64 = name.split(' ')[1] - const encryptedKeystore = JSON.parse(Buffer.from(encryptedKeystore64, 'base64').toString()) - // The validator group ID is the validator group keystore encrypted with validator 0's - // private key. - const encryptionKey = `0x${gethConfig.instances[0].privateKey}` - const decryptedKeystore = web3.eth.accounts.decrypt(encryptedKeystore, encryptionKey) - return decryptedKeystore.privateKey - } - - const isLastBlockOfEpoch = (blockNumber: number, epochSize: number) => - blockNumber % epochSize === 0 - - const assertBalanceChanged = async ( - address: string, - blockNumber: number, - expected: BigNumber, - token: any - ) => { - const currentBalance = new BigNumber( - await token.methods.balanceOf(address).call({}, blockNumber) - ) - const previousBalance = new BigNumber( - await token.methods.balanceOf(address).call({}, blockNumber - 1) - ) - assert.isFalse(currentBalance.isNaN()) - assert.isFalse(previousBalance.isNaN()) - const margin = expected.times(tolerance) - assertAlmostEqual(currentBalance.minus(previousBalance), expected, margin) - } - - const assertTargetVotingYieldChanged = async (blockNumber: number, expected: BigNumber) => { - const currentTarget = new BigNumber( - (await epochRewards.methods.getTargetVotingYieldParameters().call({}, blockNumber))[0] - ) - const previousTarget = new BigNumber( - (await epochRewards.methods.getTargetVotingYieldParameters().call({}, blockNumber - 1))[0] - ) - const max = new BigNumber( - (await epochRewards.methods.getTargetVotingYieldParameters().call({}, blockNumber))[1] - ) - const expectedTarget = previousTarget.plus(expected) - if (expectedTarget.isGreaterThanOrEqualTo(max)) { - assert.equal(currentTarget.toFixed(), max.toFixed()) - } else if (expectedTarget.isLessThanOrEqualTo(0)) { - assert.isTrue(currentTarget.isZero()) - } else { - const difference = currentTarget.minus(previousTarget) - // Assert equal to 9 decimal places due to rounding errors. - assert.equal(fromFixed(difference).dp(9).toFixed(), fromFixed(expected).dp(9).toFixed()) - } - } - - const assertTargetVotingYieldUnchanged = (blockNumber: number) => - assertTargetVotingYieldChanged(blockNumber, new BigNumber(0)) - - const getLastEpochBlock = (blockNumber: number, epoch: number) => { - const epochNumber = Math.floor((blockNumber - 1) / epoch) - return epochNumber * epoch - } - - const assertGoldTokenTotalSupplyUnchanged = (blockNumber: number) => - assertGoldTokenTotalSupplyChanged(blockNumber, new BigNumber(0)) - - const assertGoldTokenTotalSupplyChanged = async (blockNumber: number, expected: BigNumber) => { - const currentSupply = new BigNumber(await goldToken.methods.totalSupply().call({}, blockNumber)) - const previousSupply = new BigNumber( - await goldToken.methods.totalSupply().call({}, blockNumber - 1) - ) - assertAlmostEqual(currentSupply.minus(previousSupply), expected) - } - - describe('when the validator set is changing', () => { - const blockNumbers: number[] = [] - - let epoch: number - let validatorAccounts: string[] - - before(async function (this: any) { - this.timeout(0) // Disable test timeout - - await restart() - - const groupPrivateKey = await getValidatorGroupPrivateKey() - - const validatorGroup: GethInstanceConfig = { - name: 'validatorGroup', - validating: false, - syncmode: 'full', - port: 30313, - wsport: 8555, - rpcport: 8557, - privateKey: groupPrivateKey.slice(2), - } - - await initAndStartGeth(gethConfig, hooks.gethBinaryPath, validatorGroup, verbose) - - await connectPeers([...gethConfig.instances, validatorGroup], verbose) - - await waitToFinishInstanceSyncing(validatorGroup) - - validatorAccounts = await getValidatorGroupMembers() - assert.equal(validatorAccounts.length, 5) - epoch = new BigNumber(await validators.methods.getEpochSize().call()).toNumber() - assert.equal(epoch, 10) - - // Wait for an epoch transition so we can activate our vote. - await waitForEpochTransition(web3, epoch) - await sleep(5.5) - // Wait for an extra epoch transition to ensure everyone is connected to one another. - await waitForEpochTransition(web3, epoch) - - const groupWeb3Url = 'ws://localhost:8555' - - // Prepare for member swapping. - const groupWeb3 = new Web3(groupWeb3Url) - - const groupKit = newKitFromWeb3(groupWeb3) - - const group: string = (await groupWeb3.eth.getAccounts())[0] - // Send some funds to the group, so it can afford fees - await ( - await kit.sendTransaction({ - from: validatorAddress, - to: group, - value: Web3.utils.toWei('1', 'ether'), - }) - ).waitReceipt() - - // groupKit uses a different node than kit does, so wait a second in case kit's node - // got the new block before groupKit's node did. - await sleep(1) - const txos = await (await groupKit.contracts.getElection()).activate(group) - for (const txo of txos) { - await txo.sendAndWaitForReceipt({ from: group }) - } - - validators = await groupKit._web3Contracts.getValidators() - const membersToSwap = [validatorAccounts[0], validatorAccounts[1]] - const memberSwapper = await newMemberSwapper(groupKit, membersToSwap) - // The memberSwapper makes a change when it's created, so we wait for epoch change so it takes effect - await waitForEpochTransition(web3, epoch) - - const handled: any = {} - - let errorWhileChangingValidatorSet = '' - const changeValidatorSet = async (header: any) => { - try { - if (handled[header.number]) { - return - } - handled[header.number] = true - blockNumbers.push(header.number) - // At the start of epoch N, perform actions so the validator set is different for epoch N + 1. - // Note that all of these actions MUST complete within the epoch. - if (header.number % epoch === 0 && errorWhileChangingValidatorSet === '') { - // 1. Swap validator0 and validator1 so one is a member of the group and the other is not. - // 2. Rotate keys for validator 2 by authorizing a new validating key. - await memberSwapper.swap() - } - } catch (e: any) { - console.error(e) - errorWhileChangingValidatorSet = e.toString() - } - } - - const subscription = groupWeb3.eth.subscribe('newBlockHeaders') - subscription.on('data', changeValidatorSet) - - // Wait for a few epochs while changing the validator set. - while (blockNumbers.length < 40) { - // Prepare for member swapping. - await sleep(epoch) - } - ;(subscription as any).unsubscribe() - - // Wait for the current epoch to complete. - await sleep(epoch) - assert.equal(errorWhileChangingValidatorSet, '') - }) - - const getValidatorSetSignersAtBlock = async (blockNumber: number): Promise => { - return election.methods.getCurrentValidatorSigners().call({}, blockNumber) - } - - const getValidatorSetAccountsAtBlock = async (blockNumber: number) => { - const signingKeys = await getValidatorSetSignersAtBlock(blockNumber) - return Promise.all( - signingKeys.map((address: string) => - accounts.methods.signerToAccount(address).call({}, blockNumber) - ) - ) - } - - it('should always return a validator set size equal to the number of group members at the end of the last epoch', async () => { - for (const blockNumber of blockNumbers) { - const lastEpochBlock = getLastEpochBlock(blockNumber, epoch) - const validatorSetSize = await election.methods - .numberValidatorsInCurrentSet() - .call({}, blockNumber) - const groupMembership = await getValidatorGroupMembers(lastEpochBlock) - assert.equal(validatorSetSize, groupMembership.length) - } - }) - - it('should always return a validator set equal to the signing keys of the group members at the end of the last epoch', async function (this: any) { - this.timeout(0) - for (const blockNumber of blockNumbers) { - const lastEpochBlock = getLastEpochBlock(blockNumber, epoch) - const memberAccounts = await getValidatorGroupMembers(lastEpochBlock) - const memberSigners = await Promise.all( - memberAccounts.map((v: string) => getValidatorSigner(v, lastEpochBlock)) - ) - const validatorSetSigners = await getValidatorSetSignersAtBlock(blockNumber) - const validatorSetAccounts = await getValidatorSetAccountsAtBlock(blockNumber) - assert.sameMembers(memberSigners, validatorSetSigners) - assert.sameMembers(memberAccounts, validatorSetAccounts) - } - }) - - it('should block propose in a round robin fashion', async () => { - let roundRobinOrder: string[] = [] - for (const blockNumber of blockNumbers) { - const lastEpochBlock = getLastEpochBlock(blockNumber, epoch) - // Fetch the round robin order if it hasn't already been set for this epoch. - if (roundRobinOrder.length === 0 || blockNumber === lastEpochBlock + 1) { - const validatorSet = await getValidatorSetSignersAtBlock(blockNumber) - roundRobinOrder = await Promise.all( - validatorSet.map( - async (_, i) => (await web3.eth.getBlock(lastEpochBlock + i + 1)).miner - ) - ) - assert.sameMembers(roundRobinOrder, validatorSet) - } - const indexInEpoch = blockNumber - lastEpochBlock - 1 - const expectedProposer = roundRobinOrder[indexInEpoch % roundRobinOrder.length] - const block = await web3.eth.getBlock(blockNumber) - assert(eqAddress(block.miner, expectedProposer)) - } - }) - - it('should update the validator scores at the end of each epoch', async function (this: any) { - this.timeout(0) - const scoreParams = await validators.methods.getValidatorScoreParameters().call() - const exponent = new BigNumber(scoreParams[0]) - const adjustmentSpeed = fromFixed(new BigNumber(scoreParams[1])) - - const assertScoreUnchanged = async (validator: string, blockNumber: number) => { - const score = new BigNumber( - (await validators.methods.getValidator(validator).call({}, blockNumber)).score - ) - const previousScore = new BigNumber( - (await validators.methods.getValidator(validator).call({}, blockNumber - 1)).score - ) - assert.isFalse(score.isNaN()) - assert.isFalse(previousScore.isNaN()) - assert.equal(score.toFixed(), previousScore.toFixed()) - } - - const assertScoreChanged = async ( - validator: string, - blockNumber: number, - uptime: BigNumber - ) => { - const score = new BigNumber( - (await validators.methods.getValidator(validator).call({}, blockNumber)).score - ) - const previousScore = new BigNumber( - (await validators.methods.getValidator(validator).call({}, blockNumber - 1)).score - ) - assert.isFalse(score.isNaN()) - assert.isFalse(previousScore.isNaN()) - - const epochScore = uptime.exponentiatedBy(exponent) - const expectedScore = BigNumber.minimum( - epochScore, - adjustmentSpeed - .times(epochScore) - .plus(new BigNumber(1).minus(adjustmentSpeed).times(fromFixed(previousScore))) - ) - assertAlmostEqual(score, toFixed(expectedScore)) - } - - for (const blockNumber of blockNumbers) { - let expectUnchangedScores: string[] - let expectChangedScores: string[] - let electedValidators: string[] - let uptime: BigNumber[] - if (isLastBlockOfEpoch(blockNumber, epoch)) { - expectChangedScores = await getValidatorSetAccountsAtBlock(blockNumber) - expectUnchangedScores = validatorAccounts.filter((x) => !expectChangedScores.includes(x)) - electedValidators = await getValidatorSetAccountsAtBlock(blockNumber) - uptime = await calculateUptime(kit, electedValidators.length, blockNumber, epoch, 3) - } else { - expectUnchangedScores = validatorAccounts - expectChangedScores = [] - electedValidators = [] - uptime = [] - } - - for (const validator of expectUnchangedScores) { - await assertScoreUnchanged(validator, blockNumber) - } - - for (const validator of expectChangedScores) { - const signerIndex = electedValidators.map(eqAddress.bind(null, validator)).indexOf(true) - await assertScoreChanged(validator, blockNumber, uptime[signerIndex]) - } - } - }) - - it('should distribute epoch payments at the end of each epoch', async function (this: any) { - this.timeout(0) - const commission = 0.1 - const targetValidatorEpochPayment = new BigNumber( - await epochRewards.methods.targetValidatorEpochPayment().call() - ) - const [group] = await validators.methods.getRegisteredValidatorGroups().call() - - const assertBalanceUnchanged = async (validator: string, blockNumber: number) => { - await assertBalanceChanged(validator, blockNumber, new BigNumber(0), stableToken) - } - - const getExpectedTotalPayment = async (validator: string, blockNumber: number) => { - const score = new BigNumber( - (await validators.methods.getValidator(validator).call({}, blockNumber)).score - ) - assert.isFalse(score.isNaN()) - // We need to calculate the rewards multiplier for the previous block, before - // the rewards actually are awarded. - const rewardsMultiplier = new BigNumber( - await epochRewards.methods.getRewardsMultiplier().call({}, blockNumber - 1) - ) - return targetValidatorEpochPayment - .times(fromFixed(score)) - .times(fromFixed(rewardsMultiplier)) - } - - for (const blockNumber of blockNumbers) { - let expectUnchangedBalances: string[] - let expectChangedBalances: string[] - if (isLastBlockOfEpoch(blockNumber, epoch)) { - expectChangedBalances = await getValidatorSetAccountsAtBlock(blockNumber) - expectUnchangedBalances = validatorAccounts.filter( - (x) => !expectChangedBalances.includes(x) - ) - } else { - expectUnchangedBalances = validatorAccounts - expectChangedBalances = [] - } - - for (const validator of expectUnchangedBalances) { - await assertBalanceUnchanged(validator, blockNumber) - } - - let expectedGroupPayment = new BigNumber(0) - for (const validator of expectChangedBalances) { - const expectedTotalPayment = await getExpectedTotalPayment(validator, blockNumber) - const groupPayment = expectedTotalPayment.times(commission) - await assertBalanceChanged( - validator, - blockNumber, - expectedTotalPayment.minus(groupPayment), - stableToken - ) - expectedGroupPayment = expectedGroupPayment.plus(groupPayment) - } - await assertBalanceChanged(group, blockNumber, expectedGroupPayment, stableToken) - } - }) - - it('should distribute epoch rewards at the end of each epoch', async function (this: any) { - this.timeout(0) - const lockedGold = await kit._web3Contracts.getLockedGold() - const governance = await kit._web3Contracts.getGovernance() - const gasPriceMinimum = await kit._web3Contracts.getGasPriceMinimum() - const [group] = await validators.methods.getRegisteredValidatorGroups().call() - - const assertVotesChanged = async (blockNumber: number, expected: BigNumber) => { - const currentVotes = new BigNumber( - await election.methods.getTotalVotesForGroup(group).call({}, blockNumber) - ) - const previousVotes = new BigNumber( - await election.methods.getTotalVotesForGroup(group).call({}, blockNumber - 1) - ) - const margin = expected.times(tolerance) - assertAlmostEqual(currentVotes.minus(previousVotes), expected, margin) - } - - // Returns the gas fee base for a given block, which is distributed to the governance contract. - const blockBaseGasFee = async (blockNumber: number): Promise => { - const gas = (await web3.eth.getBlock(blockNumber)).gasUsed - // @ts-ignore - TODO: remove when web3 upgrade completed - const gpm = await gasPriceMinimum.methods.gasPriceMinimum().call({}, blockNumber) - return new BigNumber(gpm).times(new BigNumber(gas)) - } - - const assertLockedGoldBalanceChanged = async (blockNumber: number, expected: BigNumber) => { - await assertBalanceChanged(lockedGold.options.address, blockNumber, expected, goldToken) - } - - const assertGovernanceBalanceChanged = async (blockNumber: number, expected: BigNumber) => { - await assertBalanceChanged(governance.options.address, blockNumber, expected, goldToken) - } - - const assertReserveBalanceChanged = async (blockNumber: number, expected: BigNumber) => { - await assertBalanceChanged(reserve.options.address, blockNumber, expected, goldToken) - } - - const assertCarbonOffsettingBalanceChanged = async ( - blockNumber: number, - expected: BigNumber - ) => { - await assertBalanceChanged(carbonOffsettingPartnerAddress, blockNumber, expected, goldToken) - } - - const assertVotesUnchanged = async (blockNumber: number) => { - await assertVotesChanged(blockNumber, new BigNumber(0)) - } - - const assertLockedGoldBalanceUnchanged = async (blockNumber: number) => { - await assertLockedGoldBalanceChanged(blockNumber, new BigNumber(0)) - } - - const assertReserveBalanceUnchanged = async (blockNumber: number) => { - await assertReserveBalanceChanged(blockNumber, new BigNumber(0)) - } - - const assertCarbonOffsettingBalanceUnchanged = async (blockNumber: number) => { - await assertCarbonOffsettingBalanceChanged(blockNumber, new BigNumber(0)) - } - - const getStableTokenSupplyChange = async (blockNumber: number) => { - const currentSupply = new BigNumber( - await stableToken.methods.totalSupply().call({}, blockNumber) - ) - const previousSupply = new BigNumber( - await stableToken.methods.totalSupply().call({}, blockNumber - 1) - ) - return currentSupply.minus(previousSupply) - } - - const getStableTokenExchangeRate = async (blockNumber: number) => { - const rate = await sortedOracles.methods - .medianRate(stableToken.options.address) - .call({}, blockNumber) - return new BigNumber(rate[0]).div(rate[1]) - } - - for (const blockNumber of blockNumbers) { - if (isLastBlockOfEpoch(blockNumber, epoch)) { - // We use the number of active votes from the previous block to calculate the expected - // epoch reward as the number of active votes for the current block will include the - // epoch reward. - const activeVotes = new BigNumber( - await election.methods.getActiveVotes().call({}, blockNumber - 1) - ) - assert.isFalse(activeVotes.isZero()) - - // We need to calculate the rewards multiplier for the previous block, before - // the rewards actually are awarded. - const rewardsMultiplier = new BigNumber( - await epochRewards.methods.getRewardsMultiplier().call({}, blockNumber - 1) - ) - assert.isFalse(rewardsMultiplier.isZero()) - - // This is the array of rewards that should have been distributed - const targetRewards = await epochRewards.methods - .calculateTargetEpochRewards() - .call({}, blockNumber - 1) - // This is with reward multiplier - const perValidatorReward = new BigNumber(targetRewards[0]) - const validatorSetSize = await election.methods - .numberValidatorsInCurrentSet() - .call({}, blockNumber - 1) - const exchangeRate = await getStableTokenExchangeRate(blockNumber) - // Calculate total validator reward in gold to calc infra reward - const maxPotentialValidatorReward = perValidatorReward - .times(validatorSetSize) - .div(exchangeRate) - // Calculate the expected voting reward - const targetVotingYield = new BigNumber( - (await epochRewards.methods.getTargetVotingYieldParameters().call({}, blockNumber))[0] - ) - assert.isFalse(targetVotingYield.isZero()) - const expectedVoterRewards = activeVotes - .times(fromFixed(targetVotingYield)) - .times(fromFixed(rewardsMultiplier)) - - // infra: (x / (1 - x)) * predicted supply increase * rewards mult - const communityRewardFrac = new BigNumber( - await epochRewards.methods.getCommunityRewardFraction().call({}, blockNumber) - ) - const carbonOffsettingFrac = new BigNumber( - await epochRewards.methods.getCarbonOffsettingFraction().call({}, blockNumber) - ) - - const fundFactor = new BigNumber(1) - .minus(fromFixed(communityRewardFrac)) - .minus(fromFixed(carbonOffsettingFrac)) - - const expectedCommunityReward = expectedVoterRewards - .plus(maxPotentialValidatorReward) - .times(fromFixed(communityRewardFrac)) - .div(fundFactor) - - const expectedCarbonOffsettingPartnerAward = expectedVoterRewards - .plus(maxPotentialValidatorReward) - .times(fromFixed(carbonOffsettingFrac)) - .div(fundFactor) - - const stableTokenSupplyChange = await getStableTokenSupplyChange(blockNumber) - const expectedGoldTotalSupplyChange = expectedCommunityReward - .plus(expectedVoterRewards) - .plus(expectedCarbonOffsettingPartnerAward) - .plus(stableTokenSupplyChange.div(exchangeRate)) - // Check TS calc'd rewards against solidity calc'd rewards - const totalVoterRewards = new BigNumber(targetRewards[1]) - const totalCommunityReward = new BigNumber(targetRewards[2]) - const carbonOffsettingPartnerAward = new BigNumber(targetRewards[3]) - assertAlmostEqual(expectedVoterRewards, totalVoterRewards) - assertAlmostEqual(expectedCommunityReward, totalCommunityReward) - assertAlmostEqual(expectedCarbonOffsettingPartnerAward, carbonOffsettingPartnerAward) - // Check TS calc'd rewards against what happened - await assertVotesChanged(blockNumber, expectedVoterRewards) - await assertLockedGoldBalanceChanged(blockNumber, expectedVoterRewards) - await assertGovernanceBalanceChanged( - blockNumber, - expectedCommunityReward.plus(await blockBaseGasFee(blockNumber)) - ) - await assertReserveBalanceChanged(blockNumber, stableTokenSupplyChange.div(exchangeRate)) - await assertGoldTokenTotalSupplyChanged(blockNumber, expectedGoldTotalSupplyChange) - await assertCarbonOffsettingBalanceChanged( - blockNumber, - expectedCarbonOffsettingPartnerAward - ) - } else { - await assertVotesUnchanged(blockNumber) - await assertGoldTokenTotalSupplyUnchanged(blockNumber) - await assertLockedGoldBalanceUnchanged(blockNumber) - await assertReserveBalanceUnchanged(blockNumber) - await assertGovernanceBalanceChanged(blockNumber, await blockBaseGasFee(blockNumber)) - await assertCarbonOffsettingBalanceUnchanged(blockNumber) - } - } - }) - - it('should update the target voting yield', async () => { - for (const blockNumber of blockNumbers) { - if (isLastBlockOfEpoch(blockNumber, epoch)) { - // We use the voting gold fraction from before the rewards are granted. - const votingGoldFraction = new BigNumber( - await epochRewards.methods.getVotingGoldFraction().call({}, blockNumber - 1) - ) - const targetVotingGoldFraction = new BigNumber( - await epochRewards.methods.getTargetVotingGoldFraction().call({}, blockNumber) - ) - const difference = targetVotingGoldFraction.minus(votingGoldFraction) - const adjustmentFactor = fromFixed( - new BigNumber( - (await epochRewards.methods.getTargetVotingYieldParameters().call({}, blockNumber))[2] - ) - ) - const delta = difference.times(adjustmentFactor) - await assertTargetVotingYieldChanged(blockNumber, delta) - } else { - await assertTargetVotingYieldUnchanged(blockNumber) - } - } - }) - - it('should have emitted the correct events when paying epoch rewards', async () => { - const currentBlock = (await web3.eth.getBlock('latest')).number - const events = [ - { - contract: epochRewards, - name: 'TargetVotingYieldUpdated', - }, - { - contract: validators, - name: 'ValidatorEpochPaymentDistributed', - }, - { - contract: validators, - name: 'ValidatorScoreUpdated', - }, - { - contract: election, - name: 'EpochRewardsDistributedToVoters', - }, - ] - for (const event of events) { - const eventLogs = await event.contract.getPastEvents(event.name, { - fromBlock: currentBlock - 10, - currentBlock, - }) - assert( - eventLogs.every((a: any) => a.blockNumber % 10 === 0), - `every ${event.name} event occured on the last block of the epoch` - ) - assert(eventLogs.length > 0, `at least one ${event.name} event occured`) - } - }) - }) - - describe('when rotating keys', () => { - const blockNumbers: number[] = [] - const miners: string[] = [] - const rotation0PrivateKey = '0xa42ac9c99f6ab2c96ee6cae1b40d36187f65cd878737f6623cd363fb94ba7087' - const rotation1PrivateKey = '0x4519cae145fb9499358be484ca60c80d8f5b7f9c13ff82c88ec9e13283e9de1a' - - const rotation0Address = privateKeyToAddress(rotation0PrivateKey) - const rotation1Address = privateKeyToAddress(rotation1PrivateKey) - - let epoch: number - let validatorAccounts: string[] - - before(async function (this: any) { - this.timeout(0) // Disable test timeout - - await restart() - - const groupPrivateKey = await getValidatorGroupPrivateKey() - - const validatorGroup: GethInstanceConfig = { - name: 'validatorGroup', - validating: false, - syncmode: 'full', - port: 30313, - wsport: 8555, - rpcport: 8557, - privateKey: groupPrivateKey.slice(2), - } - - await initAndStartGeth(gethConfig, hooks.gethBinaryPath, validatorGroup, verbose) - - await connectPeers([...gethConfig.instances, validatorGroup], verbose) - - console.info('wait for validatorGroup to finish syncing') - await waitToFinishInstanceSyncing(validatorGroup) - - const additionalValidatingNodes: GethInstanceConfig[] = [ - { - name: 'validator2KeyRotation0', - validating: true, - syncmode: 'full', - lightserv: false, - port: 30315, - wsport: 8559, - rpcport: 9559, - privateKey: rotation0PrivateKey.slice(2), - minerValidator: privateKeyToAddress(rotation0PrivateKey.slice(2)), - }, - { - name: 'validator2KeyRotation1', - validating: true, - syncmode: 'full', - lightserv: false, - port: 30317, - wsport: 8561, - rpcport: 9561, - privateKey: rotation1PrivateKey.slice(2), - minerValidator: privateKeyToAddress(rotation1PrivateKey.slice(2)), - }, - ] - - await Promise.all( - additionalValidatingNodes.map((nodeConfig: GethInstanceConfig) => - initAndStartGeth(gethConfig, hooks.gethBinaryPath, nodeConfig, verbose) - ) - ) - - // Connect the validating nodes to the non-validating nodes, to test that announce messages are properly gossiped. - await connectBipartiteClique(gethConfig.instances, additionalValidatingNodes, verbose) - - console.info('wait for new validators to sync') - await Promise.all(additionalValidatingNodes.map((i) => waitToFinishInstanceSyncing(i))) - - validatorAccounts = await getValidatorGroupMembers() - assert.equal(validatorAccounts.length, 5) - epoch = new BigNumber(await validators.methods.getEpochSize().call()).toNumber() - assert.equal(epoch, 10) - - console.info('wait for end of epoch') - // Wait for an epoch transition to ensure everyone is connected to one another. - await waitForEpochTransition(web3, epoch) - - const groupWeb3Url = 'ws://localhost:8555' - const groupWeb3 = new Web3(groupWeb3Url) - - const groupKit = newKitFromWeb3(groupWeb3) - - // Prepare for key rotation. - const validatorRpc = 'http://localhost:8549' - const validatorWeb3 = new Web3(validatorRpc) - const authWeb31 = 'ws://localhost:8559' - const authWeb32 = 'ws://localhost:8561' - const authorizedKits = [ - newKitFromWeb3(new Web3(authWeb31)), - newKitFromWeb3(new Web3(authWeb32)), - ] - const authorizedPrivateKeys = [rotation0PrivateKey, rotation1PrivateKey] - const keyRotator = await newKeyRotator( - newKitFromWeb3(validatorWeb3), - authorizedKits, - authorizedPrivateKeys - ) - - const handled: any = {} - - let errorWhileChangingValidatorSet = '' - let lastRotated = 0 - const changeValidatorSet = async (header: any) => { - try { - if (handled[header.number]) { - return - } - handled[header.number] = true - blockNumbers.push(header.number) - miners.push(header.miner) - // At the start of epoch N, perform actions so the validator set is different for epoch N + 1. - // Note that all of these actions MUST complete within the epoch. - if ( - header.number % 10 === 0 && - errorWhileChangingValidatorSet === '' && - lastRotated + 60 <= header.number - ) { - // 1. Swap validator0 and validator1 so one is a member of the group and the other is not. - // 2. Rotate keys for validator 2 by authorizing a new validating key. - lastRotated = header.number - await keyRotator.rotate() - } - } catch (e: any) { - console.error(e) - errorWhileChangingValidatorSet = e.toString() - } - } - - const subscription = groupKit.connection.web3.eth.subscribe('newBlockHeaders') - subscription.on('data', changeValidatorSet) - - // Wait for a few epochs while changing the validator set. - while (blockNumbers.length < 180) { - // Prepare for member swapping. - await sleep(epoch) - } - ;(subscription as any).unsubscribe() - - // Wait for the current epoch to complete. - await sleep(epoch) - assert.equal(errorWhileChangingValidatorSet, '') - }) - - it('validator 0 should have signed at least one block', async () => { - const rotation0MinedBlock = miners.some((a) => eqAddress(a, rotation0Address)) - if (!rotation0MinedBlock) { - console.info(rotation0Address, rotation1Address, miners) - } - assert.isTrue(rotation0MinedBlock) - }) - - it('validator 1 should have signed at least one block', async () => { - const rotation1MinedBlock = miners.some((a) => eqAddress(a, rotation1Address)) - if (!rotation1MinedBlock) { - console.info(rotation0Address, rotation1Address, miners) - } - assert.isTrue(rotation1MinedBlock) - }) - }) - - describe('when rewards distribution is frozen', () => { - let epoch: number - let blockFrozen: number - let latestBlock: number - - before(async function (this: any) { - this.timeout(0) - await restart() - const validator = (await kit.connection.getAccounts())[0] - await kit.connection.web3.eth.personal.unlockAccount(validator, '', 1000000) - const freezer = await kit._web3Contracts.getFreezer() - await freezer.methods.freeze(epochRewards.options.address).send({ from: validator }) - blockFrozen = await kit.connection.getBlockNumber() - epoch = new BigNumber(await validators.methods.getEpochSize().call()).toNumber() - await waitForBlock(kit.connection.web3, blockFrozen + epoch * 2) - latestBlock = await kit.connection.getBlockNumber() - }) - - it('should not update the target voing yield', async () => { - for (let blockNumber = blockFrozen; blockNumber < latestBlock; blockNumber++) { - await assertTargetVotingYieldUnchanged(blockNumber) - } - }) - - it('should not mint new Celo Gold', async () => { - for (let blockNumber = blockFrozen; blockNumber < latestBlock; blockNumber++) { - await assertGoldTokenTotalSupplyUnchanged(blockNumber) - } - }) - }) -}) diff --git a/packages/celotool/src/e2e-tests/replica_tests.ts b/packages/celotool/src/e2e-tests/replica_tests.ts deleted file mode 100644 index 6bc6e31d643..00000000000 --- a/packages/celotool/src/e2e-tests/replica_tests.ts +++ /dev/null @@ -1,263 +0,0 @@ -import { BlockHeader } from '@celo/connect' -import { HttpRpcCaller, RpcCaller } from '@celo/connect/lib/utils/rpc-caller' -import { privateKeyToAddress } from '@celo/utils/lib/address' -import { bitIsSet, parseBlockExtraData } from '@celo/utils/lib/istanbul' -import { assert } from 'chai' -import Web3 from 'web3' -import { privateKeyToPublicKey } from '../lib/generate_utils' -import { getEnodeAddress, initAndStartGeth } from '../lib/geth' -import { GethInstanceConfig } from '../lib/interfaces/geth-instance-config' -import { GethRunConfig } from '../lib/interfaces/geth-run-config' -import { getHooks, sleep, waitForEpochTransition, waitToFinishInstanceSyncing } from './utils' - -enum IstanbulManagement { - getSnapshot = 'istanbul_getSnapshot', - getValidators = 'istanbul_getValidators', - getValidatorsBLSPublicKeys = 'istanbul_getValidatorsBLSPublicKeys', - getProposer = 'istanbul_getProposer', - addProxy = 'istanbul_addProxy', - removeProxy = 'istanbul_removeProxy', - startAtBlock = 'istanbul_startValidatingAtBlock', - stopAtBlock = 'istanbul_stopValidatingAtBlock', - startValidating = 'istanbul_startValidating', - stopValidating = 'istanbul_stopValidating', - valEnodeTableInfo = 'istanbul_getValEnodeTable', - versionCertificateTableInfo = 'istanbul_getVersionCertificateTableInfo', - currentRoundState = 'istanbul_getCurrentRoundState', - proxies = 'istanbul_getProxiesInfo', - proxiedValidators = 'istanbul_getProxiedValidators', - validating = 'istanbul_isValidating', - replicaState = 'istanbul_getCurrentReplicaState', -} - -const TMP_PATH = '/tmp/e2e' -const verbose = false - -describe('replica swap tests', () => { - const gethConfig: GethRunConfig = { - migrate: true, - runPath: TMP_PATH, - verbosity: 4, - networkId: 1101, - network: 'local', - genesisConfig: { - blockTime: 1, - churritoBlock: 0, - donutBlock: 0, - espressoBlock: 0, - }, - instances: [ - { - name: 'validator0', - validating: true, - syncmode: 'full', - port: 30303, - rpcport: 8545, - proxy: 'validator0-proxy0', - isProxied: true, - proxyport: 30304, - proxyAllowPrivateIp: true, - }, - { - name: 'validator0-proxy0', - isProxy: true, - validating: false, - syncmode: 'full', - proxyport: 30304, - port: 30305, - rpcport: 8546, - }, - { - name: 'validator1', - validating: true, - syncmode: 'full', - port: 30307, - rpcport: 8547, - }, - { - name: 'validator2', - validating: true, - syncmode: 'full', - port: 30309, - rpcport: 8549, - }, - { - name: 'validator3', - validating: true, - syncmode: 'full', - port: 30311, - wsport: 8544, - rpcport: 8551, - }, - { - name: 'validator4', - validating: true, - syncmode: 'full', - port: 30313, - rpcport: 8553, - }, - ], - } - const numValidators = gethConfig.instances.filter((x) => x.validating).length - - const hooks: any = getHooks(gethConfig) - let web3: Web3 - - before(async function (this: any) { - this.timeout(0) - // Comment out the following line after a local run for a quick rerun. - await hooks.before() - }) - - after(async function (this: any) { - this.timeout(0) - await hooks.after() - }) - - const restart = async () => { - await hooks.restart() - web3 = new Web3('http://localhost:8545') - } - - describe('replica behind single proxy', () => { - let epoch: number - let blockCount = 0 - let proxyRPC: RpcCaller - let validatoRPC: RpcCaller - let replicaRPC: RpcCaller - let swapBlock: number - const missed: any = [] - - before(async function (this: any) { - this.timeout(0) // Disable test timeout - - await restart() - - const proxyPubKey = privateKeyToPublicKey(gethConfig.instances[1].nodekey) - const replica: GethInstanceConfig = { - name: 'validator0-replica0', - replica: true, - validating: true, - syncmode: 'full', - port: 30315, - rpcport: 8555, - privateKey: gethConfig.instances[0].privateKey, - minerValidator: privateKeyToAddress(gethConfig.instances[0].privateKey), - proxy: 'validator0-proxy0', - isProxied: true, - proxyport: 30304, - proxyAllowPrivateIp: true, - proxies: [ - getEnodeAddress(proxyPubKey, '127.0.0.1', 30304), - getEnodeAddress(proxyPubKey, '127.0.0.1', 30305), - ], - } - - await initAndStartGeth(gethConfig, hooks.gethBinaryPath, replica, verbose) - if (verbose) { - console.info('Starting sync w/ replica') - } - await waitToFinishInstanceSyncing(replica) - if (verbose) { - console.info('Replica synced') - } - - epoch = 20 - // Wait for an epoch transition to ensure everyone is connected to one another. - await waitForEpochTransition(web3, epoch) - - const validatorWSWeb3Url = 'ws://localhost:8544' - const validatorWSWeb3 = new Web3(validatorWSWeb3Url) - - validatoRPC = new HttpRpcCaller(new Web3.providers.HttpProvider('http://localhost:8545')) - proxyRPC = new HttpRpcCaller(new Web3.providers.HttpProvider('http://localhost:8546')) - replicaRPC = new HttpRpcCaller(new Web3.providers.HttpProvider('http://localhost:8555')) - - const handled: any = {} - let errorMsg = '' - let setSwap = false - const recordNewBlock = async (header: BlockHeader) => { - try { - if (handled[header.number]) { - return - } - if (!setSwap) { - swapBlock = header.number + 40 - if (verbose) { - console.info(`Swapping validators at block ${swapBlock}`) - } - let resp = await replicaRPC.call(IstanbulManagement.startAtBlock, [swapBlock]) - assert.equal(resp.error, null) - resp = await validatoRPC.call(IstanbulManagement.stopAtBlock, [swapBlock]) - assert.equal(resp.error, null) - setSwap = true - } - handled[header.number] = true - blockCount += 1 - const bitmap = parseBlockExtraData(header.extraData).parentAggregatedSeal.bitmap - for (let i = 0; i < numValidators; i += 1) { - if (!bitIsSet(bitmap, i)) { - missed.push({ idx: i, num: header.number }) - } - } - } catch (e: any) { - console.error(e) - errorMsg = e - } - } - - // Wait for nodes to reliably sign blocks - await sleep(2 * epoch) - const subscription = validatorWSWeb3.eth.subscribe('newBlockHeaders') - subscription.on('data', recordNewBlock) - - // Wait for a few epochs while rotating a validator. - while (blockCount < 80) { - if (verbose) { - console.info(`Waiting. ${blockCount}/80`) - } - await sleep(epoch) - } - ;(subscription as any).unsubscribe() - if (verbose) { - console.info('Unsubscribed from block headers') - } - - // Wait for the current epoch to complete. - await sleep(epoch) - assert.equal(errorMsg, '') - }) - - it('replica is validating', async () => { - const validating = (await replicaRPC.call(IstanbulManagement.validating, [])).result - assert.isTrue(validating) - }) - - it('primary is not validating', async () => { - const validating = (await validatoRPC.call(IstanbulManagement.validating, [])).result - assert.isFalse(validating) - }) - - it('replica should have good val enode table', async () => { - const resp = (await replicaRPC.call(IstanbulManagement.valEnodeTableInfo, [])).result - Object.keys(resp).forEach((k) => { - const enode = resp[k].enode - assert.isTrue((enode || '') !== '') - }) - }) - - it('proxy should be connected', async () => { - const resp = (await proxyRPC.call(IstanbulManagement.proxiedValidators, [])) - .result as string[][] - assert.equal(resp.length, 2) - }) - - it('should switch without downtime', () => { - if (missed.length !== 0) { - missed.forEach((x: any) => console.warn(`Validator idx ${x.idx} missed block ${x.num}`)) - console.warn(`Val idx 0 should have switched on block ${swapBlock}`) - } - assert.isBelow(missed.length, 4) - }) - }) -}) diff --git a/packages/celotool/src/e2e-tests/slashing_tests.ts b/packages/celotool/src/e2e-tests/slashing_tests.ts deleted file mode 100644 index 2c28ae1d522..00000000000 --- a/packages/celotool/src/e2e-tests/slashing_tests.ts +++ /dev/null @@ -1,427 +0,0 @@ -import { NULL_ADDRESS } from '@celo/base/lib/address' -import { ContractKit, newKitFromWeb3 } from '@celo/contractkit' -import { ensureLeading0x } from '@celo/utils/lib/address' -import BigNumber from 'bignumber.js' -import { assert } from 'chai' -import * as _ from 'lodash' -import * as rlp from 'rlp' -import Web3 from 'web3' -import { GethRunConfig } from '../lib/interfaces/geth-run-config' -import { getHooks, sleep, waitForBlock } from './utils' - -const headerHex = - '0xf901a6a07285abd5b24742f184ad676e31f6054663b3529bc35ea2fcad8a3e0f642a46f7948888f1f195afa192cfee860698584c030f4c9db1a0ecc60e00b3fe5ce9f6e1a10e5469764daf51f1fe93c22ec3f9a7583a80357217a0d35d334d87c0cc0a202e3756bf81fae08b1575f286c7ee7a3f8df4f0f3afc55da056e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421b901000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000001825208845c47775c80' - -const TMP_PATH = '/tmp/e2e' - -const safeMarginBlocks = 4 - -function headerArray(block: any) { - if (!block.nonce) { - // Before Gingerbread fork - return [ - block.parentHash, - block.miner, - block.stateRoot, - block.transactionsRoot, - block.receiptsRoot, - block.logsBloom, - block.number, - block.gasUsed, - block.timestamp, - block.extraData, - ] - } - return [ - block.parentHash, - block.sha3Uncles, - block.miner, - block.stateRoot, - block.transactionsRoot, - block.receiptsRoot, - block.logsBloom, - new BigNumber(block.difficulty).toNumber(), - block.number, - block.gasLimit, - block.gasUsed, - block.timestamp, - block.extraData, - block.mixHash, - block.nonce, - block.baseFee, - ] -} - -function headerFromBlock(block: any) { - return ensureLeading0x(rlp.encode(headerArray(block)).toString('hex')) -} - -// Find a validator that double signed. Both blocks will have signatures from exactly 2F+1 validators. -async function findDoubleSignerIndex( - kit: ContractKit, - header: string, - other: string -): Promise { - const slasher = await kit._web3Contracts.getDoubleSigningSlasher() - const bitmap1 = await slasher.methods.getVerifiedSealBitmapFromHeader(header).call() - const bitmap2 = await slasher.methods.getVerifiedSealBitmapFromHeader(other).call() - - let bmNum1 = new BigNumber(bitmap1).toNumber() - let bmNum2 = new BigNumber(bitmap2).toNumber() - bmNum1 = bmNum1 >> 1 - bmNum2 = bmNum2 >> 1 - let signerIdx = 1 - for (let i = 1; i < 5; i++) { - if ((bmNum1 & 1) === 1 && (bmNum2 & 1) === 1) { - break - } - signerIdx++ - bmNum1 = bmNum1 >> 1 - bmNum2 = bmNum2 >> 1 - } - return signerIdx -} - -async function generateValidIntervalArrays( - startBlock: number, - endBlock: number, - startEpoch: number, - slotSize: number, - kit: ContractKit -): Promise<{ - startBlocks: number[] - endBlocks: number[] -}> { - const startBlocks: number[] = [] - const endBlocks: number[] = [] - - const nextEpochStart = await kit.getFirstBlockNumberForEpoch(startEpoch + 1) - for (let currentSlotStart = startBlock; currentSlotStart <= endBlock; ) { - let currentSlotEnd = currentSlotStart + slotSize - 1 - currentSlotEnd = currentSlotEnd > endBlock ? endBlock : currentSlotEnd - // avoids crossing the epoch - currentSlotEnd = - currentSlotEnd >= nextEpochStart && currentSlotStart < nextEpochStart - ? nextEpochStart - 1 - : currentSlotEnd - startBlocks.push(currentSlotStart) - endBlocks.push(currentSlotEnd) - currentSlotStart = currentSlotEnd + 1 - } - - return { startBlocks, endBlocks } -} - -describe('slashing tests', function (this: any) { - const gethConfig: GethRunConfig = { - network: 'local', - networkId: 1101, - runPath: TMP_PATH, - migrate: true, - genesisConfig: { - churritoBlock: 0, - donutBlock: 0, - espressoBlock: 0, - }, - instances: [ - { - name: 'validator0', - validating: true, - syncmode: 'full', - port: 30303, - rpcport: 8545, - }, - { - name: 'validator1', - validating: true, - syncmode: 'full', - port: 30305, - rpcport: 8547, - }, - { - name: 'validator2', - validating: true, - syncmode: 'full', - port: 30307, - rpcport: 8549, - }, - { - name: 'validator3', - validating: true, - syncmode: 'full', - port: 30309, - rpcport: 8551, - }, - { - name: 'validator4', - validating: true, - syncmode: 'full', - port: 30311, - rpcport: 8553, - }, - ], - } - - // Do a shallow copy so that the instance objects are the same (even after the init step fills private keys, etc.) - const gethConfigDown = _.clone(gethConfig) - // Exclude the last validator to simulate it being down - gethConfigDown.instances = gethConfig.instances.slice(0, gethConfig.instances.length - 1) - - const hooks: any = getHooks(gethConfig) - const hooksDown: any = getHooks(gethConfigDown) - let web3: Web3 - let kit: ContractKit - - before(async function (this: any) { - this.timeout(0) - await hooks.before() - }) - - after(async function (this: any) { - this.timeout(0) - await hooks.after() - }) - - const restart = async () => { - await hooks.restart() - web3 = new Web3('http://localhost:8545') - kit = newKitFromWeb3(web3) - await sleep(1) - } - - const restartWithDowntime = async () => { - await hooksDown.restart() - web3 = new Web3('http://localhost:8545') - kit = newKitFromWeb3(web3) - await sleep(1) - } - - describe('when running a network', () => { - before(async function (this: any) { - this.timeout(0) // Disable test timeout - await restartWithDowntime() - }) - - it('should parse blockNumber from test header', async () => { - this.timeout(0) - const contract = await kit._web3Contracts.getElection() - const blockNumber = await contract.methods.getBlockNumberFromHeader(headerHex).call() - assert.equal(blockNumber, '1') - }) - - it('should parse blockNumber from current header', async () => { - const contract = await kit._web3Contracts.getElection() - const current = await kit.connection.getBlockNumber() - const block = await kit.connection.getBlock(current) - const header = headerFromBlock(block) - const blockNumber = await contract.methods.getBlockNumberFromHeader(header).call() - assert.equal(blockNumber, current.toString()) - }) - - it('should hash test header correctly', async () => { - const contract = await kit._web3Contracts.getElection() - const hash = await contract.methods.hashHeader(headerHex).call() - assert.equal(hash, '0x2e14ef428293e41c5f81a108b5d36f892b2bee3e34aec4223474c4a31618ea69') - }) - - it('should hash current header correctly', async () => { - const contract = await kit._web3Contracts.getElection() - const current = await kit.connection.getBlockNumber() - const block = await kit.connection.getBlock(current) - const header = headerFromBlock(block) - const blockHash = await contract.methods.hashHeader(header).call() - assert.equal(blockHash, block.hash) - }) - }) - - let doubleSigningBlock: any - - describe('test slashing for downtime', () => { - before(async function (this: any) { - this.timeout(0) // Disable test timeout - await restartWithDowntime() - }) - - it('slash for downtime', async function (this: any) { - this.timeout(0) // Disable test timeout - const slasher = await kit._web3Contracts.getDowntimeSlasher() - const slashableDowntime = new BigNumber(await slasher.methods.slashableDowntime().call()) - await waitForBlock(web3, 1) - const blockNumber = await kit.connection.getBlockNumber() - await waitForBlock(web3, blockNumber + slashableDowntime.toNumber() + 2 * safeMarginBlocks) - - // Store this block for testing double signing - doubleSigningBlock = await kit.connection.getBlock(blockNumber + 2 * safeMarginBlocks) - - const signer = await slasher.methods.validatorSignerAddressFromSet(4, blockNumber).call() - const validator = (await kit.connection.getAccounts())[0] - await kit.connection.web3.eth.personal.unlockAccount(validator, '', 1000000) - const lockedGold = await kit.contracts.getLockedGold() - - const validatorsContract = await kit._web3Contracts.getValidators() - const history = await validatorsContract.methods.getMembershipHistory(signer).call() - const historyIndex = history[0].length - 1 - - const slotSize = slashableDowntime.dividedToIntegerBy(3).toNumber() - - const startBlock = blockNumber + safeMarginBlocks - const endBlock = startBlock + slashableDowntime.toNumber() - 1 - const startEpoch = await kit.getEpochNumberOfBlock(startBlock) - - const intervalArrays = await generateValidIntervalArrays( - startBlock, - endBlock, - startEpoch, - slotSize, - kit - ) - - for (let i = 0; i < intervalArrays.startBlocks.length; i += 1) { - await slasher.methods - .setBitmapForInterval(intervalArrays.startBlocks[i], intervalArrays.endBlocks[i]) - .send({ from: validator, gas: 5000000 }) - } - - await slasher.methods - .slash( - intervalArrays.startBlocks, - intervalArrays.endBlocks, - [4, 4], - historyIndex, - [], - [], - [], - [NULL_ADDRESS], - [NULL_ADDRESS], - [0] - ) - .send({ from: validator, gas: 5000000 }) - - const balance = await lockedGold.getAccountTotalLockedGold(signer) - // Penalty is defined to be 100 cGLD in migrations, locked gold is 10000 cGLD for a validator - assert.equal(balance.toString(10), '9900000000000000000000') - }) - }) - - describe('test slashing for downtime with contractkit', () => { - before(async function (this: any) { - this.timeout(0) // Disable test timeout - await restartWithDowntime() - }) - - it('slash for downtime with contractkit', async function (this: any) { - this.timeout(0) // Disable test timeout - const slasher = await kit.contracts.getDowntimeSlasher() - const blockNumber = await kit.connection.getBlockNumber() - const slashableDowntime = await slasher.slashableDowntime() - - await waitForBlock(web3, blockNumber + slashableDowntime + 2 * safeMarginBlocks) - - const user = (await kit.connection.getAccounts())[0] - await kit.connection.web3.eth.personal.unlockAccount(user, '', 1000000) - - const endBlock = blockNumber + safeMarginBlocks + slashableDowntime - 1 - const intervals = await slasher.slashableDowntimeIntervalsBefore(endBlock) - - for (const interval of intervals) { - await slasher.setBitmapForInterval(interval).send({ from: user, gas: 5000000 }) - } - - const election = await kit.contracts.getElection() - const signer = await election.validatorSignerAddressFromSet(4, intervals[0].start) - - const tx = await slasher.slashValidator(signer, intervals) - const txResult = await tx.send({ from: user, gas: 5000000 }) - const txRcpt = await txResult.waitReceipt() - assert.equal(txRcpt.status, true) - - const lockedGold = await kit.contracts.getLockedGold() - const balance = await lockedGold.getAccountTotalLockedGold(signer) - // Penalty is defined to be 100 cGLD in migrations, locked gold is 10000 cGLD for a validator - assert.equal(balance.toString(10), '9900000000000000000000') - }) - }) - - describe('test slashing for double signing', () => { - before(async function (this: any) { - this.timeout(0) // Disable test timeout - await restart() - }) - - it('slash for double signing', async function (this: any) { - this.timeout(0) // Disable test timeout - const slasher = await kit._web3Contracts.getDoubleSigningSlasher() - - await waitForBlock(web3, doubleSigningBlock.number) - - const other = headerFromBlock(doubleSigningBlock) - - const num = await slasher.methods.getBlockNumberFromHeader(other).call() - - const header = headerFromBlock(await kit.connection.getBlock(num)) - - const signerIdx = await findDoubleSignerIndex(kit, header, other) - const signer = await slasher.methods.validatorSignerAddressFromSet(signerIdx, num).call() - const validator = (await kit.connection.getAccounts())[0] - await kit.connection.web3.eth.personal.unlockAccount(validator, '', 1000000) - - const lockedGold = await kit.contracts.getLockedGold() - const validatorsContract = await kit._web3Contracts.getValidators() - const history = await validatorsContract.methods.getMembershipHistory(signer).call() - const historyIndex = history[0].length - 1 - - await slasher.methods - .slash( - signer, - signerIdx, - header, - other, - historyIndex, - [], - [], - [], - [NULL_ADDRESS], - [NULL_ADDRESS], - [0] - ) - .send({ from: validator, gas: 5000000 }) - - // Penalty is defined to be 9000 cGLD in migrations, locked gold is 10000 cGLD for a validator, so after slashing locked gold is 1000cGld - const balance = await lockedGold.getAccountTotalLockedGold(signer) - assert.equal(balance.toString(10), '1000000000000000000000') - }) - }) - - describe('test slashing for double signing with contractkit', () => { - before(async function (this: any) { - this.timeout(0) // Disable test timeout - await restart() - }) - - it('slash for double signing with contractkit', async function (this: any) { - this.timeout(0) // Disable test timeout - const slasher = await kit.contracts.getDoubleSigningSlasher() - const election = await kit.contracts.getElection() - await waitForBlock(web3, doubleSigningBlock.number) - - const other = headerFromBlock(doubleSigningBlock) - const num = await slasher.getBlockNumberFromHeader(other) - const header = headerFromBlock(await kit.connection.getBlock(num)) - const signerIdx = await findDoubleSignerIndex(kit, header, other) - const signer = await election.validatorSignerAddressFromSet(signerIdx, num) - - const validator = (await kit.connection.getAccounts())[0] - await kit.connection.web3.eth.personal.unlockAccount(validator, '', 1000000) - - const tx = await slasher.slashSigner(signer, header, other) - const txResult = await tx.send({ from: validator, gas: 5000000 }) - const txRcpt = await txResult.waitReceipt() - assert.equal(txRcpt.status, true) - - // Penalty is defined to be 9000 cGLD in migrations, locked gold is 10000 cGLD for a validator, so after slashing locked gold is 1000cGld - const lockedGold = await kit.contracts.getLockedGold() - const balance = await lockedGold.getAccountTotalLockedGold(signer) - assert.equal(balance.toString(10), '1000000000000000000000') - }) - }) -}) diff --git a/packages/celotool/src/e2e-tests/sync_tests.ts b/packages/celotool/src/e2e-tests/sync_tests.ts deleted file mode 100644 index ffb73e8d45a..00000000000 --- a/packages/celotool/src/e2e-tests/sync_tests.ts +++ /dev/null @@ -1,168 +0,0 @@ -import { assert } from 'chai' -import Web3 from 'web3' -import { GethInstanceConfig } from '../lib/interfaces/geth-instance-config' -import { GethRunConfig } from '../lib/interfaces/geth-run-config' -import { getHooks, initAndSyncGethWithRetry, killInstance, waitForBlock } from './utils' - -const TMP_PATH = '/tmp/e2e' -const verbose = false - -describe('sync tests', function (this: any) { - this.timeout(0) - - const gethConfig: GethRunConfig = { - networkId: 1101, - network: 'local', - runPath: TMP_PATH, - migrate: true, - verbosity: 2, - genesisConfig: { - churritoBlock: 0, - donutBlock: 0, - espressoBlock: 0, - }, - instances: [ - { - name: 'validator0', - validating: true, - syncmode: 'full', - port: 30303, - rpcport: 8545, - }, - { - name: 'validator1', - validating: true, - syncmode: 'full', - port: 30305, - rpcport: 8547, - }, - { - name: 'validator2', - validating: true, - syncmode: 'full', - port: 30307, - rpcport: 8549, - }, - { - name: 'validator3', - validating: true, - syncmode: 'full', - port: 30309, - rpcport: 8551, - }, - ], - } - - const fullNode: GethInstanceConfig = { - name: 'txfull', - validating: false, - syncmode: 'full', - lightserv: true, - port: 30311, - rpcport: 8553, - } - - const hooks = getHooks(gethConfig) - - before(async function (this: any) { - this.timeout(0) - // Start validator nodes and migrate contracts. - await hooks.before() - // Restart validator nodes. - await hooks.restart() - await initAndSyncGethWithRetry( - gethConfig, - hooks.gethBinaryPath, - fullNode, - [...gethConfig.instances, fullNode], - verbose, - 3 - ) - }) - - after(async function (this: any) { - this.timeout(0) - await hooks.after() - }) - - const syncModes = ['full', 'fast', 'light', 'lightest'] - for (const syncmode of syncModes) { - describe(`when syncing with a ${syncmode} node`, () => { - let syncNode: GethInstanceConfig - - beforeEach(async () => { - syncNode = { - name: syncmode, - validating: false, - syncmode, - port: 30313, - wsport: 9555, - rpcport: 8555, - lightserv: syncmode !== 'light' && syncmode !== 'lightest', - } - await initAndSyncGethWithRetry( - gethConfig, - hooks.gethBinaryPath, - syncNode, - [fullNode, syncNode], - verbose, - 3 - ) - }) - - afterEach(() => killInstance(syncNode)) - - it('should sync the latest block', async () => { - const validatingWeb3 = new Web3(`http://localhost:8545`) - const validatingFirstBlock = await validatingWeb3.eth.getBlockNumber() - console.info(`At block ${validatingFirstBlock}, waiting for next block`) - await waitForBlock(validatingWeb3, validatingFirstBlock + 1) - const validatingLatestBlock = await validatingWeb3.eth.getBlockNumber() - - const syncWeb3 = new Web3(`http://localhost:8555`) - console.info(`Waiting to sync to block ${validatingFirstBlock}`) - await waitForBlock(syncWeb3, validatingLatestBlock) - const syncLatestBlock = await syncWeb3.eth.getBlockNumber() - - assert.isAbove(validatingLatestBlock, 1) - // Assert that the validator is still producing blocks. - assert.isAbove(validatingLatestBlock, validatingFirstBlock) - // Assert that the syncing node has synced with the validator. - assert.isAtLeast(syncLatestBlock, validatingLatestBlock) - }) - }) - } - describe(`when a validator's data directory is deleted`, () => { - beforeEach(async function (this: any) { - this.timeout(0) // Disable test timeout - await hooks.restart() - }) - - it('should continue to block produce', async function (this: any) { - this.timeout(0) - const instance: GethInstanceConfig = gethConfig.instances[1] - await killInstance(instance) - // copy instance - const additionalInstance = { ...instance } - await initAndSyncGethWithRetry( - gethConfig, - hooks.gethBinaryPath, - additionalInstance, - [gethConfig.instances[0], additionalInstance], - verbose, - 3 - ) - - const web3 = new Web3(`http://localhost:${additionalInstance.rpcport}`) - const address = (await web3.eth.getAccounts())[0] - const currentBlock = await web3.eth.getBlock('latest') - for (let i = 1; i < 500; i++) { - await waitForBlock(web3, currentBlock.number + i) - if ((await web3.eth.getBlock(currentBlock.number + i)).miner === address) { - return // A block proposed by validator who lost randomness was found, hence randomness was recovered - } - } - assert.fail('Reset validator did not propose any new blocks') - }) - }) -}) diff --git a/packages/celotool/src/e2e-tests/transfer_tests.ts b/packages/celotool/src/e2e-tests/transfer_tests.ts deleted file mode 100644 index 2de49034b2e..00000000000 --- a/packages/celotool/src/e2e-tests/transfer_tests.ts +++ /dev/null @@ -1,942 +0,0 @@ -import { CeloTxPending, CeloTxReceipt, TransactionResult } from '@celo/connect' -import { ContractKit, newKitFromWeb3 } from '@celo/contractkit' -import { CeloTokenType, EachCeloToken, StableToken, Token } from '@celo/contractkit/lib/celo-tokens' -import { eqAddress, toChecksumAddress } from '@celo/utils/lib/address' -import { toFixed } from '@celo/utils/lib/fixidity' -import BigNumber from 'bignumber.js' -import { assert } from 'chai' -import Web3 from 'web3' -import { GethInstanceConfig } from '../lib/interfaces/geth-instance-config' -import { GethRunConfig } from '../lib/interfaces/geth-run-config' -import { getHooks, initAndSyncGethWithRetry, killInstance, sleep } from './utils' - -const TMP_PATH = '/tmp/e2e' -const verbose = false - -/** - * Helper Class to change StableToken Inflation in tests - */ -class InflationManager { - private kit: ContractKit - private readonly minUpdateDelay = 10 - - constructor( - readonly validatorUri: string, - readonly validatorAddress: string, - readonly token: StableToken - ) { - this.kit = newKitFromWeb3(new Web3(validatorUri)) - this.kit.connection.defaultAccount = validatorAddress - } - - now = async (): Promise => { - return Number((await this.kit.connection.getBlock('pending')).timestamp) - } - - getNextUpdateRate = async (): Promise => { - const stableToken = await this.getStableToken() - // Compute necessary `updateRate` so inflationFactor adjusment takes place on next operation - const { factorLastUpdated } = await stableToken.getInflationParameters() - - // Wait until the minimum update delay has passed so we can set a rate that gives us some - // buffer time to make the transaction in the next availiable update window. - let timeSinceLastUpdated = (await this.now()) - factorLastUpdated.toNumber() - while (timeSinceLastUpdated < this.minUpdateDelay) { - await sleep(this.minUpdateDelay - timeSinceLastUpdated) - timeSinceLastUpdated = (await this.now()) - factorLastUpdated.toNumber() - } - - return timeSinceLastUpdated - } - - getParameters = async () => { - const stableToken = await this.getStableToken() - return stableToken.getInflationParameters() - } - - setInflationRateForNextTransfer = async (rate: BigNumber) => { - // Possibly update the inflation factor and ensure it won't update again. - await this.setInflationParameters(new BigNumber(1), Number.MAX_SAFE_INTEGER) - - const updateRate = await this.getNextUpdateRate() - await this.setInflationParameters(rate, updateRate) - } - - setInflationParameters = async (rate: BigNumber, updatePeriod: number) => { - const stableToken = await this.getStableToken() - await stableToken - .setInflationParameters(toFixed(rate).toFixed(), updatePeriod.toFixed()) - .sendAndWaitForReceipt({ from: this.validatorAddress }) - } - - getStableToken = async () => { - return this.kit.celoTokens.getWrapper(this.token) - } -} - -const setIntrinsicGas = async (validatorUri: string, validatorAddress: string, gasCost: number) => { - const kit = newKitFromWeb3(new Web3(validatorUri)) - const parameters = await kit.contracts.getBlockchainParameters() - await parameters - .setIntrinsicGasForAlternativeFeeCurrency(gasCost.toString()) - .sendAndWaitForReceipt({ from: validatorAddress }) -} - -// Intrinsic gas for a basic transaction -const INTRINSIC_TX_GAS_COST = 21000 - -// Additional intrinsic gas for a transaction with fee currency specified -const ADDITIONAL_INTRINSIC_TX_GAS_COST = 50000 - -// If the To address has zero as the balance, the cost of writting that address is -const sstoreSetGasEIP2200 = 20000 -const sstoreResetGasEIP2200 = 5000 -const coldSloadCostEIP2929 = 800 // The Eip2929 set this to 2100, but our Cip48 back to 800 -const coldAccountAccessCostEIP2929 = 900 // The Eip2929 set this to 2600, but our Cip48 back to 900 -const warmStorageReadCostEIP2929 = 100 // Eip2929 and Cip48 - -// This number represent the gasUsed in the execution of the StableToken transfer assuming: -// - Nothing was preloaded in the state accessList, so the first storage calls will cost: -// * ColdSloadCostEIP2929 = 800 -// * ColdAccountAccessCostEIP2929 = 900 -// - The From and To address -// * HAVE funds -// * non of those will be zero after the transfer -// * non those were modified before (as part of the same tx) -// * This means that both SSTORE (From and To) will cost: -// SstoreResetGasEIP2200 [5000] - ColdSloadCostEIP2929 [800] => 4200 -// - No intrinsic gas involved BUT 630 gas charged for the amount of bytes sent -const basicStableTokenTransferGasCost = 31253 - -// As the basicStableTokenTransferGasCost assumes that the transfer TO have funds, we should -// only add the difference to calculate the gas (sstoreSetGasEIP2200 - 4200) => 15800 -const emptyFundsBeforeForBasicCalc = - sstoreSetGasEIP2200 - (sstoreResetGasEIP2200 - coldSloadCostEIP2929) // 15800 - -// The StableToken transfer, paid with the same StableToken, preloads a lot of state -// when the fee is subsctracted from the account, which generates that the basicStableTokenTransferGasCost -// cost less. The actual differences: -// - SLOADS ColdSloadCostEIP2929 -> WarmStorageReadCostEIP2929 (-700 each) -// * 6 from the stableToken contract -// * 2 from the celoRegistry contract -// * 2 from the Freeze contract -// - Account Check ( EXTCODEHASH | EXTCODESIZE | ext BALANCE) -// coldAccountAccessCostEIP2929 -> WarmStorageReadCostEIP2929 (-800 each) -// * 3 from the stableToken contract -// * 1 from the celoRegistry contract -// * 1 from the Freeze contract -// - The From account as already modified the state for that address -// * This will make that instead of SstoreResetGasEIP2200 [5000] - ColdSloadCostEIP2929 [800] => 4200 -// will cost WarmStorageReadCostEIP2929 [100] (-4100) -const savingGasStableTokenTransferPaidWithSameStable = - (coldSloadCostEIP2929 - warmStorageReadCostEIP2929) * 10 + - (coldAccountAccessCostEIP2929 - warmStorageReadCostEIP2929) * 5 + - (sstoreResetGasEIP2200 - coldSloadCostEIP2929 - warmStorageReadCostEIP2929) // 15100 - -/** Helper to watch balance changes over accounts */ -interface BalanceWatcher { - update(): Promise - - delta(address: string, token: CeloTokenType): BigNumber - - current(address: string, token: CeloTokenType): BigNumber - - initial(address: string, token: CeloTokenType): BigNumber - - debugPrint(address: string, token: CeloTokenType): void -} - -async function newBalanceWatcher(kit: ContractKit, accounts: string[]): Promise { - async function fetch() { - const balances: Record> = {} - await Promise.all( - accounts.map(async (a) => { - balances[a] = await kit.celoTokens.balancesOf(a) - }) - ) - return balances - } - - const initial = await fetch() - let current = initial - return { - async update() { - current = await fetch() - }, - delta(address: string, token: CeloTokenType) { - return (current[address][token] || new BigNumber(0)).minus(initial[address][token] || 0) - }, - current(address: string, token: CeloTokenType) { - return current[address][token] || new BigNumber(0) - }, - initial(address: string, token: CeloTokenType) { - return initial[address][token] || new BigNumber(0) - }, - debugPrint(address: string, token: CeloTokenType) { - // eslint-disable-next-line: no-console - console.info({ - initial: initial[address][token]?.toString(), - current: current[address][token]?.toString(), - delta: (current[address][token] || new BigNumber(0)) - .minus(initial[address][token] || 0) - .toString(), - }) - }, - } -} - -function assertEqualBN(value: BigNumber, expected: BigNumber) { - assert.equal(value.toString(), expected.toString()) -} - -describe('Transfer tests', function (this: any) { - this.timeout(0) - - let kit: ContractKit - const TransferAmount: BigNumber = new BigNumber(Web3.utils.toWei('1', 'ether')) - - let currentGethInstance: GethInstanceConfig - - let governanceAddress: string // set later on using the contract itself - const validatorAddress = '0x47e172f6cfb6c7d01c1574fa3e2be7cc73269d95' - const DEF_FROM_PK = 'f2f48ee19680706196e2e339e5da3491186e0c4c5030670656b0e0164837257d' - const FromAddress = '0x5409ed021d9299bf6814279a6a1411a7e866a631' - - // Arbitrary addresses. - const txFeeRecipientAddress = '0x5555555555555555555555555555555555555555' - const ToAddress = '0xbBae99F0E1EE565404465638d40827b54D343638' - const gatewayFeeRecipientAddress = '0x4f5f8a3f45d179553e7b95119ce296010f50f6f1' - - const syncModes = ['full', 'fast', 'light', 'lightest'] - const gethConfig: GethRunConfig = { - migrate: true, - networkId: 1101, - network: 'local', - runPath: TMP_PATH, - genesisConfig: { - churritoBlock: 0, - donutBlock: 0, - espressoBlock: 0, - }, - instances: [ - { - name: 'validator', - validating: true, - minerValidator: validatorAddress, - // Separate address for tx fees, so that we can easy identify balance changes due to them - txFeeRecipient: txFeeRecipientAddress, - syncmode: 'full', - port: 30303, - rpcport: 8545, - }, - ], - } - - const hooks = getHooks(gethConfig) - - before(async function (this: any) { - this.timeout(0) - await hooks.before() - }) - - after(async function (this: any) { - this.timeout(0) - await hooks.after() - }) - - // Spin up a node that we can sync with. - const fullInstance: GethInstanceConfig = { - name: 'txFull', - validating: false, - syncmode: 'full', - lightserv: true, - gatewayFee: new BigNumber(10000), - port: 30305, - rpcport: 8547, - // We need to set an etherbase here so that the full node will accept transactions from - // light clients. - minerValidator: gatewayFeeRecipientAddress, - txFeeRecipient: gatewayFeeRecipientAddress, - } - - const restartWithCleanNodes = async () => { - await hooks.restart() - - kit = newKitFromWeb3(new Web3('http://localhost:8545')) - kit.connection.defaultGasInflationFactor = 1 - - // TODO(mcortesi): magic sleep. without it unlockAccount sometimes fails - await sleep(2) - // Assuming empty password - await kit.connection.web3.eth.personal.unlockAccount(validatorAddress, '', 1000000) - - await initAndSyncGethWithRetry( - gethConfig, - hooks.gethBinaryPath, - fullInstance, - [...gethConfig.instances, fullInstance], - verbose, - 3 - ) - - governanceAddress = (await kit._web3Contracts.getGovernance()).options.address - // The tests below check the balance of the governance contract (i.e. the community fund) - // before and after transactions to verify the correct amount has been received from the fees. - // This causes flakiness due to the fund also receiving epoch rewards (if the epoch change is - // between the blocks the balance checker uses as its before and after the test will fail due - // to the unexpected change from the epoch rewards). - // To avoid this, we set the community fund's fraction of epoch rewards to zero. - // Another option would have been to make the epoch size large enough so no epoch changes happen - // during the test. - const epochRewards = await kit._web3Contracts.getEpochRewards() - await epochRewards.methods.setCommunityRewardFraction(0).send({ from: validatorAddress }) - - // Give the account we will send transfers as sufficient gold and dollars. - const startBalance = TransferAmount.times(500) - const resDollars = await transferCeloDollars(validatorAddress, FromAddress, startBalance) - const resGold = await transferCeloGold(validatorAddress, FromAddress, startBalance) - await Promise.all([resDollars.waitReceipt(), resGold.waitReceipt()]) - } - - const startSyncNode = async (syncmode: string) => { - if (currentGethInstance != null) { - await killInstance(currentGethInstance) - } - - const light = syncmode === 'light' || syncmode === 'lightest' - currentGethInstance = { - name: syncmode, - validating: false, - syncmode, - port: 30307, - rpcport: 8549, - lightserv: !light, - // TODO(nategraf): Remove this when light clients can query for gateway fee. - gatewayFee: light ? new BigNumber(10000) : undefined, - privateKey: DEF_FROM_PK, - } - - // Spin up the node to run transfers as. - await initAndSyncGethWithRetry( - gethConfig, - hooks.gethBinaryPath, - currentGethInstance, - [fullInstance, currentGethInstance], - verbose, - 3 - ) - - // Reset contracts to send RPCs through transferring node. - kit.connection.setProvider(new Web3.providers.HttpProvider('http://localhost:8549')) - - // Give the node time to sync the latest block. - const upstream = await new Web3('http://localhost:8545').eth.getBlock('latest') - while ((await kit.connection.getBlock('latest')).number < upstream.number) { - await sleep(0.5) - } - - // Unlock Node account - await kit.connection.web3.eth.personal.unlockAccount(FromAddress, '', 1000000) - } - - const transferCeloGold = async ( - fromAddress: string, - toAddress: string, - amount: BigNumber, - txOptions: { - gas?: number - gasPrice?: string - feeCurrency?: string - gatewayFeeRecipient?: string - gatewayFee?: string - } = {} - ) => { - const res = await kit.connection.sendTransaction({ - from: fromAddress, - to: toAddress, - value: amount.toString(), - ...txOptions, - }) - return res - } - - const transferCeloDollars = async ( - fromAddress: string, - toAddress: string, - amount: BigNumber, - txOptions: { - gas?: number - gasPrice?: string - feeCurrency?: string - gatewayFeeRecipient?: string - gatewayFee?: string - } = {} - ) => { - const kitStableToken = await kit.contracts.getStableToken() - const res = await kitStableToken.transfer(toAddress, amount.toString()).send({ - from: fromAddress, - ...txOptions, - }) - - return res - } - - const getGasPriceMinimum = async (feeCurrency: string | undefined) => { - const gasPriceMinimum = await kit._web3Contracts.getGasPriceMinimum() - if (feeCurrency) { - return gasPriceMinimum.methods.getGasPriceMinimum(feeCurrency).call() - } else { - return gasPriceMinimum.methods.gasPriceMinimum().call() - } - } - - interface Fees { - total: BigNumber - tip: BigNumber - base: BigNumber - gateway: BigNumber - } - - interface GasUsage { - used?: number - expected: number - } - - interface TestTxResults { - ok: boolean - fees: Fees - gas: GasUsage - events: any[] - } - - const TRANSFER_TOPIC = '0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef' - - function truncateTopic(hex: string) { - return '0x' + hex.substring(26) - } - - function parseEvents(receipt: CeloTxReceipt | undefined) { - if (!receipt) { - return [] - } - if (receipt.events && receipt.events.Transfer) { - let events: any = receipt.events.Transfer - if (!(events instanceof Array)) { - events = [events] - } - return events.map((a: any) => ({ to: a.returnValues.to, from: a.returnValues.from })) - } - if (receipt.logs) { - return receipt.logs - .filter((a) => a.topics[0] === TRANSFER_TOPIC) - .map((a) => ({ to: truncateTopic(a.topics[2]), from: truncateTopic(a.topics[1]) })) - } - } - - const runTestTransaction = async ( - txResult: TransactionResult, - expectedGasUsed: number, - feeCurrency?: string - ): Promise => { - const minGasPrice = await getGasPriceMinimum(feeCurrency) - assert.isAbove(parseInt(minGasPrice, 10), 0) - - let ok = false - let receipt: CeloTxReceipt | undefined - try { - receipt = await txResult.waitReceipt() - ok = true - } catch (err) { - ok = false - } - - const events = parseEvents(receipt) - - if (receipt != null && receipt.gasUsed !== expectedGasUsed) { - // eslint-disable-next-line: no-console - console.info('OOPSS: Different Gas', receipt.gasUsed, expectedGasUsed) - } - - const gasVal = receipt ? receipt.gasUsed : expectedGasUsed - assert.isAbove(gasVal, 0) - const txHash = await txResult.getHash() - const tx: CeloTxPending = await kit.connection.getTransaction(txHash) - assert.isAbove(parseInt(tx.gasPrice, 10), 0) - const txFee = new BigNumber(gasVal).times(tx.gasPrice) - const txFeeBase = new BigNumber(gasVal).times(minGasPrice) - const txFeeTip = txFee.minus(txFeeBase) - const gatewayFee = new BigNumber(tx.gatewayFee || 0) - assert.equal( - tx.gatewayFeeRecipient === null || tx.gatewayFeeRecipient === undefined, - gatewayFee.eq(0) - ) - - const fees: Fees = { - total: txFee.plus(gatewayFee), - base: txFeeBase, - tip: txFeeTip, - gateway: gatewayFee, - } - const gas: GasUsage = { - used: receipt && receipt.gasUsed, - expected: expectedGasUsed, - } - return { ok, fees, gas, events } - } - - function testTxPoolFiltering({ - feeToken, - gas, - expectedError, - }: { - feeToken: CeloTokenType - gas: number - expectedError: string - }) { - it('should not add the transaction to the pool', async () => { - const feeCurrency = await kit.celoTokens.getFeeCurrencyAddress(feeToken) - try { - const res = await transferCeloGold(FromAddress, ToAddress, TransferAmount, { - gas, - feeCurrency, - }) - await res.waitReceipt() - assert.fail('no error was thrown') - } catch (error: any) { - assert.include(error.toString(), expectedError) - } - }) - } - - const toTemplate = '0xbBae99F0E1EE565404465638d40827b54D343' // last 3 hex digits trimmed - // Starts with 1, otherwise the address could have the last byte as 00 which would change - // the gas consumption - let toCounter = 1 - - function generateCleanAddress(): string { - toCounter += 1 - // avoid the '00' at the end (check toCounter comment) - toCounter = toCounter % 100 === 0 ? toCounter + 1 : toCounter - - return toChecksumAddress(toTemplate + toCounter.toString().padStart(3, '0')) - } - - function testTwiceFirstAndSecondFundToANewAddress(testObject: { - transferToken: CeloTokenType - feeToken: CeloTokenType - expectedGas: number - expectSuccess?: boolean - txOptions?: { - gas?: number - gatewayFeeRecipient?: string - gatewayFee?: string - } - fromAddress?: string - toAddress?: string - }) { - const expectedGasAux = testObject.expectedGas - testObject.toAddress = generateCleanAddress() - // Add the fee to save to an empty address - testObject.expectedGas += emptyFundsBeforeForBasicCalc - describe('first fund to the To account', () => { - testTransferToken(testObject) - }) - testObject.expectedGas = expectedGasAux - describe('second fund to the To account', () => { - testTransferToken(testObject) - }) - } - - function testTransferToken({ - transferToken, - feeToken, - expectedGas, - txOptions, - expectSuccess = true, - fromAddress = FromAddress, - toAddress = ToAddress, - }: { - transferToken: CeloTokenType - feeToken: CeloTokenType - expectedGas: number - expectSuccess?: boolean - txOptions?: { - gas?: number - gatewayFeeRecipient?: string - gatewayFee?: string - } - fromAddress?: string - toAddress?: string - }) { - let txRes: TestTxResults - let balances: BalanceWatcher - - before(async () => { - const feeCurrency = await kit.celoTokens.getFeeCurrencyAddress(feeToken) - - const accounts = [ - fromAddress, - toAddress, - txFeeRecipientAddress, - gatewayFeeRecipientAddress, - governanceAddress, - ] - balances = await newBalanceWatcher(kit, accounts) - - const transferFn = transferToken === StableToken.cUSD ? transferCeloDollars : transferCeloGold - const txResult = await transferFn(fromAddress, toAddress, TransferAmount, { - ...txOptions, - feeCurrency, - }) - - txRes = await runTestTransaction(txResult, expectedGas, feeCurrency) - - await balances.update() - }) - - if (expectSuccess) { - it(`should succeed`, () => assert.isTrue(txRes.ok)) - - it(`should use the expected amount of gas`, () => - assert.equal(txRes.gas.used, txRes.gas.expected)) - - it(`should increment the receiver's ${transferToken} balance by the transfer amount`, () => - assertEqualBN(balances.delta(toAddress, transferToken), TransferAmount)) - - it('should have emitted transfer events for the fee token if not using CELO', () => { - if (kit.celoTokens.isStableToken(feeToken)) { - assert( - txRes.events.find( - (a) => eqAddress(a.to, governanceAddress) && eqAddress(a.from, fromAddress) - ) - ) - } - }) - - if (transferToken === feeToken) { - it(`should decrement the sender's ${transferToken} balance by the transfer amount plus fees`, () => { - const expectedBalanceChange = txRes.fees.total.plus(TransferAmount) - assertEqualBN(balances.delta(fromAddress, transferToken).negated(), expectedBalanceChange) - }) - } else { - it(`should decrement the sender's ${transferToken} balance by the transfer amount`, () => - assertEqualBN(balances.delta(fromAddress, transferToken).negated(), TransferAmount)) - - it(`should decrement the sender's ${feeToken} balance by the total fees`, () => - assertEqualBN(balances.delta(fromAddress, feeToken).negated(), txRes.fees.total)) - } - } else { - it(`should fail`, () => assert.isFalse(txRes.ok)) - - it(`should decrement the sender's ${feeToken} balance by the total fees`, () => - assertEqualBN(balances.delta(fromAddress, feeToken).negated(), txRes.fees.total)) - - it(`should not change the receiver's ${transferToken} balance`, () => { - assertEqualBN( - balances.initial(toAddress, transferToken), - balances.current(toAddress, transferToken) - ) - }) - - if (transferToken !== feeToken) { - it(`should not change the sender's ${transferToken} balance`, () => { - assertEqualBN( - balances.initial(fromAddress, transferToken), - balances.current(fromAddress, transferToken) - ) - }) - } - } - - it(`should increment the gateway fee recipient's ${feeToken} balance by the gateway fee`, () => - assertEqualBN(balances.delta(gatewayFeeRecipientAddress, feeToken), txRes.fees.gateway)) - - it(`should increment the infrastructure fund's ${feeToken} balance by the base portion of the gas fee`, () => - assertEqualBN(balances.delta(governanceAddress, feeToken), txRes.fees.base)) - - it(`should increment the tx fee recipient's ${feeToken} balance by the rest of the gas fee`, () => { - assertEqualBN(balances.delta(txFeeRecipientAddress, feeToken), txRes.fees.tip) - }) - } - describe('Normal Transfer >', () => { - before(restartWithCleanNodes) - - for (const syncMode of syncModes) { - describe(`${syncMode} Node >`, () => { - before(`start geth on sync: ${syncMode}`, () => startSyncNode(syncMode)) - - describe('Transfer CeloGold >', () => { - describe('with feeCurrency = CeloGold >', () => { - if (syncMode === 'light' || syncMode === 'lightest') { - describe('when running in light/lightest sync mode', () => { - const recipient = (choice: string) => { - switch (choice) { - case 'peer': - return gatewayFeeRecipientAddress - case 'random': - return Web3.utils.randomHex(20) - default: - // unset - return undefined - } - } - const feeValue = (choice: string) => { - switch (choice) { - case 'sufficient': - return '0x10000' - case 'insufficient': - return '0x1' - default: - // unset - return undefined - } - } - for (const recipientChoice of ['peer', 'random', 'unset']) { - describe(`when the gateway fee recipient is ${recipientChoice}`, () => { - for (const feeValueChoice of ['sufficient', 'insufficient', 'unset']) { - describe(`when the gateway fee value is ${feeValueChoice}`, () => { - const txOptions = { - gatewayFeeRecipient: recipient(recipientChoice), - gatewayFee: feeValue(feeValueChoice), - } - if (recipientChoice === 'random' || feeValueChoice === 'insufficient') { - it('should get rejected by the sending node before being added to the tx pool', async () => { - try { - const res = await transferCeloGold( - FromAddress, - ToAddress, - TransferAmount, - txOptions - ) - await res.waitReceipt() - assert.fail('no error was thrown') - } catch (error: any) { - assert.include(error.toString(), `Error: no suitable peers available`) - } - }) - } else { - testTransferToken({ - expectedGas: INTRINSIC_TX_GAS_COST, - transferToken: Token.CELO, - feeToken: Token.CELO, - txOptions, - }) - } - }) - } - }) - } - }) - } else { - testTransferToken({ - expectedGas: INTRINSIC_TX_GAS_COST, - transferToken: Token.CELO, - feeToken: Token.CELO, - }) - } - }) - - describe('feeCurrency = CeloDollars >', () => { - const intrinsicGas = INTRINSIC_TX_GAS_COST + ADDITIONAL_INTRINSIC_TX_GAS_COST - - describe('when there is no demurrage', () => { - describe('when setting a gas amount greater than the amount of gas necessary', () => - testTransferToken({ - expectedGas: intrinsicGas, - transferToken: Token.CELO, - feeToken: StableToken.cUSD, - })) - - describe('when setting a gas amount less than the intrinsic gas amount', () => { - it('should not add the transaction to the pool', async () => { - const gas = intrinsicGas - 1 - const feeCurrency = await kit.celoTokens.getFeeCurrencyAddress(StableToken.cUSD) - try { - const res = await transferCeloGold(FromAddress, ToAddress, TransferAmount, { - gas, - feeCurrency, - }) - await res.getHash() - assert.fail('no error was thrown') - } catch (error: any) { - assert.include(error.toString(), 'Error: intrinsic gas too low') - } - }) - }) - }) - }) - }) - - describe('Transfer CeloDollars', () => { - describe('feeCurrency = CeloDollars >', () => { - testTwiceFirstAndSecondFundToANewAddress({ - expectedGas: - basicStableTokenTransferGasCost + - INTRINSIC_TX_GAS_COST + - ADDITIONAL_INTRINSIC_TX_GAS_COST - - savingGasStableTokenTransferPaidWithSameStable, - transferToken: StableToken.cUSD, - feeToken: StableToken.cUSD, - }) - }) - - describe('feeCurrency = CeloGold >', () => { - testTwiceFirstAndSecondFundToANewAddress({ - expectedGas: basicStableTokenTransferGasCost + INTRINSIC_TX_GAS_COST, - transferToken: StableToken.cUSD, - feeToken: Token.CELO, - }) - }) - }) - }) - } - }) - - describe('Transfer with changed intrinsic gas cost >', () => { - const changedIntrinsicGasForAlternativeFeeCurrency = 34000 - - before(restartWithCleanNodes) - - for (const syncMode of syncModes) { - describe(`${syncMode} Node >`, () => { - before(`start geth on sync: ${syncMode}`, async () => { - try { - await startSyncNode(syncMode) - await setIntrinsicGas( - 'http://localhost:8545', - validatorAddress, - changedIntrinsicGasForAlternativeFeeCurrency - ) - } catch (err) { - console.debug('some error', err) - } - }) - - describe('Transfer CeloGold >', () => { - describe('feeCurrency = CeloDollars >', () => { - const intrinsicGas = - changedIntrinsicGasForAlternativeFeeCurrency + INTRINSIC_TX_GAS_COST - describe('when there is no demurrage', () => { - describe('when setting a gas amount greater than the amount of gas necessary', () => - testTransferToken({ - expectedGas: intrinsicGas, - transferToken: Token.CELO, - feeToken: StableToken.cUSD, - })) - - describe('when setting a gas amount less than the intrinsic gas amount', () => { - testTxPoolFiltering({ - gas: intrinsicGas - 1, - feeToken: StableToken.cUSD, - expectedError: 'Error: intrinsic gas too low', - }) - }) - }) - }) - }) - - describe('Transfer CeloDollars', () => { - describe('feeCurrency = CeloDollars >', () => { - testTwiceFirstAndSecondFundToANewAddress({ - expectedGas: - basicStableTokenTransferGasCost + - changedIntrinsicGasForAlternativeFeeCurrency + - INTRINSIC_TX_GAS_COST - - savingGasStableTokenTransferPaidWithSameStable, - transferToken: StableToken.cUSD, - feeToken: StableToken.cUSD, - }) - }) - }) - }) - } - }) - - describe('Transfer with Demurrage >', () => { - for (const syncMode of syncModes) { - describe(`${syncMode} Node >`, () => { - let inflationManager: InflationManager - before(`start geth on sync: ${syncMode}`, async () => { - await restartWithCleanNodes() - inflationManager = new InflationManager( - 'http://localhost:8545', - validatorAddress, - StableToken.cUSD - ) - await startSyncNode(syncMode) - }) - - describe('when there is demurrage of 50% applied', () => { - describe('when setting a gas amount greater than the amount of gas necessary', () => { - let balances: BalanceWatcher - let expectedFees: Fees - let txRes: TestTxResults - - before(async () => { - balances = await newBalanceWatcher(kit, [ - FromAddress, - ToAddress, - gatewayFeeRecipientAddress, - governanceAddress, - ]) - - await inflationManager.setInflationRateForNextTransfer(new BigNumber(2)) - const feeCurrency = await kit.celoTokens.getFeeCurrencyAddress(StableToken.cUSD) - txRes = await runTestTransaction( - await transferCeloGold(FromAddress, ToAddress, TransferAmount, { - feeCurrency, - }), - INTRINSIC_TX_GAS_COST + ADDITIONAL_INTRINSIC_TX_GAS_COST, - feeCurrency - ) - - await balances.update() - expectedFees = txRes.fees - }) - - it('should succeed', () => assert.isTrue(txRes.ok)) - - it('should use the expected amount of gas', () => - assert.equal(txRes.gas.used, txRes.gas.expected)) - - it("should decrement the sender's Celo Gold balance by the transfer amount", () => { - assertEqualBN(balances.delta(FromAddress, Token.CELO).negated(), TransferAmount) - }) - - it("should increment the receiver's Celo Gold balance by the transfer amount", () => { - assertEqualBN(balances.delta(ToAddress, Token.CELO), TransferAmount) - }) - - it("should halve the sender's Celo Dollar balance due to demurrage and decrement it by the total fees", () => { - assertEqualBN( - balances - .initial(FromAddress, StableToken.cUSD) - .idiv(2) - .minus(balances.current(FromAddress, StableToken.cUSD)), - expectedFees.total - ) - }) - - it("should halve the gateway fee recipient's Celo Dollar balance then increase it by the gateway fee", () => { - assertEqualBN( - balances - .current(gatewayFeeRecipientAddress, StableToken.cUSD) - .minus(balances.initial(gatewayFeeRecipientAddress, StableToken.cUSD).idiv(2)), - expectedFees.gateway - ) - }) - - it("should halve the infrastructure fund's Celo Dollar balance then increment it by the base portion of the gas fees", () => { - assertEqualBN( - balances - .current(governanceAddress, StableToken.cUSD) - .minus(balances.initial(governanceAddress, StableToken.cUSD).idiv(2)), - expectedFees.base - ) - }) - }) - }) - }) - } - }) -}) diff --git a/packages/celotool/src/e2e-tests/utils.ts b/packages/celotool/src/e2e-tests/utils.ts deleted file mode 100644 index cf73dc47253..00000000000 --- a/packages/celotool/src/e2e-tests/utils.ts +++ /dev/null @@ -1,407 +0,0 @@ -import BigNumber from 'bignumber.js' -import { assert } from 'chai' -import fs from 'fs' -import { join as joinPath, resolve as resolvePath } from 'path' -import readLastLines from 'read-last-lines' -import Web3 from 'web3' -import { spawnCmd, spawnCmdWithExitOnFailure } from '../lib/cmd-utils' -import { envVar, fetchEnvOrFallback } from '../lib/env-utils' -import { - AccountType, - getPrivateKeysFor, - getValidatorsInformation, - privateKeyToAddress, - privateKeyToPublicKey, -} from '../lib/generate_utils' -import { - buildGeth, - buildGethAll, - checkoutGethRepo, - connectPeers, - connectValidatorPeers, - getEnodeAddress, - getLogFilename, - initAndStartGeth, - initGeth, - migrateContracts, - resetDataDir, - restoreDatadir, - snapshotDatadir, - startGeth, - writeGenesis, - writeGenesisWithMigrations, -} from '../lib/geth' -import { GethInstanceConfig } from '../lib/interfaces/geth-instance-config' -import { GethRepository } from '../lib/interfaces/geth-repository' -import { GethRunConfig } from '../lib/interfaces/geth-run-config' -import { stringToBoolean } from '../lib/utils' - -const MonorepoRoot = resolvePath(joinPath(__dirname, '../..', '../..')) -const verboseOutput = false -// The mnemonic used for the e2e tests -export const mnemonic = - 'jazz ripple brown cloth door bridge pen danger deer thumb cable prepare negative library vast' - -export async function initAndSyncGethWithRetry( - gethConfig: GethRunConfig, - gethBinaryPath: string, - instance: GethInstanceConfig, - connectInstances: GethInstanceConfig[], - verbose: boolean, - retries: number -) { - for (let i = 1; i <= retries; i++) { - try { - await initAndStartGeth(gethConfig, gethBinaryPath, instance, verbose) - await connectPeers(connectInstances, verbose) - await waitToFinishInstanceSyncing(instance) - break - } catch (error) { - // eslint-disable-next-line @typescript-eslint/restrict-template-expressions - console.info(`initAndSyncGethWithRetry error: ${error}`) - const logFilename = getLogFilename(gethConfig.runPath, instance) - console.info(`tail -50 ${logFilename}`) - console.info(await readLastLines.read(logFilename, 50)) - if (i === retries) { - throw error - } else { - console.info(`Retrying ${i}/${retries} ...`) - await killInstance(instance) - continue - } - } - } - return instance -} - -export async function waitToFinishInstanceSyncing(instance: GethInstanceConfig) { - const { wsport, rpcport } = instance - console.info(`${instance.name}: syncing start`) - await waitToFinishSyncing(new Web3(`${rpcport ? 'http' : 'ws'}://localhost:${rpcport || wsport}`)) - console.info(`${instance.name}: syncing finished`) -} - -export async function waitToFinishSyncing(web3: any) { - while ((await web3.eth.isSyncing()) || (await web3.eth.getBlockNumber()) === 0) { - await sleep(0.1) - } -} - -export async function waitForBlock(web3: Web3, blockNumber: number) { - // const epoch = new BigNumber(await validators.methods.getEpochSize().call()).toNumber() - let currentBlock: number - do { - currentBlock = await web3.eth.getBlockNumber() - await sleep(0.1) - } while (currentBlock < blockNumber) -} - -export async function waitForEpochTransition(web3: Web3, epoch: number) { - // const epoch = new BigNumber(await validators.methods.getEpochSize().call()).toNumber() - let blockNumber: number - do { - blockNumber = await web3.eth.getBlockNumber() - await sleep(0.1) - } while (blockNumber % epoch !== 1) -} - -export async function waitForAnnounceToStabilize(web3: Web3) { - // Due to a problem in the announce protocol's settings, it can take a minute for all the validators - // to be aware of each other even though they are connected. This can lead to the first validator missing - // block signatures initially. So we wait for that to pass. - // Before we used mycelo, this wasn't noticeable because the migrations meant that the network would have - // been running for close to 10 minutes already, which was more than enough time. - // TODO: This function and its uses can be removed after the announce startup behavior has been resolved. - await waitForBlock(web3, 70) -} - -export function assertAlmostEqual( - actual: BigNumber, - expected: BigNumber, - delta: BigNumber = new BigNumber(10).pow(12).times(5) -) { - if (expected.isZero()) { - assert.equal(actual.toFixed(), expected.toFixed()) - } else { - const isCloseTo = actual.minus(expected).abs().lte(delta) - assert( - isCloseTo, - `expected ${actual.toString()} to almost equal ${expected.toString()} +/- ${delta.toString()}` - ) - } -} - -type Signal = 'TERM' | 'KILL' | 'INT' | 'STOP' | 'CONT' - -export async function signalProcess(identifier: string | number, signal: Signal): Promise { - const result = - typeof identifier === 'number' - ? await spawnCmd('kill', ['-s', signal, identifier.toString()], { silent: true }) - : await spawnCmd('pkill', [`-SIG${signal}`, identifier], { silent: true }) - - if (result !== 0) { - console.warn(`Attempt to send signal ${signal} to ${identifier} exited with code ${result}`) - } -} - -export async function processIsRunning(identifier: string | number): Promise { - if (typeof identifier === 'number') { - return (await spawnCmd('kill', ['-0', identifier.toString()], { silent: true })) === 0 - } else { - return (await spawnCmd('pgrep', [identifier], { silent: true })) === 0 - } -} - -export async function killGeth() { - console.info(`Killing ALL geth instances`) - await shutdownOrKill('geth') -} - -export async function killInstance(instance: GethInstanceConfig) { - if (instance.pid) { - await signalProcess(instance.pid, 'KILL') - } -} - -export async function shutdownOrKill(identifier: string | number, signal: Signal = 'INT') { - await signalProcess(identifier, signal) - - // Poll for remaining processes for up to ~30s with exponential backoff. - let processRemaining = true - for (let i = 0; i < 10 && processRemaining; i++) { - await sleep(0.03 * Math.pow(2, i)) - processRemaining = await processIsRunning(identifier) - } - - if (processRemaining) { - console.warn('shutdownOrKill: clean shutdown failed') - await signalProcess(identifier, 'KILL') - } - - // Sleep an additional 3 seconds to give time for the ports to be free. - await sleep(3.0) -} - -export function sleep(seconds: number, verbose = false) { - if (verbose) { - console.info(`Sleeping for ${seconds} seconds. Stay tuned!`) - } - return new Promise((resolve) => setTimeout(resolve, seconds * 1000)) -} - -export async function assertRevert(promise: any, errorMessage: string = ''): Promise { - try { - await promise - assert.fail('Expected revert not received') - } catch (error: any) { - const revertFound = error.message.search('revert') >= 0 - if (errorMessage === '') { - assert(revertFound, `Expected "revert", got ${error} instead`) - } else { - assert(revertFound, errorMessage) - } - } -} - -function gethRepositoryFromFlags() { - const argv = require('minimist')(process.argv.slice(2)) - return { - path: argv.localgeth || '/tmp/geth', - remote: !argv.localgeth, - branch: argv.branch, - } -} - -export function getHooks(gethConfig: GethRunConfig) { - return getContext(gethConfig, true).hooks -} - -export function getContext(gethConfig: GethRunConfig, verbose: boolean = verboseOutput) { - // Use of mycelo can be enabled through gethConfig or through an env variable - const useMycelo = - !!gethConfig.useMycelo || - stringToBoolean(fetchEnvOrFallback(envVar.E2E_TESTS_FORCE_USE_MYCELO, 'false')) - const validatorInstances = gethConfig.instances.filter((x: any) => x.validating) - - const numValidators = validatorInstances.length - - const validatorPrivateKeys = getPrivateKeysFor(AccountType.VALIDATOR, mnemonic, numValidators) - const attestationKeys = getPrivateKeysFor(AccountType.ATTESTATION, mnemonic, numValidators) - const validators = getValidatorsInformation(mnemonic, numValidators) - - const proxyInstances = gethConfig.instances.filter((x: any) => x.isProxy) - const numProxies = proxyInstances.length - - const proxyNodeKeys = getPrivateKeysFor(AccountType.PROXY, mnemonic, numProxies) - const proxyEnodes = proxyNodeKeys.map((x: string, i: number) => [ - proxyInstances[i].name, - getEnodeAddress(privateKeyToPublicKey(x), '127.0.0.1', proxyInstances[i].proxyport), - getEnodeAddress(privateKeyToPublicKey(x), '127.0.0.1', proxyInstances[i].port), - ]) - - const repo: GethRepository = gethConfig.repository || gethRepositoryFromFlags() - const gethBinaryPath = `${repo.path}/build/bin/geth` - const initialize = async () => { - if (repo.remote) { - await checkoutGethRepo(repo.branch || 'master', repo.path) - } - - if (useMycelo) { - await buildGethAll(repo.path) - } else { - await buildGeth(repo.path) - } - - if (!gethConfig.keepData && fs.existsSync(gethConfig.runPath)) { - await resetDataDir(gethConfig.runPath, verbose) - } - - if (!fs.existsSync(gethConfig.runPath)) { - fs.mkdirSync(gethConfig.runPath, { recursive: true }) - } - - if (useMycelo) { - // Compile the contracts first because mycelo assumes they are compiled already, unless told not to - if (!gethConfig.myceloSkipCompilingContracts) { - await spawnCmdWithExitOnFailure('yarn', ['truffle', 'compile'], { - cwd: `${MonorepoRoot}/packages/protocol`, - }) - } - await writeGenesisWithMigrations(gethConfig, repo.path, mnemonic, validators.length, verbose) - } else { - writeGenesis(gethConfig, validators, verbose) - } - - let validatorIndex = 0 - let proxyIndex = 0 - - for (const instance of gethConfig.instances) { - if (instance.isProxied) { - // Proxied validators should connect to only the proxy - // Find this proxied validator's proxy - const proxyEnode = proxyEnodes.filter((x: any) => x[0] === instance.proxy) - - if (proxyEnode.length !== 1) { - throw new Error('proxied validator must have exactly one proxy') - } - - instance.proxies = [proxyEnode[0][1], proxyEnode[0][2]] - } - - // Set the private key for the validator or proxy instance - if (instance.validating) { - instance.privateKey = instance.privateKey || validatorPrivateKeys[validatorIndex] - validatorIndex++ - } else if (instance.isProxy) { - instance.nodekey = instance.privateKey || proxyNodeKeys[proxyIndex] - proxyIndex++ - } - - if (!instance.minerValidator && (instance.validating || instance.isProxied)) { - instance.minerValidator = privateKeyToAddress(instance.privateKey) - } - } - - // The proxies will need to know their proxied validator's address - for (const instance of gethConfig.instances) { - if (instance.isProxy) { - const proxiedValidator = gethConfig.instances.filter( - (x: GethInstanceConfig) => x.proxy === instance.name - ) - - if (proxiedValidator.length !== 1) { - throw new Error('proxied validator must have exactly one proxy') - } - - instance.proxiedValidatorAddress = privateKeyToAddress(proxiedValidator[0].privateKey) - } - } - - if (useMycelo || !(gethConfig.migrate || gethConfig.migrateTo)) { - // Just need to initialize the nodes in this case. No need to actually start the network - // since we don't need to run the migrations against it. - for (const instance of gethConfig.instances) { - await initGeth(gethConfig, gethBinaryPath, instance, verbose) - } - return - } - - // Start all the instances - for (const instance of gethConfig.instances) { - await initAndStartGeth(gethConfig, gethBinaryPath, instance, verbose) - } - - // Directly connect validator peers that are not using a bootnode or proxy. - await connectValidatorPeers(gethConfig.instances) - - await Promise.all( - gethConfig.instances.filter((i) => i.validating).map((i) => waitToFinishInstanceSyncing(i)) - ) - - await migrateContracts( - MonorepoRoot, - validatorPrivateKeys, - attestationKeys, - validators.map((x) => x.address), - gethConfig.migrateTo, - gethConfig.migrationOverrides - ) - } - - const before = async () => { - await initialize() - - await killGeth() - - // Snapshot the datadir after the contract migrations so we can start from a "clean slate" - // for every test. - for (const instance of gethConfig.instances) { - await snapshotDatadir(gethConfig.runPath, instance, verbose) - } - } - - const restart = async () => { - await killGeth() - - // just in case - gethConfig.keepData = true - - let validatorIndex = 0 - const validatorIndices: number[] = [] - - for (const instance of gethConfig.instances) { - validatorIndices.push(validatorIndex) - if (instance.validating) { - validatorIndex++ - } - } - - // restore data dirs - await Promise.all( - gethConfig.instances.map((instance) => restoreDatadir(gethConfig.runPath, instance)) - ) - - // do in sequence, not concurrently to avoid flaky errors - for (let i = 0; i < gethConfig.instances.length; i++) { - const instance = gethConfig.instances[i] - if (!instance.privateKey && instance.validating) { - instance.privateKey = validatorPrivateKeys[validatorIndices[i]] - } - - if (!instance.minerValidator && (instance.validating || instance.isProxied)) { - instance.minerValidator = privateKeyToAddress(instance.privateKey!) - } - - await startGeth(gethConfig, gethBinaryPath, instance, verbose) - } - - await connectValidatorPeers(gethConfig.instances) - } - - const after = () => killGeth() - - return { - validators, - hooks: { initialize, before, after, restart, gethBinaryPath }, - } -} diff --git a/packages/celotool/src/e2e-tests/validator_order_tests.ts b/packages/celotool/src/e2e-tests/validator_order_tests.ts deleted file mode 100644 index 2f8b16ec3b4..00000000000 --- a/packages/celotool/src/e2e-tests/validator_order_tests.ts +++ /dev/null @@ -1,101 +0,0 @@ -import { assert } from 'chai' -import _ from 'lodash' -import Web3 from 'web3' -import { GethRunConfig } from '../lib/interfaces/geth-run-config' -import { getContext, sleep } from './utils' - -const VALIDATORS = 5 -const EPOCH = 20 -const EPOCHS_TO_WAIT = 3 -const BLOCK_COUNT = EPOCH * EPOCHS_TO_WAIT - -const TMP_PATH = '/tmp/e2e' - -describe('governance tests', () => { - const gethConfig: GethRunConfig = { - networkId: 1101, - network: 'local', - runPath: TMP_PATH, - migrate: true, - instances: _.range(VALIDATORS).map((i) => ({ - name: `validator${i}`, - validating: true, - syncmode: 'full', - port: 30303 + 2 * i, - rpcport: 8545 + 2 * i, - })), - genesisConfig: { - epoch: EPOCH, - churritoBlock: 0, - donutBlock: 0, - espressoBlock: 0, - }, - } - - const context: any = getContext(gethConfig) - let web3: Web3 - - before(async function (this: any) { - this.timeout(0) - await context.hooks.before() - }) - - after(async function (this: any) { - this.timeout(0) - await context.hooks.after() - }) - - describe('Validator ordering', () => { - before(async function () { - this.timeout(0) - web3 = new Web3('http://localhost:8545') - await context.hooks.restart() - }) - - it('properly orders validators randomly', async function (this: any) { - this.timeout(320000 /* 320 seconds */) - // If a consensus round fails during this test, the results are inconclusive. - // Retry up to two times to mitigate this issue. Restarting the nodes is not needed. - this.retries(2) - - const latestBlockNumber = (await web3.eth.getBlock('latest')).number - const indexInEpoch = ((latestBlockNumber % EPOCH) + EPOCH - 1) % EPOCH - const nextEpoch = latestBlockNumber + (EPOCH - indexInEpoch) - - // Wait for enough blocks. - while ((await web3.eth.getBlock('latest')).number < nextEpoch + BLOCK_COUNT) { - await sleep(2) - } - - // Fetch the validator for each block. - const blocks = await Promise.all( - _.range(BLOCK_COUNT).map(async (i) => web3.eth.getBlock(i + nextEpoch)) - ) - const validators = blocks.map((block) => block.miner) - - // Ensure each validator has an equal number of blocks. - const expectedCount = BLOCK_COUNT / VALIDATORS - for (const [validator, count] of Object.entries(_.countBy(validators))) { - assert.equal(count, expectedCount, `${validator} should have mined ${expectedCount} blocks`) - } - - const orderings: string[][] = [] - for (let i = 0; i < EPOCHS_TO_WAIT; i++) { - const epochValidators = validators.slice(i * EPOCH, (i + 1) * EPOCH) - const ordering = epochValidators.slice(0, VALIDATORS) - - // Ensure within an epoch, ordering is consistent. - for (const [index, validator] of ordering.entries()) { - assert.equal(validator, epochValidators[VALIDATORS + index]) - } - - // Ensure each epoch has a unique ordering. - // Note: This has a 1/(VALIDATORS!) chance of failing. With 10 validators, this is negligible. - for (const prevOrdering of orderings) { - assert(!_.isEqual(prevOrdering, ordering), 'ordering is not unique') - } - orderings.push(ordering) - } - }) - }) -}) diff --git a/packages/celotool/src/lib/artifacts.ts b/packages/celotool/src/lib/artifacts.ts deleted file mode 100644 index 24dadb37214..00000000000 --- a/packages/celotool/src/lib/artifacts.ts +++ /dev/null @@ -1,111 +0,0 @@ -/* tslint:disable: no-console */ -import { existsSync, mkdirSync, readFileSync, writeFile } from 'fs' -import { promisify } from 'util' -import { execCmd } from './cmd-utils' -import { doCheckOrPromptIfStagingOrProduction, envVar, fetchEnv, isProduction } from './env-utils' - -export const CONTRACTS_TO_COPY = [ - 'Attestations', - 'Escrow', - 'Exchange', - 'GoldToken', - 'Registry', - 'Reserve', - 'StableToken', -] - -export async function downloadArtifacts(celoEnv: string) { - let baseCmd = `yarn --cwd ../protocol run download-artifacts -n ${celoEnv}` - console.info(`Downloading artifacts for ${celoEnv}`) - if (isProduction()) { - baseCmd += ` -b contract_artifacts_production` - } - try { - await execCmd(baseCmd) - } catch (error) { - console.error(`Unable to download artifacts for ${celoEnv}`) - console.error(error) - process.exit(1) - } -} - -export async function uploadArtifacts(celoEnv: string, checkOrPromptIfStagingOrProduction = true) { - if (checkOrPromptIfStagingOrProduction) { - await doCheckOrPromptIfStagingOrProduction() - } - - let baseCmd = `yarn --cwd ../protocol run upload-artifacts -n ${celoEnv}` - if (isProduction()) { - baseCmd += ` -b contract_artifacts_production` - } - console.info(`Uploading artifacts for ${celoEnv}`) - try { - await execCmd(baseCmd) - } catch (error) { - console.error(`Unable to upload artifacts for ${celoEnv}`) - console.error(error) - process.exit(1) - } -} - -function getContract(fileData: string) { - const json = JSON.parse(fileData) - return { - abi: json.abi, - contractName: json.contractName, - schemaVersion: json.schemaVersion, - updatedAt: json.updatedAt, - } -} - -const toFile = promisify(writeFile) - -export async function copyContractArtifacts( - celoEnv: string, - outputPath: string, - contractList: string[] -) { - const baseContractPath = `../protocol/build/${celoEnv}/contracts` - - if (!existsSync(outputPath)) { - mkdirSync(outputPath) - } - - await Promise.all( - contractList.map(async (contract) => { - const json = getContract(readFileSync(`${baseContractPath}/${contract}.json`).toString()) - - const proxyJson = JSON.parse( - readFileSync(`${baseContractPath}/${contract}Proxy.json`).toString() - ) - const address = proxyJson.networks[fetchEnv(envVar.NETWORK_ID)].address - - await toFile( - `${outputPath}/${contract}.ts`, - `import Web3 from 'web3'\n\n` + - `export default async function getInstance(web3: Web3) {\n` + - ` return new web3.eth.Contract(\n` + - ` ${JSON.stringify(json.abi, null, 2)},\n` + - ` "${address}"\n` + - ` )\n` + - `}` - ) - }) - ) -} - -export async function getContractAddresses(celoEnv: string, contractList: string[]) { - const baseContractPath = `../protocol/build/${celoEnv}/contracts` - - const contracts: Record = {} - - for (const contract of contractList) { - const proxyJson = JSON.parse( - readFileSync(`${baseContractPath}/${contract}Proxy.json`).toString() - ) - const address = proxyJson.networks[fetchEnv(envVar.NETWORK_ID)].address - contracts[contract] = address - } - - return contracts -} diff --git a/packages/celotool/src/lib/azure.ts b/packages/celotool/src/lib/azure.ts deleted file mode 100644 index e1addeed91b..00000000000 --- a/packages/celotool/src/lib/azure.ts +++ /dev/null @@ -1,314 +0,0 @@ -import sleep from 'sleep-promise' -import { execCmdWithExitOnFailure } from 'src/lib/cmd-utils' -import { retryCmd } from 'src/lib/utils' -import { getAksClusterConfig } from './context-utils' -import { AksClusterConfig } from './k8s-cluster/aks' - -/** - * getIdentity gets basic info on an existing identity. If the identity doesn't - * exist, undefined is returned - */ -export async function getIdentity(clusterConfig: AksClusterConfig, identityName: string) { - const [matchingIdentitiesStr] = await execCmdWithExitOnFailure( - `az identity list -g ${clusterConfig.resourceGroup} --query "[?name == '${identityName}']" -o json` - ) - const matchingIdentities = JSON.parse(matchingIdentitiesStr) - if (!matchingIdentities.length) { - return - } - // There should only be one exact match by name - return matchingIdentities[0] -} - -// createIdentityIdempotent creates an identity if it doesn't already exist. -// Returns an object including basic info on the identity. -export async function createIdentityIdempotent( - clusterConfig: AksClusterConfig, - identityName: string -) { - const identity = await getIdentity(clusterConfig, identityName) - if (identity) { - console.info( - `Skipping identity creation, ${identityName} in resource group ${clusterConfig.resourceGroup} already exists` - ) - return identity - } - console.info(`Creating identity ${identityName} in resource group ${clusterConfig.resourceGroup}`) - // This command is idempotent-- if the identity exists, the existing one is given - const [results] = await execCmdWithExitOnFailure( - `az identity create -n ${identityName} -g ${clusterConfig.resourceGroup} -o json` - ) - return JSON.parse(results) -} - -/** - * deleteIdentity gets basic info on an existing identity - */ -export function deleteIdentity(clusterConfig: AksClusterConfig, identityName: string) { - return execCmdWithExitOnFailure( - `az identity delete -n ${identityName} -g ${clusterConfig.resourceGroup} -o json` - ) -} - -async function roleIsAssigned(assignee: string, scope: string, role: string) { - const [matchingAssignedRoles] = await retryCmd( - () => - execCmdWithExitOnFailure( - `az role assignment list --assignee ${assignee} --scope ${scope} --query "length([?roleDefinitionName == '${role}'])" -o tsv` - ), - 10 - ) - return parseInt(matchingAssignedRoles.trim(), 10) > 0 -} - -export async function assignRoleIdempotent( - assigneeObjectId: string, - assigneePrincipalType: string, - scope: string, - role: string -) { - if (await roleIsAssigned(assigneeObjectId, scope, role)) { - console.info( - `Skipping role assignment, role ${role} already assigned to ${assigneeObjectId} for scope ${scope}` - ) - return - } - console.info( - `Assigning role ${role} to ${assigneeObjectId} type ${assigneePrincipalType} for scope ${scope}` - ) - await retryCmd( - () => - execCmdWithExitOnFailure( - `az role assignment create --role "${role}" --assignee-object-id ${assigneeObjectId} --assignee-principal-type ${assigneePrincipalType} --scope ${scope}` - ), - 10 - ) -} - -export async function getAKSNodeResourceGroup(clusterConfig: AksClusterConfig) { - const [nodeResourceGroup] = await execCmdWithExitOnFailure( - `az aks show --name ${clusterConfig.clusterName} --resource-group ${clusterConfig.resourceGroup} --query nodeResourceGroup -o tsv` - ) - return nodeResourceGroup.trim() -} - -/** - * Gets the AKS Service Principal Object ID if one exists. Otherwise, an empty string is given. - */ -export async function getAKSServicePrincipalObjectId(clusterConfig: AksClusterConfig) { - // Get the correct object ID depending on the cluster configuration - // See https://github.com/Azure/aad-pod-identity/blob/b547ba86ab9b16d238db8a714aaec59a046afdc5/docs/readmes/README.role-assignment.md#obtaining-the-id-of-the-managed-identity--service-principal - const [rawServicePrincipalClientId] = await execCmdWithExitOnFailure( - `az aks show -n ${clusterConfig.clusterName} --query servicePrincipalProfile.clientId -g ${clusterConfig.resourceGroup} -o tsv` - ) - const servicePrincipalClientId = rawServicePrincipalClientId.trim() - // This will be the value of the service principal client ID if a managed service identity - // is being used instead of a service principal. - if (servicePrincipalClientId === 'msi') { - return '' - } - const [rawObjectId] = await execCmdWithExitOnFailure( - `az ad sp show --id ${servicePrincipalClientId} --query id -o tsv` - ) - return rawObjectId.trim() -} - -/** - * If an AKS cluster is using a managed service identity, the objectId is returned. - * Otherwise, an empty string is given. - */ -export async function getAKSManagedServiceIdentityObjectId(clusterConfig: AksClusterConfig) { - const [managedIdentityObjectId] = await execCmdWithExitOnFailure( - `az aks show -n ${clusterConfig.clusterName} --query identityProfile.kubeletidentity.objectId -g ${clusterConfig.resourceGroup} -o tsv` - ) - return managedIdentityObjectId.trim() -} - -export async function registerStaticIPIfNotRegistered(name: string, resourceGroupIP: string) { - // This returns an array of matching IP addresses. If there is no matching IP - // address, an empty array is returned. We expect at most 1 matching IP - const [existingIpsStr] = await execCmdWithExitOnFailure( - `az network public-ip list --resource-group ${resourceGroupIP} --query "[?name == '${name}' && sku.name == 'Standard'].ipAddress" -o json` - ) - const existingIps = JSON.parse(existingIpsStr) - if (existingIps.length) { - console.info(`Skipping IP address registration, ${name} on ${resourceGroupIP} exists`) - // We expect only 1 matching IP - return existingIps[0] - } - console.info(`Registering IP address ${name} on ${resourceGroupIP}`) - const [address] = await execCmdWithExitOnFailure( - `az network public-ip create --resource-group ${resourceGroupIP} --name ${name} --allocation-method Static --sku Standard --query publicIp.ipAddress -o tsv` - ) - return address.trim() -} - -export async function deallocateStaticIP(name: string, resourceGroupIP: string) { - console.info(`Deallocating IP address ${name} on ${resourceGroupIP}`) - return execCmdWithExitOnFailure( - `az network public-ip delete --resource-group ${resourceGroupIP} --name ${name}` - ) -} - -export async function waitForStaticIPDetachment(name: string, resourceGroup: string) { - const maxTryCount = 15 - const tryIntervalMs = 3000 - for (let tryCount = 0; tryCount < maxTryCount; tryCount++) { - const [allocated] = await execCmdWithExitOnFailure( - `az network public-ip show --resource-group ${resourceGroup} --name ${name} --query ipConfiguration.id -o tsv` - ) - if (allocated.trim() === '') { - return true - } - await sleep(tryIntervalMs) - } - throw Error(`Too many tries waiting for static IP association ID removal`) -} - -/** - * This creates an Azure identity to access a key vault - */ -export async function createKeyVaultIdentityIfNotExists( - context: string, - identityName: string, - keyVaultName: string, - keyVaultResourceGroup: string | null | undefined, - keyPermissions: string[] | null, - secretPermissions: string[] | null -) { - const clusterConfig = getAksClusterConfig(context) - const identity = await createIdentityIdempotent(clusterConfig, identityName) - // We want to grant the identity for the cluster permission to manage the odis signer identity. - // Get the correct object ID depending on the cluster configuration, either - // the service principal or the managed service identity. - // See https://github.com/Azure/aad-pod-identity/blob/b547ba86ab9b16d238db8a714aaec59a046afdc5/docs/readmes/README.role-assignment.md#obtaining-the-id-of-the-managed-identity--service-principal - let assigneeObjectId = await getAKSServicePrincipalObjectId(clusterConfig) - let assigneePrincipalType = 'ServicePrincipal' - // TODO Check how to manage the MSI type - if (!assigneeObjectId) { - assigneeObjectId = await getAKSManagedServiceIdentityObjectId(clusterConfig) - // assigneePrincipalType = 'MSI' - assigneePrincipalType = 'ServicePrincipal' - } - await assignRoleIdempotent( - assigneeObjectId, - assigneePrincipalType, - identity.id, - 'Managed Identity Operator' - ) - // Allow the odis signer identity to access the correct key vault - await setKeyVaultPolicyIfNotSet( - clusterConfig, - keyVaultName, - keyVaultResourceGroup, - identity, - keyPermissions, - secretPermissions - ) - return identity -} - -async function setKeyVaultPolicyIfNotSet( - clusterConfig: AksClusterConfig, - keyVaultName: string, - keyVaultResourceGroup: string | null | undefined, - azureIdentity: any, - keyPermissions: string[] | null, - secretPermissions: string[] | null -) { - const kvResourceGroup = keyVaultResourceGroup - ? keyVaultResourceGroup - : clusterConfig.resourceGroup - - const queryFilters = [`?objectId == '${azureIdentity.principalId}'`] - if (keyPermissions) { - queryFilters.push( - `sort(permissions.keys) == [${keyPermissions.map((perm) => `'${perm}'`).join(', ')}]` - ) - } - if (secretPermissions) { - queryFilters.push( - `sort(permissions.secrets) == [${secretPermissions.map((perm) => `'${perm}'`).join(', ')}]` - ) - } - - const [keyVaultPoliciesStr] = await execCmdWithExitOnFailure( - `az keyvault show --name ${keyVaultName} -g ${kvResourceGroup} --query "properties.accessPolicies[${queryFilters.join( - ' && ' - )}]"` - ) - const keyVaultPolicies = JSON.parse(keyVaultPoliciesStr) - if (keyVaultPolicies.length) { - const keyPermStr = keyPermissions ? `key permissions: ${keyPermissions.join(' ')}` : '' - const secretPermStr = secretPermissions - ? `secret permissions: ${secretPermissions.join(' ')}` - : '' - console.info( - `Skipping setting policy {${keyPermStr}, ${secretPermStr}}. Already set for vault ${keyVaultName} and identity objectId ${azureIdentity.principalId}` - ) - return - } - - if (keyPermissions) { - console.info( - `Setting key permissions ${keyPermissions.join( - ' ' - )} for vault ${keyVaultName} and identity objectId ${azureIdentity.principalId}` - ) - return execCmdWithExitOnFailure( - `az keyvault set-policy --name ${keyVaultName} --key-permissions ${keyPermissions.join( - ' ' - )} --object-id ${azureIdentity.principalId} -g ${kvResourceGroup}` - ) - } - - if (secretPermissions) { - console.info( - `Setting secret permissions ${secretPermissions.join( - ' ' - )} for vault ${keyVaultName} and identity objectId ${azureIdentity.principalId}` - ) - return execCmdWithExitOnFailure( - `az keyvault set-policy --name ${keyVaultName} --secret-permissions ${secretPermissions.join( - ' ' - )} --object-id ${azureIdentity.principalId} -g ${kvResourceGroup}` - ) - } -} - -/** - * deleteAzureKeyVaultIdentity deletes the key vault policy and the managed identity - */ -export async function deleteAzureKeyVaultIdentity( - context: string, - identityName: string, - keyVaultName: string -) { - const clusterConfig = getAksClusterConfig(context) - await deleteKeyVaultPolicy(clusterConfig, identityName, keyVaultName) - return deleteIdentity(clusterConfig, identityName) -} - -async function deleteKeyVaultPolicy( - clusterConfig: AksClusterConfig, - identityName: string, - keyVaultName: string -) { - const azureIdentity = await getIdentity(clusterConfig, identityName) - return execCmdWithExitOnFailure( - `az keyvault delete-policy --name ${keyVaultName} --object-id ${azureIdentity.principalId} -g ${clusterConfig.resourceGroup}` - ) -} - -/** - * @return the intended name of an azure identity given a key vault name - */ -export function getAzureKeyVaultIdentityName( - context: string, - prefix: string, - keyVaultName: string -) { - // from https://docs.microsoft.com/en-us/azure/azure-resource-manager/management/resource-name-rules#microsoftmanagedidentity - const maxIdentityNameLength = 128 - return `${prefix}-${keyVaultName}-${context}`.substring(0, maxIdentityNameLength) -} diff --git a/packages/celotool/src/lib/blockchain.ts b/packages/celotool/src/lib/blockchain.ts deleted file mode 100644 index ee16f11f7d4..00000000000 --- a/packages/celotool/src/lib/blockchain.ts +++ /dev/null @@ -1,7 +0,0 @@ -import { getRandomTxNodeIP } from 'src/lib/kubernetes' -import Web3 from 'web3' - -export async function getWeb3Client(celoEnv: string) { - const transactionNodeIP = await getRandomTxNodeIP(celoEnv) - return new Web3(`ws://${transactionNodeIP}:8546`) -} diff --git a/packages/celotool/src/lib/celostats.ts b/packages/celotool/src/lib/celostats.ts deleted file mode 100644 index 8517ba5ffb7..00000000000 --- a/packages/celotool/src/lib/celostats.ts +++ /dev/null @@ -1,76 +0,0 @@ -import { - installGenericHelmChart, - removeGenericHelmChart, - upgradeGenericHelmChart, -} from 'src/lib/helm_deploy' -import { getBlockscoutUrl, getFornoUrl } from './endpoints' -import { envVar, fetchEnv, fetchEnvOrFallback } from './env-utils' -import { AccountType, getAddressesFor } from './generate_utils' - -const helmChartPath = '../helm-charts/celostats' - -export async function installHelmChart(celoEnv: string) { - return installGenericHelmChart({ - namespace: celoEnv, - releaseName: releaseName(celoEnv), - chartDir: helmChartPath, - parameters: helmParameters(celoEnv), - }) -} - -export async function removeHelmRelease(celoEnv: string) { - await removeGenericHelmChart(releaseName(celoEnv), celoEnv) -} - -export async function upgradeHelmChart(celoEnv: string) { - await upgradeGenericHelmChart({ - namespace: celoEnv, - releaseName: releaseName(celoEnv), - chartDir: helmChartPath, - parameters: helmParameters(celoEnv), - }) -} - -function helmParameters(celoEnv: string) { - return [ - `--set domain.name=${fetchEnv(envVar.CLUSTER_DOMAIN_NAME)}`, - `--set celostats.image.server.repository=${fetchEnv( - envVar.CELOSTATS_SERVER_DOCKER_IMAGE_REPOSITORY - )}`, - `--set celostats.image.server.tag=${fetchEnv(envVar.CELOSTATS_SERVER_DOCKER_IMAGE_TAG)}`, - `--set celostats.image.frontend.repository=${fetchEnv( - envVar.CELOSTATS_FRONTEND_DOCKER_IMAGE_REPOSITORY - )}`, - `--set celostats.image.frontend.tag=${fetchEnv(envVar.CELOSTATS_FRONTEND_DOCKER_IMAGE_TAG)}`, - `--set celostats.trusted_addresses='${String(generateAuthorizedAddresses()).replace( - /,/g, - '\\,' - )}'`, - `--set celostats.banned_addresses='${String( - fetchEnv(envVar.CELOSTATS_BANNED_ADDRESSES) - ).replace(/,/g, '\\,')}'`, - `--set celostats.reserved_addresses='${String( - fetchEnv(envVar.CELOSTATS_RESERVED_ADDRESSES) - ).replace(/,/g, '\\,')}'`, - `--set celostats.network_name='Celo ${celoEnv}'`, - `--set celostats.blockscout_url='${getBlockscoutUrl(celoEnv)}'`, - `--set celostats.jsonrpc='${getFornoUrl(celoEnv)}'`, - ] -} - -function releaseName(celoEnv: string) { - return `${celoEnv}-celostats` -} - -function generateAuthorizedAddresses() { - // TODO: Add the Proxy eth addresses when available - const mnemonic = fetchEnv(envVar.MNEMONIC) - const publicKeys = [] - const txNodes = parseInt(fetchEnv(envVar.TX_NODES), 0) - const validatorNodes = parseInt(fetchEnv(envVar.VALIDATORS), 0) - publicKeys.push(getAddressesFor(AccountType.TX_NODE, mnemonic, txNodes)) - publicKeys.push(getAddressesFor(AccountType.VALIDATOR, mnemonic, validatorNodes)) - - publicKeys.push(fetchEnvOrFallback(envVar.CELOSTATS_TRUSTED_ADDRESSES, '').split(',')) - return publicKeys.reduce((accumulator, value) => accumulator.concat(value), []).filter((_) => !!_) -} diff --git a/packages/celotool/src/lib/chaoskube.ts b/packages/celotool/src/lib/chaoskube.ts deleted file mode 100644 index f83e86890d3..00000000000 --- a/packages/celotool/src/lib/chaoskube.ts +++ /dev/null @@ -1,19 +0,0 @@ -import { makeHelmParameters } from 'src/lib/helm_deploy' -import { envVar, fetchEnv } from './env-utils' - -export function helmReleaseName(celoEnv: string) { - return celoEnv + '-chaoskube' -} - -export const helmChartDir = 'stable/chaoskube' - -export function helmParameters(celoEnv: string) { - return makeHelmParameters({ - interval: fetchEnv(envVar.CHAOS_TEST_KILL_INTERVAL), - labels: 'component=validators', - namespaces: celoEnv, - dryRun: 'false', - 'rbac.create': 'true', - 'rbac.serviceAccountName': `${celoEnv}-chaoskube`, - }) -} diff --git a/packages/celotool/src/lib/cloud-storage.ts b/packages/celotool/src/lib/cloud-storage.ts deleted file mode 100644 index 566b79de83c..00000000000 --- a/packages/celotool/src/lib/cloud-storage.ts +++ /dev/null @@ -1,81 +0,0 @@ -const { Storage } = require('@google-cloud/storage') - -const sleep = async (time: number) => { - return new Promise((resolve) => setTimeout(resolve, time)) -} - -export const createClient = (credentials?: any) => { - return new Storage({ credentials }) -} - -// Location ref: https://cloud.google.com/storage/docs/locations -// Storage classes ref: https://cloud.google.com/storage/docs/storage-classes -export const createBucket = async ( - client: any, - bucketName: string, - location: string = 'US-CENTRAL1', - storageClass: string = 'COLDLINE' -) => { - await client.createBucket(bucketName, { - location, - storageClass, - }) - await sleep(2000) -} - -export const createBucketIfNotExists = async ( - client: any, - bucketName: string, - location: string = 'US-CENTRAL1', - storageClass: string = 'COLDLINE' -) => { - if (!(await checkBucketExists(client, bucketName))) { - await createBucket(client, bucketName, location, storageClass) - } -} - -export const getBuckets = async (client: any) => { - const [buckets] = await client.getBuckets() - return buckets -} - -export const getFiles = async (client: any, bucketName: string) => { - const [files] = await client.bucket(bucketName).getFiles() - return files -} - -export const checkBucketExists = async (client: any, bucketName: string) => { - const buckets = await getBuckets(client) - return buckets.some((bucket: any) => bucket.name === bucketName) -} - -export const deleteBucket = async (client: any, bucketName: string) => { - await client.bucket(bucketName).delete() -} - -export const fileUpload = async ( - client: any, - bucketName: string, - srcFileName: string, - useCache: boolean = true -) => { - const uploadOptions = { - gzip: true, - metadata: { - cacheControl: useCache ? 'public, max-age=31536000' : 'no-cache', - }, - } - - await client.bucket(bucketName).upload(srcFileName, uploadOptions) -} - -export const fileDownload = async ( - client: any, - bucketName: string, - srcFileName: string, - dstFileName: string -) => { - await client.bucket(bucketName).file(srcFileName).download({ - destination: dstFileName, - }) -} diff --git a/packages/celotool/src/lib/cluster.ts b/packages/celotool/src/lib/cluster.ts deleted file mode 100644 index 95f6661cb75..00000000000 --- a/packages/celotool/src/lib/cluster.ts +++ /dev/null @@ -1,251 +0,0 @@ -import sleep from 'sleep-promise' -import { execCmd, execCmdWithExitOnFailure } from './cmd-utils' -import { getClusterConfigForContext, switchToContextCluster } from './context-utils' -import { doCheckOrPromptIfStagingOrProduction, envTypes, envVar, fetchEnv } from './env-utils' -import { - checkHelmVersion, - createAndUploadBackupSecretIfNotExists, - getServiceAccountName, - grantRoles, - installAndEnableMetricsDeps, - installCertManagerAndNginx, - installGCPSSDStorageClass, - isCelotoolHelmDryRun, - networkName, -} from './helm_deploy' -import { createServiceAccountIfNotExists } from './service-account-utils' -import { outputIncludes, switchToProjectFromEnv } from './utils' - -const SYSTEM_HELM_RELEASES = [ - 'nginx-ingress-release', - 'kube-lego-release', - 'cert-manager-cluster-issuers', -] -const HELM_RELEASE_REGEX = new RegExp(/(.*)-\d+\.\d+\.\d+$/) - -export async function switchToClusterFromEnv( - celoEnv: string, - checkOrPromptIfStagingOrProduction = true, - skipClusterSetup = true -) { - if (checkOrPromptIfStagingOrProduction) { - await doCheckOrPromptIfStagingOrProduction() - } - await checkHelmVersion() - - await switchToProjectFromEnv() - - if (!skipClusterSetup) { - if (!isCelotoolHelmDryRun()) { - // In this case we create the cluster if it does not exist - const createdCluster = await createClusterIfNotExists() - // Install common helm charts - await setupCluster(celoEnv, createdCluster) - } else { - console.info(`Skipping cluster setup due to --helmdryrun`) - } - } - - let currentCluster = null - try { - ;[currentCluster] = await execCmd('kubectl config current-context') - } catch (error) { - console.info('No cluster currently set') - } - - const projectName = fetchEnv(envVar.TESTNET_PROJECT_NAME) - const kubernetesClusterName = fetchEnv(envVar.KUBERNETES_CLUSTER_NAME) - const kubernetesClusterZone = fetchEnv(envVar.KUBERNETES_CLUSTER_ZONE) - - const expectedCluster = `gke_${projectName}_${kubernetesClusterZone}_${kubernetesClusterName}` - - if (currentCluster === null || currentCluster.trim() !== expectedCluster) { - await execCmdWithExitOnFailure( - `gcloud container clusters get-credentials ${kubernetesClusterName} --project ${projectName} --zone ${kubernetesClusterZone}` - ) - } - await execCmdWithExitOnFailure(`kubectl config set-context --current --namespace default`) -} - -export async function createClusterIfNotExists() { - const kubernetesClusterName = fetchEnv(envVar.KUBERNETES_CLUSTER_NAME) - const kubernetesClusterZone = fetchEnv(envVar.KUBERNETES_CLUSTER_ZONE) - await switchToProjectFromEnv() - const clusterExists = await outputIncludes( - `gcloud container clusters list --zone ${kubernetesClusterZone} --filter NAME=${kubernetesClusterName}`, - kubernetesClusterName, - `Cluster ${kubernetesClusterName} exists, skipping creation` - ) - - if (!clusterExists) { - const network = networkName(fetchEnv(envVar.CELOTOOL_CELOENV)) - console.info(`Creating cluster ${kubernetesClusterName} on network ${network}...`) - await execCmdWithExitOnFailure( - `gcloud container clusters create ${kubernetesClusterName} --zone ${kubernetesClusterZone} ${fetchEnv( - envVar.CLUSTER_CREATION_FLAGS - )} --network ${network}` - ) - return true - } - - return false -} - -export async function createNamespaceIfNotExists(namespace: string) { - const namespaceExists = await outputIncludes( - `kubectl get namespaces ${namespace} || true`, - namespace, - `Namespace ${namespace} exists, skipping creation` - ) - if (!namespaceExists) { - const cmd = `kubectl create namespace ${namespace} ${ - isCelotoolHelmDryRun() ? ' --dry-run=server' : '' - }` - await execCmdWithExitOnFailure(cmd) - } -} - -export async function setupCluster(celoEnv: string, createdCluster: boolean) { - const envType = fetchEnv(envVar.ENV_TYPE) - - await checkHelmVersion() - - await createNamespaceIfNotExists(celoEnv) - - const blockchainBackupServiceAccountName = getServiceAccountName('blockchain-backup-for') - console.info(`Service account for blockchain backup is \"${blockchainBackupServiceAccountName}\"`) - - await createServiceAccountIfNotExists(blockchainBackupServiceAccountName) - // This role is required for "compute.snapshots.get" permission - // Source: https://cloud.google.com/compute/docs/access/iam - await grantRoles(blockchainBackupServiceAccountName, 'roles/compute.storageAdmin') - // This role is required for "gcloud.container.clusters.get-credentials" permission - // This role is required for "container.clusters.get" permission - // Source: https://cloud.google.com/kubernetes-engine/docs/how-to/iam - await grantRoles(blockchainBackupServiceAccountName, 'roles/container.viewer') - - await createAndUploadBackupSecretIfNotExists(blockchainBackupServiceAccountName, celoEnv) - - // poll for cluster availability - if (createdCluster) { - await pollForRunningCluster() - } - - console.info('Deploying Tiller and Cert Manager Helm chart...') - - await installGCPSSDStorageClass() - - await installCertManagerAndNginx(celoEnv) - - if (envType !== envTypes.DEVELOPMENT) { - console.info('Installing metric tools installation') - await installAndEnableMetricsDeps(true) - } else { - console.info('Skipping metric tools installation for this development env') - } - - await setClusterLabels(celoEnv) -} - -export async function pollForRunningCluster() { - const kubernetesClusterName = fetchEnv(envVar.KUBERNETES_CLUSTER_NAME) - const kubernetesClusterZone = fetchEnv(envVar.KUBERNETES_CLUSTER_ZONE) - - await switchToProjectFromEnv() - - let attempts = 0 - while (attempts < 10) { - console.info('Waiting for cluster to be in status=RUNNING') - const [status] = await execCmdWithExitOnFailure( - `gcloud container clusters describe --zone ${kubernetesClusterZone} ${kubernetesClusterName} --format="value(status)"` - ) - if (status.trim() === 'RUNNING') { - return - } - attempts += 1 - await sleep(Math.pow(2, attempts) * 1000) - } - - console.error('Waited for too long for running cluster') - process.exit(1) -} - -export async function deleteCluster() { - const kubernetesClusterName = fetchEnv(envVar.KUBERNETES_CLUSTER_NAME) - const kubernetesClusterZone = fetchEnv(envVar.KUBERNETES_CLUSTER_ZONE) - - await switchToProjectFromEnv() - const clusterExists = await outputIncludes( - `gcloud container clusters list --zone ${kubernetesClusterZone} --filter NAME=${kubernetesClusterName}`, - kubernetesClusterName - ) - if (!clusterExists) { - console.error(`Cluster ${kubernetesClusterName} does not exist`) - process.exit(1) - } - - await execCmdWithExitOnFailure( - `gcloud container clusters delete ${kubernetesClusterName} --zone ${kubernetesClusterZone} --quiet` - ) -} - -export async function setClusterLabels(celoEnv: string) { - const envType = fetchEnv(envVar.ENV_TYPE) - const labelfn = async (key: string, value: string) => { - await execCmdWithExitOnFailure( - `gcloud container clusters update ${fetchEnv( - envVar.KUBERNETES_CLUSTER_NAME - )} --update-labels ${key}=${value} --zone ${fetchEnv(envVar.KUBERNETES_CLUSTER_ZONE)}` - ) - } - await labelfn('environment', envType) - await labelfn('envtype', envType === envTypes.PRODUCTION ? 'production' : 'nonproduction') - await labelfn('envinstance', celoEnv) -} - -export function getKubernetesClusterRegion(zone?: string): string { - if (!zone) { - zone = fetchEnv(envVar.KUBERNETES_CLUSTER_ZONE) - } - const matches = zone.match('^[a-z]+-[a-z]+[0-9]') - if (matches) { - return matches[0] - } else { - console.error('Unable to find kubernetes cluster region') - process.exit(1) - // Make the compiler happy - return '' - } -} -export interface HelmRelease { - Name: string - Chart: string - Status: string - Updated: string - Namespace: string -} - -export async function getNonSystemHelmReleases(): Promise { - const [json] = await execCmdWithExitOnFailure(`helm list -A --output json`) - const releases: HelmRelease[] = JSON.parse(json).Releases - return releases.filter((release) => !SYSTEM_HELM_RELEASES.includes(release.Name)) -} - -export function getPackageName(name: string) { - const prefix = HELM_RELEASE_REGEX.exec(name) - if (!prefix) { - return '' - } - - return prefix[1] === 'ethereum' ? 'testnet' : prefix[1] -} - -export async function switchToClusterFromEnvOrContext(argv: any, skipClusterSetup = false) { - if (argv.context === undefined) { - // GCP top level cluster. - await switchToClusterFromEnv(argv.celoEnv, true, skipClusterSetup) - } else { - await switchToContextCluster(argv.celoEnv, argv.context, skipClusterSetup) - return getClusterConfigForContext(argv.context) - } -} diff --git a/packages/celotool/src/lib/cmd-utils.ts b/packages/celotool/src/lib/cmd-utils.ts deleted file mode 100644 index 65a575c15a8..00000000000 --- a/packages/celotool/src/lib/cmd-utils.ts +++ /dev/null @@ -1,122 +0,0 @@ -import { exec, spawn, SpawnOptions } from 'child_process' - -export async function execCmdAndParseJson( - cmd: string, - execOptions: any = {}, - rejectWithOutput = false, - pipeOutput = false -) { - const [output] = await execCmd(cmd, execOptions, rejectWithOutput, pipeOutput) - return JSON.parse(output) -} - -// Returns a Promise which resolves to [stdout, stderr] array -export function execCmd( - cmd: string, - execOptions: any = {}, - rejectWithOutput = false, - pipeOutput = false -): Promise<[string, string]> { - return new Promise((resolve, reject) => { - if (process.env.CELOTOOL_VERBOSE === 'true') { - console.debug('$ ' + cmd) - pipeOutput = true - } - - exec(cmd, { maxBuffer: 1024 * 10000, ...execOptions }, (err, stdout, stderr) => { - if (pipeOutput) { - console.debug(stdout.toString()) - } - if (err || pipeOutput) { - console.error(stderr.toString()) - } - if (err) { - if (rejectWithOutput) { - reject([err, stdout.toString(), stderr.toString()]) - } else { - reject(err) - } - } else { - resolve([stdout.toString(), stderr.toString()]) - } - }) - }) -} - -export function spawnCmd( - cmd: string, - args: string[], - options?: SpawnOptions & { silent?: boolean } -) { - return new Promise(async (resolve, reject) => { - const { silent, ...spawnOptions } = options || { silent: false } - if (!silent) { - console.debug('$ ' + [cmd].concat(args).join(' ')) - } - const process = spawn(cmd, args, { ...spawnOptions, stdio: silent ? 'ignore' : 'inherit' }) - process.on('close', (code) => { - try { - resolve(code) - } catch (error) { - reject(error) - } - }) - }) -} - -// Returns a Promise which resolves to [stdout, stderr] array -export function execCmdWithExitOnFailure( - cmd: string, - options: any = {}, - pipeOutput = false -): Promise<[string, string]> { - return new Promise((resolve, reject) => { - try { - resolve(execCmd(cmd, options, false, pipeOutput)) - } catch (error) { - console.error(error) - process.exit(1) - // To make the compiler happy. - reject(error) - } - }) -} - -export async function spawnCmdWithExitOnFailure( - cmd: string, - args: string[], - options?: SpawnOptions & { silent?: boolean } -) { - const code = await spawnCmd(cmd, args, options) - if (code !== 0) { - console.error('spawnCmd failed for: ' + [cmd].concat(args).join(' ')) - process.exit(1) - } -} - -export function execBackgroundCmd(cmd: string) { - if (process.env.CELOTOOL_VERBOSE === 'true') { - console.debug('$ ' + cmd) - } - return exec(cmd, { maxBuffer: 1024 * 10000 }, (err, stdout, stderr) => { - if (process.env.CELOTOOL_VERBOSE === 'true') { - console.debug(stdout) - console.error(stderr) - } - if (err) { - console.error(err) - process.exit(1) - } - }) -} - -export async function outputIncludes(cmd: string, matchString: string, matchMessage?: string) { - const [stdout] = await execCmdWithExitOnFailure(cmd) - if (stdout.includes(matchString)) { - if (matchMessage) { - console.info(matchMessage) - } - return true - } - return false -} diff --git a/packages/celotool/src/lib/context-utils.ts b/packages/celotool/src/lib/context-utils.ts deleted file mode 100644 index f14d9eb2c07..00000000000 --- a/packages/celotool/src/lib/context-utils.ts +++ /dev/null @@ -1,218 +0,0 @@ -import { Argv } from 'yargs' -import { - addCeloEnvMiddleware, - doCheckOrPromptIfStagingOrProduction, - DynamicEnvVar, - envVar, - fetchEnv, - getDynamicEnvVarValue, -} from './env-utils' -import { AksClusterConfig } from './k8s-cluster/aks' -import { BaseClusterConfig, BaseClusterManager, CloudProvider } from './k8s-cluster/base' -import { GCPClusterConfig } from './k8s-cluster/gcp' -import { getClusterManager } from './k8s-cluster/utils' - -/** - * Env vars corresponding to each value for the AksClusterConfig for a particular context - */ -const contextAksClusterConfigDynamicEnvVars: { - [k in keyof Omit]: DynamicEnvVar -} = { - clusterName: DynamicEnvVar.KUBERNETES_CLUSTER_NAME, - subscriptionId: DynamicEnvVar.AZURE_SUBSCRIPTION_ID, - tenantId: DynamicEnvVar.AZURE_TENANT_ID, - resourceGroup: DynamicEnvVar.AZURE_KUBERNETES_RESOURCE_GROUP, - regionName: DynamicEnvVar.AZURE_REGION_NAME, -} - -/** - * Env vars corresponding to each value for the GCPClusterConfig for a particular context - */ -const contextGCPClusterConfigDynamicEnvVars: { - [k in keyof Omit]: DynamicEnvVar -} = { - clusterName: DynamicEnvVar.KUBERNETES_CLUSTER_NAME, - projectName: DynamicEnvVar.GCP_PROJECT_NAME, - zone: DynamicEnvVar.GCP_ZONE, -} - -const clusterConfigGetterByCloudProvider: { - [key in CloudProvider]: (context: string) => BaseClusterConfig -} = { - [CloudProvider.AZURE]: getAksClusterConfig, - [CloudProvider.GCP]: getGCPClusterConfig, -} - -export function getCloudProviderFromContext(context: string): CloudProvider { - for (const cloudProvider of Object.values(CloudProvider)) { - if (context.startsWith(cloudProvider as string)) { - return CloudProvider[cloudProvider as keyof typeof CloudProvider] - } - } - throw Error(`Context ${context} must start with one of ${Object.values(CloudProvider)}`) -} - -/** - * Fetches the env vars for a particular context - * @param context the context to use - * @return an AksClusterConfig for the context - */ -export function getAksClusterConfig(context: string): AksClusterConfig { - const azureDynamicEnvVars = getContextDynamicEnvVarValues( - contextAksClusterConfigDynamicEnvVars, - context - ) - const clusterConfig: AksClusterConfig = { - cloudProvider: CloudProvider.AZURE, - ...azureDynamicEnvVars, - } - return clusterConfig -} - -/** - * Fetches the env vars for a particular context - * @param context the context to use - * @return an GCPClusterConfig for the context - */ -export function getGCPClusterConfig(context: string): GCPClusterConfig { - const gcpDynamicEnvVars = getContextDynamicEnvVarValues( - contextGCPClusterConfigDynamicEnvVars, - context - ) - const clusterConfig: GCPClusterConfig = { - cloudProvider: CloudProvider.GCP, - ...gcpDynamicEnvVars, - } - return clusterConfig -} - -/** - * Helper function used to extract multiple dynamic env vars based on - * an object of dynamic props used for templating. - * @param dynamicEnvVars an object whose values correspond to the desired - * dynamic env vars to fetch. - * @param dynamicProps the properties used for templatin the variables - * @param defaultValues Optional default values if the dynamic env vars are not found - * @return an object with the same keys as dynamicEnvVars, but the values are - * the values of the dynamic env vars for the particular context - */ -export function getDynamicEnvVarValues( - dynamicEnvVars: { [k in keyof T]: DynamicEnvVar }, - dynamicProps: P, - defaultValues?: { [k in keyof T]: string } -): { - [k in keyof T]: string -} { - return Object.keys(dynamicEnvVars).reduce((values: any, k: string) => { - const key = k as keyof T - const dynamicEnvVar = dynamicEnvVars[key] - const defaultValue = defaultValues ? defaultValues[key] : undefined - const value = getDynamicEnvVarValue(dynamicEnvVar, dynamicProps, defaultValue) - return { - ...values, - [key]: value, - } - }, {}) -} - -/** - * Given if the desired context is primary, gives the appropriate OracleAzureContext - * Gives an object with the values of dynamic environment variables for a context. - * @param dynamicEnvVars an object whose values correspond to the desired - * dynamic env vars to fetch. - * @param context The context - * @param defaultValues Optional default values if the dynamic env vars are not found - * @return an object with the same keys as dynamicEnvVars, but the values are - * the values of the dynamic env vars for the particular context - */ -export function getContextDynamicEnvVarValues( - dynamicEnvVars: { [k in keyof T]: DynamicEnvVar }, - context: string, - defaultValues?: { [k in keyof T]: string } -): { - [k in keyof T]: string -} { - return getDynamicEnvVarValues(dynamicEnvVars, { context }, defaultValues) -} - -/** - * Reads the context and switches to the appropriate Azure Cluster - */ -export async function switchToContextCluster( - celoEnv: string, - context: string, - checkOrPromptIfStagingOrProduction: boolean = true, - skipClusterSetup: boolean = false -) { - if (!isValidContext(context)) { - throw Error(`Invalid context, must be one of ${fetchEnv(envVar.CONTEXTS)}`) - } - if (checkOrPromptIfStagingOrProduction) { - await doCheckOrPromptIfStagingOrProduction() - } - const clusterManager: BaseClusterManager = getClusterManagerForContext(celoEnv, context) - await clusterManager.switchToClusterContext(skipClusterSetup, context) - return clusterManager -} - -export function getClusterManagerForContext(celoEnv: string, context: string) { - const cloudProvider: CloudProvider = getCloudProviderFromContext(context) - const clusterConfig = clusterConfigGetterByCloudProvider[cloudProvider](context) - return getClusterManager(cloudProvider, celoEnv, clusterConfig) -} - -export function getClusterConfigForContext(context: string) { - const cloudProvider: CloudProvider = getCloudProviderFromContext(context) - return clusterConfigGetterByCloudProvider[cloudProvider](context) -} - -/** - * yargs argv type for an command that requires a context - */ -export interface ContextArgv { - context: string -} - -/** - * Coerces the value of context to be all upper-case and underscore-separated - * rather than dash-separated. If the resulting context does not match a regex - * requiring all caps, alphanumeric, and dash-only characters - * (must start with letter and not end with an underscore), it will throw. - */ -export function coerceContext(rawContextStr: string) { - const context = rawContextStr.toUpperCase().replace(/-/g, '_') - if (!RegExp('^[A-Z][A-Z0-9_]*[A-Z0-9]$').test(context)) { - throw Error(`Invalid context. Raw ${rawContextStr}, implied ${context}`) - } - return context -} - -export function readableContext(context: string) { - const readable = context.toLowerCase().replace(/_/g, '-') - if (!RegExp('^[A-Z][A-Z0-9_]*[A-Z0-9]$').test(context)) { - throw Error(`Invalid context. Context ${context}, readable ${readable}`) - } - return readable -} - -export function isValidContext(context: string) { - const validContexts = fetchEnv(envVar.CONTEXTS).split(',') - const validContextsCoerced = validContexts.map(coerceContext) - return validContextsCoerced.includes(context) -} - -/** - * Middleware for a context related command. - * Must be one of the contexts specified in the environment - * variable CONTEXTS. - */ -export function addOptionalContextMiddleware(argv: Argv) { - return addCeloEnvMiddleware(argv).option('context', { - description: 'Context to perform the deployment in', - type: 'string', - }) -} - -export function addContextMiddleware(argv: Argv) { - return addOptionalContextMiddleware(argv).coerce('context', coerceContext) -} diff --git a/packages/celotool/src/lib/contract-utils.ts b/packages/celotool/src/lib/contract-utils.ts deleted file mode 100644 index ff5a76f71e5..00000000000 --- a/packages/celotool/src/lib/contract-utils.ts +++ /dev/null @@ -1,12 +0,0 @@ -import { GoldTokenWrapper } from '@celo/contractkit/lib/wrappers/GoldTokenWrapper' -import { StableTokenWrapper } from '@celo/contractkit/lib/wrappers/StableTokenWrapper' -import { BigNumber } from 'bignumber.js' - -export async function convertToContractDecimals( - value: number | BigNumber, - contract: StableTokenWrapper | GoldTokenWrapper -) { - const decimals = new BigNumber(await contract.decimals()) - const one = new BigNumber(10).pow(decimals.toNumber()) - return one.times(value) -} diff --git a/packages/celotool/src/lib/endpoints.ts b/packages/celotool/src/lib/endpoints.ts deleted file mode 100644 index 6cf910396aa..00000000000 --- a/packages/celotool/src/lib/endpoints.ts +++ /dev/null @@ -1,49 +0,0 @@ -import { envVar, fetchEnv } from './env-utils' - -export function getBlockscoutUrl(celoEnv: string) { - return `https://${celoEnv}-blockscout.${fetchEnv(envVar.CLUSTER_DOMAIN_NAME)}.org` -} - -export function getBlockscoutClusterInternalUrl(celoEnv: string) { - return `${celoEnv}-blockscout-web` -} - -export function getEthstatsUrl(celoEnv: string) { - return `https://${celoEnv}-ethstats.${fetchEnv(envVar.CLUSTER_DOMAIN_NAME)}.org` -} - -export function getBlockchainApiUrl(celoEnv: string) { - return `https://${celoEnv}-dot-${fetchEnv(envVar.TESTNET_PROJECT_NAME)}.appspot.com` -} - -export function getGenesisGoogleStorageUrl(celoEnv: string) { - return `https://www.googleapis.com/storage/v1/b/genesis_blocks/o/${celoEnv}?alt=media` -} - -export function getFornoUrl(celoEnv: string) { - return celoEnv === 'rc1' - ? `https://forno.celo.org` - : `https://${celoEnv}-forno.${fetchEnv(envVar.CLUSTER_DOMAIN_NAME)}.org` -} - -export function getFornoWebSocketUrl(celoEnv: string) { - return celoEnv === 'rc1' - ? `wss://forno.celo.org/ws` - : `wss://${celoEnv}-forno.${fetchEnv(envVar.CLUSTER_DOMAIN_NAME)}.org/ws` -} - -export function getFullNodeHttpRpcInternalUrl(celoEnv: string) { - return `http://${celoEnv}-fullnodes-rpc.${celoEnv}.svc.cluster.local:8545` -} - -export function getFullNodeWebSocketRpcInternalUrl(celoEnv: string) { - return `ws://${celoEnv}-fullnodes-rpc.${celoEnv}.svc.cluster.local:8546` -} - -export function getLightNodeHttpRpcInternalUrl(celoEnv: string) { - return `http://${celoEnv}-lightnodes-rpc.${celoEnv}.svc.cluster.local:8545` -} - -export function getLightNodeWebSocketRpcInternalUrl(celoEnv: string) { - return `ws://${celoEnv}-lightnodes-rpc.${celoEnv}.svc.cluster.local:8546` -} diff --git a/packages/celotool/src/lib/env-utils.ts b/packages/celotool/src/lib/env-utils.ts deleted file mode 100644 index 7d6898a17eb..00000000000 --- a/packages/celotool/src/lib/env-utils.ts +++ /dev/null @@ -1,348 +0,0 @@ -import { config } from 'dotenv' -import { existsSync } from 'fs' -import path from 'path' -import prompts from 'prompts' -import yargs from 'yargs' - -export interface CeloEnvArgv extends yargs.Argv { - celoEnv: string -} - -export enum envVar { - BLOCK_TIME = 'BLOCK_TIME', - BLOCKSCOUT_DB_SUFFIX = 'BLOCKSCOUT_DB_SUFFIX', - BLOCKSCOUT_DB_NEW_SUFFIX = 'BLOCKSCOUT_DB_NEW_SUFFIX', - BLOCKSCOUT_DOCKER_IMAGE_TAG = 'BLOCKSCOUT_DOCKER_IMAGE_TAG', - CELOCLI_STANDALONE_IMAGE_REPOSITORY = 'CELOCLI_STANDALONE_IMAGE_REPOSITORY', - CELOCLI_STANDALONE_IMAGE_TAG = 'CELOCLI_STANDALONE_IMAGE_TAG', - CELOSTATS_BANNED_ADDRESSES = 'CELOSTATS_BANNED_ADDRESSES', - CELOSTATS_FRONTEND_DOCKER_IMAGE_REPOSITORY = 'CELOSTATS_FRONTEND_DOCKER_IMAGE_REPOSITORY', - CELOSTATS_FRONTEND_DOCKER_IMAGE_TAG = 'CELOSTATS_FRONTEND_DOCKER_IMAGE_TAG', - CELOSTATS_RESERVED_ADDRESSES = 'CELOSTATS_RESERVED_ADDRESSES', - CELOSTATS_SERVER_DOCKER_IMAGE_REPOSITORY = 'CELOSTATS_SERVER_DOCKER_IMAGE_REPOSITORY', - CELOSTATS_SERVER_DOCKER_IMAGE_TAG = 'CELOSTATS_SERVER_DOCKER_IMAGE_TAG', - CELOSTATS_TRUSTED_ADDRESSES = 'CELOSTATS_TRUSTED_ADDRESSES', - CELOTOOL_CELOENV = 'CELOTOOL_CELOENV', - CELOTOOL_CONFIRMED = 'CELOTOOL_CONFIRMED', - CELOTOOL_DOCKER_IMAGE_REPOSITORY = 'CELOTOOL_DOCKER_IMAGE_REPOSITORY', - CELOTOOL_DOCKER_IMAGE_TAG = 'CELOTOOL_DOCKER_IMAGE_TAG', - CHAOS_TEST_DURATION = 'CHAOS_TEST_DURATION', - CHAOS_TEST_INTERVAL = 'CHAOS_TEST_INTERVAL', - CHAOS_TEST_KILL_INTERVAL = 'CHAOS_TEST_KILL_INTERVAL', - CHAOS_TEST_NETWORK_DELAY = 'CHAOS_TEST_NETWORK_DELAY', - CHAOS_TEST_NETWORK_JITTER = 'CHAOS_TEST_NETWORK_JITTER', - CHAOS_TEST_NETWORK_LOSS = 'CHAOS_TEST_NETWORK_LOSS', - CHAOS_TEST_NETWORK_RATE = 'CHAOS_TEST_NETWORK_RATE', - CHURRITO_BLOCK = 'CHURRITO_BLOCK', - CLUSTER_CREATION_FLAGS = 'CLUSTER_CREATION_FLAGS', - CLUSTER_DOMAIN_NAME = 'CLUSTER_DOMAIN_NAME', - CONSENSUS_TYPE = 'CONSENSUS_TYPE', - CONTEXTS = 'CONTEXTS', - DONUT_BLOCK = 'DONUT_BLOCK', - E2E_TESTS_FORCE_USE_MYCELO = 'E2E_TESTS_FORCE_USE_MYCELO', - ENV_TYPE = 'ENV_TYPE', - EPOCH = 'EPOCH', - ESPRESSO_BLOCK = 'ESPRESSO_BLOCK', - FAUCET_CUSD_WEI = 'FAUCET_CUSD_WEI', - FAUCET_GENESIS_ACCOUNTS = 'FAUCET_GENESIS_ACCOUNTS', - FAUCET_GENESIS_BALANCE = 'FAUCET_GENESIS_BALANCE', - FORNO_BANNED_CIDR = 'FORNO_BANNED_CIDR', - FORNO_DOMAINS = 'FORNO_DOMAINS', - FORNO_FULL_NODE_CONTEXTS = 'FORNO_FULL_NODE_CONTEXTS', - FORNO_VPC_NETWORK_NAME = 'FORNO_VPC_NETWORK_NAME', - FULL_NODE_READINESS_CHECK_BLOCK_AGE = 'FULL_NODE_READINESS_CHECK_BLOCK_AGE', - GENESIS_ACCOUNTS = 'GENESIS_ACCOUNTS', - GETH_ACCOUNT_SECRET = 'GETH_ACCOUNT_SECRET', - GETH_BOOTNODE_DOCKER_IMAGE_REPOSITORY = 'GETH_BOOTNODE_DOCKER_IMAGE_REPOSITORY', - GETH_BOOTNODE_DOCKER_IMAGE_TAG = 'GETH_BOOTNODE_DOCKER_IMAGE_TAG', - GETH_BOOTNODE_OVERWRITE_PKEY = 'GETH_BOOTNODE_OVERWRITE_PKEY', - GETH_DEBUG = 'GETH_DEBUG', - GETH_ENABLE_METRICS = 'GETH_ENABLE_METRICS', - GETH_NODE_DOCKER_IMAGE_REPOSITORY = 'GETH_NODE_DOCKER_IMAGE_REPOSITORY', - GETH_NODE_DOCKER_IMAGE_TAG = 'GETH_NODE_DOCKER_IMAGE_TAG', - GETH_NODES_SSD_DISKS = 'GETH_NODES_SSD_DISKS', - GETH_USE_MYCELO = 'GETH_USE_MYCELO', - GETH_MYCELO_COMMIT = 'GETH_MYCELO_COMMIT', - GETH_VERBOSITY = 'GETH_VERBOSITY', - GOOGLE_APPLICATION_CREDENTIALS = 'GOOGLE_APPLICATION_CREDENTIALS', - GRAFANA_CLOUD_PROJECT_ID = 'GRAFANA_CLOUD_PROJECT_ID', - GRAFANA_CLOUD_SECRET_NAME = 'GRAFANA_CLOUD_SECRET_NAME', - GRAFANA_CLOUD_SECRET_VERSION = 'GRAFANA_CLOUD_SECRET_VERSION', - GRAFANA_LOCAL_ADMIN_PASSWORD = 'GRAFANA_LOCAL_ADMIN_PASSWORD', - GRAFANA_LOCAL_OAUTH2_CLIENT_ID = 'GRAFANA_LOCAL_OAUTH2_CLIENT_ID', - GRAFANA_LOCAL_OAUTH2_CLIENT_SECRET = 'GRAFANA_LOCAL_OAUTH2_CLIENT_SECRET', - IN_MEMORY_DISCOVERY_TABLE = 'IN_MEMORY_DISCOVERY_TABLE', - ISTANBUL_REQUEST_TIMEOUT_MS = 'ISTANBUL_REQUEST_TIMEOUT_MS', - KUBECONFIG = 'KUBECONFIG', - KUBERNETES_CLUSTER_NAME = 'KUBERNETES_CLUSTER_NAME', - KUBERNETES_CLUSTER_ZONE = 'KUBERNETES_CLUSTER_ZONE', - LEADERBOARD_CREDENTIALS = 'LEADERBOARD_CREDENTIALS', - LEADERBOARD_DOCKER_IMAGE_REPOSITORY = 'LEADERBOARD_DOCKER_IMAGE_REPOSITORY', - LEADERBOARD_DOCKER_IMAGE_TAG = 'LEADERBOARD_DOCKER_IMAGE_TAG', - LEADERBOARD_SHEET = 'LEADERBOARD_SHEET', - LEADERBOARD_TOKEN = 'LEADERBOARD_TOKEN', - LOAD_TEST_CLIENTS = 'LOAD_TEST_CLIENTS', - LOAD_TEST_THREADS = 'LOAD_TEST_THREADS', - LOAD_TEST_GENESIS_BALANCE = 'LOAD_TEST_GENESIS_BALANCE', - LOAD_TEST_TX_DELAY_MS = 'LOAD_TEST_TX_DELAY_MS', - LOAD_TEST_USE_RANDOM_RECIPIENT = 'LOAD_TEST_USE_RANDOM_RECIPIENT', - LOKI_URL = 'LOKI_URL', - LOKI_KEY = 'LOKI_KEY', - LOKI_USERNAME = 'LOKI_USERNAME', - LOOKBACK = 'LOOKBACK', - MNEMONIC = 'MNEMONIC', - MOBILE_WALLET_PLAYSTORE_LINK = 'MOBILE_WALLET_PLAYSTORE_LINK', - MOCK_ORACLE_CRON_SCHEDULE = 'MOCK_ORACLE_CRON_SCHEDULE', - MOCK_ORACLE_DOCKER_IMAGE_REPOSITORY = 'MOCK_ORACLE_DOCKER_IMAGE_REPOSITORY', - MOCK_ORACLE_DOCKER_IMAGE_TAG = 'MOCK_ORACLE_DOCKER_IMAGE_TAG', - MOCK_ORACLE_GENESIS_BALANCE = 'MOCK_ORACLE_GENESIS_BALANCE', - NETWORK_ID = 'NETWORK_ID', - NODE_DISK_SIZE_GB = 'NODE_DISK_SIZE_GB', - ODIS_SIGNER_DOCKER_IMAGE_REPOSITORY = 'ODIS_SIGNER_DOCKER_IMAGE_REPOSITORY', - ODIS_SIGNER_DOCKER_IMAGE_TAG = 'ODIS_SIGNER_DOCKER_IMAGE_TAG', - ODIS_SIGNER_BLOCKCHAIN_PROVIDER = 'ODIS_SIGNER_BLOCKCHAIN_PROVIDER', - ODIS_SIGNER_DOMAINS_API_ENABLED = 'ODIS_SIGNER_DOMAINS_API_ENABLED', - ODIS_SIGNER_PNP_API_ENABLED = 'ODIS_SIGNER_PNP_API_ENABLED', - ORACLE_DOCKER_IMAGE_REPOSITORY = 'ORACLE_DOCKER_IMAGE_REPOSITORY', - ORACLE_DOCKER_IMAGE_TAG = 'ORACLE_DOCKER_IMAGE_TAG', - ORACLE_UNUSED_ORACLE_ADDRESSES = 'ORACLE_UNUSED_ORACLE_ADDRESSES', - ORACLE_FX_ADAPTERS_API_KEYS = 'ORACLE_FX_ADAPTERS_API_KEYS', - PRIVATE_NODE_DISK_SIZE_GB = 'PRIVATE_NODE_DISK_SIZE_GB', - PRIVATE_TX_NODES = 'PRIVATE_TX_NODES', - PROMETHEUS_DISABLE_STACKDRIVER_SIDECAR = 'PROMETHEUS_DISABLE_STACKDRIVER_SIDECAR', - PROMETHEUS_REMOTE_WRITE_PASSWORD = 'PROMETHEUS_REMOTE_WRITE_PASSWORD', - PROMETHEUS_REMOTE_WRITE_URL = 'PROMETHEUS_REMOTE_WRITE_URL', - PROMETHEUS_REMOTE_WRITE_USERNAME = 'PROMETHEUS_REMOTE_WRITE_USERNAME', - PROXIED_VALIDATORS = 'PROXIED_VALIDATORS', - PROXY_ROLLING_UPDATE_PARTITION = 'PROXY_ROLLING_UPDATE_PARTITION', - SECONDARIES_ROLLING_UPDATE_PARTITION = 'SECONDARIES_ROLLING_UPDATE_PARTITION', - STATIC_IPS_FOR_GETH_NODES = 'STATIC_IPS_FOR_GETH_NODES', - TESTNET_PROJECT_NAME = 'TESTNET_PROJECT_NAME', - TIMESTAMP = 'TIMESTAMP', - TRANSACTION_METRICS_EXPORTER_BLOCK_INTERVAL = 'TRANSACTION_METRICS_EXPORTER_BLOCK_INTERVAL', - TRANSACTION_METRICS_EXPORTER_DOCKER_IMAGE_REPOSITORY = 'TRANSACTION_METRICS_EXPORTER_DOCKER_IMAGE_REPOSITORY', - TRANSACTION_METRICS_EXPORTER_DOCKER_IMAGE_TAG = 'TRANSACTION_METRICS_EXPORTER_DOCKER_IMAGE_TAG', - TRANSACTION_METRICS_EXPORTER_FROM_BLOCK = 'TRANSACTION_METRICS_EXPORTER_FROM_BLOCK', - TRANSACTION_METRICS_EXPORTER_SUFFIX = 'TRANSACTION_METRICS_EXPORTER_SUFFIX', - TRANSACTION_METRICS_EXPORTER_TO_BLOCK = 'TRANSACTION_METRICS_EXPORTER_TO_BLOCK', - TRANSACTION_METRICS_EXPORTER_WATCH_ADDRESS = 'TRANSACTION_METRICS_EXPORTER_WATCH_ADDRESS', - TX_NODES = 'TX_NODES', - TX_NODES_PRIVATE_ROLLING_UPDATE_PARTITION = 'TX_NODES_PRIVATE_ROLLING_UPDATE_PARTITION', - TX_NODES_ROLLING_UPDATE_PARTITION = 'TX_NODES_ROLLING_UPDATE_PARTITION', - USE_GSTORAGE_DATA = 'USE_GSTORAGE_DATA', - VALIDATOR_GENESIS_BALANCE = 'VALIDATOR_GENESIS_BALANCE', - VALIDATOR_PROXY_COUNTS = 'VALIDATOR_PROXY_COUNTS', - VALIDATOR_ZERO_GENESIS_BALANCE = 'VALIDATOR_ZERO_GENESIS_BALANCE', - VALIDATORS = 'VALIDATORS', - VALIDATORS_ROLLING_UPDATE_PARTITION = 'VALIDATORS_ROLLING_UPDATE_PARTITION', - VOTING_BOT_BALANCE = 'VOTING_BOT_BALANCE', - VOTING_BOT_CHANGE_BASELINE = 'VOTING_BOT_CHANGE_BASELINE', - VOTING_BOT_CRON_SCHEDULE = 'VOTING_BOT_CRON_SCHEDULE', - VOTING_BOT_EXPLORE_PROBABILITY = 'VOTING_BOT_EXPLORE_PROBABILITY', - VOTING_BOT_SCORE_SENSITIVITY = 'VOTING_BOT_SCORE_SENSITIVITY', - VOTING_BOT_WAKE_PROBABILITY = 'VOTING_BOT_WAKE_PROBABILITY', - VOTING_BOTS = 'VOTING_BOTS', - WALLET_CONNECT_IMAGE_REPOSITORY = 'WALLET_CONNECT_IMAGE_REPOSITORY', - WALLET_CONNECT_IMAGE_TAG = 'WALLET_CONNECT_IMAGE_TAG', - WALLET_CONNECT_REDIS_CLUSTER_ENABLED = 'WALLET_CONNECT_REDIS_CLUSTER_ENABLED', - WALLET_CONNECT_REDIS_CLUSTER_USEPASSWORD = 'WALLET_CONNECT_REDIS_CLUSTER_USEPASSWORD', -} - -/** - * Dynamic env vars are env var names that can be dynamically constructed - * using templates. - */ - -export enum DynamicEnvVar { - AZURE_SUBSCRIPTION_ID = '{{ context }}_AZURE_SUBSCRIPTION_ID', - AZURE_KUBERNETES_RESOURCE_GROUP = '{{ context }}_AZURE_KUBERNETES_RESOURCE_GROUP', - AZURE_REGION_NAME = '{{ context }}_AZURE_REGION_NAME', - AZURE_TENANT_ID = '{{ context }}_AZURE_TENANT_ID', - FULL_NODES_COUNT = '{{ context }}_FULL_NODES_COUNT', - FULL_NODES_DISK_SIZE = '{{ context }}_FULL_NODES_DISK_SIZE', - FULL_NODES_GETH_GC_MODE = '{{ context }}_FULL_NODES_GETH_GC_MODE', - FULL_NODES_NODEKEY_DERIVATION_STRING = '{{ context }}_FULL_NODES_NODEKEY_DERIVATION_STRING', - FULL_NODES_ROLLING_UPDATE_PARTITION = '{{ context }}_FULL_NODES_ROLLING_UPDATE_PARTITION', - FULL_NODES_RPC_API_METHODS = '{{ context }}_FULL_NODES_RPC_API_METHODS', - FULL_NODES_STATIC_NODES_FILE_SUFFIX = '{{ context }}_FULL_NODES_STATIC_NODES_FILE_SUFFIX', - FULL_NODES_USE_GSTORAGE_DATA = '{{ context }}_FULL_NODES_USE_GSTORAGE_DATA', - FULL_NODES_WS_PORT = '{{ context }}_FULL_NODES_WS_PORT', - GCP_PROJECT_NAME = '{{ context }}_GCP_PROJECT_NAME', - GCP_ZONE = '{{ context }}_GCP_ZONE', - KUBERNETES_CLUSTER_NAME = '{{ context }}_KUBERNETES_CLUSTER_NAME', - ORACLE_ADDRESS_AZURE_KEY_VAULTS = '{{ context }}_{{ currencyPair }}_ORACLE_ADDRESS_AZURE_KEY_VAULTS', - ORACLE_ADDRESSES_FROM_MNEMONIC_COUNT = '{{ context }}_{{ currencyPair}}_ORACLE_ADDRESSES_FROM_MNEMONIC_COUNT', - ODIS_SIGNER_BLOCKCHAIN_API_KEY = '{{ context }}_ODIS_SIGNER_BLOCKCHAIN_API_KEY', - ODIS_SIGNER_AZURE_KEYVAULT_NAME = '{{ context }}_ODIS_SIGNER_AZURE_KEYVAULT_NAME', - ODIS_SIGNER_AZURE_KEYVAULT_PNP_KEY_NAME_BASE = '{{ context }}_ODIS_SIGNER_AZURE_KEYVAULT_PNP_KEY_NAME_BASE', - ODIS_SIGNER_AZURE_KEYVAULT_PNP_KEY_LATEST_VERSION = '{{ context }}_ODIS_SIGNER_AZURE_KEYVAULT_PNP_KEY_LATEST_VERSION', - ODIS_SIGNER_AZURE_KEYVAULT_DOMAINS_KEY_NAME_BASE = '{{ context }}_ODIS_SIGNER_AZURE_KEYVAULT_DOMAINS_KEY_NAME_BASE', - ODIS_SIGNER_AZURE_KEYVAULT_DOMAINS_KEY_LATEST_VERSION = '{{ context }}_ODIS_SIGNER_AZURE_KEYVAULT_DOMAINS_KEY_LATEST_VERSION', - ODIS_SIGNER_DOMAINS_API_ENABLED = '{{ context }}_ODIS_SIGNER_DOMAINS_API_ENABLED', - ODIS_SIGNER_PHONE_NUMBER_PRIVACY_API_ENABLED = '{{ context }}_ODIS_SIGNER_PNP_API_ENABLED', - ODIS_SIGNER_DB_HOST = '{{ context }}_ODIS_SIGNER_DB_HOST', - ODIS_SIGNER_DB_PORT = '{{ context }}_ODIS_SIGNER_DB_PORT', - ODIS_SIGNER_DB_USERNAME = '{{ context }}_ODIS_SIGNER_DB_USERNAME', - ODIS_SIGNER_DB_PASSWORD = '{{ context }}_ODIS_SIGNER_DB_PASSWORD', - ODIS_SIGNER_LOG_LEVEL = 'ODIS_SIGNER_LOG_LEVEL', - ODIS_SIGNER_LOG_FORMAT = 'ODIS_SIGNER_LOG_FORMAT', - ODIS_NETWORK = '{{ context }}_ODIS_NETWORK', - PROM_SIDECAR_DISABLED = '{{ context }}_PROM_SIDECAR_DISABLED', - PROM_SIDECAR_GCP_PROJECT = '{{ context }}_PROM_SIDECAR_GCP_PROJECT', - PROM_SIDECAR_GCP_REGION = '{{ context }}_PROM_SIDECAR_GCP_REGION', - PROM_SCRAPE_JOB_NAME = '{{ context }}_PROM_SCRAPE_JOB_NAME', - PROM_SCRAPE_LABELS = '{{ context }}_PROM_SCRAPE_LABELS', - PROM_SCRAPE_TARGETS = '{{ context }}_PROM_SCRAPE_TARGETS', - PROM_REMOTE_WRITE_PASSWORD = '{{ context }}_PROM_REMOTE_WRITE_PASSWORD', - PROM_REMOTE_WRITE_USERNAME = '{{ context }}_PROM_REMOTE_WRITE_USERNAME', - PROM_REMOTE_WRITE_URL = '{{ context }}_PROM_REMOTE_WRITE_URL', -} - -export enum envTypes { - DEVELOPMENT = 'development', - INTEGRATION = 'integration', - STAGING = 'staging', - PRODUCTION = 'production', -} - -export function fetchEnv(env: string, customErrorMessage?: string): string { - if (process.env[env] === undefined) { - console.error( - customErrorMessage !== undefined ? customErrorMessage : `Requires variable ${env} to be set` - ) - process.exit(1) - } - return process.env[env]! -} - -export const monorepoRoot = path.resolve(process.cwd(), './../..') -export const genericEnvFilePath = path.resolve(monorepoRoot, '.env') - -export function getEnvFile(celoEnv: string, envBegining: string = '') { - const filePath: string = path.resolve(monorepoRoot, `.env${envBegining}.${celoEnv}`) - if (existsSync(filePath)) { - return filePath - } else { - return `${genericEnvFilePath}${envBegining}` - } -} - -export function fetchEnvOrFallback(env: string, fallback: string) { - return process.env[env] || fallback -} - -export function validateAndSwitchToEnv(celoEnv: string) { - if (!isValidCeloEnv(celoEnv)) { - console.error( - `${celoEnv} does not conform to specification ^[a-z][a-z0-9]*$. We need to it to conform to that regex because it is used as URL components, Kubernetes namespace names, keys in configuration objects, etc.` - ) - process.exit(1) - } - - const envResult = config({ path: getEnvFile(celoEnv) }) - const envMemonicResult = config({ path: getEnvFile(celoEnv, '.mnemonic') }) - - const convinedParsedResults: { [s: string]: string } = {} - - for (const result of [envResult, envMemonicResult]) { - if (result.error) { - throw result.error - } - Object.assign(convinedParsedResults, result.parsed) - } - - // Override any env variables that weren't set by config. - if (convinedParsedResults) { - for (const k of Object.keys(convinedParsedResults)) { - process.env[k] = convinedParsedResults[k] - } - } - - process.env.CELOTOOL_CELOENV = celoEnv -} - -export function isProduction() { - return fetchEnv(envVar.ENV_TYPE).toLowerCase() === envTypes.PRODUCTION -} - -export function isValidCeloEnv(celoEnv: string) { - return new RegExp('^[a-z][a-z0-9]*$').test(celoEnv) -} - -export function getDynamicEnvVarValue( - dynamicEnvVar: DynamicEnvVar, - templateValues: any, - defaultValue?: string -) { - const envVarName = getDynamicEnvVarName(dynamicEnvVar, templateValues) - return defaultValue !== undefined - ? fetchEnvOrFallback(envVarName, defaultValue) - : fetchEnv(envVarName) -} - -/** - * Replaces a dynamic env var's template strings with values from an object. - * For each template value that was given, it replaces the corresponding template - * string. - * For example, if the DynamicEnvVar is: - * '{{ thing }}-is-{{noun}}!!!' - * and templateValues is the object: - * { thing: 'celo', noun: 'cool' } - * then we can expect this function to return the string: - * 'celo-is-cool!!!' - * Returns the name of the env var. - */ -export function getDynamicEnvVarName(dynamicEnvVar: DynamicEnvVar, templateValues: any) { - return Object.keys(templateValues).reduce((agg: string, templateKey: string) => { - return agg.replace(new RegExp(`{{ *${templateKey} *}}`, 'g'), templateValues[templateKey]) - }, dynamicEnvVar) -} - -function celoEnvMiddleware(argv: CeloEnvArgv) { - validateAndSwitchToEnv(argv.celoEnv) -} - -export async function doCheckOrPromptIfStagingOrProduction() { - if (process.env.CELOTOOL_CONFIRMED !== 'true' && isProduction()) { - await confirmAction( - `You are about to apply a possibly irreversible action on a production env: ${process.env.CELOTOOL_CELOENV}. Are you sure?` - ) - process.env.CELOTOOL_CONFIRMED = 'true' - } -} - -export async function confirmAction( - message: string, - onConfirmFailed?: () => Promise, - onConfirmSuccess?: () => Promise -) { - const response = await prompts({ - type: 'confirm', - name: 'confirmation', - message: `${message} (y/n)`, - }) - if (!response.confirmation) { - console.info('Aborting due to user response') - if (onConfirmFailed) { - await onConfirmFailed() - } - process.exit(0) - } - if (onConfirmSuccess) { - await onConfirmSuccess() - } -} - -export function addCeloEnvMiddleware(argv: yargs.Argv) { - return ( - argv - .option('celo-env', { - demand: 'Please specify a valid CELO_ENV', - alias: 'e', - required: true, - description: 'the environment in which you want to execute this command', - }) - // @ts-ignore Since we pass it right above, we know that celoEnv will be there at runtime - .middleware([celoEnvMiddleware]) - ) -} diff --git a/packages/celotool/src/lib/fullnodes.ts b/packages/celotool/src/lib/fullnodes.ts deleted file mode 100644 index bdbbae673da..00000000000 --- a/packages/celotool/src/lib/fullnodes.ts +++ /dev/null @@ -1,186 +0,0 @@ -import stringHash from 'string-hash' -import { - getAksClusterConfig, - getCloudProviderFromContext, - getContextDynamicEnvVarValues, - getGCPClusterConfig, -} from './context-utils' -import { DynamicEnvVar, envVar, fetchEnv, getDynamicEnvVarValue } from './env-utils' -import { CloudProvider } from './k8s-cluster/base' -import { AksFullNodeDeploymentConfig } from './k8s-fullnode/aks' -import { BaseFullNodeDeploymentConfig } from './k8s-fullnode/base' -import { GCPFullNodeDeploymentConfig } from './k8s-fullnode/gcp' -import { getFullNodeDeployer } from './k8s-fullnode/utils' -import { uploadStaticNodesToGoogleStorage } from './testnet-utils' - -/** - * Env vars corresponding to values required for a BaseFullNodeDeploymentConfig - */ -const contextFullNodeDeploymentEnvVars: { - [k in keyof BaseFullNodeDeploymentConfig]: DynamicEnvVar -} = { - diskSizeGb: DynamicEnvVar.FULL_NODES_DISK_SIZE, - replicas: DynamicEnvVar.FULL_NODES_COUNT, - rollingUpdatePartition: DynamicEnvVar.FULL_NODES_ROLLING_UPDATE_PARTITION, - rpcApis: DynamicEnvVar.FULL_NODES_RPC_API_METHODS, - gcMode: DynamicEnvVar.FULL_NODES_GETH_GC_MODE, - wsPort: DynamicEnvVar.FULL_NODES_WS_PORT, - useGstoreData: DynamicEnvVar.FULL_NODES_USE_GSTORAGE_DATA, -} - -/** - * Maps each cloud provider to the correct function to get the appropriate full - * node deployment config - */ -const deploymentConfigGetterByCloudProvider: { - [key in CloudProvider]: (context: string) => BaseFullNodeDeploymentConfig -} = { - [CloudProvider.AZURE]: getAksFullNodeDeploymentConfig, - [CloudProvider.GCP]: getGCPFullNodeDeploymentConfig, -} - -/** - * Gets the appropriate cloud platform's full node deployer given the celoEnv - * and context. - */ -export function getFullNodeDeployerForContext( - celoEnv: string, - context: string, - generateNodeKeys: boolean, - createNEG: boolean -) { - const cloudProvider: CloudProvider = getCloudProviderFromContext(context) - let deploymentConfig = deploymentConfigGetterByCloudProvider[cloudProvider](context) - if (generateNodeKeys) { - deploymentConfig = { - ...deploymentConfig, - nodeKeyGenerationInfo: { - mnemonic: fetchEnv(envVar.MNEMONIC), - derivationIndex: stringHash(getNodeKeyDerivationString(context)), - }, - } - } - if (createNEG) { - if (cloudProvider !== CloudProvider.GCP) { - throw Error('Cannot create NEG for cloud providers other than GCP') - } - deploymentConfig = { - ...deploymentConfig, - createNEG: true, - } as unknown as GCPFullNodeDeploymentConfig // make typescript happy - } - return getFullNodeDeployer(cloudProvider, celoEnv, deploymentConfig) -} - -/** - * Uses the appropriate cloud platform's full node deployer to install the full - * node chart. - */ -export async function installFullNodeChart( - celoEnv: string, - context: string, - staticNodes: boolean = false, - createNEG: boolean = false -) { - const deployer = getFullNodeDeployerForContext(celoEnv, context, staticNodes, createNEG) - const enodes = await deployer.installChart(context) - if (enodes) { - await uploadStaticNodeEnodes(celoEnv, context, enodes) - } -} - -/** - * Uses the appropriate cloud platform's full node deployer to upgrade the full - * node chart. - */ -export async function upgradeFullNodeChart( - celoEnv: string, - context: string, - reset: boolean, - generateNodeKeys: boolean = false, - createNEG: boolean = false -) { - const deployer = getFullNodeDeployerForContext(celoEnv, context, generateNodeKeys, createNEG) - const enodes = await deployer.upgradeChart(context, reset) - if (enodes) { - await uploadStaticNodeEnodes(celoEnv, context, enodes) - } -} - -/** - * Uses the appropriate cloud platform's full node deployer to remove the full - * node chart. - */ -export async function removeFullNodeChart(celoEnv: string, context: string) { - const deployer = getFullNodeDeployerForContext(celoEnv, context, false, false) - await deployer.removeChart() - // Remove any previous static nodes - await uploadStaticNodeEnodes(celoEnv, context, []) -} - -function uploadStaticNodeEnodes(celoEnv: string, context: string, enodes: string[]) { - const suffix = getStaticNodesFileSuffix(context) - // Use mainnet instead of rc1 - const env = celoEnv === 'rc1' ? 'mainnet' : celoEnv - return uploadStaticNodesToGoogleStorage(`${env}.${suffix}`, enodes) -} - -function getNodeKeyDerivationString(context: string) { - return getDynamicEnvVarValue(DynamicEnvVar.FULL_NODES_NODEKEY_DERIVATION_STRING, { - context, - }) -} - -function getStaticNodesFileSuffix(context: string) { - return getDynamicEnvVarValue(DynamicEnvVar.FULL_NODES_STATIC_NODES_FILE_SUFFIX, { - context, - }) -} - -/** - * Returns the BaseFullNodeDeploymentConfig that is not specific to a cloud - * provider for a context. - */ -function getFullNodeDeploymentConfig(context: string): BaseFullNodeDeploymentConfig { - const fullNodeDeploymentEnvVarValues = getContextDynamicEnvVarValues( - contextFullNodeDeploymentEnvVars, - context - ) - - const fullNodeDeploymentConfig: BaseFullNodeDeploymentConfig = { - diskSizeGb: parseInt(fullNodeDeploymentEnvVarValues.diskSizeGb, 10), - replicas: parseInt(fullNodeDeploymentEnvVarValues.replicas, 10), - rollingUpdatePartition: parseInt(fullNodeDeploymentEnvVarValues.rollingUpdatePartition, 10), - rpcApis: fullNodeDeploymentEnvVarValues.rpcApis, - gcMode: fullNodeDeploymentEnvVarValues.gcMode, - wsPort: parseInt(fullNodeDeploymentEnvVarValues.wsPort, 10), - useGstoreData: fullNodeDeploymentEnvVarValues.useGstoreData, - } - return fullNodeDeploymentConfig -} - -/** - * For a given context, returns the appropriate AksFullNodeDeploymentConfig - */ -function getAksFullNodeDeploymentConfig(context: string): AksFullNodeDeploymentConfig { - const fullNodeDeploymentConfig: BaseFullNodeDeploymentConfig = - getFullNodeDeploymentConfig(context) - return { - ...fullNodeDeploymentConfig, - clusterConfig: getAksClusterConfig(context), - } -} - -/** - * For a given context, returns the appropriate getGCPFullNodeDeploymentConfig - */ -function getGCPFullNodeDeploymentConfig(context: string): GCPFullNodeDeploymentConfig { - const fullNodeDeploymentConfig: BaseFullNodeDeploymentConfig = - getFullNodeDeploymentConfig(context) - return { - ...fullNodeDeploymentConfig, - clusterConfig: getGCPClusterConfig(context), - // Default value - createNEG: false, - } -} diff --git a/packages/celotool/src/lib/gcloud_utils.ts b/packages/celotool/src/lib/gcloud_utils.ts deleted file mode 100644 index d617be46e5d..00000000000 --- a/packages/celotool/src/lib/gcloud_utils.ts +++ /dev/null @@ -1,151 +0,0 @@ -import { SecretManagerServiceClient } from '@google-cloud/secret-manager' -import { execCmd } from './cmd-utils' -import { DynamicEnvVar, envVar, fetchEnv, getDynamicEnvVarValue } from './env-utils' - -export async function getCurrentGcloudAccount() { - const [output] = await execCmd('gcloud config get-value account') - if (output.trim() === '') { - throw new Error('No Gcloud account set') - } - return output.trim() -} - -async function ensureGcloudInstalled() { - try { - await execCmd(`gcloud version`) - } catch (error) { - console.error('Gcloud is not installed') - console.error(error) - process.exit(1) - } -} - -export async function ensureAuthenticatedGcloudAccount() { - await ensureGcloudInstalled() - try { - await getCurrentGcloudAccount() - } catch (error) { - // Try authenticating with a Keyfile under GOOGLE_APPLICATION_CREDENTIALS - - console.debug('Authenticating gcloud with keyfile') - await execCmd( - `gcloud auth activate-service-account --key-file=${fetchEnv( - envVar.GOOGLE_APPLICATION_CREDENTIALS, - 'gcloud is not authenticated, and thus needs GOOGLE_APPLICATION_CREDENTIALS for automatic authentication' - )}` - ) - } - - try { - await getCurrentGcloudAccount() - } catch (error) { - console.error('Could not setup gcloud with authentication') - process.exit(1) - } -} - -export async function linkSAForWorkloadIdentity(celoEnv: string, context: string) { - if ( - getDynamicEnvVarValue( - DynamicEnvVar.FULL_NODES_USE_GSTORAGE_DATA, - { context }, - 'false' - ).toLowerCase() === 'true' - ) { - await execCmd( - `gcloud iam service-accounts add-iam-policy-binding --project ${fetchEnv( - envVar.TESTNET_PROJECT_NAME - )} \ - --role roles/iam.workloadIdentityUser \ - --member "serviceAccount:${fetchEnv( - envVar.TESTNET_PROJECT_NAME - )}.svc.id.goog[${celoEnv}/gcloud-storage-access]" chaindata-download@${fetchEnv( - envVar.TESTNET_PROJECT_NAME - )}.iam.gserviceaccount.com` - ) - } -} - -export async function delinkSAForWorkloadIdentity(celoEnv: string, context: string) { - if ( - getDynamicEnvVarValue( - DynamicEnvVar.FULL_NODES_USE_GSTORAGE_DATA, - { context }, - 'false' - ).toLowerCase() === 'true' - ) { - await execCmd( - `gcloud iam service-accounts remove-iam-policy-binding --project ${fetchEnv( - envVar.TESTNET_PROJECT_NAME - )} \ - --role roles/iam.workloadIdentityUser \ - --member "serviceAccount:${fetchEnv( - envVar.TESTNET_PROJECT_NAME - )}.svc.id.goog[${celoEnv}/gcloud-storage-access]" chaindata-download@${fetchEnv( - envVar.TESTNET_PROJECT_NAME - )}.iam.gserviceaccount.com` - ) - } -} - -export async function kubectlAnnotateKSA(celoEnv: string, context: string) { - if ( - getDynamicEnvVarValue( - DynamicEnvVar.FULL_NODES_USE_GSTORAGE_DATA, - { context }, - 'false' - ).toLowerCase() === 'true' - ) { - await execCmd( - `kubectl annotate serviceaccount \ - --namespace ${celoEnv} \ - gcloud-storage-access \ - --overwrite \ - iam.gke.io/gcp-service-account=chaindata-download@${fetchEnv( - envVar.TESTNET_PROJECT_NAME - )}.iam.gserviceaccount.com` - ) - } -} - -export async function removeKubectlAnnotateKSA(celoEnv: string, context: string) { - if ( - getDynamicEnvVarValue( - DynamicEnvVar.FULL_NODES_USE_GSTORAGE_DATA, - { context }, - 'false' - ).toLowerCase() === 'true' - ) { - await execCmd( - `kubectl annotate serviceaccount \ - --namespace ${celoEnv} \ - gcloud-storage-access \ - iam.gke.io/gcp-service-account=chaindata-download@${fetchEnv( - envVar.TESTNET_PROJECT_NAME - )}.iam.gserviceaccount.com-` - ) - } -} - -export async function accessSecretVersion( - projectId: string, - secretName: string, - secretVersion: string -) { - try { - const client = new SecretManagerServiceClient() - const [version] = await client.accessSecretVersion({ - name: `projects/${projectId}/secrets/${secretName}/versions/${secretVersion}`, - }) - - const privateKey = version?.payload?.data?.toString()! - - if (!privateKey) { - throw new Error('Key is empty or undefined') - } - - return privateKey - } catch (error) { - console.info('Error retrieving key') - } -} diff --git a/packages/celotool/src/lib/generate_utils.ts b/packages/celotool/src/lib/generate_utils.ts deleted file mode 100644 index 312e35b9283..00000000000 --- a/packages/celotool/src/lib/generate_utils.ts +++ /dev/null @@ -1,574 +0,0 @@ -// @ts-ignore -import * as bls12377js from '@celo/bls12377js' -import { blsPrivateKeyToProcessedPrivateKey } from '@celo/cryptographic-utils/lib/bls' -import BigNumber from 'bignumber.js' -import { BIP32Factory, BIP32Interface } from 'bip32' -import * as bip39 from 'bip39' -import fs from 'fs' -import { merge, range, repeat } from 'lodash' -import { tmpdir } from 'os' -import path from 'path' -import * as rlp from 'rlp' -import { MyceloGenesisConfig } from 'src/lib/interfaces/mycelo-genesis-config' -import { CurrencyPair } from 'src/lib/k8s-oracle/base' -import * as ecc from 'tiny-secp256k1' -import Web3 from 'web3' -import { spawnCmd, spawnCmdWithExitOnFailure } from './cmd-utils' -import { envVar, fetchEnv, fetchEnvOrFallback, monorepoRoot } from './env-utils' -import { - CONTRACT_OWNER_STORAGE_LOCATION, - GENESIS_MSG_HASH, - GETH_CONFIG_OLD, - ISTANBUL_MIX_HASH, - REGISTRY_ADDRESS, - TEMPLATE, -} from './genesis_constants' -import { getIndexForLoadTestThread } from './geth' -import { GenesisConfig } from './interfaces/genesis-config' -import { ensure0x, strip0x } from './utils' - -const bip32 = BIP32Factory(ecc) - -export enum AccountType { - VALIDATOR = 0, - LOAD_TESTING_ACCOUNT = 1, - TX_NODE = 2, - BOOTNODE = 3, - FAUCET = 4, - ATTESTATION = 5, - PRICE_ORACLE = 6, - PROXY = 7, - ATTESTATION_BOT = 8, - VOTING_BOT = 9, - TX_NODE_PRIVATE = 10, - VALIDATOR_GROUP = 11, - ADMIN = 12, - TX_FEE_RECIPIENT = 13, -} - -export enum ConsensusType { - CLIQUE = 'clique', - ISTANBUL = 'istanbul', -} - -export interface Validator { - address: string - blsPublicKey: string - balance?: string -} - -export interface AccountAndBalance { - address: string - balance: string -} - -export const MNEMONIC_ACCOUNT_TYPE_CHOICES = [ - 'validator', - 'load_testing', - 'tx_node', - 'bootnode', - 'faucet', - 'attestation', - 'price_oracle', - 'proxy', - 'attestation_bot', - 'voting_bot', - 'tx_node_private', - 'validator_group', - 'admin', - 'tx_fee_recipient', -] - -export const add0x = (str: string) => { - return '0x' + str -} - -export const coerceMnemonicAccountType = (raw: string): AccountType => { - const index = MNEMONIC_ACCOUNT_TYPE_CHOICES.indexOf(raw) - if (index === -1) { - throw new Error('Invalid mnemonic account type') - } - return index -} - -export const generatePrivateKey = (mnemonic: string, accountType: AccountType, index: number) => { - return generatePrivateKeyWithDerivations(mnemonic, [accountType, index]) -} - -export const generateOraclePrivateKey = ( - mnemonic: string, - currencyPair: CurrencyPair, - index: number -) => { - let derivationPath: number[] - if (currencyPair === 'CELOUSD') { - // For backwards compatibility we don't add currencyPair to - // the derivation path for CELOUSD - derivationPath = [AccountType.PRICE_ORACLE, index] - } else { - // Deterministically convert the currency pair string to a path segment - // keccak(currencyPair) modulo 2^31 - const currencyDerivation = new BigNumber(Web3.utils.keccak256(currencyPair), 16) - .mod(2 ** 31) - .toNumber() - derivationPath = [AccountType.PRICE_ORACLE, currencyDerivation, index] - } - - return generatePrivateKeyWithDerivations(mnemonic, derivationPath) -} - -export const generatePrivateKeyWithDerivations = (mnemonic: string, derivations: number[]) => { - const seed = bip39.mnemonicToSeedSync(mnemonic) - const node = bip32.fromSeed(seed) - const newNode = derivations.reduce((n: BIP32Interface, derivation: number) => { - return n.derive(derivation) - }, node) - return newNode.privateKey!.toString('hex') -} - -export const generatePublicKey = (mnemonic: string, accountType: AccountType, index: number) => { - return privateKeyToPublicKey(generatePrivateKey(mnemonic, accountType, index)) -} - -export const generateAddress = (mnemonic: string, accountType: AccountType, index: number) => - privateKeyToAddress(generatePrivateKey(mnemonic, accountType, index)) - -export const privateKeyToPublicKey = (privateKey: string): string => { - // NOTE: elliptic is disabled elsewhere in this library to prevent - // accidental signing of truncated messages. - // eslint-disable-next-line:import-blacklist - const EC = require('elliptic').ec - const ec = new EC('secp256k1') - const ecPrivateKey = ec.keyFromPrivate(Buffer.from(privateKey, 'hex')) - const ecPublicKey: string = ecPrivateKey.getPublic('hex') - return ecPublicKey.slice(2) -} - -export const privateKeyToAddress = (privateKey: string) => { - // @ts-ignore - return new Web3.modules.Eth().accounts.privateKeyToAccount(ensure0x(privateKey)).address -} - -export const privateKeyToStrippedAddress = (privateKey: string) => - strip0x(privateKeyToAddress(privateKey)) - -const validatorZeroBalance = () => - fetchEnvOrFallback(envVar.VALIDATOR_ZERO_GENESIS_BALANCE, '103010030000000000000000000') // 103,010,030 CG -const validatorBalance = () => - fetchEnvOrFallback(envVar.VALIDATOR_GENESIS_BALANCE, '10011000000000000000000') // 10,011 CG -const faucetBalance = () => - fetchEnvOrFallback(envVar.FAUCET_GENESIS_BALANCE, '10011000000000000000000') // 10,011 CG -const oracleBalance = () => - fetchEnvOrFallback(envVar.MOCK_ORACLE_GENESIS_BALANCE, '100000000000000000000') // 100 CG -const votingBotBalance = () => - fetchEnvOrFallback(envVar.VOTING_BOT_BALANCE, '10000000000000000000000') // 10,000 CG - -export const getPrivateKeysFor = (accountType: AccountType, mnemonic: string, n: number) => - range(0, n).map((i) => generatePrivateKey(mnemonic, accountType, i)) - -export const getOraclePrivateKeysFor = (currencyPair: CurrencyPair, mnemonic: string, n: number) => - range(0, n).map((i) => generateOraclePrivateKey(mnemonic, currencyPair, i)) - -export const getAddressesFor = (accountType: AccountType, mnemonic: string, n: number) => - getPrivateKeysFor(accountType, mnemonic, n).map(privateKeyToAddress) - -export const getStrippedAddressesFor = (accountType: AccountType, mnemonic: string, n: number) => - getAddressesFor(accountType, mnemonic, n).map(strip0x) - -export const getValidatorsInformation = (mnemonic: string, n: number): Validator[] => { - return getPrivateKeysFor(AccountType.VALIDATOR, mnemonic, n).map((key, i) => { - const blsKeyBytes = blsPrivateKeyToProcessedPrivateKey(key) - return { - address: strip0x(privateKeyToAddress(key)), - blsPublicKey: bls12377js.BLS.privateToPublicBytes(blsKeyBytes).toString('hex'), - balance: i === 0 ? validatorZeroBalance() : validatorBalance(), - } - }) -} - -export const getAddressFromEnv = (accountType: AccountType, n: number) => { - const mnemonic = fetchEnv(envVar.MNEMONIC) - const privateKey = generatePrivateKey(mnemonic, accountType, n) - return privateKeyToAddress(privateKey) -} - -const getFaucetedAccountsFor = ( - accountType: AccountType, - mnemonic: string, - n: number, - balance: string -) => { - return getStrippedAddressesFor(accountType, mnemonic, n).map((address) => ({ - address, - balance, - })) -} - -const getFaucetedAccountsForLoadTest = ( - accountType: AccountType, - mnemonic: string, - clients: number, - threads: number, - balance: string -) => { - const addresses: string[] = [] - for (const podIndex of range(0, clients)) { - for (const threadIndex of range(0, threads)) { - const index = getIndexForLoadTestThread(podIndex, threadIndex) - addresses.push(strip0x(generateAddress(mnemonic, accountType, parseInt(`${index}`, 10)))) - } - } - return addresses.map((address) => ({ - address, - balance, - })) -} - -export const getFaucetedAccounts = (mnemonic: string) => { - const numFaucetAccounts = parseInt(fetchEnvOrFallback(envVar.FAUCET_GENESIS_ACCOUNTS, '0'), 10) - const faucetAccounts = getFaucetedAccountsFor( - AccountType.FAUCET, - mnemonic, - numFaucetAccounts, - faucetBalance() - ) - - const numLoadTestAccounts = parseInt(fetchEnvOrFallback(envVar.LOAD_TEST_CLIENTS, '0'), 10) - const numLoadTestThreads = parseInt(fetchEnvOrFallback(envVar.LOAD_TEST_THREADS, '0'), 10) - - const loadTestAccounts = getFaucetedAccountsForLoadTest( - AccountType.LOAD_TESTING_ACCOUNT, - mnemonic, - numLoadTestAccounts, - numLoadTestThreads, - faucetBalance() - ) - - const oracleAccounts = getFaucetedAccountsFor( - AccountType.PRICE_ORACLE, - mnemonic, - 1, - oracleBalance() - ) - - const numVotingBotAccounts = parseInt(fetchEnvOrFallback(envVar.VOTING_BOTS, '0'), 10) - const votingBotAccounts = getFaucetedAccountsFor( - AccountType.VOTING_BOT, - mnemonic, - numVotingBotAccounts, - votingBotBalance() - ) - - return [...faucetAccounts, ...loadTestAccounts, ...oracleAccounts, ...votingBotAccounts] -} - -const hardForkActivationBlock = (key: string) => { - const value = fetchEnvOrFallback(key, '') - if (value === '') { - return undefined - } else { - return parseInt(value, 10) - } -} - -export const generateGenesisFromEnv = (enablePetersburg: boolean = true) => { - const mnemonic = fetchEnv(envVar.MNEMONIC) - const validatorEnv = parseInt(fetchEnv(envVar.VALIDATORS), 10) - const genesisAccountsEnv = fetchEnvOrFallback(envVar.GENESIS_ACCOUNTS, '') - const validators = getValidatorsInformation(mnemonic, validatorEnv) - - const consensusType = fetchEnv(envVar.CONSENSUS_TYPE) as ConsensusType - - if (![ConsensusType.CLIQUE, ConsensusType.ISTANBUL].includes(consensusType)) { - console.error('Unsupported CONSENSUS_TYPE') - process.exit(1) - } - - const blockTime = parseInt(fetchEnv(envVar.BLOCK_TIME), 10) - const requestTimeout = parseInt( - fetchEnvOrFallback(envVar.ISTANBUL_REQUEST_TIMEOUT_MS, '3000'), - 10 - ) - const epoch = parseInt(fetchEnvOrFallback(envVar.EPOCH, '30000'), 10) - // allow 12 blocks in prod for the uptime metric - const lookbackwindow = parseInt(fetchEnvOrFallback(envVar.LOOKBACK, '12'), 10) - const chainId = parseInt(fetchEnv(envVar.NETWORK_ID), 10) - - const initialAccounts = getFaucetedAccounts(mnemonic) - if (genesisAccountsEnv !== '') { - const genesisAccountsPath = path.resolve(monorepoRoot, genesisAccountsEnv) - const genesisAccounts = JSON.parse(fs.readFileSync(genesisAccountsPath).toString()) - for (const addr of genesisAccounts.addresses) { - initialAccounts.push({ - address: addr, - balance: genesisAccounts.value, - }) - } - } - - // Allocate voting bot account(s) - const numVotingBotAccounts = parseInt(fetchEnvOrFallback(envVar.VOTING_BOTS, '0'), 10) - initialAccounts.concat( - getStrippedAddressesFor(AccountType.VOTING_BOT, mnemonic, numVotingBotAccounts).map((addr) => { - return { - address: addr, - balance: fetchEnvOrFallback(envVar.VOTING_BOT_BALANCE, '100000000000000000000'), - } - }) - ) - - // Celo hard fork activation blocks. Default is undefined, which means not activated. - const churritoBlock = hardForkActivationBlock(envVar.CHURRITO_BLOCK) - const donutBlock = hardForkActivationBlock(envVar.DONUT_BLOCK) - const espressoBlock = hardForkActivationBlock(envVar.ESPRESSO_BLOCK) - - // network start timestamp - const timestamp = parseInt(fetchEnvOrFallback(envVar.TIMESTAMP, '0'), 10) - - return generateGenesis({ - validators, - consensusType, - blockTime, - initialAccounts, - epoch, - lookbackwindow, - chainId, - requestTimeout, - enablePetersburg, - timestamp, - churritoBlock, - donutBlock, - espressoBlock, - }) -} - -export const generateIstanbulExtraData = (validators: Validator[]) => { - const istanbulVanity = GENESIS_MSG_HASH - // Vanity prefix is 32 bytes (1 hex char/.5 bytes * 32 bytes = 64 hex chars) - if (istanbulVanity.length !== 32 * 2) { - throw new Error('Istanbul vanity must be 32 bytes') - } - const blsSignatureVanity = 96 - const ecdsaSignatureVanity = 65 - return ( - '0x' + - istanbulVanity + - rlp - // @ts-ignore - .encode([ - // Added validators - validators.map((validator) => Buffer.from(validator.address, 'hex')), - validators.map((validator) => Buffer.from(validator.blsPublicKey, 'hex')), - // Removed validators - Buffer.alloc(0), - // Seal - Buffer.from(repeat('0', ecdsaSignatureVanity * 2), 'hex'), - [ - // AggregatedSeal.Bitmap - Buffer.alloc(0), - // AggregatedSeal.Signature - Buffer.from(repeat('0', blsSignatureVanity * 2), 'hex'), - // AggregatedSeal.Round - Buffer.alloc(0), - ], - [ - // ParentAggregatedSeal.Bitmap - Buffer.alloc(0), - // ParentAggregatedSeal.Signature - Buffer.from(repeat('0', blsSignatureVanity * 2), 'hex'), - // ParentAggregatedSeal.Round - Buffer.alloc(0), - ], - ]) - .toString('hex') - ) -} - -export const generateGenesis = ({ - validators, - consensusType = ConsensusType.ISTANBUL, - initialAccounts: otherAccounts = [], - blockTime, - epoch, - lookbackwindow, - chainId, - requestTimeout, - enablePetersburg = true, - timestamp = 0, - churritoBlock, - donutBlock, - espressoBlock, - gingerbreadBlock, -}: GenesisConfig): string => { - const genesis: any = { ...TEMPLATE } - - if (!enablePetersburg) { - genesis.config = GETH_CONFIG_OLD - } - - if (typeof churritoBlock === 'number') { - genesis.config.churritoBlock = churritoBlock - } - if (typeof donutBlock === 'number') { - genesis.config.donutBlock = donutBlock - } - if (typeof espressoBlock === 'number') { - genesis.config.espressoBlock = espressoBlock - } - if (typeof gingerbreadBlock === 'number') { - genesis.config.gingerbreadBlock = gingerbreadBlock - } - - genesis.config.chainId = chainId - - if (consensusType === ConsensusType.CLIQUE) { - genesis.config.clique = { - period: blockTime, - epoch, - } - } else if (consensusType === ConsensusType.ISTANBUL) { - genesis.mixHash = ISTANBUL_MIX_HASH - genesis.difficulty = '0x1' - if (validators) { - genesis.extraData = generateIstanbulExtraData(validators) - } - genesis.config.istanbul = { - // see github.com/celo-org/celo-blockchain/blob/master/consensus/istanbul/config.go#L21-L25 - // 0 = RoundRobin, 1 = Sticky, 2 = ShuffledRoundRobin - policy: 2, - blockperiod: blockTime, - requesttimeout: requestTimeout, - epoch, - lookbackwindow, - } - } - - if (validators) { - for (const validator of validators) { - genesis.alloc[validator.address] = { - balance: validator.balance, - } - } - } - - for (const account of otherAccounts) { - genesis.alloc[account.address] = { - balance: account.balance, - } - } - - const contracts = [REGISTRY_ADDRESS] - const contractBuildPath = path.resolve( - monorepoRoot, - 'packages/protocol/build/contracts/Proxy.json' - ) - - if (validators && validators.length > 0) { - for (const contract of contracts) { - genesis.alloc[contract] = { - code: JSON.parse(fs.readFileSync(contractBuildPath).toString()).deployedBytecode, - storage: { - [CONTRACT_OWNER_STORAGE_LOCATION]: validators[0].address, - }, - balance: '0', - } - } - } - - genesis.timestamp = timestamp > 0 ? timestamp.toString() : '0x0' - - return JSON.stringify(genesis, null, 2) -} - -// This function assumes that mycelo has already been built using 'make all' -export const generateGenesisWithMigrations = async ({ - gethRepoPath, - genesisConfig, - mnemonic, - numValidators, - verbose, -}: MyceloGenesisConfig): Promise => { - const tmpDir = path.join(tmpdir(), `mycelo-genesis-${Date.now()}`) - fs.mkdirSync(tmpDir) - const envFile = path.join(tmpDir, 'env.json') - const configFile = path.join(tmpDir, 'genesis-config.json') - const myceloBinaryPath = path.join(gethRepoPath!, '/build/bin/mycelo') - await spawnCmdWithExitOnFailure( - myceloBinaryPath, - [ - 'genesis-config', - '--template', - 'monorepo', - '--mnemonic', - mnemonic, - '--validators', - numValidators.toString(), - ], - { - silent: !verbose, - cwd: tmpDir, - } - ) - const mcEnv = JSON.parse(fs.readFileSync(envFile).toString()) - const mcConfig = JSON.parse(fs.readFileSync(configFile).toString()) - - // Customize and overwrite the env.json file - merge(mcEnv, { - chainId: genesisConfig.chainId, - accounts: { - validators: numValidators, - }, - }) - fs.writeFileSync(envFile, JSON.stringify(mcEnv, undefined, 2)) - - // Customize and overwrite the genesis-config.json file - if (genesisConfig.chainId) { - mcConfig.chainId = genesisConfig.chainId - } - if (genesisConfig.epoch) { - mcConfig.istanbul.epoch = genesisConfig.epoch - } - if (genesisConfig.lookbackwindow) { - mcConfig.istanbul.lookbackwindow = genesisConfig.lookbackwindow - } - if (genesisConfig.blockTime) { - mcConfig.istanbul.blockperiod = genesisConfig.blockTime - } - if (genesisConfig.requestTimeout) { - mcConfig.istanbul.requesttimeout = genesisConfig.requestTimeout - } - if (genesisConfig.churritoBlock !== undefined) { - mcConfig.hardforks.churritoBlock = genesisConfig.churritoBlock - } - if (genesisConfig.donutBlock !== undefined) { - mcConfig.hardforks.donutBlock = genesisConfig.donutBlock - } - if (genesisConfig.espressoBlock !== undefined) { - mcConfig.hardforks.espressoBlock = genesisConfig.espressoBlock - } - if (genesisConfig.gingerbreadBlock !== undefined) { - mcConfig.hardforks.gingerbreadBlock = genesisConfig.gingerbreadBlock - } - if (genesisConfig.timestamp !== undefined) { - mcConfig.genesisTimestamp = genesisConfig.timestamp - } - - // TODO: overrides for migrations - - fs.writeFileSync(configFile, JSON.stringify(mcConfig, undefined, 2)) - - // Generate the genesis file, and return its contents - const contractsBuildPath = path.resolve(monorepoRoot, 'packages/protocol/build/contracts/') - await spawnCmdWithExitOnFailure( - myceloBinaryPath, - ['genesis-from-config', tmpDir, '--buildpath', contractsBuildPath], - { - silent: !verbose, - cwd: tmpDir, - } - ) - const genesis = fs.readFileSync(path.join(tmpDir, 'genesis.json')).toString() - // Clean up the tmp dir as it's no longer needed - await spawnCmd('rm', ['-rf', tmpDir], { silent: true }) - return genesis -} diff --git a/packages/celotool/src/lib/genesis_constants.ts b/packages/celotool/src/lib/genesis_constants.ts deleted file mode 100644 index d1e67ddcabf..00000000000 --- a/packages/celotool/src/lib/genesis_constants.ts +++ /dev/null @@ -1,45 +0,0 @@ -export const GETH_CONFIG_OLD = { - chainId: 1101, - homesteadBlock: 1, - eip150Block: 2, - eip150Hash: '0x0000000000000000000000000000000000000000000000000000000000000000', - eip155Block: 3, - eip158Block: 3, - byzantiumBlock: 4, -} - -export const TEMPLATE = { - config: { - homesteadBlock: 0, - eip150Block: 0, - eip150Hash: '0x0000000000000000000000000000000000000000000000000000000000000000', - eip155Block: 0, - eip158Block: 0, - byzantiumBlock: 0, - constantinopleBlock: 0, - petersburgBlock: 0, - istanbulBlock: 0, - }, - nonce: '0x0', - timestamp: '0x5b843511', - gasLimit: '0x8000000', - extraData: - '0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000feE1a22F43BeeCB912B5a4912ba87527682ef0fC889F21CE69dcc25a4594f73230A55896d67038065372d2bbBaBaAf1495182E31cF13dB0d18463B0EF71690ea7E0c67827d8968882FAC0c4cBBD65BCE0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000', - difficulty: '0x0400', - coinbase: '0x0000000000000000000000000000000000000000', - alloc: {}, - number: '0x0', - gasUsed: '0x0', - mixHash: '0x0000000000000000000000000000000000000000000000000000000000000000', - parentHash: '0x0000000000000000000000000000000000000000000000000000000000000000', -} - -export const REGISTRY_ADDRESS = '000000000000000000000000000000000000ce10' - -export const CONTRACT_OWNER_STORAGE_LOCATION = - '0xb53127684a568b3173ae13b9f8a6016e243e63b6e8ee1178d6a717850b5d6103' -export const ISTANBUL_MIX_HASH = - '0x63746963616c2062797a616e74696e65206661756c7420746f6c6572616e6365' - -// Keccak256 of "The Times 09/Apr/2020 With $2.3 Trillion Injection, Fed’s Plan Far Exceeds Its 2008 Rescue" -export const GENESIS_MSG_HASH = 'ecc833a7747eaa8327335e8e0c6b6d8aa3a38d0063591e43ce116ccf5c89753e' diff --git a/packages/celotool/src/lib/geth.ts b/packages/celotool/src/lib/geth.ts deleted file mode 100644 index 8c7da6d364d..00000000000 --- a/packages/celotool/src/lib/geth.ts +++ /dev/null @@ -1,1676 +0,0 @@ -/* eslint-disable no-console */ -import { CeloTxReceipt, TransactionResult } from '@celo/connect' -import { CeloContract, ContractKit, newKitFromWeb3 } from '@celo/contractkit' -import { GoldTokenWrapper } from '@celo/contractkit/lib/wrappers/GoldTokenWrapper' -import { StableTokenWrapper } from '@celo/contractkit/lib/wrappers/StableTokenWrapper' -import BigNumber from 'bignumber.js' -import { spawn } from 'child_process' -import { randomBytes } from 'crypto' -import fs from 'fs' -import { merge, range } from 'lodash' -import fetch from 'node-fetch' -import path from 'path' -import sleep from 'sleep-promise' -import Web3 from 'web3' -import { Admin } from 'web3-eth-admin' -import { numberToHex } from 'web3-utils' -import { spawnCmd, spawnCmdWithExitOnFailure } from './cmd-utils' -import { convertToContractDecimals } from './contract-utils' -import { envVar, fetchEnv, fetchEnvOrFallback } from './env-utils' -import { - AccountType, - Validator, - generateGenesis, - generateGenesisWithMigrations, - generatePrivateKey, - privateKeyToAddress, - privateKeyToPublicKey, -} from './generate_utils' -import { retrieveClusterIPAddress, retrieveIPAddress } from './helm_deploy' -import { GethInstanceConfig } from './interfaces/geth-instance-config' -import { GethRunConfig } from './interfaces/geth-run-config' -import { waitForPortOpen } from './port-utils' -import { ensure0x } from './utils' - -export async function unlockAccount( - web3: Web3, - duration: number, - password: string, - accountAddress: string | null = null -) { - if (accountAddress === null) { - const accounts = await web3.eth.getAccounts() - accountAddress = accounts[0] - } - await web3.eth.personal.unlockAccount(accountAddress!, password, duration) - return accountAddress! -} - -type HandleErrorCallback = (isError: boolean, data: { location: string; error: string }) => void - -const DEFAULT_TRANSFER_AMOUNT = new BigNumber('0.00000000000001') -const LOAD_TEST_TRANSFER_WEI = new BigNumber(10000) - -const GETH_IPC = 'geth.ipc' -const DISCOVERY_PORT = 30303 -const BOOTNODE_DISCOVERY_PORT = 30301 - -const BLOCKSCOUT_TIMEOUT = 12000 // ~ 12 seconds needed to see the transaction in the blockscout - -// for log messages which indicate that blockscout where not able to provide -// information about transaction in a "timely" (15s for now) manner -export const LOG_TAG_BLOCKSCOUT_TIMEOUT = 'blockscout_timeout' -// for log messages which show time (+- 150-200ms) needed for blockscout to -// fetch and publish information about transaction -export const LOG_TAG_BLOCKSCOUT_TIME_MEASUREMENT = 'blockscout_time_measurement' -// for log messages which show the error about validating transaction receipt -export const LOG_TAG_BLOCKSCOUT_VALIDATION_ERROR = 'validate_blockscout_error' -// for log messages which show the error occurred when fetching a contract address -export const LOG_TAG_CONTRACT_ADDRESS_ERROR = 'contract_address_error' -// for log messages which show the error while validating geth rpc response -export const LOG_TAG_GETH_RPC_ERROR = 'geth_rpc_error' -// for log messages which show the error occurred when the transaction has -// been sent -export const LOG_TAG_TRANSACTION_ERROR = 'transaction_error' -// message indicating that the tx hash has been received in callback within sendTransaction -export const LOG_TAG_TRANSACTION_HASH_RECEIVED = 'tx_hash_received' -// for log messages which show the error about validating transaction receipt -export const LOG_TAG_TRANSACTION_VALIDATION_ERROR = 'validate_transaction_error' -// for log messages which show time needed to receive the receipt after -// the transaction has been sent -export const LOG_TAG_TX_TIME_MEASUREMENT = 'tx_time_measurement' -// max number of threads used for load testing -export const MAX_LOADTEST_THREAD_COUNT = 10000 - -export const getEnodeAddress = (nodeId: string, ipAddress: string, port: number) => { - return `enode://${nodeId}@${ipAddress}:${port}` -} - -export const getBootnodeEnode = async (namespace: string) => { - const ip = await retrieveBootnodeIPAddress(namespace) - const privateKey = generatePrivateKey(fetchEnv(envVar.MNEMONIC), AccountType.BOOTNODE, 0) - const nodeId = privateKeyToPublicKey(privateKey) - return [getEnodeAddress(nodeId, ip, BOOTNODE_DISCOVERY_PORT)] -} - -export const retrieveBootnodeIPAddress = async (namespace: string) => { - // Baklava bootnode address comes from VM and has an different name (not possible to update name after creation) - const resourceName = - namespace === 'baklava' ? `${namespace}-bootnode-address` : `${namespace}-bootnode` - if (fetchEnv(envVar.STATIC_IPS_FOR_GETH_NODES) === 'true') { - return retrieveIPAddress(resourceName) - } else { - return retrieveClusterIPAddress('service', resourceName, namespace) - } -} - -const retrieveTxNodeAddresses = async (namespace: string, txNodesNum: number) => { - const txNodesRange = range(0, txNodesNum) - return Promise.all(txNodesRange.map((i) => retrieveIPAddress(`${namespace}-tx-nodes-${i}`))) -} - -const getEnodesWithIpAddresses = async (namespace: string, getExternalIP: boolean) => { - const txNodesNum = parseInt(fetchEnv(envVar.TX_NODES), 10) - const txAddresses = await retrieveTxNodeAddresses(namespace, txNodesNum) - const txNodesRange = range(0, txNodesNum) - return Promise.all( - txNodesRange.map(async (index) => { - const privateKey = generatePrivateKey(fetchEnv(envVar.MNEMONIC), AccountType.TX_NODE, index) - const nodeId = privateKeyToPublicKey(privateKey) - let address: string - if (getExternalIP) { - address = txAddresses[index] - } else { - address = await retrieveClusterIPAddress( - 'service', - `${namespace}-service-${index}`, - namespace - ) - if (address.length === 0) { - console.error('IP address is empty for transaction node') - throw new Error('IP address is empty for transaction node') - } - } - return getEnodeAddress(nodeId, address, DISCOVERY_PORT) - }) - ) -} - -export const getEnodesAddresses = async (namespace: string) => { - return getEnodesWithIpAddresses(namespace, false) -} - -export const getEnodesWithExternalIPAddresses = async (namespace: string) => { - return getEnodesWithIpAddresses(namespace, true) -} - -export function getPrivateTxNodeClusterIP(celoEnv: string) { - return retrieveClusterIPAddress('service', 'tx-nodes-private', celoEnv) -} - -export const fetchPassword = (passwordFile: string) => { - if (!fs.existsSync(passwordFile)) { - console.error(`Password file at ${passwordFile} does not exists!`) - process.exit(1) - } - return fs.readFileSync(passwordFile).toString() -} - -export const writeStaticNodes = ( - enodes: string[], - outputDirPath: string, - outputFileName: string, - spacing: number = 2 -) => { - const encodedJSON = JSON.stringify(enodes, null, spacing) - - fs.writeFile(path.join(outputDirPath, outputFileName), encodedJSON, (err) => { - if (err) { - console.error(err) - process.exit(1) - } - }) -} - -export const checkGethStarted = (dataDir: string) => { - if (!fs.existsSync(path.resolve(dataDir, GETH_IPC))) { - console.error(`Looks like there are no local geth nodes running in ${dataDir}`) - console.info( - `Please, make sure you specified correct data directory, you could also run the geth node by "celotooljs geth run"` - ) - process.exit(1) - } -} - -export const getWeb3AndTokensContracts = async () => { - const kit = newKitFromWeb3(new Web3('http://localhost:8545')) - const [goldToken, stableToken] = await Promise.all([ - kit.contracts.getGoldToken(), - kit.contracts.getStableToken(), - ]) - - return { - kit, - goldToken, - stableToken, - } -} - -export const getRandomInt = (from: number, to: number) => { - return Math.floor(Math.random() * (to - from)) + from -} - -const getRandomToken = (goldToken: GoldTokenWrapper, stableToken: StableTokenWrapper) => { - const tokenType = getRandomInt(0, 2) - if (tokenType === 0) { - return goldToken - } else { - return stableToken - } -} - -const validateGethRPC = async ( - kit: ContractKit, - txHash: string, - from: string, - handleError: HandleErrorCallback -) => { - const transaction = await kit.connection.getTransaction(txHash) - handleError(!transaction || !transaction.from, { - location: '[GethRPC]', - error: `Contractkit did not return a valid transaction`, - }) - if (transaction == null) { - return - } - const txFrom = transaction.from.toLowerCase() - const expectedFrom = from.toLowerCase() - handleError(!transaction.from || expectedFrom !== txFrom, { - location: '[GethRPC]', - error: `Expected "from" to equal ${expectedFrom}, but found ${txFrom}`, - }) -} - -const checkBlockscoutResponse = ( - json: any /* response */, - txHash: string, - from: string, - handleError: HandleErrorCallback -) => { - const location = '[Blockscout]' - - handleError(json.status !== '1', { location, error: `Invalid status: expected '1', received` }) - handleError(!json.result, { location, error: `No result found: receive ${json.status.result}` }) - const resultFrom = json.result.from.toLowerCase() - const expectedFrom = from.toLowerCase() - handleError(resultFrom !== expectedFrom, { - location, - error: `Expected "from" to equal ${expectedFrom}, but found ${resultFrom}`, - }) - handleError(json.result.hash !== txHash, { - location, - error: `Expected "hash" to equal ${txHash}, but found ${json.result.hash}`, - }) -} - -const fetchBlockscoutTxInfo = async (url: string, txHash: string) => { - const response = await fetch(`${url}/api?module=transaction&action=gettxinfo&txhash=${txHash}`) - return response.json() as any -} - -const validateBlockscout = async ( - url: string, - txHash: string, - from: string, - handleError: HandleErrorCallback -) => { - const json = await fetchBlockscoutTxInfo(url, txHash) - - checkBlockscoutResponse(json, txHash, from, handleError) -} - -// Maximal time given for blockscout to provide info about tx -// If the transaction does not appear in blockscout within 15 seconds, -// blockscout is considered to be not working in a timely manner -const MAXIMAL_BLOCKSCOUT_TIMEOUT = 15000 - -// Try to fetch info about transaction every 150 ms -const BLOCKSCOUT_FETCH_RETRY_TIME = 150 - -// within MAXIMAL_BLOCKSCOUT_TIMEOUT ms -const getFirstValidBlockscoutResponse = async (url: string, txHash: string) => { - const attempts = MAXIMAL_BLOCKSCOUT_TIMEOUT / BLOCKSCOUT_FETCH_RETRY_TIME - for (let attemptId = 0; attemptId < attempts; attemptId++) { - const json = await fetchBlockscoutTxInfo(url, txHash) - if (json.status !== '1') { - await sleep(BLOCKSCOUT_FETCH_RETRY_TIME) - } else { - return [json, Date.now()] - } - } - return [null, null] -} - -const validateTransactionAndReceipt = ( - from: string, - txReceipt: any, - handleError: HandleErrorCallback -) => { - const location = '[TX & Receipt]' - - handleError(!txReceipt, { location, error: 'No transaction receipt received!' }) - handleError(txReceipt.status !== true, { - location, - error: `Transaction receipt status (${txReceipt.status}) is not true!`, - }) - handleError(txReceipt.from.toLowerCase() !== from.toLowerCase(), { - location, - error: `Transaction receipt from (${txReceipt.from}) is not equal to sender address (${from}).`, - }) -} - -const tracerLog = (logMessage: any) => { - console.info(JSON.stringify(logMessage)) -} - -const exitTracerTool = (logMessage: any) => { - tracerLog(logMessage) - process.exit(1) -} - -const transferAndTrace = async ( - kit: ContractKit, - goldToken: GoldTokenWrapper, - stableToken: StableTokenWrapper, - from: string, - to: string, - password: string, - blockscoutUrl: string -) => { - console.info('Transfer') - - const token = getRandomToken(goldToken, stableToken) - const feeCurrencyToken = getRandomToken(goldToken, stableToken) - - const [tokenName, feeCurrencySymbol] = await Promise.all([ - token.symbol(), - feeCurrencyToken.symbol(), - ]) - - const logMessage: any = { - severity: 'CRITICAL', - senderAddress: from, - receiverAddress: to, - blockscout: blockscoutUrl, - token: tokenName, - error: '', - location: '', - txHash: '', - } - - const txParams: any = {} - // Fill txParams below - if (getRandomInt(0, 2) === 3) { - txParams.feeCurrency = feeCurrencyToken.address - logMessage.feeCurrency = feeCurrencySymbol - } - - const transferToken = new Promise(async (resolve) => { - await transferERC20Token( - kit, - token, - from, - to, - DEFAULT_TRANSFER_AMOUNT, - password, - txParams, - undefined, - (receipt: any) => { - resolve(receipt) - }, - (error: any) => { - logMessage.error = error - exitTracerTool(logMessage) - } - ) - }) - - const txReceipt: any = await transferToken - const txHash = txReceipt ? txReceipt.transactionHash : '' - - // Need to wait for a bit to make sure that blockscout had enough time - // to see the transaction and display it - await sleep(BLOCKSCOUT_TIMEOUT) - - logMessage.txHash = txHash - - const handleError = (isError: boolean, data: { location: string; error: string }) => { - if (isError) { - exitTracerTool({ ...logMessage, ...data }) - } - } - - validateTransactionAndReceipt(from, txReceipt!, handleError) - await validateBlockscout(blockscoutUrl, txHash, from, handleError) - await validateGethRPC(kit, txHash, from, handleError) -} - -export const traceTransactions = async ( - kit: ContractKit, - goldToken: GoldTokenWrapper, - stableToken: StableTokenWrapper, - addresses: string[], - blockscoutUrl: string -) => { - console.info('Starting simulation') - - await transferAndTrace(kit, goldToken, stableToken, addresses[0], addresses[1], '', blockscoutUrl) - - await transferAndTrace(kit, goldToken, stableToken, addresses[1], addresses[0], '', blockscoutUrl) - - console.info('Simulation finished successully!') -} - -const measureBlockscout = async ( - blockscoutUrl: string, - txHash: string, - from: string, - obtainReceiptTime: number, - baseLogMessage: any -) => { - const [json, receivedTime] = await getFirstValidBlockscoutResponse(blockscoutUrl, txHash) - if (receivedTime === null) { - tracerLog({ - tag: LOG_TAG_BLOCKSCOUT_TIMEOUT, - ...baseLogMessage, - }) - } else { - tracerLog({ - tag: LOG_TAG_BLOCKSCOUT_TIME_MEASUREMENT, - p_time: receivedTime - obtainReceiptTime, - ...baseLogMessage, - }) - checkBlockscoutResponse(json, txHash, from, (isError, data) => { - if (isError) { - tracerLog({ - tag: LOG_TAG_BLOCKSCOUT_VALIDATION_ERROR, - ...data, - ...baseLogMessage, - }) - } - }) - } -} - -export const transferCalldata = async ( - kit: ContractKit, - fromAddress: string, - toAddress: string, - amount: BigNumber, - dataStr?: string, - txOptions: { - chainId?: number - gas?: number - gasPrice?: string - feeCurrency?: string - gatewayFeeRecipient?: string - gatewayFee?: string - nonce?: number - } = {} -) => { - return kit.sendTransaction({ - from: fromAddress, - to: toAddress, - chainId: numberToHex(txOptions.chainId || 0), - value: amount.toString(), - data: dataStr, - gas: txOptions.gas, - gasPrice: txOptions.gasPrice, - gatewayFeeRecipient: txOptions.gatewayFeeRecipient, - gatewayFee: txOptions.gatewayFee, - nonce: txOptions.nonce, - }) -} - -// Reference: https://celoscan.io/tx/0x88928b2abfcfb915077341087defc4ba345ce771ed6190bc9d21f34fdc1d34e1 -export const transferOrdinals = async ( - kit: ContractKit, - fromAddress: string, - toAddress: string, - amount: BigNumber, - dataStr?: string, - txOptions: { - chainId?: number - gas?: number - gasPrice?: string - feeCurrency?: string - gatewayFeeRecipient?: string - gatewayFee?: string - nonce?: number - } = {} -) => { - return kit.connection.sendTransaction({ - from: fromAddress, - to: fromAddress, - value: '0', - chainId: numberToHex(txOptions.chainId || 0), - data: Buffer.from('data:,{"p":"cls-20","op":"mint","tick":"cels","amt":"100000000"}').toString( - 'hex' - ), - gas: '40000', - maxFeePerGas: Web3.utils.toWei('5', 'gwei'), - maxPriorityFeePerGas: '1', - nonce: txOptions.nonce, - }) -} - -export const transferCeloGold = async ( - kit: ContractKit, - fromAddress: string, - toAddress: string, - amount: BigNumber, - _?: string, - txOptions: { - chainId?: number - gas?: number - gasPrice?: string - feeCurrency?: string - gatewayFeeRecipient?: string - gatewayFee?: string - nonce?: number - } = {} -) => { - const kitGoldToken = await kit.contracts.getGoldToken() - return kitGoldToken.transfer(toAddress, amount.toString()).send({ - from: fromAddress, - chainId: numberToHex(txOptions.chainId || 0), - gas: txOptions.gas, - gasPrice: txOptions.gasPrice, - feeCurrency: txOptions.feeCurrency || undefined, - gatewayFeeRecipient: txOptions.gatewayFeeRecipient, - gatewayFee: txOptions.gatewayFee, - nonce: txOptions.nonce, - }) -} - -export const transferCeloDollars = async ( - kit: ContractKit, - fromAddress: string, - toAddress: string, - amount: BigNumber, - _?: string, - txOptions: { - chainId?: number - gas?: number - gasPrice?: string - feeCurrency?: string - gatewayFeeRecipient?: string - gatewayFee?: string - nonce?: number - } = {} -) => { - const kitStableToken = await kit.contracts.getStableToken() - return kitStableToken.transfer(toAddress, amount.toString()).send({ - from: fromAddress, - chainId: numberToHex(txOptions.chainId || 0), - gas: txOptions.gas, - gasPrice: txOptions.gasPrice, - feeCurrency: txOptions.feeCurrency || undefined, - gatewayFeeRecipient: txOptions.gatewayFeeRecipient, - gatewayFee: txOptions.gatewayFee, - nonce: txOptions.nonce, - }) -} - -export const unlock = async ( - kit: ContractKit, - address: string, - password: string, - unlockPeriod: number -) => { - try { - await kit.web3.eth.personal.unlockAccount(address, password, unlockPeriod) - } catch (error) { - console.error(`Unlock account ${address} failed:`, error) - } -} - -export enum TestMode { - Mixed = 'mixed', - Data = 'data', - Transfer = 'transfer', - StableTransfer = 'stable_transfer', - ContractCall = 'contract_call', - Ordinals = 'ordinals', -} - -export const simulateClient = async ( - senderPK: string, - recipientAddress: string, - contractAddress: string, - contractData: string, - txPeriodMs: number, // time between new transactions in ms - blockscoutUrl: string, - blockscoutMeasurePercent: number, // percent of time in range [0, 100] to measure blockscout for a tx - index: number, - testMode: TestMode, - thread: number, - maxGasPrice: BigNumber = new BigNumber(0), - totalTxGas: number = 500000, // aim for half million gas txs - web3Provider: string = 'http://127.0.0.1:8545', - chainId: number = 42220 -) => { - // Assume the node is accessible via localhost with senderAddress unlocked - const kit = newKitFromWeb3(new Web3(web3Provider)) - - let lastNonce: number = -1 - let lastTx: string = '' - let lastGasPriceMinimum: BigNumber = new BigNumber(0) - let nonce: number = 0 - let recipientAddressFinal: string = recipientAddress - const useRandomRecipient = fetchEnvOrFallback(envVar.LOAD_TEST_USE_RANDOM_RECIPIENT, 'false') - - kit.connection.addAccount(senderPK) - kit.defaultAccount = privateKeyToAddress(senderPK) - - const sleepTime = 5000 - while (await kit.connection.isSyncing()) { - console.info( - `LoadTestId ${index} waiting for web3Provider to be synced. Sleeping ${sleepTime}ms` - ) - await sleep(sleepTime) - } - kit.addAccount(senderPK) - kit.defaultAccount = privateKeyToAddress(senderPK) - kit.connection.addAccount(senderPK) - kit.connection.defaultAccount = privateKeyToAddress(senderPK) - - // sleep a random amount of time in the range [0, txPeriodMs) before starting so - // that if multiple simulations are started at the same time, they don't all - // submit transactions at the same time - const randomSleep = Math.random() * txPeriodMs - console.info(`Sleeping for ${randomSleep} ms`) - await sleep(randomSleep) - - const txConf = await getTxConf(testMode) - const intrinsicGas = txConf.feeCurrencyGold ? 21000 : 71000 - // const totalTxGas = 500000 // aim for half million gas txs - const calldataGas = totalTxGas - intrinsicGas - const calldataSize = calldataGas / 4 // 119750 < tx pool size limit (128k) - let dataStr = testMode === TestMode.Data ? getBigData(calldataSize) : undefined // aim for half million gas txs - // Also running below the 128kb limit from the tx pool - let transferAmount = LOAD_TEST_TRANSFER_WEI - - if (testMode === TestMode.ContractCall) { - if (!contractData || !contractAddress) { - throw new Error('Contract address and data must be provided for TestMode.ContractCall') - } - dataStr = contractData - recipientAddressFinal = contractAddress - transferAmount = new BigNumber(0) - } - - const baseLogMessage: any = { - loadTestID: index, - threadID: thread, - sender: kit.defaultAccount, - nonce: '', - gasPrice: '', - recipient: recipientAddressFinal, - feeCurrency: '', - txHash: '', - tokenName: txConf.tokenName, - } - - while (true) { - const sendTransactionTime = Date.now() - const txConf = await getTxConf(testMode) - baseLogMessage.tokenName = txConf.tokenName - - // randomly choose the recipientAddress if configured - if (useRandomRecipient === 'true') { - recipientAddressFinal = `0x${randomBytes(20).toString('hex')}` - baseLogMessage.recipient = recipientAddressFinal - } - - let txOptions - const feeCurrency = await getFeeCurrency(kit, txConf.feeCurrencyGold, baseLogMessage) - - baseLogMessage.feeCurrency = feeCurrency - try { - let gasPrice = await getGasPrice(kit, feeCurrency) - - // Check if last tx was mined. If not, reuse the same nonce - const nonceResult = await getNonce( - kit, - kit.defaultAccount, - lastTx, - lastNonce, - gasPrice, - lastGasPriceMinimum - ) - nonce = nonceResult.nonce - gasPrice = nonceResult.newPrice - baseLogMessage.nonce = nonce - baseLogMessage.gasPrice = gasPrice.toString() - if (maxGasPrice.isGreaterThan(0)) { - gasPrice = BigNumber.min(gasPrice, maxGasPrice) - } - lastGasPriceMinimum = gasPrice - txOptions = { - chainId, - gasPrice: gasPrice.toString(), - feeCurrency, - nonce, - } - } catch (error: any) { - tracerLog({ - tag: LOG_TAG_CONTRACT_ADDRESS_ERROR, - error: error.toString(), - ...baseLogMessage, - }) - } - - if (testMode === TestMode.ContractCall) { - if (!contractData || !contractAddress) { - throw new Error('Contract address and data must be provided for TestMode.ContractCall') - } - dataStr = contractData - recipientAddressFinal = contractAddress - } - - await txConf - .transferFn( - kit, - kit.defaultAccount, - recipientAddressFinal, - transferAmount, - dataStr, - txOptions - ) - .then(async (txResult: TransactionResult) => { - lastTx = await txResult.getHash() - lastNonce = (await kit.web3.eth.getTransaction(lastTx)).nonce - await onLoadTestTxResult( - kit, - kit.defaultAccount!, - txResult, - sendTransactionTime, - baseLogMessage, - blockscoutUrl, - blockscoutMeasurePercent - ) - }) - .catch((error: any) => { - console.error('Load test transaction failed with error:', error) - tracerLog({ - tag: LOG_TAG_TRANSACTION_ERROR, - error: error.toString(), - ...baseLogMessage, - }) - }) - if (sendTransactionTime + txPeriodMs > Date.now()) { - await sleep(sendTransactionTime + txPeriodMs - Date.now()) - } - } -} - -const getBigData = (size: number) => { - return '0x' + '00'.repeat(size) -} - -const getTxConf = async (testMode: TestMode) => { - switch (testMode) { - case TestMode.Ordinals: - return { - feeCurrencyGold: true, - tokenName: 'cGLD', - transferFn: transferOrdinals, - } - case TestMode.Data: - return { - feeCurrencyGold: true, - tokenName: 'cGLD.L', - transferFn: transferCalldata, - } - case TestMode.Transfer: - return { - feeCurrencyGold: true, - tokenName: 'cGLD', - transferFn: transferCeloGold, - } - case TestMode.StableTransfer: - return { - feeCurrencyGold: false, - tokenName: 'cUSD', - transferFn: transferCeloDollars, - } - case TestMode.Mixed: - // randomly choose which token to use - const useGold = Boolean(Math.round(Math.random())) - const _transferFn = useGold ? transferCeloGold : transferCeloDollars - const _tokenName = useGold ? 'cGLD' : 'cUSD' - - // randomly choose which gas currency to use - const _feeCurrencyGold = Boolean(Math.round(Math.random())) - return { - feeCurrencyGold: _feeCurrencyGold, - tokenName: _tokenName, - transferFn: _transferFn, - } - case TestMode.ContractCall: - return { - feeCurrencyGold: true, - tokenName: 'contract', // For logging - transferFn: transferCalldata, - } - default: - throw new Error(`Unimplemented TestMode: ${testMode}`) - } -} - -const getNonce = async ( - kit: ContractKit, - senderAddress: string, - lastTx: any, - lastNonce: any, - gasPrice: BigNumber, - lastGasPriceMinimum: BigNumber -) => { - let _nonce, _newPrice - _newPrice = gasPrice - if (lastTx === '' || lastNonce === -1) { - _nonce = await kit.web3.eth.getTransactionCount(senderAddress, 'latest') - } else if ((await kit.connection.getTransactionReceipt(lastTx))?.blockNumber) { - _nonce = await kit.web3.eth.getTransactionCount(senderAddress, 'latest') - } else { - _nonce = (await kit.web3.eth.getTransactionCount(senderAddress, 'latest')) - 1 - _newPrice = BigNumber.max(gasPrice.toNumber(), lastGasPriceMinimum.times(1.02)).dp(0) - console.warn( - `TX ${lastTx} was not mined. Replacing tx reusing nonce ${_nonce} and gasPrice ${_newPrice}` - ) - } - return { - newPrice: _newPrice, - nonce: _nonce, - } -} - -const getFeeCurrency = async (kit: ContractKit, feeCurrencyGold: boolean, baseLogMessage: any) => { - try { - return feeCurrencyGold ? '' : await kit.registry.addressFor(CeloContract.StableToken) - } catch (error: any) { - tracerLog({ - tag: LOG_TAG_CONTRACT_ADDRESS_ERROR, - error: error.toString(), - ...baseLogMessage, - }) - } -} - -const getGasPrice = async (kit: ContractKit, feeCurrency?: string) => { - const gasPriceMinimum = await kit.contracts.getGasPriceMinimum() - const gasPriceBase = feeCurrency - ? await gasPriceMinimum.getGasPriceMinimum(feeCurrency) - : await gasPriceMinimum.gasPriceMinimum() - return new BigNumber(gasPriceBase).times(2).dp(0) -} - -export const onLoadTestTxResult = async ( - kit: ContractKit, - senderAddress: string, - txResult: TransactionResult, - sendTransactionTime: number, - baseLogMessage: any, - blockscoutUrl: string, - blockscoutMeasurePercent: number -) => { - const txReceipt = await txResult.waitReceipt() - const txHash = txReceipt.transactionHash - baseLogMessage.txHash = txHash - - const receiptTime = Date.now() - - tracerLog({ - tag: LOG_TAG_TX_TIME_MEASUREMENT, - p_time: receiptTime - sendTransactionTime, - ...baseLogMessage, - }) - // Continuing only with receipt received - validateTransactionAndReceipt(senderAddress, txReceipt, (isError, data) => { - if (isError) { - tracerLog({ - tag: LOG_TAG_TRANSACTION_VALIDATION_ERROR, - ...baseLogMessage, - ...data, - }) - } - }) - - if (Math.random() * 100 < blockscoutMeasurePercent) { - await measureBlockscout( - blockscoutUrl, - txReceipt.transactionHash, - senderAddress, - receiptTime, - baseLogMessage - ) - } - - await validateGethRPC(kit, txHash, senderAddress, (isError, data) => { - if (isError) { - tracerLog({ - tag: LOG_TAG_GETH_RPC_ERROR, - ...data, - ...baseLogMessage, - }) - } - }) -} - -export async function faucetLoadTestThreads( - index: number, - threads: number, - mnemonic: string, - web3Provider: string = 'http://localhost:8545', - chainId: number = 42220 -) { - const minimumEthBalance = 5 - const kit = newKitFromWeb3(new Web3(web3Provider)) - const privateKey = generatePrivateKey(mnemonic, AccountType.LOAD_TESTING_ACCOUNT, index) - kit.addAccount(privateKey) - const fundingAddress = privateKeyToAddress(privateKey) - console.info(`Addind account ${fundingAddress} to kit`) - kit.defaultAccount = privateKeyToAddress(privateKey) - const sleepTime = 5000 - while ((await kit.connection.isSyncing()) || (await kit.connection.getBlockNumber()) < 1) { - console.info(`Sleeping ${sleepTime}ms while waiting for web3Provider to be synced.`) - await sleep(sleepTime) - } - const [goldToken, stableToken] = await Promise.all([ - kit.contracts.getGoldToken(), - kit.contracts.getStableToken(), - ]) - const [goldAmount, stableTokenAmount] = await Promise.all([ - convertToContractDecimals(minimumEthBalance, goldToken), - convertToContractDecimals(minimumEthBalance, stableToken), - ]) - for (let thread = 0; thread < threads; thread++) { - const senderIndex = getIndexForLoadTestThread(index, thread) - const threadPkey = generatePrivateKey(mnemonic, AccountType.LOAD_TESTING_ACCOUNT, senderIndex) - const threadAddress = privateKeyToAddress(threadPkey) - console.info(`Funding account ${threadAddress} using ${kit.defaultAccount}`) - if ((await goldToken.balanceOf(threadAddress)).lt(goldAmount)) { - console.log(`Sending gold to ${threadAddress}`) - await goldToken - .transfer(threadAddress, goldAmount.toFixed()) - .send({ from: fundingAddress, chainId: numberToHex(chainId) }) - } else { - console.log(`Account ${threadAddress} already has enough gold`) - } - if ((await stableToken.balanceOf(threadAddress)).lt(stableTokenAmount)) { - console.log(`Sending cusd to ${threadAddress} using ${kit.defaultAccount}`) - await stableToken - .transfer(threadAddress, stableTokenAmount.toFixed()) - .send({ from: fundingAddress, chainId: numberToHex(chainId) }) - } else { - console.log(`Account ${threadAddress} already has enough cusd`) - } - } -} - -/** - * This method generates key derivation index for loadtest clients and threads - * - * @param pod the pod replica number - * @param thread the thread number inside the pod - */ -export function getIndexForLoadTestThread(pod: number, thread: number) { - if (thread > MAX_LOADTEST_THREAD_COUNT) { - throw new Error(`thread count must be smaller than ${MAX_LOADTEST_THREAD_COUNT}`) - } - // max number of threads to avoid overlap is [0, MAX_LOADTEST_THREAD_COUNT) - return pod * MAX_LOADTEST_THREAD_COUNT + thread -} - -/** - * This method sends ERC20 tokens - * - * @param kit instance of the contract kit - * @param token the token contract to use - * @param from sender to send the token from - * @param to receiver that gets the tokens - * @param amount the amount of tokens to be sent - * @param password the password of the account to use - * @param txParams additional transaction parameters - * @param onTransactionHash callback, fired when the transaction has is generated - * @param onReceipt callback, fired when the receipt is returned - * @param onError callback, fired in case of an error, containing the error - */ -export const transferERC20Token = async ( - kit: ContractKit, - token: GoldTokenWrapper | StableTokenWrapper, - from: string, - to: string, - amount: BigNumber, - password: string, - txParams: any = {}, - onTransactionHash?: (hash: string) => void, - onReceipt?: (receipt: CeloTxReceipt) => void, - onError?: (error: any) => void -) => { - txParams.from = from - await unlockAccount(kit.connection.web3, 0, password, from) - - const convertedAmount = await convertToContractDecimals(amount, token) - - try { - const result = await token.transfer(to, convertedAmount.toString()).send() - if (onTransactionHash) { - onTransactionHash(await result.getHash()) - } - if (onReceipt) { - const receipt = await result.waitReceipt() - onReceipt(receipt) - } - } catch (error) { - if (onError) { - onError(error) - } - } -} - -export const runGethNodes = async ({ - gethConfig, - validators, - verbose, -}: { - gethConfig: GethRunConfig - validators: Validator[] - verbose: boolean -}) => { - const gethBinaryPath = path.join( - (gethConfig.repository && gethConfig.repository.path) || '', - 'build/bin/geth' - ) - - if (!fs.existsSync(gethBinaryPath)) { - console.error(`Geth binary at ${gethBinaryPath} not found!`) - return - } - - if (!gethConfig.keepData && fs.existsSync(gethConfig.runPath)) { - await resetDataDir(gethConfig.runPath, verbose) - } - - if (!fs.existsSync(gethConfig.runPath)) { - // @ts-ignore - fs.mkdirSync(gethConfig.runPath, { recursive: true }) - } - - await writeGenesis(gethConfig, validators, verbose) - - if (verbose) { - const validatorAddresses = validators.map((validator) => validator.address) - console.info('Validators', JSON.stringify(validatorAddresses, null, 2)) - } - - for (const instance of gethConfig.instances) { - await initAndStartGeth(gethConfig, gethBinaryPath, instance, verbose) - } - - await connectValidatorPeers(gethConfig.instances) -} - -function getInstanceDir(runPath: string, instance: GethInstanceConfig) { - return path.join(runPath, instance.name) -} - -function getSnapshotdir(runPath: string, instance: GethInstanceConfig) { - return path.join(getInstanceDir(runPath, instance), 'snapshot') -} - -export function importGenesis(genesisPath: string) { - return JSON.parse(fs.readFileSync(genesisPath).toString()) -} - -export function getLogFilename(runPath: string, instance: GethInstanceConfig) { - return path.join(getDatadir(runPath, instance), 'logs.txt') -} - -function getDatadir(runPath: string, instance: GethInstanceConfig) { - const dir = path.join(getInstanceDir(runPath, instance), 'datadir') - // @ts-ignore - fs.mkdirSync(dir, { recursive: true }) - return dir -} - -/** - * @returns Promise the geth pid number - */ -export async function initAndStartGeth( - gethConfig: GethRunConfig, - gethBinaryPath: string, - instance: GethInstanceConfig, - verbose: boolean -) { - await initGeth(gethConfig, gethBinaryPath, instance, verbose) - return startGeth(gethConfig, gethBinaryPath, instance, verbose) -} - -export async function initGeth( - gethConfig: GethRunConfig, - gethBinaryPath: string, - instance: GethInstanceConfig, - verbose: boolean -) { - const datadir = getDatadir(gethConfig.runPath, instance) - const genesisPath = path.join(gethConfig.runPath, 'genesis.json') - if (verbose) { - console.info(`geth:${instance.name}: init datadir ${datadir}`) - console.info(`init geth with genesis at ${genesisPath}`) - } - - await spawnCmdWithExitOnFailure('rm', ['-rf', datadir], { silent: !verbose }) - await spawnCmdWithExitOnFailure(gethBinaryPath, ['--datadir', datadir, 'init', genesisPath], { - silent: !verbose, - }) - if (instance.privateKey) { - await importPrivateKey(gethConfig, gethBinaryPath, instance, verbose) - } -} - -export async function importPrivateKey( - getConfig: GethRunConfig, - gethBinaryPath: string, - instance: GethInstanceConfig, - verbose: boolean -) { - const keyFile = path.join(getDatadir(getConfig.runPath, instance), 'key.txt') - if (!instance.privateKey) { - throw new Error('Unexpected empty private key') - } - fs.writeFileSync(keyFile, instance.privateKey, { flag: 'a' }) - - if (verbose) { - console.info(`geth:${instance.name}: import account`) - } - - const args = [ - 'account', - 'import', - '--datadir', - getDatadir(getConfig.runPath, instance), - '--password', - '/dev/null', - keyFile, - ] - - if (verbose) { - console.info(gethBinaryPath, ...args) - } - - await spawnCmdWithExitOnFailure(gethBinaryPath, args, { silent: true }) -} - -export async function getEnode(peer: string, ws: boolean = false) { - // do we have already an enode? - if (peer.toLowerCase().startsWith('enode')) { - // yes return peer - return peer - } - - // no, try to build it - const p = ws ? 'ws' : 'http' - const enodeRpcUrl = `${p}://localhost:${peer}` - const admin = new Admin(enodeRpcUrl) - - let nodeInfo: any = { - enode: null, - } - - try { - nodeInfo = await admin.getNodeInfo() - } catch { - console.error(`Unable to get node info from ${enodeRpcUrl}`) - } - - return nodeInfo.enode -} - -export async function addStaticPeers(datadir: string, peers: string[], verbose: boolean) { - const staticPeersPath = path.join(datadir, 'static-nodes.json') - if (verbose) { - console.info(`Writing static peers to ${staticPeersPath}`) - } - - const enodes = await Promise.all(peers.map((peer) => getEnode(peer))) - const enodesString = JSON.stringify(enodes, null, 2) - - if (verbose) { - console.info('eNodes', enodesString) - } - - fs.writeFileSync(staticPeersPath, enodesString) -} - -export async function addProxyPeer( - runPath: string, - gethBinaryPath: string, - instance: GethInstanceConfig -) { - if (instance.proxies) { - await spawnCmdWithExitOnFailure(gethBinaryPath, [ - '--datadir', - getDatadir(runPath, instance), - 'attach', - '--exec', - `istanbul.addProxy('${instance.proxies[0]!}', '${instance.proxies[1]!}')`, - ]) - } -} - -export async function startGeth( - gethConfig: GethRunConfig, - gethBinaryPath: string, - instance: GethInstanceConfig, - verbose: boolean -) { - if (verbose) { - console.info('starting geth with config', JSON.stringify(instance, null, 2)) - } else { - console.info(`${instance.name}: starting.`) - } - - const datadir = getDatadir(gethConfig.runPath, instance) - - const { - syncmode, - port, - rpcport, - wsport, - validating, - replica, - validatingGasPrice, - bootnodeEnode, - isProxy, - proxyAllowPrivateIp, - isProxied, - proxyport, - ethstats, - gatewayFee, - } = instance - - const privateKey = instance.privateKey || '' - const lightserv = instance.lightserv || false - const minerValidator = instance.minerValidator - if (instance.validating && !minerValidator) { - throw new Error('miner.validator address from the instance is required') - } - const verbosity = gethConfig.verbosity ? gethConfig.verbosity : '3' - - instance.args = [ - '--datadir', - datadir, - '--syncmode', - syncmode, - '--log.debug', - '--metrics', - '--port', - port.toString(), - '--networkid', - gethConfig.networkId.toString(), - `--verbosity=${verbosity}`, - '--consoleoutput=stdout', // Send all logs to stdout - '--consoleformat=term', - '--nat', - 'extip:127.0.0.1', - '--allow-insecure-unlock', // geth1.9 to use http w/unlocking - '--gcmode=archive', // Needed to retrieve historical state - '--rpc.gasinflationrate=1', // InflationRate=1 (no inflation) - ] - - if (minerValidator) { - const txFeeRecipient = instance.txFeeRecipient || minerValidator - instance.args.push('--miner.validator', minerValidator, '--tx-fee-recipient', txFeeRecipient) - } - - if (rpcport) { - instance.args.push( - '--http', - '--http.port', - rpcport.toString(), - '--http.corsdomain=*', - '--http.vhosts=*', - '--http.api=eth,net,web3,debug,admin,personal,txpool,istanbul' - ) - } - - if (wsport) { - instance.args.push( - '--ws', - '--ws.origins=*', - '--ws.port', - wsport.toString(), - '--ws.api=eth,net,web3,debug,admin,personal,txpool,istanbul' - ) - } - - if (lightserv) { - instance.args.push('--light.serve=90') - instance.args.push('--light.maxpeers=10') - } else if (syncmode === 'full' || syncmode === 'fast') { - instance.args.push('--light.serve=0') - } - - if (instance.nodekey) { - instance.args.push(`--nodekeyhex=${instance.nodekey}`) - } else if (!validating || !replica) { - instance.args.push(`--nodekeyhex=${privateKey}`) - } - - if (gatewayFee) { - instance.args.push(`--light.gatewayfee=${gatewayFee.toString()}`) - } - - if (validating) { - instance.args.push('--mine') - - if (validatingGasPrice) { - instance.args.push(`--miner.gasprice=${validatingGasPrice}`) - } - - if (isProxied) { - instance.args.push('--proxy.proxied') - } - if (replica) { - instance.args.push('--istanbul.replica') - } - } else if (isProxy) { - instance.args.push('--proxy.proxy') - if (proxyport) { - instance.args.push(`--proxy.internalendpoint=:${proxyport.toString()}`) - } - instance.args.push(`--proxy.proxiedvalidatoraddress=${instance.proxiedValidatorAddress}`) - } - - if (bootnodeEnode) { - instance.args.push(`--bootnodes=${bootnodeEnode}`) - } else { - instance.args.push('--nodiscover') - } - - if (isProxied && instance.proxies) { - if (proxyAllowPrivateIp) { - instance.args.push('--proxy.allowprivateip=true') - } - instance.args.push(`--proxy.proxyenodeurlpairs=${instance.proxies[0]!};${instance.proxies[1]!}`) - } - - if (privateKey || ethstats) { - instance.args.push('--password=/dev/null', `--unlock=0`) - } - - if (ethstats) { - instance.args.push(`--ethstats=${instance.name}@${ethstats}`, '--etherbase=0') - } - - const gethProcess = spawnWithLog(gethBinaryPath, instance.args, `${datadir}/logs.txt`, verbose) - instance.pid = gethProcess.pid - - gethProcess.on('error', (err: Error) => { - throw new Error(`geth:${instance.name} failed to start! ${err}`) - }) - - gethProcess.on('exit', (code: number) => { - if (code === 0) { - console.info(`geth:${instance.name} exited`) - } else { - console.error(`geth:${instance.name} exited with code ${code}`) - } - instance.pid = undefined - }) - - // Give some time for geth to come up - const secondsToWait = 30 - if (rpcport) { - const isOpen = await waitForPortOpen('localhost', rpcport, secondsToWait) - if (!isOpen) { - console.error( - `geth:${instance.name}: jsonRPC port ${rpcport} didn't open after ${secondsToWait} seconds` - ) - process.exit(1) - } else if (verbose) { - console.info(`geth:${instance.name}: jsonRPC port open ${rpcport}`) - } - } - - if (wsport) { - const isOpen = await waitForPortOpen('localhost', wsport, secondsToWait) - if (!isOpen) { - console.error( - `geth:${instance.name}: ws port ${wsport} didn't open after ${secondsToWait} seconds` - ) - process.exit(1) - } else if (verbose) { - console.info(`geth:${instance.name}: ws port open ${wsport}`) - } - } - - // Geth startup isn't fully done even when the port is open, so check until it responds - const maxTries = 5 - let tries = 0 - while (tries < maxTries) { - tries++ - let block = null - try { - block = await new Web3('http://localhost:8545').eth.getBlock('latest') - } catch (e) { - console.info(`Failed to fetch test block: ${e}`) - } - if (block) { - break - } - console.info('Could not fetch test block. Wait one second, then retry.') - await sleep(1000) - } - if (tries === maxTries) { - throw new Error(`Geth did not start within ${tries} seconds`) - } - - console.info( - `${instance.name}: running.`, - rpcport ? `RPC: ${rpcport}` : '', - wsport ? `WS: ${wsport}` : '', - proxyport ? `PROXY: ${proxyport}` : '' - ) - - return instance -} - -export function writeGenesis(gethConfig: GethRunConfig, validators: Validator[], verbose: boolean) { - const genesis: string = generateGenesis({ - validators, - blockTime: 1, - epoch: 10, - lookbackwindow: 3, - requestTimeout: 3000, - chainId: gethConfig.networkId, - ...gethConfig.genesisConfig, - }) - - const genesisPath = path.join(gethConfig.runPath, 'genesis.json') - - if (verbose) { - console.info('writing genesis') - } - - fs.writeFileSync(genesisPath, genesis) - - if (verbose) { - console.info(`wrote genesis to ${genesisPath}`) - } -} - -export async function writeGenesisWithMigrations( - gethConfig: GethRunConfig, - gethRepoPath: string, - mnemonic: string, - numValidators: number, - verbose: boolean = false -) { - const genesis: string = await generateGenesisWithMigrations({ - gethRepoPath, - mnemonic, - numValidators, - verbose, - genesisConfig: { - blockTime: 1, - epoch: 10, - lookbackwindow: 3, - requestTimeout: 3000, - chainId: gethConfig.networkId, - ...gethConfig.genesisConfig, - }, - }) - - const genesisPath = path.join(gethConfig.runPath, 'genesis.json') - - if (verbose) { - console.info('writing genesis') - } - - fs.writeFileSync(genesisPath, genesis) - - if (verbose) { - console.info(`wrote genesis to ${genesisPath}`) - } -} - -export async function snapshotDatadir( - runPath: string, - instance: GethInstanceConfig, - verbose: boolean -) { - if (verbose) { - console.info('snapshotting data dir') - } - - // Sometimes the socket is still present, preventing us from snapshotting. - await spawnCmd('rm', [`${getDatadir(runPath, instance)}/geth.ipc`], { silent: true }) - await spawnCmdWithExitOnFailure('cp', [ - '-r', - getDatadir(runPath, instance), - getSnapshotdir(runPath, instance), - ]) -} - -export async function restoreDatadir(runPath: string, instance: GethInstanceConfig) { - const datadir = getDatadir(runPath, instance) - const snapshotdir = getSnapshotdir(runPath, instance) - - console.info(`geth:${instance.name}: restore datadir: ${datadir}`) - - await spawnCmdWithExitOnFailure('rm', ['-rf', datadir], { silent: true }) - await spawnCmdWithExitOnFailure('cp', ['-r', snapshotdir, datadir], { silent: true }) -} - -export async function buildGeth(gethPath: string) { - await spawnCmdWithExitOnFailure('make', ['geth'], { cwd: gethPath }) -} - -export async function buildGethAll(gethPath: string) { - await spawnCmdWithExitOnFailure('make', ['all'], { cwd: gethPath }) -} - -export async function resetDataDir(dataDir: string, verbose: boolean) { - await spawnCmd('rm', ['-rf', dataDir], { silent: !verbose }) - await spawnCmd('mkdir', [dataDir], { silent: !verbose }) -} - -export async function checkoutGethRepo(branch: string, gethPath: string) { - await spawnCmdWithExitOnFailure('rm', ['-rf', gethPath]) - await spawnCmdWithExitOnFailure('git', [ - 'clone', - '--depth', - '1', - 'https://github.com/celo-org/celo-blockchain.git', - gethPath, - '-b', - branch, - ]) - await spawnCmdWithExitOnFailure('git', ['checkout', branch], { cwd: gethPath }) -} - -export function spawnWithLog(cmd: string, args: string[], logsFilepath: string, verbose: boolean) { - try { - fs.unlinkSync(logsFilepath) - } catch (error) { - // nothing to do - } - - const logStream = fs.createWriteStream(logsFilepath, { flags: 'a' }) - - if (verbose) { - console.info(cmd, ...args) - } - - const p = spawn(cmd, args) - - p.stdout.pipe(logStream) - p.stderr.pipe(logStream) - - if (verbose) { - p.stdout.pipe(process.stdout) - p.stderr.pipe(process.stderr) - } - - return p -} - -// Create a fully connected clique of peer connections with the given instances. -export async function connectPeers(instances: GethInstanceConfig[], verbose: boolean = false) { - await connectBipartiteClique(instances, instances, verbose) -} - -// Fully connect all peers in the "left" set to all peers in the "right" set, forming a bipartite clique. -export async function connectBipartiteClique( - left: GethInstanceConfig[], - right: GethInstanceConfig[], - verbose: boolean = false -) { - const admins = (instances: GethInstanceConfig[]) => - instances.map( - ({ wsport, rpcport }) => - new Admin(`${rpcport ? 'http' : 'ws'}://localhost:${rpcport || wsport}`) - ) - - const connect = async (sources: GethInstanceConfig[], targets: GethInstanceConfig[]) => { - const targetEnodes = await Promise.all( - admins(targets).map(async (a) => (await a.getNodeInfo()).enode) - ) - - await Promise.all( - admins(sources).map(async (admin) => { - const sourceEnode = (await admin.getNodeInfo()).enode - await Promise.all( - targetEnodes.map(async (enode) => { - if (sourceEnode === enode) { - return - } - if (verbose) { - console.info(`connecting ${sourceEnode} with ${enode}`) - } - const success = await admin.addPeer(enode) - if (!success) { - throw new Error('Connecting geth peers failed!') - } - }) - ) - }) - ) - } - - await connect(left, right) - await connect(right, left) -} - -// Add validator 0 as a peer of each other validator. -export async function connectValidatorPeers(instances: GethInstanceConfig[]) { - const validators = instances.filter( - (node) => (node.validating && !node.isProxied) || node.isProxy - ) - // Determine which validators are isolated (i.e. currently just that they are not using a bootnode) - const isolated = validators.filter((node) => !node.bootnodeEnode) - if (isolated.length <= 0) { - return - } - - // Determine the root node to connect other validators to. It should be able to join the whole network of validators. - const root = validators.find((node) => node.bootnodeEnode) ?? validators[0] - await connectBipartiteClique([root], isolated) -} - -export async function migrateContracts( - monorepoRoot: string, - validatorPrivateKeys: string[], - attestationKeys: string[], - validators: string[], - to: number = 1000, - overrides: any = {}, - verbose: boolean = true -) { - const migrationOverrides = merge( - { - stableToken: { - initialBalances: { - addresses: validators.map(ensure0x), - values: validators.map(() => '10000000000000000000000'), - }, - oracles: validators.map(ensure0x), - }, - validators: { - validatorKeys: validatorPrivateKeys.map(ensure0x), - attestationKeys: attestationKeys.map(ensure0x), - }, - blockchainParameters: { - uptimeLookbackWindow: 3, // same as our default in `writeGenesis()` - }, - }, - overrides - ) - - const args = [ - '--cwd', - `${monorepoRoot}/packages/protocol`, - 'init-network', - '-n', - 'testing', - '-m', - JSON.stringify(migrationOverrides), - '-t', - to.toString(), - ] - - await spawnCmdWithExitOnFailure('yarn', args, { silent: !verbose }) -} diff --git a/packages/celotool/src/lib/helm_deploy.ts b/packages/celotool/src/lib/helm_deploy.ts deleted file mode 100644 index ac5975c0964..00000000000 --- a/packages/celotool/src/lib/helm_deploy.ts +++ /dev/null @@ -1,1408 +0,0 @@ -import { concurrentMap } from '@celo/utils/lib/async' -import compareVersions from 'compare-versions' -import fs from 'fs' -import { entries, range } from 'lodash' -import os from 'os' -import path from 'path' -import sleep from 'sleep-promise' -import { GCPClusterConfig } from 'src/lib/k8s-cluster/gcp' -import stringHash from 'string-hash' -import { getKubernetesClusterRegion, switchToClusterFromEnv } from './cluster' -import { - execCmd, - execCmdWithExitOnFailure, - outputIncludes, - spawnCmd, - spawnCmdWithExitOnFailure, -} from './cmd-utils' -import { envTypes, envVar, fetchEnv, fetchEnvOrFallback, monorepoRoot } from './env-utils' -import { ensureAuthenticatedGcloudAccount } from './gcloud_utils' -import { generateGenesisFromEnv } from './generate_utils' -import { - buildGethAll, - checkoutGethRepo, - getEnodesWithExternalIPAddresses, - retrieveBootnodeIPAddress, -} from './geth' -import { BaseClusterConfig, CloudProvider } from './k8s-cluster/base' -import { getStatefulSetReplicas, scaleResource } from './kubernetes' -import { installPrometheusIfNotExists } from './prometheus' -import { - getGenesisBlockFromGoogleStorage, - getProxiesPerValidator, - getProxyName, - uploadGenesisBlockToGoogleStorage, -} from './testnet-utils' -import { stringToBoolean } from './utils' - -const generator = require('generate-password') - -const CLOUDSQL_SECRET_NAME = 'blockscout-cloudsql-credentials' -const BACKUP_GCS_SECRET_NAME = 'backup-blockchain-credentials' -const TIMEOUT_FOR_LOAD_BALANCER_POLL = 1000 * 60 * 25 // 25 minutes -const LOAD_BALANCER_POLL_INTERVAL = 1000 * 10 // 10 seconds - -const TESTNET_CHART_DIR = '../helm-charts/testnet' -export type HelmAction = 'install' | 'upgrade' - -async function validateExistingCloudSQLInstance(instanceName: string) { - await ensureAuthenticatedGcloudAccount() - try { - await execCmd(`gcloud sql instances describe ${instanceName}`) - } catch (error) { - console.error(`Cloud SQL DB ${instanceName} does not exist, bailing`) - console.error(error) - process.exit(1) - } -} - -async function failIfSecretMissing(secretName: string, namespace: string) { - try { - await execCmd(`kubectl get secret ${secretName} --namespace ${namespace}`) - } catch (error) { - console.error( - `Couldn't retrieve service account secret, cluster is likely not setup correctly for deployment` - ) - console.error(error) - process.exit(1) - } -} - -async function copySecret(secretName: string, srcNamespace: string, destNamespace: string) { - console.info(`Copying secret ${secretName} from namespace ${srcNamespace} to ${destNamespace}`) - await execCmdWithExitOnFailure(`kubectl get secret ${secretName} --namespace ${srcNamespace} -o yaml |\ - grep -v creationTimestamp | grep -v resourceVersion | grep -v selfLink | grep -v uid |\ - sed 's/default/${destNamespace}/' | kubectl apply --namespace=${destNamespace} -f -`) -} - -export async function createCloudSQLInstance(celoEnv: string, instanceName: string) { - await ensureAuthenticatedGcloudAccount() - console.info('Creating Cloud SQL database, this might take a minute or two ...') - - await failIfSecretMissing(CLOUDSQL_SECRET_NAME, 'default') - - try { - await execCmd(`gcloud sql instances describe ${instanceName}`) - // if we get to here, that means the instance already exists - console.warn( - `A Cloud SQL instance named ${instanceName} already exists, so in all likelihood you cannot deploy initial with ${instanceName}` - ) - } catch (error: any) { - if ( - error.message.trim() !== - `Command failed: gcloud sql instances describe ${instanceName}\nERROR: (gcloud.sql.instances.describe) HTTPError 404: The Cloud SQL instance does not exist.` - ) { - console.error(error.message.trim()) - process.exit(1) - } - } - - // Quite often these commands timeout, but actually succeed anyway. By ignoring errors we allow them to be re-run. - - try { - await execCmd( - `gcloud sql instances create ${instanceName} --zone ${fetchEnv( - envVar.KUBERNETES_CLUSTER_ZONE - )} --database-version POSTGRES_9_6 --cpu 1 --memory 4G` - ) - } catch (error: any) { - console.error(error.message.trim()) - } - - const envType = fetchEnv(envVar.ENV_TYPE) - if (envType !== envTypes.DEVELOPMENT) { - try { - await execCmdWithExitOnFailure( - `gcloud sql instances create ${instanceName}-replica --master-instance-name=${instanceName} --zone ${fetchEnv( - envVar.KUBERNETES_CLUSTER_ZONE - )}` - ) - } catch (error: any) { - console.error(error.message.trim()) - } - } - - await execCmdWithExitOnFailure( - `gcloud sql instances patch ${instanceName} --backup-start-time 17:00` - ) - - const passwordOptions = { - length: 22, - numbers: true, - symbols: false, - lowercase: true, - uppercase: true, - strict: true, - } - - const blockscoutDBUsername = generator.generate(passwordOptions) - const blockscoutDBPassword = generator.generate(passwordOptions) - - console.info('Creating SQL user') - await execCmdWithExitOnFailure( - `gcloud sql users create ${blockscoutDBUsername} -i ${instanceName} --password ${blockscoutDBPassword}` - ) - - console.info('Creating blockscout database') - await execCmdWithExitOnFailure(`gcloud sql databases create blockscout -i ${instanceName}`) - - console.info('Copying blockscout service account secret to namespace') - await copySecret(CLOUDSQL_SECRET_NAME, 'default', celoEnv) - - const [blockscoutDBConnectionName] = await execCmdWithExitOnFailure( - `gcloud sql instances describe ${instanceName} --format="value(connectionName)"` - ) - - return [blockscoutDBUsername, blockscoutDBPassword, blockscoutDBConnectionName.trim()] -} - -export async function cloneCloudSQLInstance( - celoEnv: string, - instanceName: string, - cloneInstanceName: string, - dbSuffix: string -) { - await ensureAuthenticatedGcloudAccount() - console.info('Cloning Cloud SQL database, this might take a minute or two ...') - - await failIfSecretMissing(CLOUDSQL_SECRET_NAME, 'default') - - try { - await execCmd(`gcloud sql instances describe ${cloneInstanceName}`) - // if we get to here, that means the instance already exists - console.warn( - `A Cloud SQL instance named ${cloneInstanceName} already exists, so in all likelihood you cannot deploy cloning with ${cloneInstanceName}` - ) - } catch (error: any) { - if ( - error.message.trim() !== - `Command failed: gcloud sql instances describe ${cloneInstanceName}\nERROR: (gcloud.sql.instances.describe) HTTPError 404: The Cloud SQL instance does not exist.` - ) { - console.error(error.message.trim()) - process.exit(1) - } - } - - try { - await execCmdWithExitOnFailure( - `gcloud sql instances clone ${instanceName} ${cloneInstanceName} ` - ) - } catch (error: any) { - console.error(error.message.trim()) - } - - await execCmdWithExitOnFailure( - `gcloud sql instances patch ${cloneInstanceName} --backup-start-time 17:00` - ) - - const [blockscoutDBUsername, blockscoutDBPassword] = await retrieveCloudSQLConnectionInfo( - celoEnv, - instanceName, - dbSuffix - ) - - console.info('Copying blockscout service account secret to namespace') - await copySecret(CLOUDSQL_SECRET_NAME, 'default', celoEnv) - - const [blockscoutDBConnectionName] = await execCmdWithExitOnFailure( - `gcloud sql instances describe ${cloneInstanceName} --format="value(connectionName)"` - ) - - return [blockscoutDBUsername, blockscoutDBPassword, blockscoutDBConnectionName.trim()] -} - -export async function createSecretInSecretManagerIfNotExists( - secretId: string, - secretLabels: string[], - secretValue: string -) { - try { - await execCmd(`gcloud secrets describe ${secretId}`) - - console.info(`Secret ${secretId} already exists, skipping creation...`) - } catch (error) { - await execCmd( - `echo -n "${secretValue}" | gcloud secrets create ${secretId} --data-file=- --replication-policy="automatic" --labels ${secretLabels.join( - ',' - )}` - ) - } -} - -export async function deleteSecretFromSecretManager(secretId: string) { - try { - await execCmd(`gcloud secrets delete ${secretId}`) - } catch { - console.info(`Couldn't delete secret ${secretId} -- skipping`) - } -} - -async function createAndUploadKubernetesSecretIfNotExists( - secretName: string, - serviceAccountName: string, - celoEnv: string -) { - await switchToClusterFromEnv(celoEnv) - const keyfilePath = `/tmp/${serviceAccountName}_key.json` - const secretExists = await outputIncludes( - `kubectl get secrets`, - secretName, - `secret exists, skipping creation: ${secretName}` - ) - if (!secretExists) { - console.info(`Creating secret ${secretName}`) - await execCmdWithExitOnFailure( - `gcloud iam service-accounts keys create ${keyfilePath} --iam-account ${serviceAccountName}@${fetchEnv( - envVar.TESTNET_PROJECT_NAME - )}.iam.gserviceaccount.com` - ) - await execCmdWithExitOnFailure( - `kubectl create secret generic ${secretName} --from-file=credentials.json=${keyfilePath}` - ) - } -} - -export async function createAndUploadCloudSQLSecretIfNotExists( - serviceAccountName: string, - celoEnv: string -) { - return createAndUploadKubernetesSecretIfNotExists( - CLOUDSQL_SECRET_NAME, - serviceAccountName, - celoEnv - ) -} - -export async function createAndUploadBackupSecretIfNotExists( - serviceAccountName: string, - celoEnv: string -) { - return createAndUploadKubernetesSecretIfNotExists( - BACKUP_GCS_SECRET_NAME, - serviceAccountName, - celoEnv - ) -} - -export function getServiceAccountName(prefix: string) { - // NOTE: trim to meet the max size requirements of service account names - return `${prefix}-${fetchEnv(envVar.KUBERNETES_CLUSTER_NAME)}`.slice(0, 30) -} - -export async function installGCPSSDStorageClass() { - // A previous version installed this directly with `kubectl` instead of helm. - // To be backward compatible, we don't install the chart if the storage class - // already exists. - const storageClassExists = await outputIncludes( - `kubectl get storageclass`, - `ssd`, - `SSD StorageClass exists, skipping install` - ) - if (!storageClassExists) { - const gcpSSDHelmChartPath = '../helm-charts/gcp-ssd' - await execCmdWithExitOnFailure(`helm upgrade -i gcp-ssd ${gcpSSDHelmChartPath}`) - } -} - -export async function installCertManagerAndNginx( - celoEnv: string, - clusterConfig?: BaseClusterConfig -) { - const nginxChartVersion = '4.2.1' - const nginxChartNamespace = 'default' - - // Check if cert-manager is installed in any namespace - // because cert-manager crds are global and cannot live - // different crds version in the same cluster - const certManagerExists = - (await outputIncludes(`helm list -n default`, `cert-manager-cluster-issuers`)) || - (await outputIncludes(`helm list -n cert-manager`, `cert-manager-cluster-issuers`)) - - if (certManagerExists) { - console.info('cert-manager-cluster-issuers exists, skipping install') - } else { - await installCertManager() - } - - const nginxIngressReleaseExists = await outputIncludes( - `helm list -n default`, - `nginx-ingress-release`, - `nginx-ingress-release exists, skipping install` - ) - if (!nginxIngressReleaseExists) { - const valueFilePath = `/tmp/${celoEnv}-nginx-testnet-values.yaml` - await nginxHelmParameters(valueFilePath, celoEnv, clusterConfig) - - await helmAddAndUpdateRepos() - await execCmdWithExitOnFailure(`helm install \ - -n ${nginxChartNamespace} \ - --version ${nginxChartVersion} \ - nginx-ingress-release ingress-nginx/ingress-nginx \ - -f ${valueFilePath} - `) - } -} - -async function nginxHelmParameters( - valueFilePath: string, - celoEnv: string, - clusterConfig?: BaseClusterConfig -) { - const logFormat = `{"timestamp": "$time_iso8601", "requestID": "$req_id", "proxyUpstreamName": - "$proxy_upstream_name", "proxyAlternativeUpstreamName": "$proxy_alternative_upstream_name","upstreamStatus": - "$upstream_status", "upstreamAddr": "$upstream_addr","httpRequest":{"requestMethod": - "$request_method", "requestUrl": "$host$request_uri", "status": $status,"requestSize": - "$request_length", "responseSize": "$upstream_response_length", "userAgent": - "$http_user_agent", "remoteIp": "$remote_addr", "referer": "$http_referer", - "latency": "$upstream_response_time s", "protocol":"$server_protocol"}}` - - let loadBalancerIP = '' - if (clusterConfig == null || clusterConfig?.cloudProvider === CloudProvider.GCP) { - loadBalancerIP = await getOrCreateNginxStaticIp(celoEnv, clusterConfig) - } - - const valueFileContent = ` -controller: - autoscaling: - enabled: "true" - minReplicas: 1 - maxReplicas: 10 - targetCPUUtilizationPercentage: 80 - targetMemoryUtilizationPercentage: 80 - config: - log-format-escape-json: "true" - log-format-upstream: '${logFormat}' - metrics: - enabled: "true" - service: - annotations: - prometheus.io/scrape: "true" - prometheus.io/port: "10254" - service: - loadBalancerIP: ${loadBalancerIP} - resources: - requests: - cpu: 300m - memory: 600Mi -` - fs.writeFileSync(valueFilePath, valueFileContent) -} - -async function getOrCreateNginxStaticIp(celoEnv: string, clusterConfig?: BaseClusterConfig) { - const staticIpName = clusterConfig?.clusterName - ? `${clusterConfig?.clusterName}-nginx` - : `${celoEnv}-nginx` - let staticIpAddress - if (clusterConfig !== undefined && clusterConfig.hasOwnProperty('zone')) { - const zone = (clusterConfig as GCPClusterConfig).zone - await registerIPAddress(staticIpName, zone) - staticIpAddress = await retrieveIPAddress(staticIpName, zone) - } else { - await registerIPAddress(staticIpName) - staticIpAddress = await retrieveIPAddress(staticIpName) - } - console.info(`nginx-ingress static ip --> ${staticIpName}: ${staticIpAddress}`) - return staticIpAddress -} - -// Add a Helm repository and updates the local cache. If repository already exists, it is executed -// without error. -export async function helmAddRepoAndUpdate(repository: string, name?: string) { - if (name === undefined) { - const repoArray = repository.split('/') - name = repoArray[repoArray.length - 1] - } - console.info(`Adding Helm repository ${name} with URL ${repository}`) - await execCmdWithExitOnFailure(`helm repo add ${name} ${repository}`) - await execCmdWithExitOnFailure(`helm repo update`) -} - -// Add common helm repositories -export async function helmAddAndUpdateRepos() { - await helmAddRepoAndUpdate('https://kubernetes.github.io/ingress-nginx') - await helmAddRepoAndUpdate('https://charts.helm.sh/stable') - await execCmdWithExitOnFailure(`helm repo update`) -} - -export async function installCertManager() { - const clusterIssuersHelmChartPath = `../helm-charts/cert-manager-cluster-issuers` - - console.info('Create the namespace for cert-manager') - await execCmdWithExitOnFailure(`kubectl create namespace cert-manager`) - - console.info('Installing cert-manager CustomResourceDefinitions') - await execCmdWithExitOnFailure( - `kubectl apply -f https://github.com/jetstack/cert-manager/releases/download/v1.9.1/cert-manager.crds.yaml` - ) - console.info('Updating cert-manager-cluster-issuers chart dependencies') - await execCmdWithExitOnFailure(`helm dependency update ${clusterIssuersHelmChartPath}`) - console.info('Installing cert-manager-cluster-issuers') - await execCmdWithExitOnFailure( - `helm install cert-manager-cluster-issuers ${clusterIssuersHelmChartPath} -n cert-manager` - ) -} - -export async function installAndEnableMetricsDeps( - installPrometheus: boolean, - context?: string, - clusterConfig?: BaseClusterConfig -) { - const kubeStateMetricsReleaseExists = await outputIncludes( - `helm list -n default`, - `kube-state-metrics`, - `kube-state-metrics exists, skipping install` - ) - if (!kubeStateMetricsReleaseExists) { - await execCmdWithExitOnFailure( - `helm install kube-state-metrics stable/kube-state-metrics --set rbac.create=true -n default` - ) - } - if (installPrometheus) { - await installPrometheusIfNotExists(context, clusterConfig) - } -} - -export async function grantRoles(serviceAccountName: string, role: string) { - const projectName = fetchEnv(envVar.TESTNET_PROJECT_NAME) - - const serviceAccountFullName = `${serviceAccountName}@${projectName}.iam.gserviceaccount.com` - const commandRolesAlreadyGranted = `gcloud projects get-iam-policy ${projectName} \ - --flatten="bindings[].members" \ - --format='table(bindings.role)' \ - --filter="bindings.members:serviceAccount:${serviceAccountFullName}"` - const rolesAlreadyGranted = await outputIncludes( - commandRolesAlreadyGranted, - role, - `Role ${role} already granted for account ${serviceAccountFullName}, skipping binding` - ) - if (!rolesAlreadyGranted) { - const cmd = - `gcloud projects add-iam-policy-binding ${projectName} ` + - `--role=${role} ` + - `--member=serviceAccount:${serviceAccountFullName}` - await execCmd(cmd) - } - return -} - -export async function retrieveCloudSQLConnectionInfo( - celoEnv: string, - instanceName: string, - dbSuffix: string -) { - await validateExistingCloudSQLInstance(instanceName) - const secretName = `${celoEnv}-blockscout${dbSuffix}` - const [blockscoutDBUsername] = await execCmdWithExitOnFailure( - `kubectl get secret ${secretName} -o jsonpath='{.data.DATABASE_USER}' -n ${celoEnv} | base64 --decode` - ) - const [blockscoutDBPassword] = await execCmdWithExitOnFailure( - `kubectl get secret ${secretName} -o jsonpath='{.data.DATABASE_PASSWORD}' -n ${celoEnv} | base64 --decode` - ) - const [blockscoutDBConnectionName] = await execCmdWithExitOnFailure( - `gcloud sql instances describe ${instanceName} --format="value(connectionName)"` - ) - - return [blockscoutDBUsername, blockscoutDBPassword, blockscoutDBConnectionName.trim()] -} - -export async function deleteCloudSQLInstance( - instanceName: string -): Promise<[string, string, string]> { - console.info(`Deleting Cloud SQL instance ${instanceName}, this might take a minute or two ...`) - try { - await execCmd(`gcloud sql instances delete ${instanceName} --quiet`) - } catch { - console.info(`Couldn't delete Cloud SQL instance ${instanceName} -- skipping`) - } - return ['', '', ''] -} - -export async function resetCloudSQLInstance(instanceName: string) { - await validateExistingCloudSQLInstance(instanceName) - - console.info('Deleting blockscout database from instance') - await execCmdWithExitOnFailure( - `gcloud sql databases delete blockscout -i ${instanceName} --quiet` - ) - - console.info('Creating blockscout database') - await execCmdWithExitOnFailure(`gcloud sql databases create blockscout -i ${instanceName}`) -} - -export async function registerIPAddress(name: string, zone?: string) { - console.info(`Registering IP address ${name}`) - try { - await execCmd( - `gcloud compute addresses create ${name} --region ${getKubernetesClusterRegion(zone)}` - ) - } catch (error: any) { - if (!error.toString().includes('already exists')) { - console.error(error) - process.exit(1) - } - } -} - -export async function deleteIPAddress(name: string, zone?: string) { - console.info(`Deleting IP address ${name}`) - try { - if (isCelotoolVerbose()) { - console.info(`IP Address ${name} would be deleted`) - } else { - await execCmd( - `gcloud compute addresses delete ${name} --region ${getKubernetesClusterRegion(zone)} -q` - ) - } - } catch (error: any) { - if (!error.toString().includes('was not found')) { - console.error(error) - process.exit(1) - } - } -} - -export async function retrieveIPAddress(name: string, zone?: string) { - const regionFlag = zone === 'global' ? '--global' : `--region ${getKubernetesClusterRegion(zone)}` - const [address] = await execCmdWithExitOnFailure( - `gcloud compute addresses describe ${name} ${regionFlag} --format="value(address)"` - ) - return address.replace(/\n*$/, '') -} - -export async function retrieveIPAddresses(prefix: string, zone?: string) { - const [address] = await execCmdWithExitOnFailure( - `gcloud compute addresses list --filter="name~'${prefix}-' AND name!~'${prefix}-private-' AND region:( ${getKubernetesClusterRegion( - zone - )} )" --format="value(name)"` - ) - return address.split('\n') -} - -// returns the IP address of a resource internal to the cluster (ie 10.X.X.X) -export async function retrieveClusterIPAddress( - resourceType: string, - resourceName: string, - namespace: string -) { - const [address] = await execCmdWithExitOnFailure( - `kubectl get ${resourceType} ${resourceName} -n ${namespace} -o jsonpath={.spec.clusterIP}` - ) - return address -} - -export async function createStaticIPs(celoEnv: string) { - console.info(`Creating static IPs for ${celoEnv}`) - - const numTxNodes = parseInt(fetchEnv(envVar.TX_NODES), 10) - await concurrentMap(5, range(numTxNodes), (i) => registerIPAddress(`${celoEnv}-tx-nodes-${i}`)) - - if (useStaticIPsForGethNodes()) { - await registerIPAddress(`${celoEnv}-bootnode`) - - const validatorCount = parseInt(fetchEnv(envVar.VALIDATORS), 10) - const proxiesPerValidator = getProxiesPerValidator() - // only create IPs for validators that are not proxied - for (let i = 0; i < validatorCount; i++) { - if (proxiesPerValidator[i] === 0) { - await registerIPAddress(`${celoEnv}-validators-${i}`) - } - } - - // and create IPs for all the proxies - let validatorIndex = 0 - for (const proxyCount of proxiesPerValidator) { - for (let i = 0; i < proxyCount; i++) { - await registerIPAddress(getProxyName(celoEnv, validatorIndex, i)) - } - validatorIndex++ - } - - // Create IPs for the private tx nodes - const numPrivateTxNodes = parseInt(fetchEnv(envVar.PRIVATE_TX_NODES), 10) - await concurrentMap(5, range(numPrivateTxNodes), (i) => - registerIPAddress(`${celoEnv}-tx-nodes-private-${i}`) - ) - } -} - -export async function upgradeStaticIPs(celoEnv: string) { - const newTxNodeCount = parseInt(fetchEnv(envVar.TX_NODES), 10) - await upgradeNodeTypeStaticIPs(celoEnv, 'tx-nodes', newTxNodeCount) - - if (useStaticIPsForGethNodes()) { - const prevValidatorNodeCount = await getStatefulSetReplicas(celoEnv, `${celoEnv}-validators`) - const newValidatorNodeCount = parseInt(fetchEnv(envVar.VALIDATORS), 10) - await upgradeValidatorStaticIPs(celoEnv, prevValidatorNodeCount, newValidatorNodeCount) - - const proxiesPerValidator = getProxiesPerValidator() - // Iterate through all validators and check to see if there are changes in proxies - const higherValidatorCount = Math.max(prevValidatorNodeCount, newValidatorNodeCount) - for (let i = 0; i < higherValidatorCount; i++) { - const proxyCount = proxiesPerValidator[i] - await upgradeNodeTypeStaticIPs(celoEnv, `validators-${i}-proxy`, proxyCount) - } - - const newPrivateTxNodeCount = parseInt(fetchEnv(envVar.PRIVATE_TX_NODES), 10) - await upgradeNodeTypeStaticIPs(celoEnv, 'tx-nodes-private', newPrivateTxNodeCount) - } -} - -async function upgradeValidatorStaticIPs( - celoEnv: string, - prevValidatorNodeCount: number, - newValidatorNodeCount: number -) { - const proxiesPerValidator = getProxiesPerValidator() - - // Iterate through each validator & create or destroy - // IP addresses as necessary. If a validator has a 1+ proxies, - // the validator do not have a static IP. If the validator has - // no proxy, then the validator needs a static ip. - const higherValidatorCount = Math.max(prevValidatorNodeCount, newValidatorNodeCount) - for (let i = 0; i < higherValidatorCount; i++) { - const ipName = `${celoEnv}-validators-${i}` - let ipExists - try { - await retrieveIPAddress(ipName) - ipExists = true - } catch (e) { - ipExists = false - } - const proxiedValidator = proxiesPerValidator[i] === 0 ? false : true - if (ipExists && proxiedValidator) { - await deleteIPAddress(ipName) - } else if (!ipExists && !proxiedValidator) { - await registerIPAddress(ipName) - } - } -} - -async function upgradeNodeTypeStaticIPs(celoEnv: string, nodeType: string, newNodeCount: number) { - const existingAddresses = await retrieveIPAddresses(`${celoEnv}-${nodeType}`) - const desiredAddresses = range(0, newNodeCount).map((i) => `${celoEnv}-${nodeType}-${i}`) - const addressesToCreate = desiredAddresses.filter((a) => !existingAddresses.includes(a)) - const addressesToDelete = existingAddresses.filter((a) => !desiredAddresses.includes(a)) - - for (const address of addressesToCreate) { - if (address) { - await registerIPAddress(address) - } - } - - for (const address of addressesToDelete) { - if (address) { - await deleteIPAddress(address) - } - } -} - -export async function pollForBootnodeLoadBalancer(celoEnv: string) { - if (!useStaticIPsForGethNodes()) { - return - } - console.info(`Poll for bootnode load balancer`) - let totalTime = 0 - - while (true) { - const [rules] = await execCmdWithExitOnFailure( - `gcloud compute addresses describe ${celoEnv}-bootnode --region ${getKubernetesClusterRegion()} --format="value(users.len())"` - ) - - if (parseInt(rules, 10) > 0) { - break - } - - totalTime += LOAD_BALANCER_POLL_INTERVAL - if (totalTime > TIMEOUT_FOR_LOAD_BALANCER_POLL) { - console.error( - `\nCould not detect the bootnode's load balancer provisioning, which will likely cause the peers on the network unable to connect` - ) - process.exit(1) - } - - process.stdout.write('.') - await sleep(LOAD_BALANCER_POLL_INTERVAL) - } - - console.info('Sleeping 1 minute...') - await sleep(1000 * 60) // 1 minute - - console.info(`\nReset all pods now that the bootnode load balancer has provisioned`) - await execCmdWithExitOnFailure(`kubectl delete pod -n ${celoEnv} --selector=component=validators`) - await execCmdWithExitOnFailure(`kubectl delete pod -n ${celoEnv} --selector=component=tx_nodes`) - await execCmdWithExitOnFailure(`kubectl delete pod -n ${celoEnv} --selector=component=proxy`) - return -} - -export async function deleteStaticIPs(celoEnv: string) { - console.info(`Deleting static IPs for ${celoEnv}`) - - const numTxNodes = parseInt(fetchEnv(envVar.TX_NODES), 10) - await concurrentMap(5, range(numTxNodes), (i) => deleteIPAddress(`${celoEnv}-tx-nodes-${i}`)) - - await deleteIPAddress(`${celoEnv}-bootnode`) - - const numValidators = parseInt(fetchEnv(envVar.VALIDATORS), 10) - await concurrentMap(5, range(numValidators), (i) => deleteIPAddress(`${celoEnv}-validators-${i}`)) - - const proxiesPerValidator = getProxiesPerValidator() - for (let valIndex = 0; valIndex < numValidators; valIndex++) { - for (let proxyIndex = 0; proxyIndex < proxiesPerValidator[valIndex]; proxyIndex++) { - await deleteIPAddress(getProxyName(celoEnv, valIndex, proxyIndex)) - } - } - - const numPrivateTxNodes = parseInt(fetchEnv(envVar.PRIVATE_TX_NODES), 10) - await concurrentMap(5, range(numPrivateTxNodes), (i) => - deleteIPAddress(`${celoEnv}-tx-nodes-private-${i}`) - ) -} - -export async function deletePersistentVolumeClaims(celoEnv: string, componentLabels: string[]) { - for (const component of componentLabels) { - await deletePersistentVolumeClaimsCustomLabels(celoEnv, 'component', component) - } -} - -export async function deletePersistentVolumeClaimsCustomLabels( - namespace: string, - label: string, - value: string -) { - console.info( - `Deleting persistent volume claims for labels ${label}=${value} in namespace ${namespace}` - ) - try { - const [output] = await execCmd( - `kubectl delete pvc --selector='${label}=${value}' --namespace ${namespace}` - ) - console.info(output) - } catch (error: any) { - console.error(error) - if (!error.toString().includes('not found')) { - process.exit(1) - } - } -} - -async function helmIPParameters(celoEnv: string) { - const ipAddressParameters: string[] = [ - `--set geth.static_ips=${fetchEnv(envVar.STATIC_IPS_FOR_GETH_NODES)}`, - ] - - const numTxNodes = parseInt(fetchEnv(envVar.TX_NODES), 10) - - const txAddresses = await concurrentMap(5, range(numTxNodes), (i) => - retrieveIPAddress(`${celoEnv}-tx-nodes-${i}`) - ) - - // Tx-node IPs - const txNodeIpParams = setHelmArray('geth.txNodesIPAddressArray', txAddresses) - ipAddressParameters.push(...txNodeIpParams) - - if (useStaticIPsForGethNodes()) { - ipAddressParameters.push( - `--set geth.bootnodeIpAddress=${await retrieveBootnodeIPAddress(celoEnv)}` - ) - - // Validator IPs - const numValidators = parseInt(fetchEnv(envVar.VALIDATORS), 10) - const proxiesPerValidator = getProxiesPerValidator() - // This tracks validator IP addresses for each corresponding validator. If the validator - // is proxied, there is no public IP address, so it's set as an empty string - const validatorIpAddresses = [] - for (let i = 0; i < numValidators; i++) { - if (proxiesPerValidator[i] > 0) { - // Then this validator is proxied - validatorIpAddresses.push('') - } else { - validatorIpAddresses.push(await retrieveIPAddress(`${celoEnv}-validators-${i}`)) - } - } - const validatorIpParams = setHelmArray('geth.validatorsIPAddressArray', validatorIpAddresses) - ipAddressParameters.push(...validatorIpParams) - - // Proxy IPs - // Helm ran into issues when dealing with 2-d lists, - // so each index corresponds to a particular validator. - // Multiple proxy IPs for a single validator are separated by '/' - const proxyIpAddressesPerValidator = [] - let validatorIndex = 0 - for (const proxyCount of proxiesPerValidator) { - const proxyIpAddresses = [] - for (let i = 0; i < proxyCount; i++) { - proxyIpAddresses.push(await retrieveIPAddress(getProxyName(celoEnv, validatorIndex, i))) - } - const listOfProxyIpAddresses = proxyIpAddresses.join('/') - proxyIpAddressesPerValidator.push(listOfProxyIpAddresses) - - validatorIndex++ - } - - const proxyIpAddressesParams = setHelmArray( - 'geth.proxyIPAddressesPerValidatorArray', - proxyIpAddressesPerValidator - ) - ipAddressParameters.push(...proxyIpAddressesParams) - - const numPrivateTxNodes = parseInt(fetchEnv(envVar.PRIVATE_TX_NODES), 10) - const privateTxAddresses = await concurrentMap(5, range(numPrivateTxNodes), (i) => - retrieveIPAddress(`${celoEnv}-tx-nodes-private-${i}`) - ) - const privateTxAddressParameters = privateTxAddresses.map( - (address, i) => `--set geth.private_tx_nodes_${i}IpAddress=${address}` - ) - ipAddressParameters.push(...privateTxAddressParameters) - const listOfPrivateTxNodeAddresses = privateTxAddresses.join(',') - ipAddressParameters.push( - `--set geth.private_tx_node_ip_addresses='{${listOfPrivateTxNodeAddresses}}'` - ) - } - - return ipAddressParameters -} - -async function helmParameters(celoEnv: string, useExistingGenesis: boolean) { - const valueFilePath = `/tmp/${celoEnv}-testnet-values.yaml` - await saveHelmValuesFile(celoEnv, valueFilePath, useExistingGenesis, false) - - const gethMetricsOverrides = - fetchEnvOrFallback('GETH_ENABLE_METRICS', 'false') === 'true' - ? [ - `--set metrics="true"`, - `--set pprof.enabled="true"`, - `--set pprof.path="/debug/metrics/prometheus"`, - `--set pprof.port="6060"`, - ] - : [`--set metrics="false"`, `--set pprof.enabled="false"`] - - const useMyCelo = stringToBoolean(fetchEnvOrFallback(envVar.GETH_USE_MYCELO, 'false')) - await createAndPushGenesis(celoEnv, !useExistingGenesis, useMyCelo) - - const bootnodeOverwritePkey = - fetchEnvOrFallback(envVar.GETH_BOOTNODE_OVERWRITE_PKEY, '') !== '' - ? [ - `--set geth.overwriteBootnodePrivateKey="true"`, - `--set geth.bootnodePrivateKey="${fetchEnv(envVar.GETH_BOOTNODE_OVERWRITE_PKEY)}"`, - ] - : [`--set geth.overwriteBootnodePrivateKey="false"`] - - const defaultDiskSize = fetchEnvOrFallback(envVar.NODE_DISK_SIZE_GB, '10') - const privateTxNodeDiskSize = fetchEnvOrFallback( - envVar.PRIVATE_NODE_DISK_SIZE_GB, - defaultDiskSize - ) - - return [ - `-f ${valueFilePath}`, - `--set bootnode.image.repository=${fetchEnv('GETH_BOOTNODE_DOCKER_IMAGE_REPOSITORY')}`, - `--set bootnode.image.tag=${fetchEnv('GETH_BOOTNODE_DOCKER_IMAGE_TAG')}`, - `--set celotool.image.repository=${fetchEnv('CELOTOOL_DOCKER_IMAGE_REPOSITORY')}`, - `--set celotool.image.tag=${fetchEnv('CELOTOOL_DOCKER_IMAGE_TAG')}`, - `--set domain.name=${fetchEnv('CLUSTER_DOMAIN_NAME')}`, - `--set genesis.useGenesisFileBase64="false"`, - `--set genesis.network=${celoEnv}`, - `--set genesis.networkId=${fetchEnv(envVar.NETWORK_ID)}`, - `--set geth.verbosity=${fetchEnvOrFallback('GETH_VERBOSITY', '4')}`, - `--set geth.vmodule=${fetchEnvOrFallback('GETH_VMODULE', '')}`, - `--set geth.resources.requests.cpu=${fetchEnv('GETH_NODE_CPU_REQUEST')}`, - `--set geth.resources.requests.memory=${fetchEnv('GETH_NODE_MEMORY_REQUEST')}`, - `--set geth.image.repository=${fetchEnv('GETH_NODE_DOCKER_IMAGE_REPOSITORY')}`, - `--set geth.image.tag=${fetchEnv('GETH_NODE_DOCKER_IMAGE_TAG')}`, - `--set geth.validators="${fetchEnv('VALIDATORS')}"`, - `--set geth.secondaries="${fetchEnvOrFallback('SECONDARIES', '0')}"`, - `--set geth.use_gstorage_data=${fetchEnvOrFallback('USE_GSTORAGE_DATA', 'false')}`, - `--set geth.gstorage_data_bucket=${fetchEnvOrFallback('GSTORAGE_DATA_BUCKET', '')}`, - `--set geth.faultyValidators="${fetchEnvOrFallback('FAULTY_VALIDATORS', '0')}"`, - `--set geth.faultyValidatorType="${fetchEnvOrFallback('FAULTY_VALIDATOR_TYPE', '0')}"`, - `--set geth.tx_nodes="${fetchEnv('TX_NODES')}"`, - `--set geth.private_tx_nodes="${fetchEnv(envVar.PRIVATE_TX_NODES)}"`, - `--set geth.ssd_disks="${fetchEnvOrFallback(envVar.GETH_NODES_SSD_DISKS, 'true')}"`, - `--set geth.account.secret="${fetchEnv('GETH_ACCOUNT_SECRET')}"`, - `--set geth.ping_ip_from_packet=${fetchEnvOrFallback('PING_IP_FROM_PACKET', 'false')}`, - `--set geth.in_memory_discovery_table=${fetchEnvOrFallback( - 'IN_MEMORY_DISCOVERY_TABLE', - 'false' - )}`, - `--set geth.diskSizeGB=${defaultDiskSize}`, - `--set geth.privateTxNodediskSizeGB=${privateTxNodeDiskSize}`, - `--set mnemonic="${fetchEnv('MNEMONIC')}"`, - ...setHelmArray('geth.proxiesPerValidator', getProxiesPerValidator()), - ...gethMetricsOverrides, - ...bootnodeOverwritePkey, - ...rollingUpdateHelmVariables(), - ...(await helmIPParameters(celoEnv)), - ] -} - -async function helmCommand(command: string, pipeOutput = false) { - // "helm diff" is a plugin and doesn't support "--debug" - if (isCelotoolVerbose() && !command.startsWith('helm diff')) { - command += ' --debug' - } - - await execCmdWithExitOnFailure(command, {}, pipeOutput) -} - -function buildHelmChartDependencies(chartDir: string) { - console.info(`Building any chart dependencies...`) - return helmCommand(`helm dep build ${chartDir}`) -} - -export async function installHelmDiffPlugin() { - try { - await execCmd(`helm diff version`, {}, false) - } catch (error) { - console.info(`Installing helm-diff plugin...`) - await execCmdWithExitOnFailure(`helm plugin install https://github.com/databus23/helm-diff`) - } -} - -// Return the values file arg if file exists If values file reference is defined and file not found, -// throw an error. When chartDir is a remote chart, the values file is assumed to be an abslute path. -function valuesOverrideArg(chartDir: string, filename: string | undefined) { - if (filename === undefined) { - return '' - } else if (fs.existsSync(filename)) { - return `-f ${filename}` - } else if (fs.existsSync(path.join(chartDir, filename))) { - return `-f ${path.join(chartDir, filename)}` - } else { - console.error(`Values override file ${filename} not found`) - } -} - -// namespace: The namespace to install the chart into -// releaseName: The name of the release -// chartDir: The directory containing the chart or the values.yamls files. By default, it will try to use a custom values file -// at ${chartDir}/${valuesOverrideFile}.yaml -// parameters: The parameters to pass to the helm install command (e.g. --set geth.replicas=3) -// buildDependencies: Whether to build the chart dependencies before installing. When using a remote chart, this must be false. -// chartVersion: The version of the chart to install. Used only when chartRemoteReference is set -// valuesOverrideFile: The name of the values file to use. In the case of a remote chart, this is assumed to be an absolute path. -interface GenericHelmChartParameters { - namespace: string - releaseName: string - chartDir: string - parameters: string[] - buildDependencies?: boolean - chartVersion?: string - valuesOverrideFile?: string -} -// Install a Helm Chart. Look above for the parameters - -// When using a remote helm chart, buildDependencies must be false and valuesOverrideFile the absolute path to the values file -export async function installGenericHelmChart({ - namespace, - releaseName, - chartDir, - parameters, - buildDependencies = true, - chartVersion, - valuesOverrideFile, -}: GenericHelmChartParameters) { - if (buildDependencies) { - await buildHelmChartDependencies(chartDir) - } - - if (isCelotoolHelmDryRun()) { - const versionLog = chartVersion ? ` version ${chartVersion}` : '' - const valuesOverrideLog = valuesOverrideFile - ? `, with values override: ${valuesOverrideFile}` - : '' - console.info( - `This would deploy chart ${chartDir}${versionLog} with release name ${releaseName} in namespace ${namespace}${valuesOverrideLog} with parameters:` - ) - console.info(parameters) - } else { - console.info(`Installing helm release ${releaseName}`) - const versionArg = chartVersion ? `--version=${chartVersion}` : '' - const valuesOverride = valuesOverrideArg(chartDir, valuesOverrideFile) - await helmCommand( - `helm upgrade --install ${valuesOverride} ${releaseName} ${chartDir} ${versionArg} --namespace ${namespace} ${parameters.join( - ' ' - )}` - ) - } -} - -// Upgrade a Helm Chart. chartDir can be the path to the Helm Chart or the name of a remote Helm Chart. -// If using a remote helm chart, the chart repository has to be added and updated in the local helm config -// When using a remote helm chart, buildDependencies must be false and valuesOverrideFile the absolute path to the values file -export async function upgradeGenericHelmChart({ - namespace, - releaseName, - chartDir, - parameters, - buildDependencies = true, - chartVersion, - valuesOverrideFile, -}: GenericHelmChartParameters) { - if (buildDependencies) { - await buildHelmChartDependencies(chartDir) - } - const valuesOverride = valuesOverrideArg(chartDir, valuesOverrideFile) - const versionArg = chartVersion ? `--version=${chartVersion}` : '' - - if (isCelotoolHelmDryRun()) { - console.info( - `Simulating the upgrade of helm release ${releaseName}. No output means no change in the helm release` - ) - await installHelmDiffPlugin() - await helmCommand( - `helm diff upgrade --install -C 5 ${valuesOverride} ${versionArg} ${releaseName} ${chartDir} --namespace ${namespace} ${parameters.join( - ' ' - )}`, - true - ) - } else { - console.info(`Upgrading helm release ${releaseName}`) - await helmCommand( - `helm upgrade --install ${valuesOverride} ${versionArg} ${releaseName} ${chartDir} --timeout 120h --namespace ${namespace} ${parameters.join( - ' ' - )}` - ) - console.info(`Upgraded helm release ${releaseName} successful`) - } -} - -export async function getConfigMapHashes( - celoEnv: string, - releaseName: string, - chartDir: string, - parameters: string[], - action: HelmAction, - valuesOverrideFile?: string -): Promise> { - const valuesOverride = valuesOverrideArg(chartDir, valuesOverrideFile) - const [output] = await execCmd( - `helm ${action} -f ${chartDir}/values.yaml ${valuesOverride} ${releaseName} ${chartDir} --namespace ${celoEnv} ${parameters.join( - ' ' - )} --dry-run`, - {}, - false, - false - ) - - return output - .split('---') - .filter((section) => { - return /kind: ConfigMap/.exec(section) - }) - .reduce>((configHashes, section) => { - const matchSource = /Source: (.*)/.exec(section) - if (matchSource === null) { - throw new Error('Can not extract Source from config section') - } - - configHashes[matchSource[1]] = stringHash(section).toString() - return configHashes - }, {}) -} - -export function isCelotoolVerbose() { - return process.env.CELOTOOL_VERBOSE === 'true' -} - -export function isCelotoolHelmDryRun() { - return process.env.CELOTOOL_HELM_DRY_RUN === 'true' -} - -export function exitIfCelotoolHelmDryRun() { - if (isCelotoolHelmDryRun()) { - console.error('Option --helmdryrun is not allowed for this command. Exiting.') - process.exit(1) - } -} - -export async function removeGenericHelmChart(releaseName: string, namespace: string) { - console.info(`Deleting helm chart ${releaseName} from namespace ${namespace}`) - try { - await execCmd(`helm uninstall --namespace ${namespace} ${releaseName}`) - } catch (error) { - console.error(error) - } -} - -function getExtraValuesFile(celoEnv: string) { - const extraValuesFile = fs.existsSync(`${TESTNET_CHART_DIR}/values-${celoEnv}.yaml`) - ? `values-${celoEnv}.yaml` - : undefined - return extraValuesFile -} - -export async function installHelmChart(celoEnv: string, useExistingGenesis: boolean) { - await failIfSecretMissing(BACKUP_GCS_SECRET_NAME, 'default') - await copySecret(BACKUP_GCS_SECRET_NAME, 'default', celoEnv) - const extraValuesFile = getExtraValuesFile(celoEnv) - return installGenericHelmChart({ - namespace: celoEnv, - releaseName: celoEnv, - chartDir: TESTNET_CHART_DIR, - parameters: await helmParameters(celoEnv, useExistingGenesis), - buildDependencies: true, - valuesOverrideFile: extraValuesFile, - }) -} - -export async function upgradeHelmChart(celoEnv: string, useExistingGenesis: boolean) { - console.info(`Upgrading helm release ${celoEnv}`) - const parameters = await helmParameters(celoEnv, useExistingGenesis) - const extraValuesFile = getExtraValuesFile(celoEnv) - await upgradeGenericHelmChart({ - namespace: celoEnv, - releaseName: celoEnv, - chartDir: TESTNET_CHART_DIR, - parameters, - buildDependencies: true, - valuesOverrideFile: extraValuesFile, - }) -} - -export async function resetAndUpgradeHelmChart(celoEnv: string, useExistingGenesis: boolean) { - const txNodesSetName = `${celoEnv}-tx-nodes` - const validatorsSetName = `${celoEnv}-validators` - const bootnodeName = `${celoEnv}-bootnode` - const privateTxNodesSetname = `${celoEnv}-tx-nodes-private` - const persistentVolumeClaimsLabels = ['validators', 'tx_nodes', 'proxy', 'tx_nodes_private'] - - if (isCelotoolHelmDryRun()) { - // If running dryrun we just want to simulate the helm changes - await upgradeHelmChart(celoEnv, useExistingGenesis) - } else { - // scale down nodes - await scaleResource(celoEnv, 'StatefulSet', txNodesSetName, 0) - await scaleResource(celoEnv, 'StatefulSet', validatorsSetName, 0) - // allow to fail for the cases where a testnet does not include the privatetxnode statefulset yet - await scaleResource(celoEnv, 'StatefulSet', privateTxNodesSetname, 0, true) - await scaleProxies(celoEnv, 0) - await scaleResource(celoEnv, 'Deployment', bootnodeName, 0) - - await deletePersistentVolumeClaims(celoEnv, persistentVolumeClaimsLabels) - await sleep(10000) - - await upgradeHelmChart(celoEnv, useExistingGenesis) - await sleep(10000) - - const numValdiators = parseInt(fetchEnv(envVar.VALIDATORS), 10) - const numTxNodes = parseInt(fetchEnv(envVar.TX_NODES), 10) - const numPrivateTxNodes = parseInt(fetchEnv(envVar.PRIVATE_TX_NODES), 10) - - // Note(trevor): helm upgrade only compares the current chart to the - // previously deployed chart when deciding what needs changing, so we need - // to manually scale up to account for when a node count is the same - await scaleResource(celoEnv, 'StatefulSet', txNodesSetName, numTxNodes) - await scaleResource(celoEnv, 'StatefulSet', validatorsSetName, numValdiators) - await scaleResource(celoEnv, 'StatefulSet', privateTxNodesSetname, numPrivateTxNodes) - await scaleProxies(celoEnv) - await scaleResource(celoEnv, 'Deployment', bootnodeName, 1) - } -} - -// scaleProxies scales all proxy statefulsets to have `replicas` replicas. -// If `replicas` is undefined, proxies will be scaled to their intended -// replica counts -async function scaleProxies(celoEnv: string, replicas?: number) { - if (replicas !== undefined) { - const statefulsetNames = await getProxyStatefulsets(celoEnv) - for (const name of statefulsetNames) { - await scaleResource(celoEnv, 'StatefulSet', name, replicas) - } - } else { - const proxiesPerValidator = getProxiesPerValidator() - let validatorIndex = 0 - for (const proxyCount of proxiesPerValidator) { - // allow to fail for the cases where a testnet does not include the proxy statefulset yet - await scaleResource( - celoEnv, - 'StatefulSet', - `${celoEnv}-validators-${validatorIndex}-proxy`, - proxyCount, - true - ) - validatorIndex++ - } - } -} - -async function getProxyStatefulsets(celoEnv: string) { - const [output] = await execCmd( - `kubectl get statefulsets --selector=component=proxy --no-headers -o custom-columns=":metadata.name" -n ${celoEnv}` - ) - if (!output) { - return [] - } - return output.split('\n').filter((name) => name) -} - -export async function removeHelmRelease(celoEnv: string) { - return removeGenericHelmChart(celoEnv, celoEnv) -} - -export function makeHelmParameters(map: { [key: string]: string }) { - return entries(map).map(([key, value]) => `--set ${key}=${value}`) -} - -export function setHelmArray(paramName: string, arr: any[]) { - return arr.map((value, i) => `--set ${paramName}[${i}]="${value}"`) -} - -export async function deleteFromCluster(celoEnv: string) { - await removeHelmRelease(celoEnv) - console.info(`Deleting namespace ${celoEnv}`) - await execCmdWithExitOnFailure(`kubectl delete namespace ${celoEnv}`) -} - -function useStaticIPsForGethNodes() { - return fetchEnv(envVar.STATIC_IPS_FOR_GETH_NODES) === 'true' -} - -export async function checkHelmVersion() { - const requiredMinHelmVersion = '3.8' - const helmVersionCmd = `helm version --template '{{ .Version }}'` - const localHelmVersion = (await execCmdWithExitOnFailure(helmVersionCmd))[0].replace('^v', '') - - const helmOK = compareVersions.compare(localHelmVersion, requiredMinHelmVersion, '>=') - if (helmOK) { - return true - } else { - console.error( - `Error checking local helm version. Minimum Helm version required ${requiredMinHelmVersion}` - ) - process.exit(1) - } -} - -function rollingUpdateHelmVariables() { - return [ - `--set updateStrategy.validators.rollingUpdate.partition=${fetchEnvOrFallback( - envVar.VALIDATORS_ROLLING_UPDATE_PARTITION, - '0' - )}`, - `--set updateStrategy.secondaries.rollingUpdate.partition=${fetchEnvOrFallback( - envVar.SECONDARIES_ROLLING_UPDATE_PARTITION, - '0' - )}`, - `--set updateStrategy.proxy.rollingUpdate.partition=${fetchEnvOrFallback( - envVar.PROXY_ROLLING_UPDATE_PARTITION, - '0' - )}`, - `--set updateStrategy.tx_nodes.rollingUpdate.partition=${fetchEnvOrFallback( - envVar.TX_NODES_ROLLING_UPDATE_PARTITION, - '0' - )}`, - `--set updateStrategy.tx_nodes_private.rollingUpdate.partition=${fetchEnvOrFallback( - envVar.TX_NODES_PRIVATE_ROLLING_UPDATE_PARTITION, - '0' - )}`, - ] -} - -export async function saveHelmValuesFile( - celoEnv: string, - valueFilePath: string, - useExistingGenesis: boolean, - skipGenesisValue = false -) { - const genesisContent = useExistingGenesis - ? await getGenesisBlockFromGoogleStorage(celoEnv) - : generateGenesisFromEnv() - - const enodes = await getEnodesWithExternalIPAddresses(celoEnv) - - let valueFileContent = ` -staticnodes: - staticnodesBase64: ${Buffer.from(JSON.stringify(enodes)).toString('base64')} -` - if (!skipGenesisValue) { - valueFileContent += ` - genesis: - genesisFileBase64: ${Buffer.from(genesisContent).toString('base64')} -` - } - fs.writeFileSync(valueFilePath, valueFileContent) -} - -const celoBlockchainDir: string = path.join(os.tmpdir(), 'celo-blockchain-celotool') - -export async function createAndPushGenesis(celoEnv: string, reset: boolean, useMyCelo: boolean) { - let genesis: string = '' - try { - genesis = await getGenesisBlockFromGoogleStorage(celoEnv) - } catch { - console.debug(`Genesis file not found in GCP. Creating a new one`) - } - if (genesis === '' || reset === true) { - genesis = useMyCelo ? await generateMyCeloGenesis() : generateGenesisFromEnv() - } - // Upload the new genesis file to gcp - if (!isCelotoolHelmDryRun()) { - await uploadGenesisBlockToGoogleStorage(celoEnv, genesis) - } -} - -async function generateMyCeloGenesis(): Promise { - // Clean up the tmp dir - await spawnCmd('rm', ['-rf', celoBlockchainDir], { silent: true }) - fs.mkdirSync(celoBlockchainDir) - const gethTag = - fetchEnvOrFallback(envVar.GETH_MYCELO_COMMIT, '') !== '' - ? fetchEnv(envVar.GETH_MYCELO_COMMIT) - : fetchEnv(envVar.GETH_NODE_DOCKER_IMAGE_TAG) - const celoBlockchainVersion = gethTag.includes('.') ? `v${gethTag}` : gethTag - await checkoutGethRepo(celoBlockchainVersion, celoBlockchainDir) - await buildGethAll(celoBlockchainDir) - - // Generate genesis-config from template - const myceloBinary = path.join(celoBlockchainDir, 'build/bin/mycelo') - const myceloGenesisConfigArgs = [ - 'genesis-config', - '--template', - 'monorepo', - '--mnemonic', - fetchEnv(envVar.MNEMONIC), - '--validators', - fetchEnv(envVar.VALIDATORS), - '--dev.accounts', - fetchEnv(envVar.LOAD_TEST_CLIENTS), - '--blockperiod', - fetchEnv(envVar.BLOCK_TIME), - '--epoch', - fetchEnv(envVar.EPOCH), - '--blockgaslimit', - '20000000', - ] - await spawnCmdWithExitOnFailure(myceloBinary, myceloGenesisConfigArgs, { - silent: false, - cwd: celoBlockchainDir, - }) - - // TODO: Load config to customize migrations... - - // Generate genesis from config - - const myceloGenesisFromConfigArgs = [ - 'genesis-from-config', - celoBlockchainDir, - '--buildpath', - path.join(monorepoRoot, 'packages/protocol/build/contracts'), - ] - await spawnCmdWithExitOnFailure(myceloBinary, myceloGenesisFromConfigArgs, { - silent: false, - cwd: celoBlockchainDir, - }) - const genesisPath = path.join(celoBlockchainDir, 'genesis.json') - const genesisContent = fs.readFileSync(genesisPath).toString() - - // Clean up the tmp dir as it's no longer needed - await spawnCmd('rm', ['-rf', celoBlockchainDir], { silent: true }) - return genesisContent -} - -function useDefaultNetwork() { - return fetchEnv(envVar.KUBERNETES_CLUSTER_NAME) === 'celo-networks-dev' -} - -export function networkName(celoEnv: string) { - return useDefaultNetwork() ? 'default' : `${celoEnv}-network` -} diff --git a/packages/celotool/src/lib/interfaces/genesis-config.ts b/packages/celotool/src/lib/interfaces/genesis-config.ts deleted file mode 100644 index b59ca86b1e5..00000000000 --- a/packages/celotool/src/lib/interfaces/genesis-config.ts +++ /dev/null @@ -1,19 +0,0 @@ -import { AccountAndBalance, ConsensusType, Validator } from '../generate_utils' - -export interface GenesisConfig { - validators?: Validator[] - consensusType?: ConsensusType - initialAccounts?: AccountAndBalance[] - blockTime?: number - epoch?: number - lookbackwindow?: number - chainId?: number - requestTimeout?: number - enablePetersburg?: boolean - timestamp?: number - // Activation block numbers for Celo hard forks, null for never activating - churritoBlock?: number | null - donutBlock?: number | null - espressoBlock?: number | null - gingerbreadBlock?: number | null -} diff --git a/packages/celotool/src/lib/interfaces/geth-instance-config.ts b/packages/celotool/src/lib/interfaces/geth-instance-config.ts deleted file mode 100644 index bb72e7c13ba..00000000000 --- a/packages/celotool/src/lib/interfaces/geth-instance-config.ts +++ /dev/null @@ -1,29 +0,0 @@ -import BigNumber from 'bignumber.js' - -export interface GethInstanceConfig { - name: string - validating?: boolean - replica?: boolean - validatingGasPrice?: number - syncmode: string - port: number - proxyport?: number - rpcport?: number - wsport?: number - lightserv?: boolean - gatewayFee?: BigNumber - privateKey?: string - minerValidator?: string - txFeeRecipient?: string - proxies?: Array - isProxied?: boolean - isProxy?: boolean - bootnodeEnode?: string - nodekey?: string - proxy?: string - proxiedValidatorAddress?: string - proxyAllowPrivateIp?: boolean - ethstats?: string - pid?: number - args?: string[] -} diff --git a/packages/celotool/src/lib/interfaces/geth-repository.ts b/packages/celotool/src/lib/interfaces/geth-repository.ts deleted file mode 100644 index 8107d4c24a8..00000000000 --- a/packages/celotool/src/lib/interfaces/geth-repository.ts +++ /dev/null @@ -1,5 +0,0 @@ -export interface GethRepository { - path: string - remote?: boolean - branch?: string -} diff --git a/packages/celotool/src/lib/interfaces/geth-run-config.ts b/packages/celotool/src/lib/interfaces/geth-run-config.ts deleted file mode 100644 index 710e91ea5f5..00000000000 --- a/packages/celotool/src/lib/interfaces/geth-run-config.ts +++ /dev/null @@ -1,26 +0,0 @@ -import { GenesisConfig } from './genesis-config' -import { GethInstanceConfig } from './geth-instance-config' -import { GethRepository } from './geth-repository' - -export interface GethRunConfig { - // migration - migrate?: boolean - migrateTo?: number - migrationOverrides?: any - keepData?: boolean - // Whether to use the mycelo tool to generate the genesis.json - useMycelo?: boolean - // Skip compiling the smart contracts (e.g. during dev if they're already compiled and you want to save 10 seconds) - myceloSkipCompilingContracts?: boolean - // genesis config - genesisConfig?: GenesisConfig - // network - network: string - networkId: number - // where to run - runPath: string - verbosity?: number - repository?: GethRepository - // running instances - instances: GethInstanceConfig[] -} diff --git a/packages/celotool/src/lib/interfaces/mycelo-genesis-config.ts b/packages/celotool/src/lib/interfaces/mycelo-genesis-config.ts deleted file mode 100644 index d7abe9dd68a..00000000000 --- a/packages/celotool/src/lib/interfaces/mycelo-genesis-config.ts +++ /dev/null @@ -1,10 +0,0 @@ -import { GenesisConfig } from 'src/lib/interfaces/genesis-config' - -export interface MyceloGenesisConfig { - verbose: boolean - genesisConfig: GenesisConfig - numValidators: number // used in place of genesisConfig.validators - mnemonic: string - gethRepoPath: string - migrationOverrides?: any -} diff --git a/packages/celotool/src/lib/k8s-cluster/aks.ts b/packages/celotool/src/lib/k8s-cluster/aks.ts deleted file mode 100644 index 51b0c126d3c..00000000000 --- a/packages/celotool/src/lib/k8s-cluster/aks.ts +++ /dev/null @@ -1,81 +0,0 @@ -import { execCmd, execCmdWithExitOnFailure } from '../cmd-utils' -import { envVar, fetchEnv, fetchEnvOrFallback } from '../env-utils' -import { helmAddRepoAndUpdate, isCelotoolHelmDryRun } from '../helm_deploy' -import { outputIncludes } from '../utils' -import { BaseClusterConfig, BaseClusterManager, CloudProvider } from './base' - -export interface AksClusterConfig extends BaseClusterConfig { - tenantId: string - resourceGroup: string - subscriptionId: string - regionName: string -} - -export class AksClusterManager extends BaseClusterManager { - async switchToSubscription() { - let currentTenantId = null - try { - ;[currentTenantId] = await execCmd('az account show --query id -o tsv') - } catch (error) { - console.info('No azure account subscription currently set') - } - if (currentTenantId === null || currentTenantId.trim() !== this.clusterConfig.tenantId) { - await execCmdWithExitOnFailure( - `az account set --subscription ${this.clusterConfig.subscriptionId}` - ) - } - } - - async getAndSwitchToClusterContext() { - const kubeconfig = fetchEnvOrFallback(envVar.KUBECONFIG, '') - ? `--file ${fetchEnv(envVar.KUBECONFIG)}` - : '' - await execCmdWithExitOnFailure( - `az aks get-credentials --resource-group ${this.clusterConfig.resourceGroup} --name ${this.clusterConfig.clusterName} --subscription ${this.clusterConfig.subscriptionId} --overwrite-existing ${kubeconfig}` - ) - } - - async setupCluster(context?: string) { - await super.setupCluster(context) - await this.installAADPodIdentity() - } - - // installAADPodIdentity installs the resources necessary for AAD pod level identities - async installAADPodIdentity() { - // The helm chart maintained directly by AAD Pod Identity is not compatible with helm v2. - // Until we upgrade to helm v3, we rely on our own helm chart adapted from: - // https://raw.githubusercontent.com/Azure/aad-pod-identity/8a5f2ed5941496345592c42e1d6cbd12c32aeebf/deploy/infra/deployment-rbac.yaml - const aadPodIdentityExists = await outputIncludes( - `helm list -n default`, - `aad-pod-identity`, - `aad-pod-identity exists, skipping install` - ) - if (!aadPodIdentityExists) { - if (isCelotoolHelmDryRun()) { - console.info('Skipping aad-pod-identity deployment due to --helmdryrun') - } else { - console.info('Adding aad-pod-identity helm repository to local helm') - await helmAddRepoAndUpdate( - 'https://raw.githubusercontent.com/Azure/aad-pod-identity/master/charts', - 'aad-pod-identity' - ) - console.info('Installing aad-pod-identity') - await execCmdWithExitOnFailure( - `helm install aad-pod-identity aad-pod-identity/aad-pod-identity -n default` - ) - } - } - } - - get clusterConfig(): AksClusterConfig { - return this._clusterConfig as AksClusterConfig - } - - get kubernetesContextName(): string { - return this.clusterConfig.clusterName - } - - get cloudProvider(): CloudProvider { - return CloudProvider.AZURE - } -} diff --git a/packages/celotool/src/lib/k8s-cluster/base.ts b/packages/celotool/src/lib/k8s-cluster/base.ts deleted file mode 100644 index 0a18f49d6f6..00000000000 --- a/packages/celotool/src/lib/k8s-cluster/base.ts +++ /dev/null @@ -1,101 +0,0 @@ -import { createNamespaceIfNotExists } from '../cluster' -import { execCmd, execCmdWithExitOnFailure } from '../cmd-utils' -import { - installAndEnableMetricsDeps, - installCertManagerAndNginx, - isCelotoolHelmDryRun, -} from '../helm_deploy' - -export enum CloudProvider { - AZURE, - GCP, -} - -export interface BaseClusterConfig { - cloudProvider: CloudProvider - clusterName: string -} - -export abstract class BaseClusterManager { - protected _clusterConfig: BaseClusterConfig - private _celoEnv: string - - constructor(clusterConfig: BaseClusterConfig, celoEnv: string) { - this._clusterConfig = clusterConfig - this._celoEnv = celoEnv - } - - async switchToClusterContext(skipSetup: boolean, context?: string) { - const exists = await this.switchToClusterContextIfExists() - if (!exists) { - await this.getAndSwitchToClusterContext() - } - // Reset back to default namespace - await execCmdWithExitOnFailure(`kubectl config set-context --current --namespace default`) - if (!skipSetup) { - if (!isCelotoolHelmDryRun()) { - await this.setupCluster(context) - } else { - console.info(`Skipping cluster setup due to --helmdryrun`) - } - } - } - - /** - * This will set the current context to the listed cluster name. - * If a context with the cluster name does not exist, return false. - * @param clusterConfig - */ - async switchToClusterContextIfExists() { - await this.switchToSubscription() - - let currentContext = null - try { - ;[currentContext] = await execCmd('kubectl config current-context') - } catch (error) { - console.info('No context currently set') - } - - // We expect the context to be the cluster name. - if (currentContext === null || currentContext.trim() !== this.kubernetesContextName) { - const [existingContextsStr] = await execCmdWithExitOnFailure( - 'kubectl config get-contexts -o name' - ) - const existingContexts = existingContextsStr.trim().split('\n') - if (existingContexts.includes(this.clusterConfig.clusterName)) { - await execCmdWithExitOnFailure( - `kubectl config use-context ${this.clusterConfig.clusterName}` - ) - } else { - // If we don't already have the context, context set up is not complete. - // We would still need to retrieve credentials/contexts from the provider - return false - } - } - return true - } - - async setupCluster(context?: string) { - await createNamespaceIfNotExists(this.celoEnv) - if (!isCelotoolHelmDryRun()) { - console.info('Performing any cluster setup that needs to be done...') - - await installCertManagerAndNginx(this.celoEnv, this.clusterConfig) - await installAndEnableMetricsDeps(true, context, this.clusterConfig) - } - } - - abstract switchToSubscription(): Promise - abstract getAndSwitchToClusterContext(): Promise - - abstract get kubernetesContextName(): string - abstract get cloudProvider(): CloudProvider - - get clusterConfig(): BaseClusterConfig { - return this._clusterConfig - } - - get celoEnv(): string { - return this._celoEnv - } -} diff --git a/packages/celotool/src/lib/k8s-cluster/gcp.ts b/packages/celotool/src/lib/k8s-cluster/gcp.ts deleted file mode 100644 index bdce3936730..00000000000 --- a/packages/celotool/src/lib/k8s-cluster/gcp.ts +++ /dev/null @@ -1,39 +0,0 @@ -import { execCmdWithExitOnFailure } from '../cmd-utils' -import { installGCPSSDStorageClass } from '../helm_deploy' -import { switchToGCPProject } from '../utils' -import { BaseClusterConfig, BaseClusterManager, CloudProvider } from './base' - -export interface GCPClusterConfig extends BaseClusterConfig { - projectName: string - zone: string -} - -export class GCPClusterManager extends BaseClusterManager { - async switchToSubscription() { - await switchToGCPProject(this.clusterConfig.projectName) - } - - async getAndSwitchToClusterContext() { - const { clusterName, projectName, zone } = this.clusterConfig - await execCmdWithExitOnFailure( - `gcloud container clusters get-credentials ${clusterName} --project ${projectName} --zone ${zone}` - ) - } - - async setupCluster(context?: string) { - await super.setupCluster(context) - await installGCPSSDStorageClass() - } - - get clusterConfig(): GCPClusterConfig { - return this._clusterConfig as GCPClusterConfig - } - - get kubernetesContextName(): string { - return `gke_${this.clusterConfig.projectName}_${this.clusterConfig.zone}_${this.clusterConfig.clusterName}` - } - - get cloudProvider(): CloudProvider { - return CloudProvider.GCP - } -} diff --git a/packages/celotool/src/lib/k8s-cluster/utils.ts b/packages/celotool/src/lib/k8s-cluster/utils.ts deleted file mode 100644 index a1584306a80..00000000000 --- a/packages/celotool/src/lib/k8s-cluster/utils.ts +++ /dev/null @@ -1,20 +0,0 @@ -import { AksClusterConfig, AksClusterManager } from './aks' -import { BaseClusterConfig, BaseClusterManager, CloudProvider } from './base' -import { GCPClusterConfig, GCPClusterManager } from './gcp' - -const clusterManagerByCloudProvider: { - [key in CloudProvider]: (clusterConfig: BaseClusterConfig, celoEnv: string) => BaseClusterManager -} = { - [CloudProvider.AZURE]: (clusterConfig: BaseClusterConfig, celoEnv: string) => - new AksClusterManager(clusterConfig as AksClusterConfig, celoEnv), - [CloudProvider.GCP]: (clusterConfig: BaseClusterConfig, celoEnv: string) => - new GCPClusterManager(clusterConfig as GCPClusterConfig, celoEnv), -} - -export function getClusterManager( - cloudProvider: CloudProvider, - celoEnv: string, - clusterConfig: BaseClusterConfig -) { - return clusterManagerByCloudProvider[cloudProvider](clusterConfig, celoEnv) -} diff --git a/packages/celotool/src/lib/k8s-fullnode/aks.ts b/packages/celotool/src/lib/k8s-fullnode/aks.ts deleted file mode 100644 index a74156fb197..00000000000 --- a/packages/celotool/src/lib/k8s-fullnode/aks.ts +++ /dev/null @@ -1,96 +0,0 @@ -import { range } from 'lodash' -import { - deallocateStaticIP, - getAKSNodeResourceGroup, - registerStaticIPIfNotRegistered, - waitForStaticIPDetachment, -} from '../azure' -import { execCmdWithExitOnFailure } from '../cmd-utils' -import { AksClusterConfig } from '../k8s-cluster/aks' -import { deleteResource } from '../kubernetes' -import { BaseFullNodeDeployer, BaseFullNodeDeploymentConfig } from './base' - -export interface AksFullNodeDeploymentConfig extends BaseFullNodeDeploymentConfig { - clusterConfig: AksClusterConfig -} - -export class AksFullNodeDeployer extends BaseFullNodeDeployer { - async additionalHelmParameters() { - const staticIps = (await this.allocateStaticIPs()).join(',') - return [ - `--set azure=true`, - `--set geth.public_ip_per_node='{${staticIps}}'`, - // Azure has a special annotation to expose TCP and UDP on the same service. - // Only TCP needs to be specified in that case. - `--set geth.service_protocols='{TCP}'`, - // Fix for LES server panic-- don't serve any LES clients! - `--set geth.maxpeers=150`, - `--set geth.light.maxpeers=0`, - `--set geth.light.serve=0`, - `--set geth.use_gstorage_data=false`, - ] - } - - async allocateStaticIPs() { - console.info(`Creating static IPs on Azure for ${this.celoEnv}`) - const resourceGroup = await getAKSNodeResourceGroup(this.deploymentConfig.clusterConfig) - const { replicas } = this.deploymentConfig - // Deallocate static ip if we are scaling down the replica count - const existingStaticIPsCount = await this.getAzureStaticIPsCount(resourceGroup) - for (let i = existingStaticIPsCount - 1; i > replicas - 1; i--) { - await deleteResource(this.celoEnv, 'service', `${this.celoEnv}-fullnodes-${i}`, false) - await waitForStaticIPDetachment(`${this.staticIPNamePrefix}-${i}`, resourceGroup) - await deallocateStaticIP(`${this.staticIPNamePrefix}-${i}`, resourceGroup) - } - - const addresses = await Promise.all( - range(replicas).map((i) => - registerStaticIPIfNotRegistered(`${this.staticIPNamePrefix}-${i}`, resourceGroup) - ) - ) - - return addresses - } - - async getAzureStaticIPsCount(resourceGroup: string) { - const [staticIPsCount] = await execCmdWithExitOnFailure( - `az network public-ip list --resource-group ${resourceGroup} --query "[?contains(name,'${this.staticIPNamePrefix}')].{Name:name, IPAddress:ipAddress}" -o tsv | wc -l` - ) - return parseInt(staticIPsCount.trim(), 10) - } - - async deallocateAllIPs() { - console.info(`Deallocating static IPs on Azure for ${this.celoEnv}`) - - const resourceGroup = await getAKSNodeResourceGroup(this.deploymentConfig.clusterConfig) - const replicaCount = await this.getAzureStaticIPsCount(resourceGroup) - - await this.waitForAllStaticIPDetachment() - - await Promise.all( - range(replicaCount).map((i) => - deallocateStaticIP(`${this.staticIPNamePrefix}-${i}`, resourceGroup) - ) - ) - } - - async waitForAllStaticIPDetachment() { - const resourceGroup = await getAKSNodeResourceGroup(this.deploymentConfig.clusterConfig) - - await Promise.all( - range(this.deploymentConfig.replicas).map((i) => - waitForStaticIPDetachment(`${this.staticIPNamePrefix}-${i}`, resourceGroup) - ) - ) - } - - async getFullNodeIP(index: number, resourceGroup?: string): Promise { - resourceGroup = - resourceGroup || (await getAKSNodeResourceGroup(this.deploymentConfig.clusterConfig)) - return registerStaticIPIfNotRegistered(`${this.staticIPNamePrefix}-${index}`, resourceGroup) - } - - get deploymentConfig(): AksFullNodeDeploymentConfig { - return this._deploymentConfig as AksFullNodeDeploymentConfig - } -} diff --git a/packages/celotool/src/lib/k8s-fullnode/base-nodeport.ts b/packages/celotool/src/lib/k8s-fullnode/base-nodeport.ts deleted file mode 100644 index b7ac98a1752..00000000000 --- a/packages/celotool/src/lib/k8s-fullnode/base-nodeport.ts +++ /dev/null @@ -1,154 +0,0 @@ -import { range } from 'lodash' -import { getAllUsedNodePorts, getService } from '../kubernetes' -import { BaseFullNodeDeployer, BaseFullNodeDeploymentConfig } from './base' - -const NODE_PORT_MIN = 30000 -const NODE_PORT_MAX = 32767 - -export abstract class BaseNodePortFullNodeDeployer extends BaseFullNodeDeployer { - async additionalHelmParameters() { - const existingNodePortSet = await this.getExistingNodePortSet() - const newNodePortForEachFullNode = await this.getNodePortForEachFullNode() - const newNodePortSet = new Set(newNodePortForEachFullNode) - // Essentially existingNodePortSet - newNodePortForEachFullNode - const nodePortsToRemove = new Set( - Array.from(existingNodePortSet).filter((existing) => !newNodePortSet.has(existing)) - ) - // Ensure all the new node ports have ingress rules set - await this.setIngressRulesTCPAndUDP(newNodePortForEachFullNode, true) - // Remove any removed node port ingress rules - await this.setIngressRulesTCPAndUDP(Array.from(nodePortsToRemove), false) - - const nodePortPerFullNodeStrs = newNodePortForEachFullNode.map( - (nodePort: number, index: number) => - `--set geth.service_node_port_per_full_node[${index}]=${nodePort}` - ) - return [...nodePortPerFullNodeStrs, `--set geth.service_type=NodePort`] - } - - async getNodePortForEachFullNode() { - // Get all node ports that are currently used on the entire cluster - const allUsedNodePorts: number[] = await getAllUsedNodePorts() - // Get the service for each full node. An element will be undefined if does not exist - const serviceForEachFullNode: any[] = await this.getServiceForEachFullNode() - - const NO_NODE_PORT = -1 - // Get the node port for each existing full node service. If none has been - // assigned, give `NO_KNOWN_NODE_PORT` - const nodePortForEachFullNode: number[] = serviceForEachFullNode.map((service: any) => { - if (!service) { - return NO_NODE_PORT - } - return service.spec.ports.reduce((existingNodePort: number, portsSpec: any) => { - if (!portsSpec.nodePort) { - return existingNodePort - } - if (existingNodePort !== NO_NODE_PORT && existingNodePort !== portsSpec.nodePort) { - throw Error( - `Expected all nodePorts to be the same in service, got ${existingNodePort} !== ${portsSpec.nodePort}` - ) - } - return portsSpec.nodePort - }, NO_NODE_PORT) - }) - - let potentialPort = NODE_PORT_MIN - let allUsedNodePortsIndex = 0 - // Assign node port to services that do not have one yet. Do so in a way to - // not assign a node port that has been assigned to another service on the - // cluster, and keep newly assigned node ports as close to the minPort as - // possible. Doing so makes reasoning about node ports and port ranges way easier. - for (let i = 0; i < nodePortForEachFullNode.length; i++) { - const nodePort = nodePortForEachFullNode[i] - if (nodePort === NO_NODE_PORT) { - for (; allUsedNodePortsIndex < allUsedNodePorts.length; allUsedNodePortsIndex++) { - if (potentialPort > NODE_PORT_MAX) { - throw Error(`No available node ports`) - } - const usedPort = allUsedNodePorts[allUsedNodePortsIndex] - if (potentialPort < usedPort) { - break - } - // Try the next port on the next iteration - potentialPort = usedPort + 1 - } - // Assign the port - nodePortForEachFullNode[i] = potentialPort - // Add the newly assigned port to allUsedNodePorts - allUsedNodePorts.splice(allUsedNodePortsIndex, 0, potentialPort) - // Increment potential port for a potential subsequent NodePort assignment - potentialPort++ - } - } - return nodePortForEachFullNode - } - - /** - * Not needed for NodePort services. Instead, shows a message to remove any - * now-unused ports from the security group whitelist - */ - async deallocateAllIPs() { - // Do nothing - } - - /** - * Returns an array with each element as the corresponding full node's service. - * An element will be undefined if the service doesn't exist. - */ - getServiceForEachFullNode() { - const replicas = this.deploymentConfig.replicas - return Promise.all( - range(replicas).map(async (i: number) => - getService(`${this.celoEnv}-fullnodes-${i}`, this.kubeNamespace) - ) - ) - } - - /** - * Returns an array of all services that currently exist for full nodes. - * Does so using a selector, and has no guarantees about the order of the services. - */ - async getExistingFullNodeServices() { - const response = await getService( - `--selector=component=celo-fullnode-protocol-traffic`, - this.kubeNamespace - ) - return response.items - } - - /** - * Looks at the existing full node services and returns which nodePorts are currently used. - */ - async getExistingNodePortSet(): Promise> { - const serviceForEachFullNode = await this.getExistingFullNodeServices() - return serviceForEachFullNode.reduce((set: Set, service: any) => { - // If there is no service for a full node, it is undefined. Just ignore - if (!service) { - return set - } - for (const portSpec of service.spec.ports) { - if (portSpec.nodePort) { - set.add(portSpec.nodePort) - } - } - return set - }, new Set()) - } - - async getFullNodeIP(_index: number): Promise { - throw Error('Not supported for NodePort full nodes') - } - - /** - * Determines if a given port number is a valid node port. - */ - isNodePort(portNumber: number): boolean { - return portNumber >= NODE_PORT_MIN && portNumber <= NODE_PORT_MAX - } - - abstract setIngressRulesTCPAndUDP(nodePorts: number[], authorize: boolean): Promise - - get deploymentConfig(): BaseFullNodeDeploymentConfig { - return this._deploymentConfig - } -} diff --git a/packages/celotool/src/lib/k8s-fullnode/base.ts b/packages/celotool/src/lib/k8s-fullnode/base.ts deleted file mode 100644 index cc107ea0804..00000000000 --- a/packages/celotool/src/lib/k8s-fullnode/base.ts +++ /dev/null @@ -1,191 +0,0 @@ -import fs from 'fs' -import { range } from 'lodash' -import { readableContext } from 'src/lib/context-utils' -import { createNamespaceIfNotExists } from '../cluster' -import { envVar, fetchEnv, fetchEnvOrFallback } from '../env-utils' -import { generatePrivateKeyWithDerivations, privateKeyToPublicKey } from '../generate_utils' -import { - deletePersistentVolumeClaims, - installGenericHelmChart, - isCelotoolHelmDryRun, - removeGenericHelmChart, - upgradeGenericHelmChart, -} from '../helm_deploy' -import { scaleResource } from '../kubernetes' - -const helmChartPath = 'oci://us-west1-docker.pkg.dev/devopsre/clabs-public-oci/celo-fullnode' -const chartVersion = '0.2.0' - -export interface NodeKeyGenerationInfo { - mnemonic: string - // A derivation index to apply to the mnemonic. - // Each full node will then have its node key derived like: - // mnemonic.derive(derivationIndex).derive(fullNodeIndex) - derivationIndex: number -} - -export interface BaseFullNodeDeploymentConfig { - diskSizeGb: number - replicas: number - rollingUpdatePartition: number - rpcApis: string - gcMode: string - useGstoreData: string - wsPort: number - // If undefined, node keys will not be predetermined and will be random - nodeKeyGenerationInfo?: NodeKeyGenerationInfo -} - -export abstract class BaseFullNodeDeployer { - protected _deploymentConfig: BaseFullNodeDeploymentConfig - private _celoEnv: string - - constructor(deploymentConfig: BaseFullNodeDeploymentConfig, celoEnv: string) { - this._deploymentConfig = deploymentConfig - this._celoEnv = celoEnv - } - - // If the node key is generated, then a promise containing the enodes is returned. - // Otherwise, the enode cannot be calculated deterministically so a Promise is returned. - async installChart(context: string): Promise { - await createNamespaceIfNotExists(this.kubeNamespace) - - await installGenericHelmChart({ - namespace: this.kubeNamespace, - releaseName: this.releaseName, - chartDir: helmChartPath, - parameters: await this.helmParameters(context), - chartVersion, - buildDependencies: false, - }) - - if (this._deploymentConfig.nodeKeyGenerationInfo) { - return this.getEnodes() - } - } - - // If the node key is generated, then a promise containing the enodes is returned. - // Otherwise, the enode cannot be calculated deterministically so a Promise is returned. - async upgradeChart(context: string, reset: boolean): Promise { - if (isCelotoolHelmDryRun()) { - await upgradeGenericHelmChart({ - namespace: this.kubeNamespace, - releaseName: this.releaseName, - chartDir: helmChartPath, - parameters: await this.helmParameters(context), - chartVersion, - buildDependencies: false, - }) - } else { - if (reset) { - await scaleResource(this.celoEnv, 'StatefulSet', `${this.celoEnv}-fullnodes`, 0) - await deletePersistentVolumeClaims(this.celoEnv, ['celo-fullnode']) - } - - await upgradeGenericHelmChart({ - namespace: this.kubeNamespace, - releaseName: this.releaseName, - chartDir: helmChartPath, - parameters: await this.helmParameters(context), - chartVersion, - buildDependencies: false, - }) - - await scaleResource( - this.celoEnv, - 'StatefulSet', - `${this.celoEnv}-fullnodes`, - this._deploymentConfig.replicas - ) - } - if (this._deploymentConfig.nodeKeyGenerationInfo) { - return this.getEnodes() - } - } - - async removeChart() { - await removeGenericHelmChart(this.releaseName, this.kubeNamespace) - await deletePersistentVolumeClaims(this.celoEnv, ['celo-fullnode']) - await this.deallocateAllIPs() - } - - async helmParameters(context: string) { - let nodeKeys: string[] | undefined - if (this._deploymentConfig.nodeKeyGenerationInfo) { - nodeKeys = range(this._deploymentConfig.replicas).map((index: number) => - this.getPrivateKey(index) - ) - } - - const rpcApis = this._deploymentConfig.rpcApis - ? this._deploymentConfig.rpcApis - : 'eth,net,rpc,web3' - const gcMode = this._deploymentConfig.gcMode ? this._deploymentConfig.gcMode : 'full' - const customValuesFile = `${helmChartPath}/${this._celoEnv}-${readableContext( - context - )}-values.yaml` - return [ - `--set namespace=${this.kubeNamespace}`, - `--set replicaCount=${this._deploymentConfig.replicas}`, - `--set geth.updateStrategy.rollingUpdate.partition=${this._deploymentConfig.rollingUpdatePartition}`, - `--set storage.size=${this._deploymentConfig.diskSizeGb}Gi`, - `--set geth.expose_rpc_externally=false`, - `--set geth.gcmode=${gcMode}`, - `--set geth.image.repository=${fetchEnv(envVar.GETH_NODE_DOCKER_IMAGE_REPOSITORY)}`, - `--set geth.image.tag=${fetchEnv(envVar.GETH_NODE_DOCKER_IMAGE_TAG)}`, - `--set-string geth.rpc_apis='${rpcApis.split(',').join('\\,')}'`, - `--set geth.metrics=${fetchEnvOrFallback(envVar.GETH_ENABLE_METRICS, 'false')}`, - `--set genesis.networkId=${fetchEnv(envVar.NETWORK_ID)}`, - `--set genesis.network=${this.celoEnv}`, - `--set geth.use_gstorage_data=${this._deploymentConfig.useGstoreData}`, - `--set geth.ws_port=${this._deploymentConfig.wsPort}`, - `--set geth.gstorage_data_bucket=${fetchEnvOrFallback('GSTORAGE_DATA_BUCKET', '')}`, - ...(await this.additionalHelmParameters()), - nodeKeys ? `--set geth.node_keys='{${nodeKeys.join(',')}}'` : '', - fs.existsSync(customValuesFile) ? `-f ${customValuesFile}` : '', - ] - } - - async getEnodes() { - return Promise.all( - range(this._deploymentConfig.replicas).map(async (index: number) => { - const publicKey = privateKeyToPublicKey(this.getPrivateKey(index)) - const ip = await this.getFullNodeIP(index) - // Assumes 30303 is the port - return `enode://${publicKey}@${ip}:30303` - }) - ) - } - - getPrivateKey(index: number) { - if (!this._deploymentConfig.nodeKeyGenerationInfo) { - throw Error( - 'The deployment config property nodeKeyGenerationInfo must be defined to get a full node private key' - ) - } - return generatePrivateKeyWithDerivations( - this._deploymentConfig.nodeKeyGenerationInfo!.mnemonic, - [this._deploymentConfig.nodeKeyGenerationInfo!.derivationIndex, index] - ) - } - - abstract additionalHelmParameters(): Promise - abstract deallocateAllIPs(): Promise - abstract getFullNodeIP(index: number): Promise - - get releaseName() { - return `${this.celoEnv}-fullnodes` - } - - get kubeNamespace() { - return this.celoEnv - } - - get staticIPNamePrefix() { - return `${this.celoEnv}-fullnodes` - } - - get celoEnv(): string { - return this._celoEnv - } -} diff --git a/packages/celotool/src/lib/k8s-fullnode/gcp.ts b/packages/celotool/src/lib/k8s-fullnode/gcp.ts deleted file mode 100644 index 26c3c65d057..00000000000 --- a/packages/celotool/src/lib/k8s-fullnode/gcp.ts +++ /dev/null @@ -1,82 +0,0 @@ -import { concurrentMap } from '@celo/base' -import { range } from 'lodash' -import { execCmd } from '../cmd-utils' -import { deleteIPAddress, registerIPAddress, retrieveIPAddress } from '../helm_deploy' -import { GCPClusterConfig } from '../k8s-cluster/gcp' -import { BaseFullNodeDeployer, BaseFullNodeDeploymentConfig } from './base' - -export interface GCPFullNodeDeploymentConfig extends BaseFullNodeDeploymentConfig { - clusterConfig: GCPClusterConfig - createNEG: boolean -} - -export class GCPFullNodeDeployer extends BaseFullNodeDeployer { - async additionalHelmParameters() { - const staticIps = (await this.allocateStaticIPs()).join(',') - return [ - `--set gcp=true`, - `--set storage.storageClass=ssd`, - `--set geth.public_ip_per_node='{${staticIps}}'`, - `--set geth.create_network_endpoint_group=${this.deploymentConfig.createNEG}`, - `--set geth.flags='--txpool.nolocals'`, - ] - } - - async allocateStaticIPs() { - await concurrentMap(5, range(this.deploymentConfig.replicas), (i) => - registerIPAddress(this.getIPAddressName(i), this.deploymentConfig.clusterConfig.zone) - ) - await Promise.all([this.deallocateIPsWithNames(await this.ipAddressNamesToDeallocate())]) - return Promise.all( - range(this.deploymentConfig.replicas).map((index: number) => this.getFullNodeIP(index)) - ) - } - - async getFullNodeIP(index: number): Promise { - return retrieveIPAddress(this.getIPAddressName(index), this.deploymentConfig.clusterConfig.zone) - } - - async ipAddressNamesToDeallocate(intendedReplicas: number = this.deploymentConfig.replicas) { - const [allMatchesRaw] = await execCmd( - `gcloud compute addresses list --filter="name~'${this.ipAddressPrefix}-[0-9]+'" --format json` - ) - const allMatches = JSON.parse(allMatchesRaw) - const getReplicaFromIPName = (ipName: string) => { - const regex = new RegExp(`${this.ipAddressPrefix}-([0-9]+)`, 'g') - const matches = regex.exec(ipName) - if (matches == null) { - return null - } - return parseInt(matches[1], 10) - } - return allMatches - .filter((ipDescription: any) => { - const replica = getReplicaFromIPName(ipDescription.name) - return replica != null && replica >= intendedReplicas - }) - .map((ipDescription: any) => ipDescription.name) - } - - async deallocateIPsWithNames(names: string[]) { - await Promise.all( - names.map((name: string) => deleteIPAddress(name, this.deploymentConfig.clusterConfig.zone)) - ) - } - - async deallocateAllIPs() { - const ipNamesToDeallocate = await this.ipAddressNamesToDeallocate(0) - await this.deallocateIPsWithNames(ipNamesToDeallocate) - } - - getIPAddressName(index: number) { - return `${this.ipAddressPrefix}-${index}` - } - - get ipAddressPrefix() { - return `${this.celoEnv}-${this.deploymentConfig.clusterConfig.clusterName}` - } - - get deploymentConfig(): GCPFullNodeDeploymentConfig { - return this._deploymentConfig as GCPFullNodeDeploymentConfig - } -} diff --git a/packages/celotool/src/lib/k8s-fullnode/utils.ts b/packages/celotool/src/lib/k8s-fullnode/utils.ts deleted file mode 100644 index 67d6e839d65..00000000000 --- a/packages/celotool/src/lib/k8s-fullnode/utils.ts +++ /dev/null @@ -1,24 +0,0 @@ -import { CloudProvider } from '../k8s-cluster/base' -import { AksFullNodeDeployer, AksFullNodeDeploymentConfig } from './aks' -import { BaseFullNodeDeployer, BaseFullNodeDeploymentConfig } from './base' -import { GCPFullNodeDeployer, GCPFullNodeDeploymentConfig } from './gcp' - -const fullNodeDeployerByCloudProvider: { - [key in CloudProvider]: ( - deploymentConfig: BaseFullNodeDeploymentConfig, - celoEnv: string - ) => BaseFullNodeDeployer -} = { - [CloudProvider.AZURE]: (deploymentConfig: BaseFullNodeDeploymentConfig, celoEnv: string) => - new AksFullNodeDeployer(deploymentConfig as AksFullNodeDeploymentConfig, celoEnv), - [CloudProvider.GCP]: (deploymentConfig: BaseFullNodeDeploymentConfig, celoEnv: string) => - new GCPFullNodeDeployer(deploymentConfig as GCPFullNodeDeploymentConfig, celoEnv), -} - -export function getFullNodeDeployer( - cloudProvider: CloudProvider, - celoEnv: string, - deploymentConfig: BaseFullNodeDeploymentConfig -) { - return fullNodeDeployerByCloudProvider[cloudProvider](deploymentConfig, celoEnv) -} diff --git a/packages/celotool/src/lib/k8s-oracle/aks-hsm.ts b/packages/celotool/src/lib/k8s-oracle/aks-hsm.ts deleted file mode 100644 index b634b0c4253..00000000000 --- a/packages/celotool/src/lib/k8s-oracle/aks-hsm.ts +++ /dev/null @@ -1,179 +0,0 @@ -import { - assignRoleIdempotent, - createIdentityIdempotent, - deleteIdentity, - getAKSManagedServiceIdentityObjectId, - getAKSServicePrincipalObjectId, - getIdentity, -} from '../azure' -import { execCmdWithExitOnFailure } from '../cmd-utils' -import { AksClusterConfig } from '../k8s-cluster/aks' -import { BaseOracleDeploymentConfig, OracleIdentity } from './base' -import { RbacOracleDeployer } from './rbac' - -/** - * Contains information needed when using Azure HSM signing - */ -export interface AksHsmOracleIdentity extends OracleIdentity { - keyVaultName: string - resourceGroup: string -} - -export interface AksHsmOracleDeploymentConfig extends BaseOracleDeploymentConfig { - clusterConfig: AksClusterConfig - identities: AksHsmOracleIdentity[] -} - -/** - * AksHsmOracleDeployer manages deployments for HSM-based oracles on AKS - */ -export class AksHsmOracleDeployer extends RbacOracleDeployer { - // Explicitly specify this so we enforce AksHsmOracleDeploymentConfig - constructor(deploymentConfig: AksHsmOracleDeploymentConfig, celoEnv: string) { - super(deploymentConfig, celoEnv) - } - - async removeChart() { - await super.removeChart() - for (const identity of this.deploymentConfig.identities) { - await this.deleteAzureHsmIdentity(identity) - } - } - - async helmParameters() { - return [ - ...(await super.helmParameters()), - `--set kube.cloudProvider=azure`, - `--set oracle.azureHsm.initTryCount=5`, - `--set oracle.azureHsm.initMaxRetryBackoffMs=30000`, - `--set oracle.walletType=AZURE_HSM`, - ] - } - - async oracleIdentityHelmParameters() { - let params = await super.oracleIdentityHelmParameters() - for (let i = 0; i < this.replicas; i++) { - const oracleIdentity = this.deploymentConfig.identities[i] - const prefix = `--set oracle.identities[${i}]` - const azureIdentity = await this.createOracleAzureIdentityIdempotent(oracleIdentity) - params = params.concat([ - `${prefix}.azure.id=${azureIdentity.id}`, - `${prefix}.azure.clientId=${azureIdentity.clientId}`, - `${prefix}.azure.keyVaultName=${oracleIdentity.keyVaultName}`, - ]) - } - return params - } - - /** - * Creates an Azure identity for a specific oracle identity with the - * appropriate permissions to use its HSM. - * Idempotent. - */ - async createOracleAzureIdentityIdempotent(oracleHsmIdentity: AksHsmOracleIdentity) { - const identity = await createIdentityIdempotent( - this.clusterConfig, - this.azureHsmIdentityName(oracleHsmIdentity) - ) - // We want to grant the identity for the cluster permission to manage the oracle identity. - // Get the correct object ID depending on the cluster configuration, either - // the service principal or the managed service identity. - // See https://github.com/Azure/aad-pod-identity/blob/b547ba86ab9b16d238db8a714aaec59a046afdc5/docs/readmes/README.role-assignment.md#obtaining-the-id-of-the-managed-identity--service-principal - let assigneeObjectId = await getAKSServicePrincipalObjectId(this.clusterConfig) - let assigneePrincipalType = 'ServicePrincipal' - if (!assigneeObjectId) { - assigneeObjectId = await getAKSManagedServiceIdentityObjectId(this.clusterConfig) - assigneePrincipalType = 'MSI' - } - await assignRoleIdempotent( - assigneeObjectId, - assigneePrincipalType, - identity.id, - 'Managed Identity Operator' - ) - // Allow the oracle identity to access the correct key vault - await this.setOracleKeyVaultPolicyIdempotent(oracleHsmIdentity, identity) - return identity - } - - /** - * Ensures an Azure identity has the appropriate permissions to use its HSM - * via a key vault policy. - * Idempotent. - */ - async setOracleKeyVaultPolicyIdempotent( - oracleHsmIdentity: AksHsmOracleIdentity, - azureIdentity: any - ) { - const keyPermissions = ['get', 'list', 'sign'] - const keyVaultResourceGroup = oracleHsmIdentity.resourceGroup - const [keyVaultPoliciesStr] = await execCmdWithExitOnFailure( - `az keyvault show --name ${ - oracleHsmIdentity.keyVaultName - } -g ${keyVaultResourceGroup} --query "properties.accessPolicies[?objectId == '${ - azureIdentity.principalId - }' && sort(permissions.keys) == [${keyPermissions.map((perm) => `'${perm}'`).join(', ')}]]"` - ) - const keyVaultPolicies = JSON.parse(keyVaultPoliciesStr) - if (keyVaultPolicies.length) { - console.info( - `Skipping setting key permissions, ${keyPermissions.join(' ')} already set for vault ${ - oracleHsmIdentity.keyVaultName - } and identity objectId ${azureIdentity.principalId}` - ) - return - } - console.info( - `Setting key permissions ${keyPermissions.join(' ')} for vault ${ - oracleHsmIdentity.keyVaultName - } and identity objectId ${azureIdentity.principalId}` - ) - return execCmdWithExitOnFailure( - `az keyvault set-policy --name ${ - oracleHsmIdentity.keyVaultName - } --key-permissions ${keyPermissions.join(' ')} --object-id ${ - azureIdentity.principalId - } -g ${keyVaultResourceGroup}` - ) - } - - /** - * Deletes the key vault policy and the oracle's managed identity - */ - async deleteAzureHsmIdentity(oracleHsmIdentity: AksHsmOracleIdentity) { - const identityName = this.azureHsmIdentityName(oracleHsmIdentity) - console.info(`Deleting Azure identity ${identityName}`) - await this.deleteOracleKeyVaultPolicy(oracleHsmIdentity) - return deleteIdentity(this.clusterConfig, identityName) - } - - /** - * Deletes the key vault policy that allows an Azure managed identity to use - * its HSM. - */ - async deleteOracleKeyVaultPolicy(oracleHsmIdentity: AksHsmOracleIdentity) { - const azureIdentity = await getIdentity( - this.clusterConfig, - this.azureHsmIdentityName(oracleHsmIdentity) - ) - return execCmdWithExitOnFailure( - `az keyvault delete-policy --name ${oracleHsmIdentity.keyVaultName} --object-id ${azureIdentity.principalId} -g ${oracleHsmIdentity.resourceGroup}` - ) - } - - /** - * @return the intended name of an azure identity - */ - azureHsmIdentityName(identity: AksHsmOracleIdentity) { - // Max length from https://docs.microsoft.com/en-us/azure/azure-resource-manager/management/resource-name-rules#microsoftmanagedidentity - return `${identity.keyVaultName}-${identity.currencyPair}-${identity.address}`.substring(0, 128) - } - - get deploymentConfig(): AksHsmOracleDeploymentConfig { - return this._deploymentConfig as AksHsmOracleDeploymentConfig - } - - get clusterConfig(): AksClusterConfig { - return this.deploymentConfig.clusterConfig - } -} diff --git a/packages/celotool/src/lib/k8s-oracle/base.ts b/packages/celotool/src/lib/k8s-oracle/base.ts deleted file mode 100644 index e6ef12c8a0d..00000000000 --- a/packages/celotool/src/lib/k8s-oracle/base.ts +++ /dev/null @@ -1,132 +0,0 @@ -import { - getFornoUrl, - getFornoWebSocketUrl, - getLightNodeHttpRpcInternalUrl, - getLightNodeWebSocketRpcInternalUrl, -} from 'src/lib/endpoints' -import { envVar, fetchEnv, fetchEnvOrFallback } from 'src/lib/env-utils' -import { - installGenericHelmChart, - removeGenericHelmChart, - upgradeGenericHelmChart, -} from 'src/lib/helm_deploy' - -const helmChartPath = '../helm-charts/oracle' - -export type CurrencyPair = 'CELOUSD' | 'CELOEUR' | 'CELOBTC' - -/** - * Represents the identity of a single oracle - */ -export interface OracleIdentity { - address: string - currencyPair: CurrencyPair -} - -export interface BaseOracleDeploymentConfig { - context: string - currencyPair: CurrencyPair - identities: OracleIdentity[] - useForno: boolean -} - -export abstract class BaseOracleDeployer { - protected _deploymentConfig: BaseOracleDeploymentConfig - private _celoEnv: string - - constructor(deploymentConfig: BaseOracleDeploymentConfig, celoEnv: string) { - this._deploymentConfig = deploymentConfig - this._celoEnv = celoEnv - } - - async installChart() { - return installGenericHelmChart({ - namespace: this.celoEnv, - releaseName: this.releaseName, - chartDir: helmChartPath, - parameters: await this.helmParameters(), - buildDependencies: true, - valuesOverrideFile: `${this.currencyPair}.yaml`, - }) - } - - async upgradeChart() { - return upgradeGenericHelmChart({ - namespace: this.celoEnv, - releaseName: this.releaseName, - chartDir: helmChartPath, - parameters: await this.helmParameters(), - buildDependencies: true, - valuesOverrideFile: `${this.currencyPair}.yaml`, - }) - } - - async removeChart() { - await removeGenericHelmChart(this.releaseName, this.celoEnv) - } - - async helmParameters() { - const httpRpcProviderUrl = this.deploymentConfig.useForno - ? getFornoUrl(this.celoEnv) - : getLightNodeHttpRpcInternalUrl(this.celoEnv) - const wsRpcProviderUrl = this.deploymentConfig.useForno - ? getFornoWebSocketUrl(this.celoEnv) - : getLightNodeWebSocketRpcInternalUrl(this.celoEnv) - return [ - `--set-literal oracle.api_keys=${fetchEnv(envVar.ORACLE_FX_ADAPTERS_API_KEYS)}`, - `--set environment.name=${this.celoEnv}`, - `--set image.repository=${fetchEnv(envVar.ORACLE_DOCKER_IMAGE_REPOSITORY)}`, - `--set image.tag=${fetchEnv(envVar.ORACLE_DOCKER_IMAGE_TAG)}`, - `--set oracle.replicas=${this.replicas}`, - `--set oracle.rpcProviderUrls.http=${httpRpcProviderUrl}`, - `--set oracle.rpcProviderUrls.ws=${wsRpcProviderUrl}`, - `--set-string oracle.unusedOracleAddresses='${fetchEnvOrFallback( - envVar.ORACLE_UNUSED_ORACLE_ADDRESSES, - '' - ) - .split(',') - .join('\\,')}'`, - ].concat(await this.oracleIdentityHelmParameters()) - } - - /** - * Returns an array of helm command line parameters for the oracle identities. - */ - async oracleIdentityHelmParameters() { - const params: string[] = [] - for (let i = 0; i < this.replicas; i++) { - const oracleIdentity = this.deploymentConfig.identities[i] - const prefix = `--set oracle.identities[${i}]` - params.push(`${prefix}.address=${oracleIdentity.address}`) - } - return params - } - - get deploymentConfig() { - return this._deploymentConfig - } - - get releaseName() { - return `${this.celoEnv}-${this.currencyPair.toLocaleLowerCase()}-oracle` - } - - get kubeNamespace() { - return this.celoEnv - } - - get celoEnv(): string { - return this._celoEnv - } - - get replicas(): number { - return this.deploymentConfig.identities.length - } - - get context(): string { - return this.deploymentConfig.context - } - - get currencyPair(): CurrencyPair { - return this.deploymentConfig.currencyPair - } -} diff --git a/packages/celotool/src/lib/k8s-oracle/pkey.ts b/packages/celotool/src/lib/k8s-oracle/pkey.ts deleted file mode 100644 index 1b9e0cae79d..00000000000 --- a/packages/celotool/src/lib/k8s-oracle/pkey.ts +++ /dev/null @@ -1,37 +0,0 @@ -import { BaseOracleDeployer, BaseOracleDeploymentConfig, OracleIdentity } from './base' - -export interface PrivateKeyOracleIdentity extends OracleIdentity { - privateKey: string -} - -export interface PrivateKeyOracleDeploymentConfig extends BaseOracleDeploymentConfig { - identities: PrivateKeyOracleIdentity[] -} - -/** - * PrivateKeyOracleDeployer cloud-agnostically manages deployments for oracles - * that are using in-memory signing via private keys - */ -export class PrivateKeyOracleDeployer extends BaseOracleDeployer { - constructor(deploymentConfig: PrivateKeyOracleDeploymentConfig, celoEnv: string) { - super(deploymentConfig, celoEnv) - } - - async helmParameters() { - return [...(await super.helmParameters()), `--set oracle.walletType=PRIVATE_KEY`] - } - - async oracleIdentityHelmParameters() { - const params: string[] = await super.oracleIdentityHelmParameters() - for (let i = 0; i < this.replicas; i++) { - const oracleIdentity = this.deploymentConfig.identities[i] - const prefix = `--set oracle.identities[${i}]` - params.push(`${prefix}.privateKey=${oracleIdentity.privateKey}`) - } - return params - } - - get deploymentConfig(): PrivateKeyOracleDeploymentConfig { - return this._deploymentConfig as PrivateKeyOracleDeploymentConfig - } -} diff --git a/packages/celotool/src/lib/k8s-oracle/rbac.ts b/packages/celotool/src/lib/k8s-oracle/rbac.ts deleted file mode 100644 index 7d45f97b42b..00000000000 --- a/packages/celotool/src/lib/k8s-oracle/rbac.ts +++ /dev/null @@ -1,73 +0,0 @@ -import { - installGenericHelmChart, - removeGenericHelmChart, - upgradeGenericHelmChart, -} from 'src/lib/helm_deploy' -import { BaseOracleDeployer } from './base' - -// Oracle RBAC------ -// We need the oracle pods to be able to change their label to accommodate -// limitations in aad-pod-identity & statefulsets (see https://github.com/Azure/aad-pod-identity/issues/237#issuecomment-611672987) -// To do this, we use an auth token that we get using the resources in the `oracle-rbac` chart - -const rbacHelmChartPath = '../helm-charts/oracle-rbac' - -/** - * RbacOracleDeployer cloud-agnostically manages deployments for oracles - * whose pods must change their metadata in order to accomodate limitations - * in pod identity solutions (like Azure's aad-pod-identity). - * This will create a k8s service account for each oracle pod that can modify - * pod metadata, and will ensure each SA's credentials make their way to the helm chart. - */ -export abstract class RbacOracleDeployer extends BaseOracleDeployer { - async installChart() { - await installGenericHelmChart({ - namespace: this.celoEnv, - releaseName: this.rbacReleaseName(), - chartDir: rbacHelmChartPath, - parameters: this.rbacHelmParameters(), - }) - return super.installChart() - } - - async upgradeChart() { - await upgradeGenericHelmChart({ - namespace: this.celoEnv, - releaseName: this.rbacReleaseName(), - chartDir: rbacHelmChartPath, - parameters: this.rbacHelmParameters(), - }) - return super.upgradeChart() - } - - async removeChart() { - await removeGenericHelmChart(this.rbacReleaseName(), this.celoEnv) - return super.removeChart() - } - - async helmParameters() { - const kubeServiceAccountSecretNames = await this.rbacServiceAccountSecretNames() - return [ - ...(await super.helmParameters()), - `--set kube.serviceAccountSecretNames='{${kubeServiceAccountSecretNames.join(',')}}'`, - ] - } - - rbacHelmParameters() { - return [ - `--set environment.name=${this.celoEnv}`, - `--set environment.currencyPair=${this.currencyPair}`, - `--set oracle.replicas=${this.replicas}`, - ] - } - - async rbacServiceAccountSecretNames() { - return [...Array(this.replicas).keys()].map((i) => { - return `${this.rbacReleaseName()}-secret-${i}` - }) - } - - rbacReleaseName() { - return `${this.celoEnv}-${this.currencyPair.toLocaleLowerCase()}-oracle-rbac` - } -} diff --git a/packages/celotool/src/lib/kubernetes.ts b/packages/celotool/src/lib/kubernetes.ts deleted file mode 100644 index ce4f462b6f8..00000000000 --- a/packages/celotool/src/lib/kubernetes.ts +++ /dev/null @@ -1,110 +0,0 @@ -import { execCmd, execCmdWithExitOnFailure } from './cmd-utils' -import { envVar, fetchEnv } from './env-utils' - -export async function scaleResource( - namespace: string, - type: string, - resourceName: string, - replicaCount: number, - allowFail: boolean = false -) { - const execFn = allowFail ? execCmd : execCmdWithExitOnFailure - const run = () => - execFn( - `kubectl scale ${type} ${resourceName} --replicas=${replicaCount} --namespace ${namespace}` - ) - if (allowFail) { - try { - return run() - } catch (e) { - console.info('Error scaling resource, not failing', e) - return Promise.resolve() - } - } - return run() -} - -export async function getStatefulSetReplicas(namespace: string, resourceName: string) { - const [replicas] = await execCmd( - `kubectl get statefulset ${resourceName} --namespace ${namespace} -o jsonpath={.status.replicas}` - ) - return parseInt(replicas, 10) -} - -export async function getRandomTxNodeIP(namespace: string) { - const txNodes = parseInt(fetchEnv(envVar.TX_NODES), 10) - const randomNumber = Math.floor(Math.random() * txNodes) - const [address] = await execCmdWithExitOnFailure( - `kubectl get service/${namespace}-service-${randomNumber} --namespace ${namespace} -o jsonpath='{.status.loadBalancer.ingress[0].ip}'` - ) - return address -} - -export async function deleteResource( - namespace: string, - type: string, - resourceName: string, - allowFail: boolean = false -) { - const execFn = allowFail ? execCmd : execCmdWithExitOnFailure - const run = () => execFn(`kubectl delete ${type} ${resourceName} --namespace ${namespace}`) - if (allowFail) { - try { - // By awaiting here, we ensure that a rejected promise gets caught - return await run() - } catch (e) { - console.info('Error deleting resource, not failing', e) - return Promise.resolve() - } - } - return run() -} - -/** - * Returns a sorted array of used node ports - */ -export async function getAllUsedNodePorts( - namespace?: string, - cmdFlags?: { [key: string]: string } -) { - const namespaceFlag = namespace ? `--namespace ${namespace}` : `--all-namespaces` - const cmdFlagStrs = cmdFlags - ? Object.entries(cmdFlags).map(([flag, value]) => `--${flag} ${value}`) - : [] - const [output] = await execCmd( - `kubectl get svc ${namespaceFlag} ${cmdFlagStrs.join( - ' ' - )} -o go-template='{{range .items}}{{range .spec.ports}}{{if .nodePort}}{{.nodePort}}{{"\\n"}}{{end}}{{end}}{{end}}'` - ) - const nodePorts = output - .trim() - .split('\n') - .filter((portStr: string) => portStr.length) - .map((portStr: string) => parseInt(portStr, 10)) - // Remove duplicates and sort low -> high - return Array.from(new Set(nodePorts)).sort((a: number, b: number) => a - b) -} - -export async function getService(serviceName: string, namespace: string) { - try { - const [output] = await execCmd( - `kubectl get svc ${serviceName} --namespace ${namespace} -o json` - ) - return JSON.parse(output) - } catch (e) { - return undefined - } -} - -export async function getServerVersion() { - const [output] = await execCmd(`kubectl version -o json`) - const jsonOutput = JSON.parse(output) - const [minorNumberStr] = jsonOutput.serverVersion.minor.match(/^([0-9]+)/g) - if (!minorNumberStr) { - throw Error('Could not get minor version') - } - return { - major: parseInt(jsonOutput.serverVersion.major, 10), - minor: parseInt(minorNumberStr, 10), - } -} diff --git a/packages/celotool/src/lib/leaderboard.ts b/packages/celotool/src/lib/leaderboard.ts deleted file mode 100644 index 9ddab031615..00000000000 --- a/packages/celotool/src/lib/leaderboard.ts +++ /dev/null @@ -1,56 +0,0 @@ -import { execCmd } from 'src/lib/cmd-utils' -import { envVar, fetchEnv } from 'src/lib/env-utils' -import { - installGenericHelmChart, - removeGenericHelmChart, - upgradeGenericHelmChart, -} from 'src/lib/helm_deploy' -const yaml = require('js-yaml') -const helmChartPath = '../helm-charts/leaderboard' - -export async function installHelmChart(celoEnv: string) { - return installGenericHelmChart({ - namespace: celoEnv, - releaseName: releaseName(celoEnv), - chartDir: helmChartPath, - parameters: await helmParameters(celoEnv), - }) -} - -export async function removeHelmRelease(celoEnv: string) { - await removeGenericHelmChart(releaseName(celoEnv), celoEnv) -} - -export async function upgradeHelmChart(celoEnv: string) { - await upgradeGenericHelmChart({ - namespace: celoEnv, - releaseName: releaseName(celoEnv), - chartDir: helmChartPath, - parameters: await helmParameters(celoEnv), - }) -} - -export async function helmParameters(celoEnv: string) { - const dbValues = await getBlockscoutHelmValues(celoEnv) - return [ - `--set leaderboard.db.connection_name=${dbValues.connection_name}`, - `--set leaderboard.db.username=${dbValues.username}`, - `--set leaderboard.db.password=${dbValues.password}`, - `--set leaderboard.image.repository=${fetchEnv(envVar.LEADERBOARD_DOCKER_IMAGE_REPOSITORY)}`, - `--set leaderboard.image.tag=${fetchEnv(envVar.LEADERBOARD_DOCKER_IMAGE_TAG)}`, - `--set leaderboard.sheet=${fetchEnv(envVar.LEADERBOARD_SHEET)}`, - `--set leaderboard.token=${fetchEnv(envVar.LEADERBOARD_TOKEN)}`, - `--set leaderboard.credentials=${fetchEnv(envVar.LEADERBOARD_CREDENTIALS)}`, - `--set leaderboard.web3=https://${celoEnv}-forno.${fetchEnv(envVar.CLUSTER_DOMAIN_NAME)}.org`, - ] -} - -function releaseName(celoEnv: string) { - return `${celoEnv}-leaderboard` -} - -export async function getBlockscoutHelmValues(celoEnv: string) { - const [output] = await execCmd(`helm get values ${celoEnv}-blockscout`) - const blockscoutValues: any = yaml.safeLoad(output) - return blockscoutValues.blockscout.db -} diff --git a/packages/celotool/src/lib/load-test-logs-collector.ts b/packages/celotool/src/lib/load-test-logs-collector.ts deleted file mode 100644 index 2c89decddca..00000000000 --- a/packages/celotool/src/lib/load-test-logs-collector.ts +++ /dev/null @@ -1,231 +0,0 @@ -import { - LOG_TAG_BLOCKSCOUT_TIME_MEASUREMENT, - LOG_TAG_BLOCKSCOUT_TIMEOUT, - LOG_TAG_BLOCKSCOUT_VALIDATION_ERROR, - LOG_TAG_GETH_RPC_ERROR, - LOG_TAG_TRANSACTION_ERROR, - LOG_TAG_TRANSACTION_VALIDATION_ERROR, - LOG_TAG_TX_TIME_MEASUREMENT, -} from 'src/lib/geth' - -export interface TimeStatsBase { - count: number - totalTime: number - maximalTime: number - minimalTime: number -} - -export interface TimeStatsAccumulated extends TimeStatsBase { - durations: number[] -} - -export interface BlockscoutStats extends TimeStatsAccumulated { - timeouts: number - validationErrors: any[] -} - -export interface ErrorsStatsBase { - count: number - errors: any[] -} - -export interface TransactionsErrorsState extends ErrorsStatsBase { - validationCount: number - validationErrors: any[] -} - -export interface LogsAggregator { - /* State storage */ - podsStats: any - blockscoutStats: BlockscoutStats - transactionsStats: { [key: string]: TimeStatsAccumulated } - transactionsErrors: { [key: string]: TransactionsErrorsState } - gethRPCErrors: any[] - - /* Public handler for mutating the state */ - handleNewEntry: (json: any) => void - - handleMultipleEntries: (messages: any[]) => void - - /* Returns summry in JSON */ - getSummary: () => string - - /* Helper functions */ - _initTransactionsStatsForToken: (token: string) => void - _initTransactionsErrorsForToken: (token: string) => void - _updateTimeStats: (json: any, statsStorage: TimeStatsAccumulated) => void - - /* State mutators depending on the event type */ - _handleTxTimeMeasurement: (json: any, token: string) => void - _handleTxError: (json: any, token: string) => void - _handleTxValidationError: (json: any, token: string) => void - _handleBlockscoutTimeout: () => void - _handleBlockscoutTimeMeasurement: (json: any) => void - _handleBlockscoutValidationError: (json: any) => void - _handleGethRPCError: (json: any) => void -} - -export const createLogsAggregator = () => { - const allowedTags = [ - LOG_TAG_TX_TIME_MEASUREMENT, - LOG_TAG_BLOCKSCOUT_TIME_MEASUREMENT, - LOG_TAG_TRANSACTION_ERROR, - LOG_TAG_TRANSACTION_VALIDATION_ERROR, - LOG_TAG_BLOCKSCOUT_TIMEOUT, - LOG_TAG_BLOCKSCOUT_VALIDATION_ERROR, - LOG_TAG_GETH_RPC_ERROR, - ] - - const logsAggregator: LogsAggregator = { - podsStats: {}, - blockscoutStats: { - count: 0, - totalTime: 0, - maximalTime: 0, - minimalTime: Number.MAX_SAFE_INTEGER, - timeouts: 0, - validationErrors: [], - durations: [], - }, - transactionsStats: {}, - transactionsErrors: {}, - gethRPCErrors: [], - - _initTransactionsStatsForToken(token: string) { - if (!this.transactionsStats[token]) { - this.transactionsStats[token] = { - count: 0, - totalTime: 0, - maximalTime: 0, - minimalTime: Number.MAX_SAFE_INTEGER, - durations: [], - } - } - }, - _initTransactionsErrorsForToken(token: string) { - if (!this.transactionsErrors[token]) { - this.transactionsErrors[token] = { - count: 0, - errors: [], - validationCount: 0, - validationErrors: [], - } - } - }, - _updateTimeStats(json: any, statsStorage: TimeStatsAccumulated) { - try { - const timeTaken = parseInt(json.p_time, 10) - statsStorage.count += 1 - statsStorage.totalTime += timeTaken - statsStorage.maximalTime = Math.max(statsStorage.maximalTime, timeTaken) - statsStorage.minimalTime = Math.min(statsStorage.minimalTime, timeTaken) - statsStorage.durations.push(timeTaken) - } catch (ignored) { - // ignore errors - } - }, - - _handleTxTimeMeasurement(json: any, token: string) { - this._initTransactionsStatsForToken(token) - this._updateTimeStats(json, this.transactionsStats[token]) - }, - _handleTxError(json: any, token: string) { - this._initTransactionsErrorsForToken(token) - if (json.error) { - this.transactionsErrors[token].count += 1 - this.transactionsErrors[token].errors.push(json.error) - } - }, - _handleTxValidationError(json: any, token: string) { - this._initTransactionsErrorsForToken(token) - if (json.error) { - this.transactionsErrors[token].validationCount += 1 - this.transactionsErrors[token].validationErrors.push(json.error) - } - }, - _handleBlockscoutTimeout() { - this.blockscoutStats.timeouts += 1 - }, - _handleBlockscoutTimeMeasurement(json: any) { - this._updateTimeStats(json, this.blockscoutStats) - }, - _handleBlockscoutValidationError(json: any) { - if (json.error) { - this.blockscoutStats.validationErrors.push(json.error) - } - }, - _handleGethRPCError(json: any) { - if (json.error) { - this.gethRPCErrors.push(json.error) - } - }, - - handleMultipleEntries(messages: any[]) { - messages.forEach((json) => { - this.handleNewEntry(json) - }) - }, - - handleNewEntry(json: any) { - if (allowedTags.indexOf(json.tag) < 0) { - return - } - - if (!this.podsStats[json.podID]) { - this.podsStats[json.podID] = 1 - } else { - this.podsStats[json.podID] += 1 - } - - const token = json.token - switch (json.tag) { - case LOG_TAG_TX_TIME_MEASUREMENT: - this._handleTxTimeMeasurement(json, token) - break - - case LOG_TAG_TRANSACTION_ERROR: - this._handleTxError(json, token) - break - - case LOG_TAG_TRANSACTION_VALIDATION_ERROR: - this._handleTxValidationError(json, token) - break - - case LOG_TAG_BLOCKSCOUT_TIME_MEASUREMENT: - this._handleBlockscoutTimeMeasurement(json) - break - - case LOG_TAG_BLOCKSCOUT_TIMEOUT: - this._handleBlockscoutTimeout() - break - - case LOG_TAG_BLOCKSCOUT_VALIDATION_ERROR: - this._handleBlockscoutValidationError(json) - break - - case LOG_TAG_GETH_RPC_ERROR: - this._handleGethRPCError(json) - break - - default: - break - } - }, - - getSummary() { - const collectedLogsSummary = { - podsStats: this.podsStats, - blockscoutStats: this.blockscoutStats, - transactionsStats: this.transactionsStats, - transactionsErrors: this.transactionsErrors, - gethRPCErrors: this.gethRPCErrors, - } - - const summaryJSON = JSON.stringify(collectedLogsSummary, null, 2) - - return summaryJSON - }, - } - - return logsAggregator -} diff --git a/packages/celotool/src/lib/load-test.ts b/packages/celotool/src/lib/load-test.ts deleted file mode 100644 index 052cb451bf8..00000000000 --- a/packages/celotool/src/lib/load-test.ts +++ /dev/null @@ -1,128 +0,0 @@ -import sleep from 'sleep-promise' -import { LoadTestArgv } from 'src/cmds/deploy/initial/load-test' -import { getBlockscoutUrl } from 'src/lib/endpoints' -import { envVar, fetchEnv } from 'src/lib/env-utils' -import { getEnodesWithExternalIPAddresses } from 'src/lib/geth' -import { - installGenericHelmChart, - removeGenericHelmChart, - saveHelmValuesFile, - upgradeGenericHelmChart, -} from 'src/lib/helm_deploy' -import { scaleResource } from 'src/lib/kubernetes' - -const chartDir = '../helm-charts/load-test/' - -function releaseName(celoEnv: string) { - return `${celoEnv}-load-test` -} - -export async function installHelmChart( - celoEnv: string, - blockscoutProb: number, - delayMs: number, - replicas: number, - threads: number -) { - const params = await helmParameters(celoEnv, blockscoutProb, delayMs, replicas, threads) - return installGenericHelmChart({ - namespace: celoEnv, - releaseName: releaseName(celoEnv), - chartDir, - parameters: params, - }) -} - -export async function upgradeHelmChart( - celoEnv: string, - blockscoutProb: number, - delayMs: number, - replicas: number, - threads: number -) { - const params = await helmParameters(celoEnv, blockscoutProb, delayMs, replicas, threads) - await upgradeGenericHelmChart({ - namespace: celoEnv, - releaseName: releaseName(celoEnv), - chartDir, - parameters: params, - }) -} - -// scales down all pods, upgrades, then scales back up -export async function resetAndUpgrade( - celoEnv: string, - blockscoutProb: number, - delayMs: number, - replicas: number, - threads: number -) { - const loadTestStatefulSetName = `${celoEnv}-load-test` - - console.info('Scaling load-test StatefulSet down to 0...') - await scaleResource(celoEnv, 'StatefulSet', loadTestStatefulSetName, 0) - - await sleep(3000) - - await upgradeHelmChart(celoEnv, blockscoutProb, delayMs, replicas, threads) - - await sleep(3000) - - console.info(`Scaling load-test StatefulSet back up to ${replicas}...`) - await scaleResource(celoEnv, 'StatefulSet', loadTestStatefulSetName, replicas) -} - -export function setArgvDefaults(argv: LoadTestArgv) { - // Variables from the .env file are not set as environment variables - // by the time the builder is run, so we set the default here - if (argv.delay < 0) { - argv.delay = parseInt(fetchEnv(envVar.LOAD_TEST_TX_DELAY_MS), 10) - } - if (argv.replicas < 0) { - argv.replicas = parseInt(fetchEnv(envVar.LOAD_TEST_CLIENTS), 10) - } - if (argv.threads < 0) { - argv.replicas = parseInt(fetchEnv(envVar.LOAD_TEST_THREADS), 1) - } -} - -export async function removeHelmRelease(celoEnv: string) { - return removeGenericHelmChart(`${celoEnv}-load-test`, celoEnv) -} - -async function helmParameters( - celoEnv: string, - blockscoutProb: number, - delayMs: number, - replicas: number, - threads: number -) { - const enodes = await getEnodesWithExternalIPAddresses(celoEnv) - const staticNodesJsonB64 = Buffer.from(JSON.stringify(enodes)).toString('base64') - // Uses the genesis file from google storage to ensure it's the correct genesis for the network - const valueFilePath = `/tmp/${celoEnv}-testnet-values.yaml` - await saveHelmValuesFile(celoEnv, valueFilePath, true, true) - - return [ - `-f ${valueFilePath}`, - `--set geth.accountSecret="${fetchEnv(envVar.GETH_ACCOUNT_SECRET)}"`, - `--set blockscout.measurePercent=${blockscoutProb}`, - `--set blockscout.url=${getBlockscoutUrl(celoEnv)}`, - `--set celotool.image.repository=${fetchEnv(envVar.CELOTOOL_DOCKER_IMAGE_REPOSITORY)}`, - `--set celotool.image.tag=${fetchEnv(envVar.CELOTOOL_DOCKER_IMAGE_TAG)}`, - `--set delay=${delayMs}`, // send txs every 5 seconds - `--set environment=${celoEnv}`, - `--set geth.image.repository=${fetchEnv(envVar.GETH_NODE_DOCKER_IMAGE_REPOSITORY)}`, - `--set geth.image.tag=${fetchEnv(envVar.GETH_NODE_DOCKER_IMAGE_TAG)}`, - `--set geth.networkID=${fetchEnv(envVar.NETWORK_ID)}`, - `--set geth.staticNodes="${staticNodesJsonB64}"`, - `--set geth.verbosity=${fetchEnv('GETH_VERBOSITY')}`, - `--set mnemonic="${fetchEnv(envVar.MNEMONIC)}"`, - `--set replicas=${replicas}`, - `--set threads=${threads}`, - `--set genesis.useGenesisFileBase64=false`, - `--set genesis.network=${celoEnv}`, - `--set use_random_recipient=${fetchEnv(envVar.LOAD_TEST_USE_RANDOM_RECIPIENT)}`, - `--set reuse_light_clients=true`, - ] -} diff --git a/packages/celotool/src/lib/migration-utils.ts b/packages/celotool/src/lib/migration-utils.ts deleted file mode 100644 index ae841e2692f..00000000000 --- a/packages/celotool/src/lib/migration-utils.ts +++ /dev/null @@ -1,124 +0,0 @@ -import { generateKeys } from '@celo/cryptographic-utils/lib/account' -import { envVar, fetchEnv, fetchEnvOrFallback } from './env-utils' -import { - AccountType, - generatePrivateKey, - getAddressesFor, - getFaucetedAccounts, - getPrivateKeysFor, - privateKeyToAddress, -} from './generate_utils' -import { ensure0x } from './utils' - -const DEFAULT_FAUCET_CUSD_WEI = '60000000000000000000000' /* 60k Celo Dollars */ - -export async function getKey(mnemonic: string, account: TestAccounts) { - const key = await generateKeys(mnemonic, undefined, 0, account) - return { ...key, address: privateKeyToAddress(key.privateKey) } -} - -// From env-tests package -export enum TestAccounts { - Root, - TransferFrom, - TransferTo, - Exchange, - Oracle, - GovernanceApprover, - ReserveSpender, - ReserveCustodian, -} - -export function minerForEnv() { - return privateKeyToAddress( - generatePrivateKey(fetchEnv(envVar.MNEMONIC), AccountType.VALIDATOR, 0) - ) -} - -export function validatorKeys() { - return getPrivateKeysFor( - AccountType.VALIDATOR, - fetchEnv(envVar.MNEMONIC), - parseInt(fetchEnv(envVar.VALIDATORS), 10) - ).map(ensure0x) -} - -function getAttestationKeys() { - return getPrivateKeysFor( - AccountType.ATTESTATION, - fetchEnv(envVar.MNEMONIC), - parseInt(fetchEnv(envVar.VALIDATORS), 10) - ).map(ensure0x) -} - -export async function migrationOverrides(faucet: boolean) { - let overrides = {} - if (faucet) { - const mnemonic = fetchEnv(envVar.MNEMONIC) - const faucetedAccountAddresses = getFaucetedAccounts(mnemonic).map((account) => account.address) - const attestationBotAddresses = getAddressesFor(AccountType.ATTESTATION_BOT, mnemonic, 10) - const validatorAddresses = getAddressesFor(AccountType.VALIDATOR, mnemonic, 1) - const envTestRoot = await getKey(mnemonic, TestAccounts.Root) - const envTestReserveCustodian = await getKey(mnemonic, TestAccounts.ReserveCustodian) - const envTestOracle = await getKey(mnemonic, TestAccounts.Oracle) - const envTestGovernanceApprover = await getKey(mnemonic, TestAccounts.GovernanceApprover) - const envTestReserveSpender = await getKey(mnemonic, TestAccounts.ReserveSpender) - const initialAddresses = [ - ...faucetedAccountAddresses, - ...attestationBotAddresses, - ...validatorAddresses, - envTestRoot.address, - envTestOracle.address, - ] - - const initialBalance = fetchEnvOrFallback(envVar.FAUCET_CUSD_WEI, DEFAULT_FAUCET_CUSD_WEI) - - overrides = { - ...overrides, - stableToken: { - initialBalances: { - addresses: initialAddresses, - values: initialAddresses.map(() => initialBalance), - }, - oracles: [ - ...getAddressesFor(AccountType.PRICE_ORACLE, mnemonic, 1), - minerForEnv(), - envTestOracle.address, - ], - }, - // from migrationsConfig - governanceApproverMultiSig: { - signatories: [minerForEnv(), envTestGovernanceApprover.address], - numRequiredConfirmations: 1, - numInternalRequiredConfirmations: 1, - }, - // from migrationsConfig: - reserve: { - initialBalance: 100000000, // CELO - frozenAssetsStartBalance: 80000000, // Matches Mainnet after CGP-6 - frozenAssetsDays: 182, // 3x Mainnet thawing rate - otherAddresses: [envTestReserveCustodian.address], - }, - // from migrationsConfig - reserveSpenderMultiSig: { - signatories: [minerForEnv(), envTestReserveSpender.address], - numRequiredConfirmations: 1, - numInternalRequiredConfirmations: 1, - }, - } - } - - return { - ...overrides, - validators: { - validatorKeys: validatorKeys(), - attestationKeys: getAttestationKeys(), - }, - } -} - -export function truffleOverrides() { - return { - from: minerForEnv(), - } -} diff --git a/packages/celotool/src/lib/mock-oracle.ts b/packages/celotool/src/lib/mock-oracle.ts deleted file mode 100644 index 778bb206cf6..00000000000 --- a/packages/celotool/src/lib/mock-oracle.ts +++ /dev/null @@ -1,37 +0,0 @@ -import { envVar, fetchEnv } from 'src/lib/env-utils' -import { getPrivateTxNodeClusterIP } from 'src/lib/geth' -import { installGenericHelmChart, removeGenericHelmChart } from 'src/lib/helm_deploy' - -const helmChartPath = '../helm-charts/mock-oracle' - -export async function installHelmChart(celoEnv: string) { - return installGenericHelmChart({ - namespace: celoEnv, - releaseName: releaseName(celoEnv), - chartDir: helmChartPath, - parameters: await helmParameters(celoEnv), - }) -} -export async function removeHelmRelease(celoEnv: string) { - await removeGenericHelmChart(releaseName(celoEnv), celoEnv) -} - -async function helmParameters(celoEnv: string) { - const nodeIp = await getPrivateTxNodeClusterIP(celoEnv) - const nodeUrl = `http://${nodeIp}:8545` - return [ - `--set celotool.image.repository=${fetchEnv(envVar.CELOTOOL_DOCKER_IMAGE_REPOSITORY)}`, - `--set celotool.image.tag=${fetchEnv(envVar.CELOTOOL_DOCKER_IMAGE_TAG)}`, - `--set mnemonic="${fetchEnv(envVar.MNEMONIC)}"`, - `--set oracle.cronSchedule="${fetchEnv(envVar.MOCK_ORACLE_CRON_SCHEDULE)}"`, - `--set oracle.image.repository=${fetchEnv(envVar.MOCK_ORACLE_DOCKER_IMAGE_REPOSITORY)}`, - `--set oracle.image.tag=${fetchEnv(envVar.MOCK_ORACLE_DOCKER_IMAGE_TAG)}`, - `--set celocli.nodeUrl=${nodeUrl}`, - `--set celocli.image.repository=${fetchEnv(envVar.CELOCLI_STANDALONE_IMAGE_REPOSITORY)}`, - `--set celocli.image.tag=${fetchEnv(envVar.CELOCLI_STANDALONE_IMAGE_TAG)}`, - ] -} - -function releaseName(celoEnv: string) { - return `${celoEnv}-mock-oracle` -} diff --git a/packages/celotool/src/lib/odis.ts b/packages/celotool/src/lib/odis.ts deleted file mode 100644 index 8a27e704738..00000000000 --- a/packages/celotool/src/lib/odis.ts +++ /dev/null @@ -1,201 +0,0 @@ -import { DynamicEnvVar, envVar, fetchEnv } from 'src/lib/env-utils' -import { - installGenericHelmChart, - removeGenericHelmChart, - upgradeGenericHelmChart, -} from 'src/lib/helm_deploy' -import { - createKeyVaultIdentityIfNotExists, - deleteAzureKeyVaultIdentity, - getAzureKeyVaultIdentityName, -} from './azure' -import { getAksClusterConfig, getContextDynamicEnvVarValues } from './context-utils' - -const helmChartPath = '../helm-charts/odis' - -/** - * Information for the Azure Key Vault - */ -interface ODISSignerKeyVaultConfig { - vaultName: string - pnpKeyNameBase: string - pnpKeyLatestVersion: string - domainsKeyNameBase: string - domainsKeyLatestVersion: string -} - -/** - * Information for the ODIS postgres db - */ -interface ODISSignerDatabaseConfig { - host: string - port: string - username: string - password: string -} - -/** - * Information for the Blockchain provider connection - */ -interface ODISSignerBlockchainConfig { - blockchainApiKey: string -} - -/** - * Information for the ODIS logging - */ -interface ODISSignerLoggingConfig { - level: string - format: string -} - -/* - * Prefix for the cluster's identity name - */ -const identityNamePrefix = 'ODISSIGNERID' - -/** - * Env vars corresponding to each value for the ODISSignerKeyVaultConfig for a particular context - */ -const contextODISSignerKeyVaultConfigDynamicEnvVars: { - [k in keyof ODISSignerKeyVaultConfig]: DynamicEnvVar -} = { - vaultName: DynamicEnvVar.ODIS_SIGNER_AZURE_KEYVAULT_NAME, - pnpKeyNameBase: DynamicEnvVar.ODIS_SIGNER_AZURE_KEYVAULT_PNP_KEY_NAME_BASE, - pnpKeyLatestVersion: DynamicEnvVar.ODIS_SIGNER_AZURE_KEYVAULT_PNP_KEY_LATEST_VERSION, - domainsKeyNameBase: DynamicEnvVar.ODIS_SIGNER_AZURE_KEYVAULT_DOMAINS_KEY_NAME_BASE, - domainsKeyLatestVersion: DynamicEnvVar.ODIS_SIGNER_AZURE_KEYVAULT_DOMAINS_KEY_LATEST_VERSION, -} - -/** - * Env vars corresponding to each value for the ODISSignerDatabaseConfig for a particular context - */ -const contextDatabaseConfigDynamicEnvVars: { - [k in keyof ODISSignerDatabaseConfig]: DynamicEnvVar -} = { - host: DynamicEnvVar.ODIS_SIGNER_DB_HOST, - port: DynamicEnvVar.ODIS_SIGNER_DB_PORT, - username: DynamicEnvVar.ODIS_SIGNER_DB_USERNAME, - password: DynamicEnvVar.ODIS_SIGNER_DB_PASSWORD, -} - -/** - * Env vars corresponding to each value for the ODISSignerBlockchainConfig for a particular context - */ -const contextBlockchainConfigDynamicEnvVars: { - [k in keyof ODISSignerBlockchainConfig]: DynamicEnvVar -} = { - blockchainApiKey: DynamicEnvVar.ODIS_SIGNER_BLOCKCHAIN_API_KEY, -} - -/** - * Env vars corresponding to each value for the logging for a particular context - */ -const contextLoggingConfigDynamicEnvVars: { - [k in keyof ODISSignerLoggingConfig]: DynamicEnvVar -} = { - level: DynamicEnvVar.ODIS_SIGNER_LOG_LEVEL, - format: DynamicEnvVar.ODIS_SIGNER_LOG_FORMAT, -} - -function releaseName(celoEnv: string, context: string) { - const contextK8sFriendly = context.toLowerCase().replace(/_/g, '-') - return `${celoEnv}--${contextK8sFriendly}--odissigner` -} - -export async function installODISHelmChart(celoEnv: string, context: string) { - return installGenericHelmChart({ - namespace: celoEnv, - releaseName: releaseName(celoEnv, context), - chartDir: helmChartPath, - parameters: await helmParameters(celoEnv, context), - }) -} - -export async function upgradeODISChart(celoEnv: string, context: string) { - return upgradeGenericHelmChart({ - namespace: celoEnv, - releaseName: releaseName(celoEnv, context), - chartDir: helmChartPath, - parameters: await helmParameters(celoEnv, context), - }) -} - -export async function removeHelmRelease(celoEnv: string, context: string) { - await removeGenericHelmChart(releaseName(celoEnv, context), celoEnv) - const keyVaultConfig = getContextDynamicEnvVarValues( - contextODISSignerKeyVaultConfigDynamicEnvVars, - context - ) - - await deleteAzureKeyVaultIdentity( - context, - getAzureKeyVaultIdentityName(context, identityNamePrefix, keyVaultConfig.vaultName), - keyVaultConfig.vaultName - ) -} - -async function helmParameters(celoEnv: string, context: string) { - const blockchainConfig = getContextDynamicEnvVarValues( - contextBlockchainConfigDynamicEnvVars, - context - ) - const databaseConfig = getContextDynamicEnvVarValues(contextDatabaseConfigDynamicEnvVars, context) - const keyVaultConfig = getContextDynamicEnvVarValues( - contextODISSignerKeyVaultConfigDynamicEnvVars, - context - ) - - const loggingConfig = getContextDynamicEnvVarValues(contextLoggingConfigDynamicEnvVars, context, { - level: 'trace', - format: 'stackdriver', - }) - - const clusterConfig = getAksClusterConfig(context) - - return [ - `--set environment.name=${celoEnv}`, - `--set environment.cluster.name=${clusterConfig.clusterName}`, - `--set environment.cluster.location=${clusterConfig.regionName}`, - `--set image.repository=${fetchEnv(envVar.ODIS_SIGNER_DOCKER_IMAGE_REPOSITORY)}`, - `--set image.tag=${fetchEnv(envVar.ODIS_SIGNER_DOCKER_IMAGE_TAG)}`, - `--set db.host=${databaseConfig.host}`, - `--set db.port=${databaseConfig.port}`, - `--set db.username=${databaseConfig.username}`, - `--set db.password='${databaseConfig.password}'`, - `--set keystore.vaultName=${keyVaultConfig.vaultName}`, - `--set keystore.pnpKeyNameBase=${keyVaultConfig.pnpKeyNameBase}`, - `--set keystore.domainsKeyNameBase=${keyVaultConfig.domainsKeyNameBase}`, - `--set keystore.pnpKeyLatestVersion=${keyVaultConfig.pnpKeyLatestVersion}`, - `--set keystore.domainsKeyLatestVersion=${keyVaultConfig.domainsKeyLatestVersion}`, - `--set api.pnpAPIEnabled=${fetchEnv(envVar.ODIS_SIGNER_PNP_API_ENABLED)}`, - `--set api.domainsAPIEnabled=${fetchEnv(envVar.ODIS_SIGNER_DOMAINS_API_ENABLED)}`, - `--set blockchainProvider=${fetchEnv(envVar.ODIS_SIGNER_BLOCKCHAIN_PROVIDER)}`, - `--set blockchainApiKey=${blockchainConfig.blockchainApiKey}`, - `--set log.level=${loggingConfig.level}`, - `--set log.format=${loggingConfig.format}`, - ].concat(await ODISSignerKeyVaultIdentityHelmParameters(context, keyVaultConfig)) -} - -/** - * Returns an array of helm command line parameters for the ODIS Signer key vault identity. - */ -async function ODISSignerKeyVaultIdentityHelmParameters( - context: string, - keyVaultConfig: ODISSignerKeyVaultConfig -) { - const azureKVIdentity = await createKeyVaultIdentityIfNotExists( - context, - getAzureKeyVaultIdentityName(context, identityNamePrefix, keyVaultConfig.vaultName), - keyVaultConfig.vaultName, - null, - null, - ['get'] - ) - const params: string[] = [ - `--set azureKVIdentity.id=${azureKVIdentity.id}`, - `--set azureKVIdentity.clientId=${azureKVIdentity.clientId}`, - ] - - return params -} diff --git a/packages/celotool/src/lib/oracle.ts b/packages/celotool/src/lib/oracle.ts deleted file mode 100644 index f2d84845771..00000000000 --- a/packages/celotool/src/lib/oracle.ts +++ /dev/null @@ -1,255 +0,0 @@ -import { ensureLeading0x } from '@celo/utils/lib/address' -import { DynamicEnvVar, envVar, fetchEnv } from 'src/lib/env-utils' -import yargs from 'yargs' -import { getCloudProviderFromContext, getDynamicEnvVarValues } from './context-utils' -import { getOraclePrivateKeysFor, privateKeyToAddress } from './generate_utils' -import { AksClusterConfig } from './k8s-cluster/aks' -import { BaseClusterManager, CloudProvider } from './k8s-cluster/base' -import { - AksHsmOracleDeployer, - AksHsmOracleDeploymentConfig, - AksHsmOracleIdentity, -} from './k8s-oracle/aks-hsm' -import { BaseOracleDeployer, CurrencyPair } from './k8s-oracle/base' -import { - PrivateKeyOracleDeployer, - PrivateKeyOracleDeploymentConfig, - PrivateKeyOracleIdentity, -} from './k8s-oracle/pkey' - -/** - * Maps each cloud provider to the correct function to get the appropriate - * HSM-based oracle deployer. - */ -const hsmOracleDeployerGetterByCloudProvider: { - [key in CloudProvider]?: ( - celoEnv: string, - context: string, - currencyPair: CurrencyPair, - useForno: boolean, - clusterManager: BaseClusterManager - ) => BaseOracleDeployer -} = { - [CloudProvider.AZURE]: getAksHsmOracleDeployer, -} - -/** - * Gets the appropriate oracle deployer for the given context. If the env vars - * specify that the oracle addresses should be generated from the mnemonic, - * then the cloud-provider agnostic deployer PrivateKeyOracleDeployer is used. - */ -export function getOracleDeployerForContext( - celoEnv: string, - context: string, - currencyPair: CurrencyPair, - useForno: boolean, - clusterManager: BaseClusterManager -) { - // If the mnemonic-based oracle address env var has a value, we should be using - // the private key oracle deployer - const { addressesFromMnemonicCount } = getDynamicEnvVarValues( - mnemonicBasedOracleIdentityConfigDynamicEnvVars, - { context, currencyPair }, - { - addressesFromMnemonicCount: '', - } - ) - - if (addressesFromMnemonicCount) { - const addressesFromMnemonicCountNum = parseInt(addressesFromMnemonicCount, 10) - // This is a cloud-provider agnostic deployer because it doesn't rely - // on cloud-specific HSMs - return getPrivateKeyOracleDeployer( - celoEnv, - context, - currencyPair, - useForno, - addressesFromMnemonicCountNum - ) - } - // If we've gotten this far, we should be using an HSM-based oracle deployer - const cloudProvider: CloudProvider = getCloudProviderFromContext(context) - const getDeployer = hsmOracleDeployerGetterByCloudProvider[cloudProvider] - if (getDeployer === undefined) { - throw new Error( - `Deployer not defined for CloudProvider: ${cloudProvider}. ` + - `Expecting one of: ${Object.keys(hsmOracleDeployerGetterByCloudProvider)}` - ) - } - return getDeployer(celoEnv, context, currencyPair, useForno, clusterManager) -} - -/** - * ----------- AksHsmOracleDeployer helpers ----------- - */ - -/** - * Gets an AksHsmOracleDeployer by looking at env var values - */ -function getAksHsmOracleDeployer( - celoEnv: string, - context: string, - currencyPair: CurrencyPair, - useForno: boolean, - clusterManager: BaseClusterManager -) { - const { addressKeyVaults } = getDynamicEnvVarValues( - aksHsmOracleIdentityConfigDynamicEnvVars, - { context, currencyPair }, - { - addressKeyVaults: '', - } - ) - const aksClusterConfig = clusterManager.clusterConfig as AksClusterConfig - const identities = getAksHsmOracleIdentities( - addressKeyVaults, - aksClusterConfig.resourceGroup, - currencyPair - ) - const deploymentConfig: AksHsmOracleDeploymentConfig = { - context, - clusterConfig: aksClusterConfig, - currencyPair, - identities, - useForno, - } - return new AksHsmOracleDeployer(deploymentConfig, celoEnv) -} - -/** - * Given a string addressAzureKeyVaults containing comma separated info of the form: - *
:: - * eg: 0x0000000000000000000000000000000000000000:keyVault0,0x0000000000000000000000000000000000000001:keyVault1:resourceGroup1 - * returns an array of AksHsmOracleIdentity in the same order - */ -export function getAksHsmOracleIdentities( - addressAzureKeyVaults: string, - defaultResourceGroup: string, - currencyPair: CurrencyPair -): AksHsmOracleIdentity[] { - const identityStrings = addressAzureKeyVaults.split(',') - const identities = [] - for (const identityStr of identityStrings) { - const [address, keyVaultName, resourceGroup] = identityStr.split(':') - // resourceGroup can be undefined - if (!address || !keyVaultName) { - throw Error( - `Address or key vault name is invalid. Address: ${address} Key Vault Name: ${keyVaultName}` - ) - } - identities.push({ - address, - currencyPair, - keyVaultName, - resourceGroup: resourceGroup || defaultResourceGroup, - }) - } - return identities -} - -/** - * Config values pulled from env vars used for generating an AksHsmOracleIdentity - */ -interface AksHsmOracleIdentityConfig { - addressKeyVaults: string -} - -/** - * Env vars corresponding to each value for the AksHsmOracleIdentityConfig for a particular context - */ -const aksHsmOracleIdentityConfigDynamicEnvVars: { - [k in keyof AksHsmOracleIdentityConfig]: DynamicEnvVar -} = { - addressKeyVaults: DynamicEnvVar.ORACLE_ADDRESS_AZURE_KEY_VAULTS, -} - -/** - * ----------- PrivateKeyOracleDeployer helpers ----------- - */ - -/** - * Gets an PrivateKeyOracleDeployer by looking at env var values and generating private keys - * from the mnemonic - */ -function getPrivateKeyOracleDeployer( - celoEnv: string, - context: string, - currencyPair: CurrencyPair, - useForno: boolean, - count: number -): PrivateKeyOracleDeployer { - const identities: PrivateKeyOracleIdentity[] = getOraclePrivateKeysFor( - currencyPair, - fetchEnv(envVar.MNEMONIC), - count - ).map((pkey) => ({ - address: privateKeyToAddress(pkey), - currencyPair, - privateKey: ensureLeading0x(pkey), - })) - const deploymentConfig: PrivateKeyOracleDeploymentConfig = { - context, - currencyPair, - identities, - useForno, - } - return new PrivateKeyOracleDeployer(deploymentConfig, celoEnv) -} - -/** - * Config values pulled from env vars used for generating a PrivateKeyOracleIdentity - * from a mnemonic - */ -interface MnemonicBasedOracleIdentityConfig { - addressesFromMnemonicCount: string -} - -/** - * Env vars corresponding to each value for the MnemonicBasedOracleIdentityConfig for a particular context - */ -const mnemonicBasedOracleIdentityConfigDynamicEnvVars: { - [k in keyof MnemonicBasedOracleIdentityConfig]: DynamicEnvVar -} = { - addressesFromMnemonicCount: DynamicEnvVar.ORACLE_ADDRESSES_FROM_MNEMONIC_COUNT, -} - -/** - * Add currencyPair to command arguments - * @param argv the yargs arguments list to add to - */ -export function addCurrencyPairMiddleware(argv: yargs.Argv) { - return argv.option('currencyPair', { - choices: [ - 'CELOUSD', - 'CELOEUR', - 'CELOBRL', - 'USDCUSD', - 'USDCEUR', - 'USDCBRL', - 'CELOXOF', - 'XOFEUR', - 'EUROCEUR', - 'EURXOF', - 'EUROCXOF', - 'KESUSD', - 'COPUSD', - 'CELOKES', - 'USDTUSD', - ], - description: 'Oracle deployment to target based on currency pair', - demandOption: true, - type: 'string', - }) -} - -/** - * Add useForno to command arguments - * @param argv the yargs arguments list to add to - */ -export function addUseFornoMiddleware(argv: yargs.Argv) { - return argv.option('useForno', { - description: 'Uses forno for RPCs from the oracle clients', - default: false, - type: 'boolean', - }) -} diff --git a/packages/celotool/src/lib/port-utils.ts b/packages/celotool/src/lib/port-utils.ts deleted file mode 100644 index 480b875cd19..00000000000 --- a/packages/celotool/src/lib/port-utils.ts +++ /dev/null @@ -1,32 +0,0 @@ -import { spawn, SpawnOptions } from 'child_process' - -export async function waitForPortOpen(host: string, port: number, seconds: number) { - const deadline = Date.now() + seconds * 1000 - do { - if (await isPortOpen(host, port)) { - return true - } - } while (Date.now() < deadline) - return false -} - -export async function isPortOpen(host: string, port: number) { - return (await execCmd('nc', ['-z', host, port.toString()], { silent: true })) === 0 -} - -async function execCmd(cmd: string, args: string[], options?: SpawnOptions & { silent?: boolean }) { - return new Promise((resolve, reject) => { - const { silent, ...spawnOptions } = options || { silent: false } - if (!silent) { - console.debug('$ ' + [cmd].concat(args).join(' ')) - } - const process = spawn(cmd, args, { ...spawnOptions, stdio: silent ? 'ignore' : 'inherit' }) - process.on('close', (code) => { - try { - resolve(code) - } catch (error) { - reject(error) - } - }) - }) -} diff --git a/packages/celotool/src/lib/port_forward.ts b/packages/celotool/src/lib/port_forward.ts deleted file mode 100644 index 74ca21daaca..00000000000 --- a/packages/celotool/src/lib/port_forward.ts +++ /dev/null @@ -1,97 +0,0 @@ -/* tslint:disable: no-console */ -import { ChildProcess, spawnSync } from 'child_process' -import { execBackgroundCmd, execCmd } from './cmd-utils' - -function sleep(ms: number) { - return new Promise((resolve) => setTimeout(resolve, ms)) -} - -export const defaultPortsString = '8545:8545 8546:8546 9200:9200' - -const PORT_CONTROL_CMD = 'nc -z 127.0.0.1 8545' -const DEFAULT_COMPONENT = 'validators' - -async function getPortForwardCmd(celoEnv: string, component?: string, ports = defaultPortsString) { - return getKubernetesPortForwardCmd(celoEnv, component, ports) -} - -async function getKubernetesPortForwardCmd( - celoEnv: string, - component?: string, - ports = defaultPortsString -) { - if (!component) { - component = DEFAULT_COMPONENT - } - console.info(`Port-forwarding to ${celoEnv} ${component} ${ports}`) - const portForwardArgs = await getPortForwardArgs(celoEnv, component, ports) - return `kubectl ${portForwardArgs.join(' ')}` -} - -async function getPortForwardArgs(celoEnv: string, component?: string, ports = defaultPortsString) { - if (!component) { - component = DEFAULT_COMPONENT - } - console.info(`Port-forwarding to ${celoEnv} ${component} ${ports}`) - // The testnet helm chart used to have the label app=ethereum, but this was changed - // to app=testnet. To preserve backward compatibility, we search for both labels. - // It's not expected to ever have a situation where a namespace has pods with - // both labels. - const podName = await execCmd( - `kubectl get pods --namespace ${celoEnv} -l "app in (ethereum,testnet), component=${component}, release=${celoEnv}" --field-selector=status.phase=Running -o jsonpath="{.items[0].metadata.name}"` - ) - return ['port-forward', `--namespace=${celoEnv}`, podName[0], ...ports.split(' ')] -} - -export async function portForward(celoEnv: string, component?: string, ports?: string) { - try { - const portForwardCmd = await getPortForwardCmd(celoEnv, component, ports) - const splitCmd = portForwardCmd.split(' ') - console.info(`Port-forwarding to celoEnv ${celoEnv} ports ${ports}`) - console.info(`\t$ ${portForwardCmd}`) - await spawnSync(splitCmd[0], splitCmd.slice(1), { - stdio: 'inherit', - }) - } catch (error) { - console.error(`Unable to port-forward to ${celoEnv}`) - console.error(error) - process.exit(1) - } -} - -export async function portForwardAnd( - celoEnv: string, - cb: () => void, - component?: string, - ports?: string -) { - let childProcess: ChildProcess - - try { - childProcess = execBackgroundCmd(await getPortForwardCmd(celoEnv, component, ports)) - } catch (error) { - console.error(error) - process.exit(1) - throw new Error() // unreachable, but to fix typescript - } - - try { - let isConnected = false - while (!isConnected) { - if (process.env.CELOTOOL_VERBOSE === 'true') { - console.debug('Port Forward not ready yet...') - } - isConnected = await execCmd(PORT_CONTROL_CMD) - .then(() => true) - .catch(() => false) - await sleep(2000) - } - await cb() - childProcess.kill('SIGINT') - } catch (error) { - childProcess.kill('SIGINT') - - console.error(error) - process.exit(1) - } -} diff --git a/packages/celotool/src/lib/prometheus.ts b/packages/celotool/src/lib/prometheus.ts deleted file mode 100644 index 704b6caa2e3..00000000000 --- a/packages/celotool/src/lib/prometheus.ts +++ /dev/null @@ -1,394 +0,0 @@ -import fs from 'fs' -import { createNamespaceIfNotExists } from './cluster' -import { execCmd, execCmdWithExitOnFailure } from './cmd-utils' -import { - DynamicEnvVar, - envVar, - fetchEnv, - fetchEnvOrFallback, - getDynamicEnvVarValue, -} from './env-utils' -import { - helmAddRepoAndUpdate, - installGenericHelmChart, - removeGenericHelmChart, - setHelmArray, - upgradeGenericHelmChart, -} from './helm_deploy' -import { BaseClusterConfig, CloudProvider } from './k8s-cluster/base' -import { - createServiceAccountIfNotExists, - getServiceAccountEmail, - getServiceAccountKey, - setupGKEWorkloadIdentities, -} from './service-account-utils' -import { outputIncludes, switchToGCPProject } from './utils' -const yaml = require('js-yaml') - -const helmChartPath = '../helm-charts/prometheus-stackdriver' -const releaseName = 'prometheus-stackdriver' -const kubeNamespace = 'prometheus' -const kubeServiceAccountName = releaseName -// stackdriver-prometheus-sidecar relevant links: -// GitHub: https://github.com/Stackdriver/stackdriver-prometheus-sidecar -// Container registry with latest tags: https://console.cloud.google.com/gcr/images/stackdriver-prometheus/GLOBAL/stackdriver-prometheus-sidecar?gcrImageListsize=30 -const sidecarImageTag = '0.8.2' -// Prometheus container registry with latest tags: https://hub.docker.com/r/prom/prometheus/tags -const prometheusImageTag = 'v2.38.0' - -const grafanaHelmRepo = 'grafana/grafana' -const grafanaChartVersion = '6.32.3' -const grafanaReleaseName = 'grafana' - -export async function installPrometheusIfNotExists( - context?: string, - clusterConfig?: BaseClusterConfig -) { - const prometheusExists = await outputIncludes( - `helm list -n prometheus`, - releaseName, - `prometheus-stackdriver exists, skipping install` - ) - if (!prometheusExists) { - console.info('Installing prometheus-stackdriver') - await installPrometheus(context, clusterConfig) - } -} - -async function installPrometheus(context?: string, clusterConfig?: BaseClusterConfig) { - await createNamespaceIfNotExists(kubeNamespace) - return installGenericHelmChart({ - namespace: kubeNamespace, - releaseName, - chartDir: helmChartPath, - parameters: await helmParameters(context, clusterConfig), - }) -} - -export async function removePrometheus() { - await removeGenericHelmChart(releaseName, kubeNamespace) -} - -export async function upgradePrometheus(context?: string, clusterConfig?: BaseClusterConfig) { - await createNamespaceIfNotExists(kubeNamespace) - return upgradeGenericHelmChart({ - namespace: kubeNamespace, - releaseName, - chartDir: helmChartPath, - parameters: await helmParameters(context, clusterConfig), - }) -} - -function getK8sContextVars( - clusterConfig?: BaseClusterConfig, - context?: string -): [string, string, string, string, string, boolean] { - const cloudProvider = clusterConfig ? getCloudProviderPrefix(clusterConfig!) : 'gcp' - const usingGCP = !clusterConfig || clusterConfig.cloudProvider === CloudProvider.GCP - let clusterName = usingGCP ? fetchEnv(envVar.KUBERNETES_CLUSTER_NAME) : clusterConfig!.clusterName - let gcloudProject, gcloudRegion, stackdriverDisabled - - if (context) { - gcloudProject = getDynamicEnvVarValue( - DynamicEnvVar.PROM_SIDECAR_GCP_PROJECT, - { context }, - fetchEnv(envVar.TESTNET_PROJECT_NAME) - ) - gcloudRegion = getDynamicEnvVarValue( - DynamicEnvVar.PROM_SIDECAR_GCP_REGION, - { context }, - fetchEnv(envVar.KUBERNETES_CLUSTER_ZONE) - ) - clusterName = getDynamicEnvVarValue( - DynamicEnvVar.KUBERNETES_CLUSTER_NAME, - { context }, - clusterName - ) - stackdriverDisabled = getDynamicEnvVarValue( - DynamicEnvVar.PROM_SIDECAR_DISABLED, - { context }, - clusterName - ) - } else { - gcloudProject = fetchEnv(envVar.TESTNET_PROJECT_NAME) - gcloudRegion = fetchEnv(envVar.KUBERNETES_CLUSTER_ZONE) - stackdriverDisabled = fetchEnvOrFallback(envVar.PROMETHEUS_DISABLE_STACKDRIVER_SIDECAR, 'false') - } - - return [cloudProvider, clusterName, gcloudProject, gcloudRegion, stackdriverDisabled, usingGCP] -} - -function getRemoteWriteParameters(context?: string): string[] { - const remoteWriteUrl = getDynamicEnvVarValue( - DynamicEnvVar.PROM_REMOTE_WRITE_URL, - { context }, - fetchEnv(envVar.PROMETHEUS_REMOTE_WRITE_URL) - ) - const remoteWriteUser = getDynamicEnvVarValue( - DynamicEnvVar.PROM_REMOTE_WRITE_USERNAME, - { context }, - fetchEnv(envVar.PROMETHEUS_REMOTE_WRITE_USERNAME) - ) - const remoteWritePassword = getDynamicEnvVarValue( - DynamicEnvVar.PROM_REMOTE_WRITE_PASSWORD, - { context }, - fetchEnv(envVar.PROMETHEUS_REMOTE_WRITE_PASSWORD) - ) - return [ - `--set remote_write.url='${remoteWriteUrl}'`, - `--set remote_write.basic_auth.username='${remoteWriteUser}'`, - `--set remote_write.basic_auth.password='${remoteWritePassword}'`, - ] -} - -async function helmParameters(context?: string, clusterConfig?: BaseClusterConfig) { - const [cloudProvider, clusterName, gcloudProject, gcloudRegion, stackdriverDisabled, usingGCP] = - getK8sContextVars(clusterConfig, context) - - const params = [ - `--set namespace=${kubeNamespace}`, - `--set gcloud.project=${gcloudProject}`, - `--set gcloud.region=${gcloudRegion}`, - `--set prometheus.imageTag=${prometheusImageTag}`, - `--set serviceAccount.name=${kubeServiceAccountName}`, - `--set cluster=${clusterName}`, - ] - - // Remote write to Grafana Cloud - if (fetchEnvOrFallback(envVar.PROMETHEUS_REMOTE_WRITE_URL, '') !== '') { - params.push(...getRemoteWriteParameters(context)) - } - - if (usingGCP) { - // Note: ssd is not the default storageClass in GCP clusters - params.push(`--set storageClassName=ssd`) - } else if (context?.startsWith('AZURE_ODIS')) { - params.push(`--set storageClassName=default`) - } - - if (stackdriverDisabled.toLowerCase() === 'false') { - params.push( - // Disable stackdriver sidecar env wide. TODO: Update to a contexted variable if needed - `--set stackdriver.disabled=false`, - `--set stackdriver.sidecar.imageTag=${sidecarImageTag}`, - `--set stackdriver.gcloudServiceAccountKeyBase64=${await getPrometheusGcloudServiceAccountKeyBase64( - clusterName, - cloudProvider, - gcloudProject - )}` - ) - - // Metrics prefix for non-ODIS clusters. - if (!context?.startsWith('AZURE_ODIS')) { - params.push( - `--set stackdriver.metricsPrefix=external.googleapis.com/prometheus/${clusterName}` - ) - } - - if (usingGCP) { - const serviceAccountName = getServiceAccountName(clusterName, cloudProvider) - await createPrometheusGcloudServiceAccount(serviceAccountName, gcloudProject) - console.info(serviceAccountName) - const serviceAccountEmail = await getServiceAccountEmail(serviceAccountName) - params.push( - `--set serviceAccount.annotations.'iam\\\.gke\\\.io/gcp-service-account'=${serviceAccountEmail}` - ) - } - } else { - // Stackdriver disabled - params.push(`--set stackdriver.disabled=true`) - } - - // Set scrape job if set for the context - if (context) { - const scrapeJobName = getDynamicEnvVarValue(DynamicEnvVar.PROM_SCRAPE_JOB_NAME, { context }, '') - const scrapeTargets = getDynamicEnvVarValue(DynamicEnvVar.PROM_SCRAPE_TARGETS, { context }, '') - const scrapeLabels = getDynamicEnvVarValue(DynamicEnvVar.PROM_SCRAPE_LABELS, { context }, '') - - if (scrapeJobName !== '') { - params.push(`--set scrapeJob.Name=${scrapeJobName}`) - } - - if (scrapeTargets !== '') { - const targetParams = setHelmArray('scrapeJob.Targets', scrapeTargets.split(',')) - params.push(...targetParams) - } - - if (scrapeLabels !== '') { - const labelParams = setHelmArray('scrapeJob.Labels', scrapeLabels.split(',')) - params.push(...labelParams) - } - } - - return params -} - -async function getPrometheusGcloudServiceAccountKeyBase64( - clusterName: string, - cloudProvider: string, - gcloudProjectName: string -) { - // First check if value already exist in helm release. If so we pass the same value - // and we avoid creating a new key for the service account - const gcloudServiceAccountKeyBase64 = await getPrometheusGcloudServiceAccountKeyBase64FromHelm() - if (gcloudServiceAccountKeyBase64) { - return gcloudServiceAccountKeyBase64 - } else { - // We do not have the service account key in helm so we need to create the SA (if it does not exist) - // and create a new key for the service account in any case - await switchToGCPProject(gcloudProjectName) - const serviceAccountName = getServiceAccountName(clusterName, cloudProvider) - await createPrometheusGcloudServiceAccount(serviceAccountName, gcloudProjectName) - const serviceAccountEmail = await getServiceAccountEmail(serviceAccountName) - const serviceAccountKeyPath = `/tmp/gcloud-key-${serviceAccountName}.json` - await getServiceAccountKey(serviceAccountEmail, serviceAccountKeyPath) - return fs.readFileSync(serviceAccountKeyPath).toString('base64') - } -} - -async function getPrometheusGcloudServiceAccountKeyBase64FromHelm() { - const prometheusInstalled = await outputIncludes( - `helm list -n ${kubeNamespace}`, - `${releaseName}` - ) - if (prometheusInstalled) { - const [output] = await execCmd(`helm get values -n ${kubeNamespace} ${releaseName}`) - const prometheusValues: any = yaml.safeLoad(output) - return prometheusValues.gcloudServiceAccountKeyBase64 - } -} - -// createPrometheusGcloudServiceAccount creates a gcloud service account with a given -// name and the proper permissions for writing metrics to stackdriver -async function createPrometheusGcloudServiceAccount( - serviceAccountName: string, - gcloudProjectName: string -) { - await execCmdWithExitOnFailure(`gcloud config set project ${gcloudProjectName}`) - const accountCreated = await createServiceAccountIfNotExists( - serviceAccountName, - gcloudProjectName - ) - if (accountCreated) { - let serviceAccountEmail = await getServiceAccountEmail(serviceAccountName) - while (!serviceAccountEmail) { - serviceAccountEmail = await getServiceAccountEmail(serviceAccountName) - } - await execCmdWithExitOnFailure( - `gcloud projects add-iam-policy-binding ${gcloudProjectName} --role roles/monitoring.metricWriter --member serviceAccount:${serviceAccountEmail}` - ) - - // Setup workload identity IAM permissions - await setupGKEWorkloadIdentities( - serviceAccountName, - gcloudProjectName, - kubeNamespace, - kubeServiceAccountName - ) - } -} - -function getCloudProviderPrefix(clusterConfig: BaseClusterConfig) { - const prefixByCloudProvider: { [key in CloudProvider]: string } = { - [CloudProvider.AZURE]: 'aks', - [CloudProvider.GCP]: 'gcp', - } - return prefixByCloudProvider[clusterConfig.cloudProvider] -} - -function getServiceAccountName(clusterName: string, cloudProvider: string) { - // Ensure the service account name is within the length restriction - // and ends with an alphanumeric character - return `prometheus-${cloudProvider}-${clusterName}` - .substring(0, 30) - .replace(/[^a-zA-Z0-9]+$/g, '') -} - -export async function installGrafanaIfNotExists( - context?: string, - clusterConfig?: BaseClusterConfig -) { - const grafanaExists = await outputIncludes( - `helm list -A`, - grafanaReleaseName, - `grafana exists, skipping install` - ) - if (!grafanaExists) { - console.info('Installing grafana') - await installGrafana(context, clusterConfig) - } -} - -async function installGrafana(context?: string, clusterConfig?: BaseClusterConfig) { - await helmAddRepoAndUpdate('https://grafana.github.io/helm-charts', 'grafana') - await createNamespaceIfNotExists(kubeNamespace) - return installGenericHelmChart({ - namespace: kubeNamespace, - releaseName: grafanaReleaseName, - chartDir: grafanaHelmRepo, - parameters: await grafanaHelmParameters(context, clusterConfig), - buildDependencies: false, - valuesOverrideFile: '../helm-charts/grafana/values-clabs.yaml', - }) -} - -export async function upgradeGrafana(context?: string, clusterConfig?: BaseClusterConfig) { - await helmAddRepoAndUpdate('https://grafana.github.io/helm-charts', 'grafana') - await createNamespaceIfNotExists(kubeNamespace) - return upgradeGenericHelmChart({ - namespace: kubeNamespace, - releaseName: grafanaReleaseName, - chartDir: grafanaHelmRepo, - parameters: await grafanaHelmParameters(context, clusterConfig), - buildDependencies: false, - // Adding this file and clabs' default values file. - valuesOverrideFile: '../helm-charts/grafana/values-clabs.yaml', - }) -} - -export async function removeGrafanaHelmRelease() { - const grafanaExists = await outputIncludes(`helm list -A`, grafanaReleaseName) - if (grafanaExists) { - console.info('Removing grafana') - await removeGenericHelmChart(grafanaReleaseName, kubeNamespace) - } -} - -async function grafanaHelmParameters(context?: string, clusterConfig?: BaseClusterConfig) { - // Grafana chart is a copy from source. No changes done directly on the chart. - const [_, k8sClusterName] = getK8sContextVars(clusterConfig, context) - const k8sDomainName = fetchEnv(envVar.CLUSTER_DOMAIN_NAME) - // Rename baklavastaging -> baklava - const grafanaUrl = - k8sClusterName !== 'baklavastaging' - ? `${k8sClusterName}-grafana.${k8sDomainName}.org` - : `baklava-grafana.${k8sDomainName}.org` - const values = { - adminPassword: fetchEnv(envVar.GRAFANA_LOCAL_ADMIN_PASSWORD), - 'grafana.ini': { - server: { - root_url: `https://${grafanaUrl}`, - }, - 'auth.google': { - client_id: fetchEnv(envVar.GRAFANA_LOCAL_OAUTH2_CLIENT_ID), - client_secret: fetchEnv(envVar.GRAFANA_LOCAL_OAUTH2_CLIENT_SECRET), - }, - }, - ingress: { - hosts: [grafanaUrl], - tls: [ - { - secretName: `${k8sClusterName}-grafana-tls`, - hosts: [grafanaUrl], - }, - ], - }, - } - - const valuesFile = '/tmp/grafana-values.yaml' - fs.writeFileSync(valuesFile, yaml.safeDump(values)) - - // Adding this file and clabs' default values file. - const params = [`-f ${valuesFile} --version ${grafanaChartVersion}`] - return params -} diff --git a/packages/celotool/src/lib/promtail.ts b/packages/celotool/src/lib/promtail.ts deleted file mode 100644 index 3a4144684ff..00000000000 --- a/packages/celotool/src/lib/promtail.ts +++ /dev/null @@ -1,155 +0,0 @@ -import { createNamespaceIfNotExists } from 'src/lib/cluster' -import { execCmdWithExitOnFailure } from 'src/lib/cmd-utils' -import { envVar, fetchEnv } from 'src/lib/env-utils' -import { - helmAddAndUpdateRepos, - installHelmDiffPlugin, - isCelotoolHelmDryRun, - isCelotoolVerbose, - removeGenericHelmChart, -} from 'src/lib/helm_deploy' -import { BaseClusterConfig, CloudProvider } from 'src/lib/k8s-cluster/base' -import { GCPClusterConfig } from 'src/lib/k8s-cluster/gcp' -import { - createServiceAccountIfNotExists, - getServiceAccountEmail, - setupGKEWorkloadIdentities, -} from 'src/lib/service-account-utils' -import { outputIncludes } from 'src/lib/utils' - -// https://artifacthub.io/packages/helm/grafana/promtail -const helmChartPath = 'grafana/promtail' -const valuesFilePath = '../helm-charts/promtail/values.yaml' -const releaseName = 'promtail' -const kubeNamespace = 'prometheus' -const kubeServiceAccountName = 'gcp-promtail-loki-grafana' -const promtailImageTag = '2.3.0' -const chartVersion = '3.8.2' - -export async function installPromtailIfNotExists(clusterConfig?: BaseClusterConfig) { - const promtailExists = await outputIncludes( - `helm list -n ${kubeNamespace}`, - releaseName, - `${releaseName} exists, skipping install` - ) - if (!promtailExists) { - console.info(`Installing ${releaseName}`) - await installPromtail(clusterConfig) - } -} - -async function installPromtail(clusterConfig?: BaseClusterConfig) { - await helmAddAndUpdateRepos() - await createNamespaceIfNotExists(kubeNamespace) - - const cmd = await buildHelmUpgradeCmd(await helmParameters(clusterConfig)) - await execCmdWithExitOnFailure(cmd, {}, isCelotoolVerbose()) -} - -export async function removePromtail() { - await removeGenericHelmChart(releaseName, kubeNamespace) -} - -export async function upgradePromtail(clusterConfig?: BaseClusterConfig) { - const cmd = await buildHelmUpgradeCmd(await helmParameters(clusterConfig)) - await execCmdWithExitOnFailure(cmd, {}, isCelotoolVerbose()) -} - -async function helmParameters(clusterConfig?: BaseClusterConfig) { - const params = [] - - // Find which cloud provider is in use - const cloudProvider = clusterConfig ? clusterConfig.cloudProvider : CloudProvider.GCP - - switch (cloudProvider) { - case CloudProvider.GCP: - let gcpProjectName, clusterName - if (clusterConfig) { - const configGCP = clusterConfig as GCPClusterConfig - gcpProjectName = configGCP!.projectName - clusterName = configGCP!.clusterName - } else { - gcpProjectName = fetchEnv(envVar.TESTNET_PROJECT_NAME) - clusterName = fetchEnv(envVar.KUBERNETES_CLUSTER_NAME) - } - - const serviceAccountName = await createPromtailGcloudServiceAccount( - gcpProjectName, - clusterName - ) - const serviceAccountEmail = await getServiceAccountEmail(serviceAccountName) - - params.push( - `--set serviceAccount.annotations.'iam\\\.gke\\\.io/gcp-service-account'=${serviceAccountEmail}` - ) - params.push(`--set extraArgs[0]='-client.external-labels=cluster_name=${clusterName}'`) - break - - case CloudProvider.AZURE: - // Adding cluster_name label - params.push( - `--set extraArgs[0]='-client.external-labels=cluster_name=${clusterConfig?.clusterName}'` - ) - break - } - - const user = fetchEnv(envVar.LOKI_USERNAME) - const key = fetchEnv(envVar.LOKI_KEY) - const url = fetchEnv(envVar.LOKI_URL) - params.push(`--set config.lokiAddress=https://${user}:${key}@${url}`) - params.push(`--set promtail.imageTag=${promtailImageTag}`) - params.push(`--version=${chartVersion}`) - - return params -} - -async function createPromtailGcloudServiceAccount(gcpProjectName: string, clusterName: string) { - // Create a new GCP Service Account - const serviceAccountName = getServiceAccountName(clusterName, 'gcp') - - const accountCreated = await createServiceAccountIfNotExists( - serviceAccountName, - gcpProjectName, - 'Loki/Promtail service account to push logs to Grafana Cloud' - ) - - if (accountCreated) { - // Setup workload identity IAM permissions - await setupGKEWorkloadIdentities( - serviceAccountName, - gcpProjectName, - kubeNamespace, - kubeServiceAccountName - ) - } - - return serviceAccountName -} - -// TODO: refactor with the function in prometheus -function getServiceAccountName(clusterName: string, cloudProvider: string) { - return `promtail-${cloudProvider}-${clusterName}`.substring(0, 30).replace(/[^a-zA-Z0-9]+$/g, '') -} - -async function buildHelmUpgradeCmd(params: string[]) { - if (isCelotoolHelmDryRun()) { - await installHelmDiffPlugin() - } - - let cmd = `helm ${ - isCelotoolHelmDryRun() ? 'diff -C 5' : '' - } upgrade --install ${releaseName} ${helmChartPath} \ - -n ${kubeNamespace} \ - -f ${valuesFilePath} \ - ${params.join(' ')}` - - if (isCelotoolVerbose()) { - cmd += ' --debug' - if (isCelotoolHelmDryRun()) { - // The promtail config is a k8s secret. - cmd += ' --show-secrets' - } - } - - return cmd -} diff --git a/packages/celotool/src/lib/pubsub.ts b/packages/celotool/src/lib/pubsub.ts deleted file mode 100644 index 18170ddd56d..00000000000 --- a/packages/celotool/src/lib/pubsub.ts +++ /dev/null @@ -1,56 +0,0 @@ -import PubSub from '@google-cloud/pubsub' - -export const createClient = (credentials?: any) => { - // @ts-ignore-next-line - return new PubSub.v1.SubscriberClient({ credentials }) -} - -export const buildSubscriptionName = (envName: string, purpose: string) => { - return `${envName}-${purpose}` -} - -export const createSubscription = async ( - client: any, - projectID: string, - topic: string, - subscriptionName: string -) => { - const formattedName = client.subscriptionPath(projectID, subscriptionName) - const formattedTopic = client.topicPath(projectID, topic) - - const request = { - name: formattedName, - topic: formattedTopic, - } - const [subscriptionInfo] = await client.createSubscription(request) - - return subscriptionInfo -} - -export const deleteSubscription = async ( - client: any, - projectID: string, - subscriptionName: string -) => { - const formattedName = client.subscriptionPath(projectID, subscriptionName) - - await client.deleteSubscription({ subscription: formattedName }) - - return true -} - -export const createStreamingPull = ( - client: any, - projectID: string, - subscriptionName: string, - // eslint-disable-next-line: ban-types - handler: Function -) => { - const stream = client.streamingPull().on('data', handler) - const formattedName = client.subscriptionPath(projectID, subscriptionName) - const request = { - subscription: formattedName, - streamAckDeadlineSeconds: 10, - } - stream.write(request) -} diff --git a/packages/celotool/src/lib/pumba.ts b/packages/celotool/src/lib/pumba.ts deleted file mode 100644 index 4ecd35145f4..00000000000 --- a/packages/celotool/src/lib/pumba.ts +++ /dev/null @@ -1,19 +0,0 @@ -import { makeHelmParameters } from 'src/lib/helm_deploy' -import { envVar, fetchEnv } from './env-utils' - -export function helmReleaseName(celoEnv: string) { - return celoEnv + '-pumba' -} - -export const helmChartDir = '../helm-charts/pumba' - -export function helmParameters() { - return makeHelmParameters({ - 'pumba.interval': fetchEnv(envVar.CHAOS_TEST_INTERVAL), - 'pumba.duration': fetchEnv(envVar.CHAOS_TEST_DURATION), - 'pumba.networkDelay': fetchEnv(envVar.CHAOS_TEST_NETWORK_DELAY), - 'pumba.networkJitter': fetchEnv(envVar.CHAOS_TEST_NETWORK_JITTER), - 'pumba.networkLoss': fetchEnv(envVar.CHAOS_TEST_NETWORK_LOSS), - 'pumba.networkRate': fetchEnv(envVar.CHAOS_TEST_NETWORK_RATE), - }) -} diff --git a/packages/celotool/src/lib/service-account-utils.ts b/packages/celotool/src/lib/service-account-utils.ts deleted file mode 100644 index d475d1637a7..00000000000 --- a/packages/celotool/src/lib/service-account-utils.ts +++ /dev/null @@ -1,77 +0,0 @@ -import { execCmdAndParseJson, execCmdWithExitOnFailure } from './cmd-utils' -import { isCelotoolHelmDryRun } from './helm_deploy' -import { switchToGCPProject, switchToProjectFromEnv } from './utils' - -// createServiceAccountIfNotExists creates a service account with the given name -// if it does not exist. Returns if the account was created. -export async function createServiceAccountIfNotExists( - name: string, - gcloudProject?: string, - description?: string -) { - if (gcloudProject !== undefined) { - await switchToGCPProject(gcloudProject) - } else { - await switchToProjectFromEnv() - } - // TODO: add permissions for cloudsql editor to service account - const serviceAccounts = await execCmdAndParseJson( - `gcloud iam service-accounts list --filter "displayName:${name}" --quiet --format json` - ) - const serviceAccountExists = serviceAccounts.some((account: any) => account.displayName === name) - if (!serviceAccountExists) { - let cmd = `gcloud iam service-accounts create ${name} --display-name="${name}" ` - if (description) { - cmd = cmd.concat(`--description="${description}"`) - } - if (isCelotoolHelmDryRun()) { - console.info(`This would run the following command:\n${cmd}\n`) - } else { - await execCmdWithExitOnFailure(cmd) - } - } - return !serviceAccountExists -} - -// getServiceAccountEmail returns the email of the service account with the -// given name -export async function getServiceAccountEmail(serviceAccountName: string) { - const [output] = await execCmdWithExitOnFailure( - `gcloud iam service-accounts list --filter="displayName<=${serviceAccountName} AND displayName>=${serviceAccountName}" --format='value[terminator=""](email)'` - ) - return output -} - -export function getServiceAccountKey(serviceAccountEmail: string, keyPath: string) { - return execCmdWithExitOnFailure( - `gcloud iam service-accounts keys create ${keyPath} --iam-account ${serviceAccountEmail}` - ) -} - -// Used for Prometheus and Promtail/Loki -// https://cloud.google.com/kubernetes-engine/docs/how-to/workload-identity -export async function setupGKEWorkloadIdentities( - serviceAccountName: string, - gcloudProjectName: string, - kubeNamespace: string, - kubeServiceAccountName: string -) { - // Only grant access to GCE API to Prometheus or Promtail SA deployed in GKE - if (!serviceAccountName.includes('gcp')) { - return - } - - const serviceAccountEmail = await getServiceAccountEmail(serviceAccountName) - - // Allow the Kubernetes service account to impersonate the Google service account - const roleIamWorkloadIdentityUserCmd = `gcloud iam --project ${gcloudProjectName} service-accounts add-iam-policy-binding \ - --role roles/iam.workloadIdentityUser \ - --member "serviceAccount:${gcloudProjectName}.svc.id.goog[${kubeNamespace}/${kubeServiceAccountName}]" \ - ${serviceAccountEmail}` - - if (isCelotoolHelmDryRun()) { - console.info(`This would run the following: ${roleIamWorkloadIdentityUserCmd}\n`) - } else { - await execCmdWithExitOnFailure(roleIamWorkloadIdentityUserCmd) - } -} diff --git a/packages/celotool/src/lib/testnet-utils.ts b/packages/celotool/src/lib/testnet-utils.ts deleted file mode 100644 index 74bd7e36f15..00000000000 --- a/packages/celotool/src/lib/testnet-utils.ts +++ /dev/null @@ -1,216 +0,0 @@ -import { StaticNodeUtils } from '@celo/network-utils' -import { GenesisBlocksGoogleStorageBucketName } from '@celo/network-utils/lib/genesis-block-utils' -import { Storage } from '@google-cloud/storage' -import * as fs from 'fs' -import fetch from 'node-fetch' -import * as path from 'path' -import { retryCmd } from '../lib/utils' -import { execCmdWithExitOnFailure } from './cmd-utils' -import { getGenesisGoogleStorageUrl } from './endpoints' -import { envVar, fetchEnvOrFallback, getEnvFile } from './env-utils' -import { ensureAuthenticatedGcloudAccount } from './gcloud_utils' -import { getBootnodeEnode, getEnodesWithExternalIPAddresses } from './geth' -const genesisBlocksBucketName = GenesisBlocksGoogleStorageBucketName -const staticNodesBucketName = StaticNodeUtils.getStaticNodesGoogleStorageBucketName() -// Someone has taken env_files and I don't even has permission to modify it :/ -// See files in this bucket using `$ gsutil ls gs://env_config_files` -const envBucketName = 'env_config_files' -const bootnodesBucketName = 'env_bootnodes' - -// uploads genesis block, static nodes, env file, and bootnode to GCS -export async function uploadTestnetInfoToGoogleStorage(networkName: string) { - await uploadTestnetStaticNodesToGoogleStorage(networkName) - await uploadBootnodeToGoogleStorage(networkName) - await uploadEnvFileToGoogleStorage(networkName) -} - -export async function uploadGenesisBlockToGoogleStorage(networkName: string, genesis: string) { - console.info(`\nUploading genesis block for ${networkName} to Google cloud storage`) - console.debug(`Genesis block is ${genesis} \n`) - await uploadDataToGoogleStorage( - genesis, - genesisBlocksBucketName, - networkName, - true, - 'application/json' - ) -} - -export async function getGenesisBlockFromGoogleStorage(networkName: string) { - const resp = await fetch(getGenesisGoogleStorageUrl(networkName)) - return JSON.stringify(await resp.json()) -} - -// This will throw an error if it fails to upload. -// Intended to be used for deploying testnets, not forno full nodes. -export async function uploadTestnetStaticNodesToGoogleStorage(networkName: string) { - console.info(`\nUploading static nodes for ${networkName} to Google cloud storage...`) - // Get node json file - const nodesData: string[] | null = await retryCmd(() => - getEnodesWithExternalIPAddresses(networkName) - ) - if (nodesData === null) { - throw new Error('Fail to get static nodes information') - } - return uploadStaticNodesToGoogleStorage(networkName, nodesData) -} - -export async function uploadStaticNodesToGoogleStorage(fileName: string, enodes: string[]) { - const json = JSON.stringify(enodes) - console.debug(`${fileName} static nodes are ${json}\n`) - await uploadDataToGoogleStorage(json, staticNodesBucketName, fileName, true, 'application/json') -} - -export async function uploadBootnodeToGoogleStorage(networkName: string) { - console.info(`\nUploading bootnode for ${networkName} to Google Cloud Storage...`) - const [bootnodeEnode] = await retryCmd(() => getBootnodeEnode(networkName)) - if (!bootnodeEnode) { - throw new Error('Failed to get bootnode enode') - } - // for now there is always only one bootnodde - console.info('Bootnode enode:', bootnodeEnode) - await uploadDataToGoogleStorage( - bootnodeEnode, - bootnodesBucketName, - networkName, - true, // make it public - 'text/plain' - ) -} - -export async function uploadEnvFileToGoogleStorage(networkName: string) { - const envFileName = getEnvFile(networkName) - const userInfo = `${await getGoogleCloudUserInfo()}` - const repo = await getGitRepoName() - const commitHash = await getCommitHash() - - console.info( - `\nUploading Env file ${envFileName} for network ${networkName} to Google cloud storage: ` + - `gs://${envBucketName}/${networkName}` - ) - const envFileData = fs.readFileSync(getEnvFile(networkName)).toString() - const metaData = - `# .env file for network "${networkName}"\n` + - `# Last modified by "${userInfo}"\n` + - `# Last modified on ${Date()}\n` + - `# Base commit: "https://github.com/${repo}/commit/${commitHash}"\n` - const fullData = metaData + '\n' + envFileData - await uploadDataToGoogleStorage( - fullData, - envBucketName, - networkName, - false /* keep file private */, - 'text/plain' - ) -} - -async function getGoogleCloudUserInfo(): Promise { - const cmd = 'gcloud config get-value account' - const stdout = (await execCmdWithExitOnFailure(cmd))[0] - return stdout.trim() -} - -async function getGitRepoName(): Promise { - const cmd = 'git config --get remote.origin.url' - let stdout = '' - try { - stdout = (await execCmdWithExitOnFailure(cmd))[0].trim() - stdout = stdout.split(':')[1] - if (stdout.endsWith('.git')) { - stdout = stdout.substring(0, stdout.length - '.git'.length) - } - } catch (error) { - // Not running from a git folder - stdout = 'celo-monorepo' - } - - return stdout -} - -async function getCommitHash(): Promise { - try { - const cmd = 'git show | head -n 1' - const stdout = (await execCmdWithExitOnFailure(cmd))[0] - return stdout.split(' ')[1].trim() - } catch (error) { - // Not running from a git folder - return 'no-commmit-hash' - } -} - -// Writes data to a temporary file & uploads it to GCS -export function uploadDataToGoogleStorage( - data: any, - googleStorageBucketName: string, - googleStorageFileName: string, - makeFileWorldReadable: boolean, - contentType: string -) { - const localTmpFilePath = `/tmp/${googleStorageBucketName}-${googleStorageFileName}` - // @ts-ignore The expected type of this is not accurate - fs.mkdirSync(path.dirname(localTmpFilePath), { - recursive: true, - }) - fs.writeFileSync(localTmpFilePath, data) - return uploadFileToGoogleStorage( - localTmpFilePath, - googleStorageBucketName, - googleStorageFileName, - makeFileWorldReadable, - contentType - ) -} - -// TODO(yerdua): make this communicate or handle auth issues reasonably. Ideally, -// it should catch an auth error and tell the user to login with `gcloud auth login`. -// So, if you run into an error that says something about being unauthorized, -// copy and paste this into your terminal: gcloud auth login -// One can browse these files at https://console.cloud.google.com/storage/browser -export async function uploadFileToGoogleStorage( - localFilePath: string, - googleStorageBucketName: string, - googleStorageFileName: string, - makeFileWorldReadable: boolean, - contentType: string -) { - await ensureAuthenticatedGcloudAccount() - const storage = new Storage() - await storage.bucket(googleStorageBucketName).upload(localFilePath, { - destination: googleStorageFileName, - contentType, - metadata: { - cacheControl: 'private', - }, - }) - - if (makeFileWorldReadable) { - // set the permission to be world-readable - await storage.bucket(googleStorageBucketName).file(googleStorageFileName).acl.add({ - entity: 'allUsers', - role: storage.acl.READER_ROLE, - }) - } -} - -// Reads the envVar VALIDATOR_PROXY_COUNTS, which indicates how many validators -// have a certain number of proxies in the format: -// <# of validators>:;<# of validators>:;... -// For example, VALIDATOR_PROXY_COUNTS='1:0,2:1,3:2' will give [0,1,1,2,2,2] -export function getProxiesPerValidator() { - const arr = [] - const valProxyCountsStr = fetchEnvOrFallback(envVar.VALIDATOR_PROXY_COUNTS, '') - const splitValProxyCountStrs = valProxyCountsStr.split(',').filter((counts) => counts) - for (const valProxyCount of splitValProxyCountStrs) { - const [valCountStr, proxyCountStr] = valProxyCount.split(':') - const valCount = parseInt(valCountStr, 10) - const proxyCount = parseInt(proxyCountStr, 10) - for (let i = 0; i < valCount; i++) { - arr.push(proxyCount) - } - } - return arr -} - -export function getProxyName(celoEnv: string, validatorIndex: number, proxyIndex: number) { - return `${celoEnv}-validators-${validatorIndex}-proxy-${proxyIndex}` -} diff --git a/packages/celotool/src/lib/tracer-tool.ts b/packages/celotool/src/lib/tracer-tool.ts deleted file mode 100644 index 987f78f2328..00000000000 --- a/packages/celotool/src/lib/tracer-tool.ts +++ /dev/null @@ -1,48 +0,0 @@ -import { getEnodesAddresses } from 'src/lib/geth' -import { - installGenericHelmChart, - removeGenericHelmChart, - upgradeGenericHelmChart, -} from 'src/lib/helm_deploy' -import { envVar, fetchEnv } from './env-utils' - -const chartDir = '../helm-charts/tracer-tool/' - -function releaseName(celoEnv: string) { - return `${celoEnv}-tracer-tool` -} - -export async function installHelmChart(celoEnv: string) { - await installGenericHelmChart({ - namespace: celoEnv, - releaseName: releaseName(celoEnv), - chartDir, - parameters: await helmParameters(celoEnv), - }) -} - -export async function upgradeHelmChart(celoEnv: string) { - await upgradeGenericHelmChart({ - namespace: celoEnv, - releaseName: releaseName(celoEnv), - chartDir, - parameters: await helmParameters(celoEnv), - }) -} - -export async function removeHelmRelease(celoEnv: string) { - await removeGenericHelmChart(releaseName(celoEnv), celoEnv) -} - -async function helmParameters(celoEnv: string) { - const enodes = await getEnodesAddresses(celoEnv) - const b64EnodesJSON = Buffer.from(JSON.stringify(enodes, null, 0)).toString('base64') - - return [ - `--namespace ${celoEnv}`, - `--set imageRepository=${fetchEnv(envVar.CELOTOOL_DOCKER_IMAGE_REPOSITORY)}`, - `--set imageTag=${fetchEnv(envVar.CELOTOOL_DOCKER_IMAGE_TAG)}`, - `--set environment=${celoEnv}`, - `--set enodes="${b64EnodesJSON}"`, - ] -} diff --git a/packages/celotool/src/lib/transaction-metrics-exporter.ts b/packages/celotool/src/lib/transaction-metrics-exporter.ts deleted file mode 100644 index 4b79a2325eb..00000000000 --- a/packages/celotool/src/lib/transaction-metrics-exporter.ts +++ /dev/null @@ -1,61 +0,0 @@ -import { envVar, fetchEnv, fetchEnvOrFallback } from 'src/lib/env-utils' -import { - installGenericHelmChart, - removeGenericHelmChart, - upgradeGenericHelmChart, -} from 'src/lib/helm_deploy' - -const chartDir = '../helm-charts/transaction-metrics-exporter/' - -function releaseName(celoEnv: string, suffix: string) { - return `${celoEnv}-transaction-metrics-exporter-${suffix}` -} - -export async function installHelmChart(celoEnv: string) { - const suffix = fetchEnvOrFallback(envVar.TRANSACTION_METRICS_EXPORTER_SUFFIX, '1') - await installGenericHelmChart({ - namespace: celoEnv, - releaseName: releaseName(celoEnv, suffix), - chartDir, - parameters: await helmParameters(celoEnv), - }) -} - -export async function upgradeHelmChart(celoEnv: string) { - const suffix = fetchEnvOrFallback(envVar.TRANSACTION_METRICS_EXPORTER_SUFFIX, '1') - await upgradeGenericHelmChart({ - namespace: celoEnv, - releaseName: releaseName(celoEnv, suffix), - chartDir, - parameters: await helmParameters(celoEnv), - }) -} - -export async function removeHelmRelease(celoEnv: string) { - const suffix = fetchEnvOrFallback(envVar.TRANSACTION_METRICS_EXPORTER_SUFFIX, '1') - await removeGenericHelmChart(releaseName(celoEnv, suffix), celoEnv) -} - -async function helmParameters(celoEnv: string) { - const suffix = fetchEnvOrFallback(envVar.TRANSACTION_METRICS_EXPORTER_SUFFIX, '1') - const params = [ - `--namespace ${celoEnv}`, - `--set environment="${celoEnv}"`, - `--set imageRepository="${fetchEnv( - envVar.TRANSACTION_METRICS_EXPORTER_DOCKER_IMAGE_REPOSITORY - )}"`, - `--set imageTag="${fetchEnv(envVar.TRANSACTION_METRICS_EXPORTER_DOCKER_IMAGE_TAG)}"`, - `--set deploymentSuffix=${suffix}`, - `--set fromBlock=${fetchEnvOrFallback(envVar.TRANSACTION_METRICS_EXPORTER_FROM_BLOCK, '0')}`, - `--set toBlock=${fetchEnvOrFallback(envVar.TRANSACTION_METRICS_EXPORTER_FROM_BLOCK, '')}`, - `--set blockInterval=${fetchEnvOrFallback( - envVar.TRANSACTION_METRICS_EXPORTER_BLOCK_INTERVAL, - '1' - )}`, - `--set watchAddress=${fetchEnvOrFallback( - envVar.TRANSACTION_METRICS_EXPORTER_WATCH_ADDRESS, - '' - )}`, - ] - return params -} diff --git a/packages/celotool/src/lib/utils.ts b/packages/celotool/src/lib/utils.ts deleted file mode 100644 index 40e4bf74beb..00000000000 --- a/packages/celotool/src/lib/utils.ts +++ /dev/null @@ -1,96 +0,0 @@ -import sleep from 'sleep-promise' -import yargs from 'yargs' -import { switchToClusterFromEnv } from './cluster' -import { execCmdWithExitOnFailure } from './cmd-utils' -import { envVar, fetchEnv } from './env-utils' -import { retrieveIPAddress } from './helm_deploy' - -export async function outputIncludes(cmd: string, matchString: string, matchMessage?: string) { - const [stdout] = await execCmdWithExitOnFailure(cmd) - if (stdout.includes(matchString)) { - if (matchMessage) { - console.info(matchMessage) - } - return true - } - return false -} - -export async function retrieveTxNodeIpAddress(celoEnv: string, txNodeIndex: number) { - return retrieveIPAddress(`${celoEnv}-tx-nodes-${txNodeIndex}`) -} - -export async function getVerificationPoolConfig(celoEnv: string) { - await switchToClusterFromEnv(celoEnv) - - const ip = await retrieveTxNodeIpAddress(celoEnv, 0) - - return { - testnetId: fetchEnv('NETWORK_ID'), - txIP: ip, - txPort: '8545', - } -} - -export async function switchToGCPProject(projectName: string) { - const [currentProject] = await execCmdWithExitOnFailure('gcloud config get-value project') - - if (currentProject !== projectName) { - await execCmdWithExitOnFailure(`gcloud config set project ${projectName}`) - } -} - -export async function switchToProjectFromEnv() { - const expectedProject = fetchEnv(envVar.TESTNET_PROJECT_NAME) - await switchToGCPProject(expectedProject) -} - -export function addCeloGethMiddleware(argv: yargs.Argv) { - return argv - .option('geth-dir', { - type: 'string', - description: 'path to geth repository', - demand: 'Please, specify the path to geth directory, where the binary could be found', - }) - .option('data-dir', { - type: 'string', - description: 'path to datadir', - demand: 'Please, specify geth datadir', - }) -} - -// Some tools require hex address to be preceeded by 0x, some don't. -// Therefore, we try to be conservative and accept only the addresses prefixed by 0x as valid. -export const validateAccountAddress = (address: string) => { - return address !== null && address.toLowerCase().startsWith('0x') && address.length === 42 // 0x followed by 40 hex-chars -} - -export const ensure0x = (hexstr: string) => (hexstr.startsWith('0x') ? hexstr : '0x' + hexstr) -export const strip0x = (hexstr: string) => (hexstr.startsWith('0x') ? hexstr.slice(2) : hexstr) - -export async function retryCmd( - cmd: () => Promise, - numAttempts: number = 100, - maxTimeoutMs: number = 15000 -) { - for (let i = 1; i <= numAttempts; i++) { - try { - const result = await cmd() - return result - } catch (error) { - const sleepTimeBasisInMs = 1000 - const sleepTimeInMs = Math.min(sleepTimeBasisInMs * Math.pow(2, i), maxTimeoutMs) - console.warn( - `${new Date().toLocaleTimeString()} Retry attempt: ${i}/${numAttempts}, ` + - `retry after sleeping for ${sleepTimeInMs} milli-seconds`, - error - ) - await sleep(sleepTimeInMs) - } - } - throw Error(`Retried ${numAttempts} without any successes`) -} - -export const stringToBoolean = (myString: string) => { - return myString.toLowerCase() === 'true' -} diff --git a/packages/celotool/src/lib/voting-bot.ts b/packages/celotool/src/lib/voting-bot.ts deleted file mode 100644 index e16cb125013..00000000000 --- a/packages/celotool/src/lib/voting-bot.ts +++ /dev/null @@ -1,99 +0,0 @@ -import { ContractKit, newKitFromWeb3 } from '@celo/contractkit' -import { getFornoUrl } from 'src/lib/endpoints' -import { envVar, fetchEnv } from 'src/lib/env-utils' -import { AccountType, getPrivateKeysFor } from 'src/lib/generate_utils' -import { installGenericHelmChart, removeGenericHelmChart } from 'src/lib/helm_deploy' -import { ensure0x } from 'src/lib/utils' -import Web3 from 'web3' - -const web3 = new Web3() - -const helmChartPath = '../helm-charts/voting-bot' - -export async function installHelmChart(celoEnv: string, excludedGroups?: string[]) { - const params = await helmParameters(celoEnv, excludedGroups) - console.info(params) - return installGenericHelmChart({ - namespace: celoEnv, - releaseName: releaseName(celoEnv), - chartDir: helmChartPath, - parameters: params, - }) -} -export async function removeHelmRelease(celoEnv: string) { - await removeGenericHelmChart(releaseName(celoEnv), celoEnv) -} - -export async function setupVotingBotAccounts(celoEnv: string) { - const fornoUrl = getFornoUrl(celoEnv) - const mnemonic = fetchEnv(envVar.MNEMONIC) - const numBotAccounts = parseInt(fetchEnv(envVar.VOTING_BOTS), 10) - - const kit: ContractKit = newKitFromWeb3(new Web3(fornoUrl)) - const goldToken = await kit.contracts.getGoldToken() - const lockedGold = await kit.contracts.getLockedGold() - const accounts = await kit.contracts.getAccounts() - - const botsWithoutGold: string[] = [] - - for (const key of getPrivateKeysFor(AccountType.VOTING_BOT, mnemonic, numBotAccounts)) { - const botAccount = ensure0x(web3.eth.accounts.privateKeyToAccount(key).address) - const goldBalance = await goldToken.balanceOf(botAccount) - if (goldBalance.isZero()) { - botsWithoutGold.push(botAccount) - continue - } - - kit.connection.addAccount(key) - - if (!(await accounts.isAccount(botAccount))) { - const registerTx = await accounts.createAccount() - await registerTx.sendAndWaitForReceipt({ from: botAccount }) - } - - const amountLocked = await lockedGold.getAccountTotalLockedGold(botAccount) - if (amountLocked.isZero()) { - const tx = await lockedGold.lock() - const amountToLock = goldBalance.multipliedBy(0.99).toFixed(0) - - await tx.sendAndWaitForReceipt({ - to: lockedGold.address, - value: amountToLock, - from: botAccount, - }) - console.info(`Locked gold for ${botAccount}`) - } - } - if (botsWithoutGold.length > 0) { - throw new Error(`These bot accounts have no gold. Faucet them, and retry: ${botsWithoutGold}`) - } - console.info('Finished/confirmed setup of voting bot accounts') - - kit.connection.stop() -} - -function helmParameters(celoEnv: string, excludedGroups?: string[]) { - const params = [ - `--set celoProvider=${getFornoUrl(celoEnv)}`, - `--set cronSchedule="${fetchEnv(envVar.VOTING_BOT_CRON_SCHEDULE)}"`, - `--set domain.name=${fetchEnv(envVar.CLUSTER_DOMAIN_NAME)}`, - `--set environment=${celoEnv}`, - `--set imageRepository=${fetchEnv(envVar.CELOTOOL_DOCKER_IMAGE_REPOSITORY)}`, - `--set imageTag=${fetchEnv(envVar.CELOTOOL_DOCKER_IMAGE_TAG)}`, - `--set mnemonic="${fetchEnv(envVar.MNEMONIC)}"`, - `--set votingBot.changeBaseline="${fetchEnv(envVar.VOTING_BOT_CHANGE_BASELINE)}"`, - `--set votingBot.count=${fetchEnv(envVar.VOTING_BOTS)}`, - `--set votingBot.exploreProbability="${fetchEnv(envVar.VOTING_BOT_EXPLORE_PROBABILITY)}"`, - `--set votingBot.scoreSensitivity="${fetchEnv(envVar.VOTING_BOT_SCORE_SENSITIVITY)}"`, - `--set votingBot.wakeProbability="${fetchEnv(envVar.VOTING_BOT_WAKE_PROBABILITY)}"`, - ] - - if (excludedGroups && excludedGroups.length > 0) { - params.push(`--set votingBot.excludedGroups="${excludedGroups.join('\\,')}"`) - } - return params -} - -function releaseName(celoEnv: string) { - return `${celoEnv}-voting-bot` -} diff --git a/packages/celotool/src/lib/wallet-connect.ts b/packages/celotool/src/lib/wallet-connect.ts deleted file mode 100644 index 83564d1516e..00000000000 --- a/packages/celotool/src/lib/wallet-connect.ts +++ /dev/null @@ -1,46 +0,0 @@ -import { createNamespaceIfNotExists } from 'src/lib/cluster' -import { - installGenericHelmChart, - makeHelmParameters, - removeGenericHelmChart, - upgradeGenericHelmChart, -} from 'src/lib/helm_deploy' -import { envVar, fetchEnv } from './env-utils' - -const releaseName = 'walletconnect' -const releaseNamespace = 'walletconnect' - -export const helmChartDir = '../helm-charts/wallet-connect' - -export async function installWalletConnect() { - await createNamespaceIfNotExists(releaseNamespace) - await installGenericHelmChart({ - namespace: releaseNamespace, - releaseName, - chartDir: helmChartDir, - parameters: helmParameters(), - }) -} - -export async function upgradeWalletConnect() { - await upgradeGenericHelmChart({ - namespace: releaseNamespace, - releaseName, - chartDir: helmChartDir, - parameters: helmParameters(), - }) -} - -export async function removeWalletConnect() { - await removeGenericHelmChart(releaseName, releaseNamespace) -} - -export function helmParameters() { - return makeHelmParameters({ - 'domain.name': fetchEnv(envVar.CLUSTER_DOMAIN_NAME), - 'walletconnect.image.repository': fetchEnv(envVar.WALLET_CONNECT_IMAGE_REPOSITORY), - 'walletconnect.image.tag': fetchEnv(envVar.WALLET_CONNECT_IMAGE_TAG), - 'redis.cluster.enabled': fetchEnv(envVar.WALLET_CONNECT_REDIS_CLUSTER_ENABLED), - 'redis.cluster.usePassword': fetchEnv(envVar.WALLET_CONNECT_REDIS_CLUSTER_USEPASSWORD), - }) -} diff --git a/packages/celotool/src/types.d.ts b/packages/celotool/src/types.d.ts deleted file mode 100644 index cb40e33c989..00000000000 --- a/packages/celotool/src/types.d.ts +++ /dev/null @@ -1,11 +0,0 @@ -declare module 'web3-utils' -declare module 'country-data' -declare module 'bip39' { - function mnemonicToSeedSync(mnemonic: string): Buffer -} -declare module 'read-last-lines' { - namespace readLastLines { - function read(inputFilePath: string, maxLineCount: number, encoding?: string): Promise - } - export = readLastLines -} diff --git a/packages/celotool/stakeoff_grants.json b/packages/celotool/stakeoff_grants.json deleted file mode 100644 index 67cb8cf382e..00000000000 --- a/packages/celotool/stakeoff_grants.json +++ /dev/null @@ -1,2162 +0,0 @@ -[ - { - "identifier": "0x0f0b8b43084d3aa8a05d60922F8a2791FB603970", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x0f0b8b43084d3aa8a05d60922F8a2791FB603970", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x57a5fddadb1d26d0558a55bd11e17f16ebcfbbcb", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x57a5fddadb1d26d0558a55bd11e17f16ebcfbbcb", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x6bd60aa42b900f7a5608d4dff5b406e503924c24", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x6bd60aa42b900f7a5608d4dff5b406e503924c24", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x2E5b1E0ebbAbBf74b2D13876e027a42939892610", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10702, - "revocable": false, - "beneficiary": "0x2E5b1E0ebbAbBf74b2D13876e027a42939892610", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x4a2859423ad40C9d2E2C4CC677c28b04eF471578", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x4a2859423ad40C9d2E2C4CC677c28b04eF471578", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xc7040c4DA590E8F0F4d46454AC7c3C82603BaA66", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xc7040c4DA590E8F0F4d46454AC7c3C82603BaA66", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xa087388376f1ebe368e3ae22d630146319410082", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10402, - "revocable": false, - "beneficiary": "0xa087388376f1ebe368e3ae22d630146319410082", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x1223e6fd91058c49264f597ca33e25e99d6de957", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x1223e6fd91058c49264f597ca33e25e99d6de957", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x73dfb47feb4fa1536eb7d54386add730b8fe0235", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x73dfb47feb4fa1536eb7d54386add730b8fe0235", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xf588c86d6555cb264733b97001d28e07279912e2", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xf588c86d6555cb264733b97001d28e07279912e2", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x5d4aeaf81a25a590fac7215851a5541e6239faf7", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 20202, - "revocable": false, - "beneficiary": "0x5d4aeaf81a25a590fac7215851a5541e6239faf7", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xb3Ba92Ab0E7F52E931D9773ab755bF01a0ba3074", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 20002, - "revocable": false, - "beneficiary": "0xb3Ba92Ab0E7F52E931D9773ab755bF01a0ba3074", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xf2e7D6f720B266Bceef81E65388Aebc7D29d1951", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xf2e7D6f720B266Bceef81E65388Aebc7D29d1951", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x584a380b07851A6cf53Fb9F449616dC8A840062b", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x584a380b07851A6cf53Fb9F449616dC8A840062b", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xE2269F2973c11Cd6D9976B04445390Ae32BCa483", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xE2269F2973c11Cd6D9976B04445390Ae32BCa483", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xDE84076a698635151C8237e6d1d6Dc1133703263", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xDE84076a698635151C8237e6d1d6Dc1133703263", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xA588dB154AeDB5D15d765c27b359417255478c72", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xA588dB154AeDB5D15d765c27b359417255478c72", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xE267D978037B89db06C6a5FcF82fAd8297E290ff", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xE267D978037B89db06C6a5FcF82fAd8297E290ff", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x4d82BfC8823a4F3AF82B0AdE52ff3e2d74A04757", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 30002, - "revocable": false, - "beneficiary": "0x4d82BfC8823a4F3AF82B0AdE52ff3e2d74A04757", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x221370e066ab3de1f6130731a1a0e992954ed1a4", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x221370e066ab3de1f6130731a1a0e992954ed1a4", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x4EA01c7F19F64bE2771Ad91568FB01b745637fc9", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x4EA01c7F19F64bE2771Ad91568FB01b745637fc9", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x17787195Df7605290AbB400B19Aa562F3e9E5719", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x17787195Df7605290AbB400B19Aa562F3e9E5719", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x0F96b40F89E6b4690Ddb94dEc4b428Ac31538561", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x0F96b40F89E6b4690Ddb94dEc4b428Ac31538561", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xde8804E1022b7AcE305fCaB28F1ff85DfDc2730e", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xde8804E1022b7AcE305fCaB28F1ff85DfDc2730e", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x861d6466a08BF0E948c57B762DB0dE240E083672", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 20002, - "revocable": false, - "beneficiary": "0x861d6466a08BF0E948c57B762DB0dE240E083672", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x22f22c38C5148f58086b937238c101a0CAEF9DA3", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x22f22c38C5148f58086b937238c101a0CAEF9DA3", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x3Da0f75d7Cc237EDFfBc57fBe2B319e10258A19e", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x3Da0f75d7Cc237EDFfBc57fBe2B319e10258A19e", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xC842D30Ef5e354cb2F6f15a7Da524f4A968C2911", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xC842D30Ef5e354cb2F6f15a7Da524f4A968C2911", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x2273eDFc040019BfA9198A8007F4880b4E18E4C0", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x2273eDFc040019BfA9198A8007F4880b4E18E4C0", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x4e1a3c4246faaefc2987cf19b5b0d958e4e42a0e", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x4e1a3c4246faaefc2987cf19b5b0d958e4e42a0e", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x5863e292fbdccb8bb3c8dd356c201a5e1524e419", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x5863e292fbdccb8bb3c8dd356c201a5e1524e419", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x1332a671a2437a33E07C08A99B067e8b2Ea7f4E2", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x1332a671a2437a33E07C08A99B067e8b2Ea7f4E2", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x6dffE761ee1b9Ec3a92deD9aA037DB85543Bb8de", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x6dffE761ee1b9Ec3a92deD9aA037DB85543Bb8de", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "cDe0c50fF41cDD3dd22457853D9C0b4c0C35dFE3", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "cDe0c50fF41cDD3dd22457853D9C0b4c0C35dFE3", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0B04675D9e11F9aD0280f7DE3E53bE01a90f8749", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0B04675D9e11F9aD0280f7DE3E53bE01a90f8749", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xB1dE485616f06cB6E244932cdEfe6c4678187684", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xB1dE485616f06cB6E244932cdEfe6c4678187684", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x5E55d7bf906f8b35adF4b73462C1a45030a2E5c9", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x5E55d7bf906f8b35adF4b73462C1a45030a2E5c9", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x0C8eDeffEfE778287978175Fdc9Eaa4c03f0Ed17", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x0C8eDeffEfE778287978175Fdc9Eaa4c03f0Ed17", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x20Ec8B699AE4BED7C31e72DA2cA638D3CaEed871", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x20Ec8B699AE4BED7C31e72DA2cA638D3CaEed871", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x42149d79bEEbb0C9d62f2436256f83707Afa09F7", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x42149d79bEEbb0C9d62f2436256f83707Afa09F7", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xe33E1E211bdF68147b72b91c176f90C7677AaF6B", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xe33E1E211bdF68147b72b91c176f90C7677AaF6B", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x5ACBbaC2C3130a347282dF5bB536b8E744B66A82", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x5ACBbaC2C3130a347282dF5bB536b8E744B66A82", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x3352daf507dc597ac47f227f7e27c71d684c3fde", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x3352daf507dc597ac47f227f7e27c71d684c3fde", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xd5c94Eba1431F8E481D1621402dABf204C1d12D3", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xd5c94Eba1431F8E481D1621402dABf204C1d12D3", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x15868524D44f60d9F2aC492A3A803a19d4E0AA5C", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x15868524D44f60d9F2aC492A3A803a19d4E0AA5C", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xCFed50e6F87605E9c834b75C22a666EFcD31566C", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xCFed50e6F87605E9c834b75C22a666EFcD31566C", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x000aAEE22EB8491e72fDff1fF3cDeB1cF071eE76", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x000aAEE22EB8491e72fDff1fF3cDeB1cF071eE76", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xF490274c5731B75F300560A5D4C0c40e28bD8669", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xF490274c5731B75F300560A5D4C0c40e28bD8669", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x24a8A579f2c1613780ea029e6Fc85b7d2b586c6b", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x24a8A579f2c1613780ea029e6Fc85b7d2b586c6b", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xA5b05c58054A3863DADb67950aBa007624DaAE1f", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 20002, - "revocable": false, - "beneficiary": "0xA5b05c58054A3863DADb67950aBa007624DaAE1f", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x7Ff03c3216573710a62831d5249D7A7CE4D017dE", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x7Ff03c3216573710a62831d5249D7A7CE4D017dE", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xCbBDa4307091f287a6C4BbE641D5C84014ed70d5", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 20002, - "revocable": false, - "beneficiary": "0xCbBDa4307091f287a6C4BbE641D5C84014ed70d5", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xc8269fb652e55d2ae6f379718d5ec5a49921607b", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xc8269fb652e55d2ae6f379718d5ec5a49921607b", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x44deceba63d392d8a178f566677021e8d2695765", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x44deceba63d392d8a178f566677021e8d2695765", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xfB32cab6De0BcC462917C8194DfC3A6C71e63d94", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xfB32cab6De0BcC462917C8194DfC3A6C71e63d94", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x39F641c7adeDB946d0670860E45CbcB9bEC4f4dA", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x39F641c7adeDB946d0670860E45CbcB9bEC4f4dA", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x23db1e5daa277d3b643ad9b44c8045ca21949bef", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x23db1e5daa277d3b643ad9b44c8045ca21949bef", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xd51247163529d327322712239e65c7e6aff0de82", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xd51247163529d327322712239e65c7e6aff0de82", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x4C7Cd95d47858D9B28C314D5b70149e2Ab1076C2", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x4C7Cd95d47858D9B28C314D5b70149e2Ab1076C2", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x88996e9484DB8f9398c8DFaC953585Bc7277eFE5", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 20002, - "revocable": false, - "beneficiary": "0x88996e9484DB8f9398c8DFaC953585Bc7277eFE5", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xa8AA9c53b6535656eb3dA9624105e6F4d5cb7bF4", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 20002, - "revocable": false, - "beneficiary": "0xa8AA9c53b6535656eb3dA9624105e6F4d5cb7bF4", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xd1687731C3661167706e53e4cfF43B37c8673189", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xd1687731C3661167706e53e4cfF43B37c8673189", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x93D3dC5eB8Bc021cF51ab4eD67050F094c5233f9", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x93D3dC5eB8Bc021cF51ab4eD67050F094c5233f9", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x0Dd1d9D813684bcbF127CCA52eCfa686fef1Db0a", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x0Dd1d9D813684bcbF127CCA52eCfa686fef1Db0a", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x580e765D0FF80d8a74Dd37b5a2E9FA4214A7D2Db", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x580e765D0FF80d8a74Dd37b5a2E9FA4214A7D2Db", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xBecc041a5090cD08AbD3940ab338d4CC94d2Ed3c", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xBecc041a5090cD08AbD3940ab338d4CC94d2Ed3c", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x231b10a19F5215BF218B7604114C0f2dF9Dc17a6", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 12502, - "revocable": false, - "beneficiary": "0x231b10a19F5215BF218B7604114C0f2dF9Dc17a6", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "Fb2D3069e3F36E04347174Cf01F6FBf43163DE6D", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "Fb2D3069e3F36E04347174Cf01F6FBf43163DE6D", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xAC271CC079A734C363f082A431b2270f56C49038", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10252, - "revocable": false, - "beneficiary": "0xAC271CC079A734C363f082A431b2270f56C49038", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "08cd011A0cE3F139d586c8a403Ea0b2610f1b5Ed", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "08cd011A0cE3F139d586c8a403Ea0b2610f1b5Ed", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "eDBd3d6E7077Ef2205230107259a851A78bb6Eb1", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 23302, - "revocable": false, - "beneficiary": "eDBd3d6E7077Ef2205230107259a851A78bb6Eb1", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x26994A9545C77EaE9E17A90E444eb228b8D192E4", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x26994A9545C77EaE9E17A90E444eb228b8D192E4", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x25C7731B5448D3002DCC0DFc256BC657e40deD33", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x25C7731B5448D3002DCC0DFc256BC657e40deD33", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x493881b5133f53f902c92c02C67EE35c0d50158D", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 30102, - "revocable": false, - "beneficiary": "0x493881b5133f53f902c92c02C67EE35c0d50158D", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x3d841f0b39e2daca569c09ade3ea3ca0e4359641", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x3d841f0b39e2daca569c09ade3ea3ca0e4359641", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x1d420010b8917fe71cb635abad88398a0e89b2a5", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x1d420010b8917fe71cb635abad88398a0e89b2a5", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x3D3E223B0f16e3D09f6B5b320422F52ad157bd1D", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x3D3E223B0f16e3D09f6B5b320422F52ad157bd1D", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x87f394e0deabb94d844ae62ff2aea730f75bd33d", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 11252, - "revocable": false, - "beneficiary": "0x87f394e0deabb94d844ae62ff2aea730f75bd33d", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x6eF5CB03FaaF615b0284a994386B930362217DE1", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x6eF5CB03FaaF615b0284a994386B930362217DE1", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x8072D7f3A3A6634cab850a83039223e22f20B04B", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x8072D7f3A3A6634cab850a83039223e22f20B04B", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x0F0CEAe30aD490B47e000449803F5e926D66dB0e", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x0F0CEAe30aD490B47e000449803F5e926D66dB0e", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xd19249cc30c84b068263Ba0e173FD19fAcD78f19", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 40902, - "revocable": false, - "beneficiary": "0xd19249cc30c84b068263Ba0e173FD19fAcD78f19", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x351aebcded9240ae77dd0a3987e2110b15e0f038", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 30002, - "revocable": false, - "beneficiary": "0x351aebcded9240ae77dd0a3987e2110b15e0f038", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x42ec39634217b708aa87befa85d6e87ddd335292", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x42ec39634217b708aa87befa85d6e87ddd335292", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x435b4b17f213b0cac1ebc17e3d08c800485866e9", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x435b4b17f213b0cac1ebc17e3d08c800485866e9", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xe4bece11dfb62a8a63569710302ef8bd70325e10", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xe4bece11dfb62a8a63569710302ef8bd70325e10", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xfec5e94e4a230b425ca6e19d82dbd8df7c65c680", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xfec5e94e4a230b425ca6e19d82dbd8df7c65c680", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "b6364907f4e5f7b3d35891144f9dbf9c571759d6", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 30002, - "revocable": false, - "beneficiary": "b6364907f4e5f7b3d35891144f9dbf9c571759d6", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "686d6f5215b79252a42b4a41ea5257009bec44a8", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "686d6f5215b79252a42b4a41ea5257009bec44a8", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "b056768bb484e29d90c09e330565ad8e1c24cc6a", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "b056768bb484e29d90c09e330565ad8e1c24cc6a", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x138976a2AE14ecC940F3D7E46C4a647c5dc2e3E8", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 33002, - "revocable": false, - "beneficiary": "0x138976a2AE14ecC940F3D7E46C4a647c5dc2e3E8", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x131cD6c660094B0967640E098E1280D6618c3742", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x131cD6c660094B0967640E098E1280D6618c3742", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xce2a38595a416e7f0f6eC9F86E4C5c3d28400C20", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xce2a38595a416e7f0f6eC9F86E4C5c3d28400C20", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x76A39Aa16276d30c137ae338EA99E2F3B8AC3CDA", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x76A39Aa16276d30c137ae338EA99E2F3B8AC3CDA", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xD38935223966BA19Ea73792b1dC92b71420B46fc", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xD38935223966BA19Ea73792b1dC92b71420B46fc", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x1C5fecC60f4F288e82354111681b2A466340bca8", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 20602, - "revocable": false, - "beneficiary": "0x1C5fecC60f4F288e82354111681b2A466340bca8", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0efcbf280ac18ef18884ac4624ada88fbfe43134", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0efcbf280ac18ef18884ac4624ada88fbfe43134", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "bb161c8a7128c20cee5e5e36038213e57d9bbc64", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 20002, - "revocable": false, - "beneficiary": "bb161c8a7128c20cee5e5e36038213e57d9bbc64", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xFD8cCe689A04E356AE13071C0c41Faf7552402Fb", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xFD8cCe689A04E356AE13071C0c41Faf7552402Fb", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xc2064926AFb167ffEcB8997235CAd1CCd5773228", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xc2064926AFb167ffEcB8997235CAd1CCd5773228", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xACd31B9a109c5227B165e4C9324723ae614cf903", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xACd31B9a109c5227B165e4C9324723ae614cf903", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x0627E5f7E0D1A3749764F8417133A77E7b2eE707", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x0627E5f7E0D1A3749764F8417133A77E7b2eE707", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x811fCDCA76977Aa7236f48dF1A37446fa13eC8cc", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 20002, - "revocable": false, - "beneficiary": "0x811fCDCA76977Aa7236f48dF1A37446fa13eC8cc", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x10e8a187b50fcda52adba67f98ff780aaba18104", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x10e8a187b50fcda52adba67f98ff780aaba18104", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x43d218fe26aab09ebf27b550f248d152d4ca9220", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x43d218fe26aab09ebf27b550f248d152d4ca9220", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xf02e1fB0Fe7F3962818Cf3a1301d766A511fF17F", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xf02e1fB0Fe7F3962818Cf3a1301d766A511fF17F", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x44EEcfc3eCa752254dBbc44ac037c7641f206d17", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 12102, - "revocable": false, - "beneficiary": "0x44EEcfc3eCa752254dBbc44ac037c7641f206d17", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x1a9e8e7f8a7b6400e5c70344842e38ff09e68131", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x1a9e8e7f8a7b6400e5c70344842e38ff09e68131", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x8b87de691f8bf0040ff585326eeb612de81f0d53", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x8b87de691f8bf0040ff585326eeb612de81f0d53", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xF215beB7F6d29aDba0aeFC2d2d1bE66748e1ACEe", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xF215beB7F6d29aDba0aeFC2d2d1bE66748e1ACEe", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x23265941b3248211f186034d2DD7aC25dCFBE8C7", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10202, - "revocable": false, - "beneficiary": "0x23265941b3248211f186034d2DD7aC25dCFBE8C7", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "64d8ba94b639a2c752dd6cb87ee641d2e6ce787e", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "64d8ba94b639a2c752dd6cb87ee641d2e6ce787e", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "80b2c7f9e297928764c493882a15b2f0292ad4d1", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "80b2c7f9e297928764c493882a15b2f0292ad4d1", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "7C99a549E5021aC25E756d81224f30b9ed086322", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "7C99a549E5021aC25E756d81224f30b9ed086322", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "2252fCdEa40B90cf0eD1b09C22914c7ffBb17Bf0", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "2252fCdEa40B90cf0eD1b09C22914c7ffBb17Bf0", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xD4A6dd7e815175fd5232dC32CF1c0dD926D81255", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xD4A6dd7e815175fd5232dC32CF1c0dD926D81255", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xD3ab7Ee9219EFA53298F7639fBEa54A3460F7240", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xD3ab7Ee9219EFA53298F7639fBEa54A3460F7240", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "36F669B1C347CB52Df8fa9e8FDE0b3061170669e", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "36F669B1C347CB52Df8fa9e8FDE0b3061170669e", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "4fC0e6c1AC6eD1c600e57baf304212dc8d98F7Fd", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "4fC0e6c1AC6eD1c600e57baf304212dc8d98F7Fd", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xC05153EaAEAb67D5BAc1C25B1e5675Fb85B75a08", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xC05153EaAEAb67D5BAc1C25B1e5675Fb85B75a08", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xf1014E08113bd8729D3D1657D883fe7EF4EA8897", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xf1014E08113bd8729D3D1657D883fe7EF4EA8897", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "658b5Df4E9deac0f3dabDe671CaEA3c77582f222", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "658b5Df4E9deac0f3dabDe671CaEA3c77582f222", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "FB5A0cf1baAeb160ef99E1Ac1a3794e5cCF20b9E", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "FB5A0cf1baAeb160ef99E1Ac1a3794e5cCF20b9E", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xeca46aFE86BD68FAcc70484Be8Bc2a826c63A2DC", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xeca46aFE86BD68FAcc70484Be8Bc2a826c63A2DC", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xFE917317e641774739AD4594536F67aD8e33c02F", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xFE917317e641774739AD4594536F67aD8e33c02F", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x3Ca536b532D00529d1B37f68137F011F2D1d2CeE", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x3Ca536b532D00529d1B37f68137F011F2D1d2CeE", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xC9298cBCe08d47323EDBfd2cf28eF26Dccbb21bA", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xC9298cBCe08d47323EDBfd2cf28eF26Dccbb21bA", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x111E7C6a17Ad27f45032C75BF6426958Dcbbc03A", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x111E7C6a17Ad27f45032C75BF6426958Dcbbc03A", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xF0eB19960A230129Ca5f6C25f3F5cfEb6887D099", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xF0eB19960A230129Ca5f6C25f3F5cfEb6887D099", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xB72c361EF13B3BCf7Fbf8209AE67AC298c75aA7C", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0xB72c361EF13B3BCf7Fbf8209AE67AC298c75aA7C", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x8267809F2f76D53f0a85486ccE9c2b8CBD8960aa", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x8267809F2f76D53f0a85486ccE9c2b8CBD8960aa", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x81a9cc4103e2c66a652b2c17baaf134fdaeafca9", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 11252, - "revocable": false, - "beneficiary": "0x81a9cc4103e2c66a652b2c17baaf134fdaeafca9", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x7F5bb79b6756afE35CC71b3850C829B5EC13e6d7", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x7F5bb79b6756afE35CC71b3850C829B5EC13e6d7", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0xf1cC0D8f2322AE86252Cf1D1F2308EE51BE336A6", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10202, - "revocable": false, - "beneficiary": "0xf1cC0D8f2322AE86252Cf1D1F2308EE51BE336A6", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - }, - { - "identifier": "0x008105af3e4a32ce8476cb8927f482bc3830a714", - "releaseStartTime": "MAINNET", - "releaseCliffTime": 31536000, - "numReleasePeriods": 1, - "releasePeriod": 31536000, - "amountReleasedPerPeriod": 10002, - "revocable": false, - "beneficiary": "0x008105af3e4a32ce8476cb8927f482bc3830a714", - "releaseOwner": "0xf772F744f0152b6B85095a39f1d5541839710C6e", - "refundAddress": "0x0000000000000000000000000000000000000000", - "subjectToLiquidityProvision": false, - "initialDistributionRatio": 1000, - "canVote": true, - "canValidate": true - } -] \ No newline at end of file diff --git a/packages/celotool/tsconfig.json b/packages/celotool/tsconfig.json deleted file mode 100644 index 79762e77a86..00000000000 --- a/packages/celotool/tsconfig.json +++ /dev/null @@ -1,18 +0,0 @@ -{ - "extends": "@tsconfig/recommended/tsconfig.json", - "compilerOptions": { - "outDir": "lib", - "rootDir": "src", - "baseUrl": ".", - "strict": false, - "lib": ["es7", "es2017", "es2020"], - "target": "es2020", - "resolveJsonModule": true, - "esModuleInterop": true, - "paths": { - "@google-cloud/monitoring": ["types/monitoring"] - } - }, - "include": ["src"], - "exclude": ["node_modules/"], -} diff --git a/packages/env-tests/.gitignore b/packages/env-tests/.gitignore deleted file mode 100644 index 592a8d4a357..00000000000 --- a/packages/env-tests/.gitignore +++ /dev/null @@ -1,3 +0,0 @@ -*.js -!jest.config.js -lib \ No newline at end of file diff --git a/packages/env-tests/CHANGELOG.md b/packages/env-tests/CHANGELOG.md deleted file mode 100644 index 89bd1d22d7f..00000000000 --- a/packages/env-tests/CHANGELOG.md +++ /dev/null @@ -1,87 +0,0 @@ -# @celo/env-tests - -## 1.0.3 - -### Patch Changes - -- Updated dependencies [9ab9d00eb] -- Updated dependencies [1c9c844cf] -- Updated dependencies [9ab9d00eb] - - @celo/contractkit@6.0.0 - -## 1.0.3-beta.0 - -### Patch Changes - -- Updated dependencies [1c9c844cf] - - @celo/contractkit@6.0.0-beta.0 - - -## 1.0.2 - -### Patch Changes - -- 22ea7f691: Remove moment.js dependency -- Updated dependencies -- Updated dependencies [679ef0c60] -- Updated dependencies [97d5ccf43] -- Updated dependencies [32face3d8] -- Updated dependencies [97d5ccf43] -- Updated dependencies [87647b46b] - - @celo/contractkit@5.2.0 - - @celo/connect@5.1.1 - - @celo/phone-utils@6.0.0 - - @celo/base@6.0.0 - - @celo/cryptographic-utils@5.0.6 - - @celo/utils@5.0.6 - -## 1.0.2-beta.0 - -### Patch Changes - -- 22ea7f691: Remove moment.js dependency -- Updated dependencies -- Updated dependencies [97d5ccf43] -- Updated dependencies [32face3d8] -- Updated dependencies [97d5ccf43] -- Updated dependencies [87647b46b] - - @celo/contractkit@5.2.0-beta.0 - - @celo/phone-utils@6.0.0-beta.0 - - @celo/base@6.0.0-beta.0 - - @celo/connect@5.1.1-beta.0 - - @celo/cryptographic-utils@5.0.6-beta.0 - - @celo/utils@5.0.6-beta.0 - -## 1.0.1 - -### Patch Changes - -- Updated dependencies [d48c68afc] -- Updated dependencies [d48c68afc] -- Updated dependencies [53bbd4958] -- Updated dependencies [d48c68afc] -- Updated dependencies [53bbd4958] -- Updated dependencies [d48c68afc] - - @celo/contractkit@5.1.0 - - @celo/connect@5.1.0 - - @celo/cryptographic-utils@5.0.5 - - @celo/phone-utils@5.0.5 - - @celo/utils@5.0.5 - - @celo/base@5.0.5 - -## 1.0.1-beta.0 - -### Patch Changes - -- Updated dependencies [d48c68afc] -- Updated dependencies [d48c68afc] -- Updated dependencies [53bbd4958] -- Updated dependencies [d48c68afc] -- Updated dependencies [53bbd4958] -- Updated dependencies [d48c68afc] - - @celo/contractkit@5.1.0-beta.0 - - @celo/connect@5.1.0-beta.0 - - @celo/cryptographic-utils@5.0.5-beta.0 - - @celo/phone-utils@5.0.5-beta.0 - - @celo/utils@5.0.5-beta.0 - - @celo/base@5.0.5-beta.0 diff --git a/packages/env-tests/README.md b/packages/env-tests/README.md deleted file mode 100644 index 620e7f17715..00000000000 --- a/packages/env-tests/README.md +++ /dev/null @@ -1,22 +0,0 @@ -# Env-tests - -The `env-tests` package is a set of tests that is designed to be run against CELO blockchains and assert that various platform interactions work as intended. It currently has tests for the following: - -1. Exchange: Does an exchange on Mento -2. Oracle: Reports an exchange rate -3. Reserve: Tests that reserve spenders can move funds to reserve custodians -4. Transfer: Does simple ERC20 transfers - -## Setup - -`env-tests` work by deriving keys from a single mnemonic. When run in the context of the monorepo, it will pull the relevant environment mnemonic, otherwise it should be passed to the `context` of the test setup. - -All keys derive funds from the "root key" which should be funded. From it, all test keys are funded in the test setup, increase verbosity with the `LOG_LEVEL` env var to `info` or `debug` to see more information. - -By default, transfer and exchange tests are performed for cUSD. By setting the env variable `STABLETOKENS` other stabletokens can be included in testing. `STABLETOKENS` can be set to a comma-separated string of stabletokens to test (e.g. `‘cEUR’` for only testing cEUR or `‘cUSD,cEUR’` for testing both cUSD and cEUR). - -As part of the testnet contract deploys in `celotool`, privileged keys like reserve spender or oracles can be authorized directly in the migrations. Hence, the relevant tests will pass on environments like `staging` while failing on public environments as the keys are not yet authorized. - -## Running the test - -Since all the keys are derived from a single mnemonic, the `env-tests` just need a node for chain interactions and not key management. Theoretically, running against forno would work and the embedded `yarn` commands set that up. However, since Forno currently does not have sticky sessions everywhere, tests can appear flaky. Instead, consider using a local lightest client or port-forwarding with `celotool port-forward -e ${ENVIRONMENT_NAME}`. diff --git a/packages/env-tests/index.d.ts b/packages/env-tests/index.d.ts deleted file mode 100644 index 102dc17cf17..00000000000 --- a/packages/env-tests/index.d.ts +++ /dev/null @@ -1 +0,0 @@ -declare module 'bunyan-debug-stream' diff --git a/packages/env-tests/jest.config.js b/packages/env-tests/jest.config.js deleted file mode 100644 index 54dfe8710b4..00000000000 --- a/packages/env-tests/jest.config.js +++ /dev/null @@ -1,5 +0,0 @@ -module.exports = { - preset: 'ts-jest', - testMatch: ['/src/**/?(*.)+(spec|test).ts?(x)', '/src/monorepoRun.ts'], - verbose: true, -} diff --git a/packages/env-tests/package.json b/packages/env-tests/package.json deleted file mode 100644 index ab4c6e84233..00000000000 --- a/packages/env-tests/package.json +++ /dev/null @@ -1,37 +0,0 @@ -{ - "name": "@celo/env-tests", - "private": true, - "version": "1.0.3", - "description": "Environment tests", - "main": "index.js", - "license": "MIT", - "dependencies": { - "@celo/contractkit": "^7.0.0", - "@celo/utils": "^5.0.6", - "@celo/base": "^6.0.0", - "@celo/connect": "^5.1.2", - "@celo/cryptographic-utils": "^5.0.7", - "bignumber.js": "^9.0.0", - "bunyan": "1.8.12", - "bunyan-gke-stackdriver": "0.1.2", - "bunyan-debug-stream": "2.0.0", - "dotenv": "8.2.0", - "jest": "^29.0.2", - "web3": "1.10.0" - }, - "scripts": { - "clean": "tsc -b . --clean", - "build": "tsc -b .", - "lint": "yarn run --top-level eslint .", - "test": "jest --runInBand", - "baklava-test": "CELO_ENV=baklava CELO_PROVIDER=https://baklava-forno.celo-testnet.org jest --runInBand", - "alfajores-test": "CELO_ENV=alfajores CELO_PROVIDER=https://alfajores-forno.celo-testnet.org jest --runInBand", - "mainnet-test": "CELO_ENV=rc1 CELO_PROVIDER=https://forno.celo.org jest --runInBand", - "staging-test": "CELO_ENV=staging CELO_PROVIDER=https://staging-forno.celo-networks-dev.org jest --runInBand" - }, - "devDependencies": { - "@tsconfig/recommended": "^1.0.3", - "@jest/globals": "^29.5.0", - "typescript": "^5.3.3" - } -} \ No newline at end of file diff --git a/packages/env-tests/src/context.ts b/packages/env-tests/src/context.ts deleted file mode 100644 index 07db35fec49..00000000000 --- a/packages/env-tests/src/context.ts +++ /dev/null @@ -1,9 +0,0 @@ -import { Address, ContractKit } from '@celo/contractkit' -import Logger from 'bunyan' - -export interface EnvTestContext { - kit: ContractKit - mnemonic: string - reserveSpenderMultiSigAddress: Address | undefined - logger: Logger -} diff --git a/packages/env-tests/src/env.ts b/packages/env-tests/src/env.ts deleted file mode 100644 index 948effaa28b..00000000000 --- a/packages/env-tests/src/env.ts +++ /dev/null @@ -1,63 +0,0 @@ -import * as dotenv from 'dotenv' -import { existsSync } from 'fs' -import path from 'path' - -if (process.env.CONFIG) { - dotenv.config({ path: process.env.CONFIG }) -} - -export function fetchEnv(name: string): string { - if (process.env[name] === undefined || process.env[name] === '') { - console.error(`ENV var '${name}' was not defined`) - throw new Error(`ENV var '${name}' was not defined`) - } - return process.env[name] as string -} - -export function fetchEnvOrDefault(name: string, defaultValue: string): string { - return process.env[name] === undefined || process.env[name] === '' - ? defaultValue - : (process.env[name] as string) -} - -export function isYes(value: string) { - switch (value.toLowerCase().trim()) { - case '1': - case 'y': - case 'yes': - case 't': - case 'true': - return true - default: - return false - } -} - -// Only use this if in monorepo and env files are as expected and in dev -export function loadFromEnvFile() { - const envName = process.env.CELO_ENV - - if (!envName) { - return - } - - const envFile = getEnvFile(envName) - dotenv.config({ path: envFile }) - - const envFileMnemonic = getEnvFile(envName, '.mnemonic') - dotenv.config({ path: envFileMnemonic }) - - return envName -} - -export const monorepoRoot = path.resolve(process.cwd(), './../..') -export const genericEnvFilePath = path.resolve(monorepoRoot, '.env') - -export function getEnvFile(celoEnv: string, envBegining: string = '') { - const filePath: string = path.resolve(monorepoRoot, `.env${envBegining}.${celoEnv}`) - if (existsSync(filePath)) { - return filePath - } else { - return `${genericEnvFilePath}${envBegining}` - } -} diff --git a/packages/env-tests/src/logger.ts b/packages/env-tests/src/logger.ts deleted file mode 100644 index 1cf1a1cef8b..00000000000 --- a/packages/env-tests/src/logger.ts +++ /dev/null @@ -1,30 +0,0 @@ -import Logger, { createLogger, levelFromName, LogLevelString, stdSerializers } from 'bunyan' -import bunyanDebugStream from 'bunyan-debug-stream' -import { createStream } from 'bunyan-gke-stackdriver' -import { fetchEnvOrDefault } from './env' - -const logLevel = fetchEnvOrDefault('LOG_LEVEL', 'info') as LogLevelString -const logFormat = fetchEnvOrDefault('LOG_FORMAT', 'human') - -const streams: Logger.LoggerOptions['streams'] = [] -switch (logFormat) { - case 'stackdriver': - streams.push(createStream(levelFromName[logLevel])) - break - case 'json': - streams.push({ stream: process.stdout, level: logLevel }) - break - default: - // eslint-disable-next-line @typescript-eslint/no-unsafe-call - streams.push({ - level: logLevel, - stream: bunyanDebugStream() as unknown as NodeJS.WritableStream, - }) - break -} - -export const rootLogger: Logger = createLogger({ - name: 'env-tests', - serializers: stdSerializers, - streams, -}) diff --git a/packages/env-tests/src/monorepoRun.ts b/packages/env-tests/src/monorepoRun.ts deleted file mode 100644 index 81362a33c64..00000000000 --- a/packages/env-tests/src/monorepoRun.ts +++ /dev/null @@ -1,49 +0,0 @@ -import { newKitFromWeb3, StableToken } from '@celo/contractkit' -import Web3 from 'web3' -import { loadFromEnvFile } from './env' -import { rootLogger } from './logger' -import { clearAllFundsToRoot, parseStableTokensList } from './scaffold' -import { runOracleTest } from './tests/oracle' -import { runReserveTest } from './tests/reserve' -import { runTransfersTest } from './tests/transfer' - -const DEFAULT_TOKENS_TO_TEST = [StableToken.cUSD] - -jest.setTimeout(120000) - -function runTests() { - const envName = loadFromEnvFile() - - if (!process.env.MNEMONIC) { - throw new Error('No MNEMONIC was set, envName was parsed as ' + envName) - } - const kit = newKitFromWeb3(new Web3(process.env.CELO_PROVIDER || 'http://localhost:8545')) - const mnemonic: string = process.env.MNEMONIC - const reserveSpenderMultiSigAddress = process.env.RESERVE_SPENDER_MULTISIG_ADDRESS - - const stableTokensToTest = process.env.STABLETOKENS - ? parseStableTokensList(process.env.STABLETOKENS) - : DEFAULT_TOKENS_TO_TEST - - describe('Run tests in context of monorepo', () => { - const context = { - kit, - mnemonic, - logger: rootLogger, - reserveSpenderMultiSigAddress, - } - - // TODO: Assert maximum loss after test - runTransfersTest(context, stableTokensToTest) - runOracleTest(context) - runReserveTest(context) - // TODO: Governance Proposals - // TODO: Validator election + Slashing - - afterAll(async () => { - await clearAllFundsToRoot(context, stableTokensToTest) - }) - }) -} - -runTests() diff --git a/packages/env-tests/src/scaffold.ts b/packages/env-tests/src/scaffold.ts deleted file mode 100644 index 00fd07bea59..00000000000 --- a/packages/env-tests/src/scaffold.ts +++ /dev/null @@ -1,186 +0,0 @@ -import { concurrentMap } from '@celo/base' -import { CeloTokenType, StableToken, Token } from '@celo/contractkit' -import { generateKeys } from '@celo/cryptographic-utils/lib/account' -import { privateKeyToAddress } from '@celo/utils/lib/address' -import BigNumber from 'bignumber.js' -import { EnvTestContext } from './context' - -BigNumber.config({ EXPONENTIAL_AT: 1e9 }) - -interface KeyInfo { - address: string - privateKey: string - publicKey: string -} - -export async function fundAccountWithCELO( - context: EnvTestContext, - account: TestAccounts, - value: BigNumber -) { - return fundAccount(context, account, value, Token.CELO) -} - -export async function fundAccountWithcUSD( - context: EnvTestContext, - account: TestAccounts, - value: BigNumber -) { - await fundAccountWithStableToken(context, account, value, StableToken.cUSD) -} - -export async function fundAccountWithStableToken( - context: EnvTestContext, - account: TestAccounts, - value: BigNumber, - stableToken: StableToken -) { - return fundAccount(context, account, value, stableToken) -} - -async function fundAccount( - context: EnvTestContext, - account: TestAccounts, - value: BigNumber, - token: CeloTokenType -) { - const tokenWrapper = await context.kit.celoTokens.getWrapper(token) - - const root = await getKey(context.mnemonic, TestAccounts.Root) - context.kit.connection.addAccount(root.privateKey) - - const recipient = await getKey(context.mnemonic, account) - const logger = context.logger.child({ - token, - index: account, - root: root.address, - value: value.toString(), - recipient: recipient.address, - }) - - const rootBalance = await tokenWrapper.balanceOf(root.address) - if (rootBalance.lte(value)) { - logger.error({ rootBalance: rootBalance.toString() }, 'Error funding test account') - throw new Error( - `Root account ${root.address}'s ${token} balance (${rootBalance.toPrecision( - 4 - )}) is not enough for transferring ${value.toPrecision(4)}` - ) - } - const receipt = await tokenWrapper - .transfer(recipient.address, value.toString()) - .sendAndWaitForReceipt({ - from: root.address, - feeCurrency: token === Token.CELO ? undefined : tokenWrapper.address, - }) - logger.info({ rootFundingReceipt: receipt, value }, `Root funded recipient`) -} - -export async function getValidatorKey(mnemonic: string, index: number): Promise { - return getKey(mnemonic, index, '') -} - -export async function getKey( - mnemonic: string, - account: TestAccounts, - derivationPath?: string -): Promise { - const key = await generateKeys(mnemonic, undefined, 0, account, undefined, derivationPath) - return { ...key, address: privateKeyToAddress(key.privateKey) } -} - -export enum TestAccounts { - Root, - GrandaMentoExchanger, - TransferFrom, - TransferTo, - Exchange, - Oracle, - GovernanceApprover, - ReserveSpender, - ReserveCustodian, -} - -export const ONE = new BigNumber('1000000000000000000') - -export async function clearAllFundsToRoot( - context: EnvTestContext, - stableTokensToClear: StableToken[] -) { - const accounts = Array.from( - new Array(Object.keys(TestAccounts).length / 2), - (_val, index) => index - ) - // Refund all to root - const root = await getKey(context.mnemonic, TestAccounts.Root) - context.logger.debug({ root: root.address }, 'Clearing funds of test accounts back to root') - const goldToken = await context.kit.contracts.getGoldToken() - await concurrentMap(5, accounts, async (_val, index) => { - if (index === 0) { - return - } - const account = await getKey(context.mnemonic, index) - context.kit.connection.addAccount(account.privateKey) - - const celoBalance = await goldToken.balanceOf(account.address) - // Exchange and transfer tests move ~0.5, so setting the threshold slightly below - const maxBalanceBeforeCollecting = ONE.times(0.4) - if (celoBalance.gt(maxBalanceBeforeCollecting)) { - await goldToken - .transfer( - root.address, - celoBalance - .minus(maxBalanceBeforeCollecting) - .integerValue(BigNumber.ROUND_DOWN) - .toString() - ) - .sendAndWaitForReceipt({ from: account.address, feeCurrency: undefined }) - context.logger.debug( - { - index, - value: celoBalance.toString(), - address: account.address, - }, - 'cleared CELO' - ) - } - for (const stableToken of stableTokensToClear) { - const stableTokenInstance = await context.kit.celoTokens.getWrapper(stableToken) - const balance = await stableTokenInstance.balanceOf(account.address) - if (balance.gt(maxBalanceBeforeCollecting)) { - await stableTokenInstance - .transfer( - root.address, - balance.minus(maxBalanceBeforeCollecting).integerValue(BigNumber.ROUND_DOWN).toString() - ) - .sendAndWaitForReceipt({ - feeCurrency: stableTokenInstance.address, - from: account.address, - }) - const balanceAfter = await stableTokenInstance.balanceOf(account.address) - context.logger.debug( - { - index, - stabletoken: stableToken, - balanceBefore: balance.toString(), - address: account.address, - BalanceAfter: balanceAfter.toString(), - }, - `cleared ${stableToken}` - ) - } - } - }) -} - -export function parseStableTokensList(stableTokenList: string): StableToken[] { - const stableTokenStrs = stableTokenList.split(',') - const validStableTokens = Object.values(StableToken) - - for (const stableTokenStr of stableTokenStrs) { - if (!validStableTokens.includes(stableTokenStr as StableToken)) { - throw Error(`String ${stableTokenStr} not a valid StableToken`) - } - } - return stableTokenStrs as StableToken[] -} diff --git a/packages/env-tests/src/tests/oracle.ts b/packages/env-tests/src/tests/oracle.ts deleted file mode 100644 index 6e18bc9cfe1..00000000000 --- a/packages/env-tests/src/tests/oracle.ts +++ /dev/null @@ -1,61 +0,0 @@ -import { CeloContract } from '@celo/contractkit' -// eslint-disable-next-line import/no-extraneous-dependencies -import { describe, expect, test } from '@jest/globals' -import BigNumber from 'bignumber.js' -import { EnvTestContext } from '../context' -import { fundAccountWithcUSD, getKey, ONE, TestAccounts } from '../scaffold' - -export function runOracleTest(context: EnvTestContext) { - describe('Oracle Test', () => { - const logger = context.logger.child({ test: 'exchange' }) - beforeAll(async () => { - await fundAccountWithcUSD(context, TestAccounts.Exchange, ONE.times(2)) - }) - - // TODO: Check if oracle account is authorized - test('report a rate', async () => { - const from = await getKey(context.mnemonic, TestAccounts.Oracle) - context.kit.connection.addAccount(from.privateKey) - context.kit.defaultAccount = from.address - const stableToken = await context.kit.contracts.getStableToken() - context.kit.defaultFeeCurrency = stableToken.address - - const oracles = await context.kit.contracts.getSortedOracles() - - const isOracle = await oracles.isOracle(CeloContract.StableToken, from.address) - - expect(isOracle).toBeTruthy() - - const oracleRates = await oracles.getReports(CeloContract.StableToken) - const ourRate = oracleRates.find((_) => _.address === from.address) - - let rateToReport: BigNumber - if (!ourRate) { - const currentMedianRate = await oracles.medianRate(CeloContract.StableToken) - rateToReport = currentMedianRate.rate - logger.debug( - { - rate: currentMedianRate.rate.toString(), - }, - 'no existing rate, using the median' - ) - } else { - rateToReport = ourRate.rate - logger.debug({ rate: ourRate.rate.toString() }, 'fetched existing oracle report') - } - - // Move the rate in one direction or another - rateToReport = rateToReport.times(0.95 + Math.random() * 0.1).decimalPlaces(10) - - const reportTx = await oracles.report(CeloContract.StableToken, rateToReport, from.address) - const reportTxReceipt = await reportTx.sendAndWaitForReceipt({ from: from.address }) - logger.debug({ receipt: reportTxReceipt }, 'rate reported') - - const newOracleRates = await oracles.getReports(CeloContract.StableToken) - const ourNewRate = newOracleRates.find((_) => _.address === from.address) - - logger.debug({ rate: ourNewRate?.rate.toString() }, 'our new rate') - expect(ourNewRate?.rate).toEqual(rateToReport) - }) - }) -} diff --git a/packages/env-tests/src/tests/reserve.ts b/packages/env-tests/src/tests/reserve.ts deleted file mode 100644 index f3a8cafd4d6..00000000000 --- a/packages/env-tests/src/tests/reserve.ts +++ /dev/null @@ -1,77 +0,0 @@ -import { CeloContract } from '@celo/contractkit' -// eslint-disable-next-line import/no-extraneous-dependencies -import { describe, test } from '@jest/globals' -import BigNumber from 'bignumber.js' -import { EnvTestContext } from '../context' -import { fundAccountWithcUSD, getKey, ONE, TestAccounts } from '../scaffold' -export function runReserveTest(context: EnvTestContext) { - describe('Reserve Test', () => { - const logger = context.logger.child({ test: 'reserve' }) - beforeAll(async () => { - await fundAccountWithcUSD(context, TestAccounts.ReserveSpender, ONE.times(2)) - await fundAccountWithcUSD(context, TestAccounts.ReserveCustodian, ONE.times(2)) - }) - - // TODO: Check if reserve account is authorized - test('move funds from the Reserve to a custodian and back', async () => { - const spender = await getKey(context.mnemonic, TestAccounts.ReserveSpender) - const custodian = await getKey(context.mnemonic, TestAccounts.ReserveCustodian) - context.kit.connection.addAccount(spender.privateKey) - context.kit.connection.addAccount(custodian.privateKey) - const reserve = await context.kit.contracts.getReserve() - const goldToken = await context.kit.contracts.getGoldToken() - - // Find an alternate way to get the reserve spender address - let spenderMultiSigAddress = context.reserveSpenderMultiSigAddress - - if (!spenderMultiSigAddress) { - context.logger.debug('have to get reserve spender multisig address') - const spenders = await reserve.getSpenders() - expect(spenders).toHaveLength(1) - spenderMultiSigAddress = spenders[0] - context.logger.debug({ spenderMultiSigAddress }, 'got reserve spender address') - } - - const custodians = await reserve.getOtherReserveAddresses() - expect(custodians).toContain(custodian.address) - - const spenderMultiSig = await context.kit.contracts.getMultiSig(spenderMultiSigAddress) - const isOwner = await spenderMultiSig.isowner(spender.address) - expect(isOwner).toBeTruthy() - - const reserveValue = await reserve.getReserveGoldBalance() - // Fetch from contract when added to CK wrapper - const dailySpendingRatio = 0.05 - const transferRatio = 0.01 - - const valueToTransfer = reserveValue - .times(dailySpendingRatio) - .times(transferRatio) - .integerValue(BigNumber.ROUND_DOWN) - const spenderTx = reserve.transferGold(custodian.address, valueToTransfer.toString()) - const multiSigTx = await spenderMultiSig.submitOrConfirmTransaction( - reserve.address, - spenderTx.txo - ) - logger.debug( - { - data: spenderTx.txo.encodeABI(), - reserve: reserve.address, - from: spenderMultiSigAddress, - }, - 'submitting via multisig' - ) - const multiSigTxReceipt = await multiSigTx.sendAndWaitForReceipt({ - from: spender.address, - feeCurrency: await context.kit.registry.addressFor(CeloContract.StableToken), - }) - - logger.debug({ receipt: multiSigTxReceipt }, 'funds moved to custodian via spender') - - const returnTx = goldToken.transfer(reserve.address, valueToTransfer.toString()) - const returnTxReceipt = await returnTx.sendAndWaitForReceipt({ from: custodian.address }) - - logger.debug({ receipt: returnTxReceipt }, 'funds moved back to reserve') - }) - }) -} diff --git a/packages/env-tests/src/tests/transfer.ts b/packages/env-tests/src/tests/transfer.ts deleted file mode 100644 index 318044e3dad..00000000000 --- a/packages/env-tests/src/tests/transfer.ts +++ /dev/null @@ -1,67 +0,0 @@ -import { StableToken } from '@celo/contractkit' -// eslint-disable-next-line import/no-extraneous-dependencies -import { describe, expect, test } from '@jest/globals' -import BigNumber from 'bignumber.js' -import { EnvTestContext } from '../context' -import { ONE, TestAccounts, fundAccountWithStableToken, getKey } from '../scaffold' - -export function runTransfersTest(context: EnvTestContext, stableTokensToTest: StableToken[]) { - describe('Transfer Test', () => { - const logger = context.logger.child({ test: 'transfer' }) - - for (const stableToken of stableTokensToTest) { - test(`transfer ${stableToken}`, async () => { - const stableTokenAmountToFund = ONE - await fundAccountWithStableToken( - context, - TestAccounts.TransferFrom, - stableTokenAmountToFund, - stableToken - ) - const stableTokenInstance = await context.kit.celoTokens.getWrapper(stableToken) - - const from = await getKey(context.mnemonic, TestAccounts.TransferFrom) - const to = await getKey(context.mnemonic, TestAccounts.TransferTo) - context.kit.connection.addAccount(from.privateKey) - context.kit.connection.addAccount(to.privateKey) - context.kit.connection.defaultFeeCurrency = stableTokenInstance.address - - const toBalanceBefore = await stableTokenInstance.balanceOf(to.address) - const fromBalanceBefore = await stableTokenInstance.balanceOf(from.address) - logger.debug( - { stabletoken: stableToken, balance: toBalanceBefore.toString(), account: to.address }, - `Get ${stableToken} Balance Before` - ) - - const stableTokenAmountToTransfer = ONE.times(0.5) - const receipt = await stableTokenInstance - .transfer(to.address, stableTokenAmountToTransfer.toString()) - .sendAndWaitForReceipt({ from: from.address }) - - logger.debug({ stabletoken: stableToken, receipt }, `Transferred ${stableToken}`) - const transaction = await context.kit.web3.eth.getTransaction(receipt.transactionHash) - const gasPrice = new BigNumber(transaction.gasPrice) - const gasUsed = new BigNumber(context.kit.web3.utils.toDecimal(receipt.gasUsed).toString()) - const transactionFee = gasPrice.times(gasUsed) - - const toBalanceAfter = await stableTokenInstance.balanceOf(to.address) - const fromBalanceAfter = await stableTokenInstance.balanceOf(from.address) - logger.debug( - { stabletoken: stableToken, balance: toBalanceAfter.toString(), account: to.address }, - `Get ${stableToken} Balance After` - ) - expect( - toBalanceAfter.minus(toBalanceBefore).isEqualTo(stableTokenAmountToTransfer) - ).toBeTruthy() - // check whether difference of balance of 'from' account before/after - transfer amount - // is equal to transaction fee - expect( - fromBalanceBefore - .minus(fromBalanceAfter) - .minus(stableTokenAmountToTransfer) - .isEqualTo(transactionFee) - ).toBeTruthy() - }) - } - }) -} diff --git a/packages/env-tests/tsconfig.json b/packages/env-tests/tsconfig.json deleted file mode 100644 index ae4909ad2cb..00000000000 --- a/packages/env-tests/tsconfig.json +++ /dev/null @@ -1,9 +0,0 @@ -{ - "extends": "@tsconfig/recommended/tsconfig.json", - "compilerOptions": { - "rootDir": "src", - "outDir": "lib", - "resolveJsonModule": true - }, - "include": ["src/**/*", "types/**/*", "../../node_modules/@celo/contractkit/types", "index.d.ts"], -} diff --git a/packages/helm-charts/.gitignore b/packages/helm-charts/.gitignore deleted file mode 100644 index 71cbcd94cf0..00000000000 --- a/packages/helm-charts/.gitignore +++ /dev/null @@ -1,4 +0,0 @@ -tracer-tool/staticnodes -*/charts -**/requirements.lock - diff --git a/packages/helm-charts/aad-pod-identity/Chart.yaml b/packages/helm-charts/aad-pod-identity/Chart.yaml deleted file mode 100644 index 8609750d434..00000000000 --- a/packages/helm-charts/aad-pod-identity/Chart.yaml +++ /dev/null @@ -1,6 +0,0 @@ -name: aad-pod-identity -version: 0.1.1 -description: Helm v2 compatible helm chart for deploying aad-pod-identity -keywords: -- aad-pod-identity -appVersion: "" diff --git a/packages/helm-charts/aad-pod-identity/templates/azure-assigned-identity-crd.yaml b/packages/helm-charts/aad-pod-identity/templates/azure-assigned-identity-crd.yaml deleted file mode 100644 index f5887162a2f..00000000000 --- a/packages/helm-charts/aad-pod-identity/templates/azure-assigned-identity-crd.yaml +++ /dev/null @@ -1,11 +0,0 @@ -apiVersion: apiextensions.k8s.io/v1beta1 -kind: CustomResourceDefinition -metadata: - name: azureassignedidentities.aadpodidentity.k8s.io -spec: - group: aadpodidentity.k8s.io - version: v1 - names: - kind: AzureAssignedIdentity - plural: azureassignedidentities - scope: Namespaced diff --git a/packages/helm-charts/aad-pod-identity/templates/azure-identity-binding-crd.yaml b/packages/helm-charts/aad-pod-identity/templates/azure-identity-binding-crd.yaml deleted file mode 100644 index 5f31afcd2d2..00000000000 --- a/packages/helm-charts/aad-pod-identity/templates/azure-identity-binding-crd.yaml +++ /dev/null @@ -1,11 +0,0 @@ -apiVersion: apiextensions.k8s.io/v1beta1 -kind: CustomResourceDefinition -metadata: - name: azureidentitybindings.aadpodidentity.k8s.io -spec: - group: aadpodidentity.k8s.io - version: v1 - names: - kind: AzureIdentityBinding - plural: azureidentitybindings - scope: Namespaced diff --git a/packages/helm-charts/aad-pod-identity/templates/azure-identity-crd.yaml b/packages/helm-charts/aad-pod-identity/templates/azure-identity-crd.yaml deleted file mode 100644 index ff5e2f2f0ca..00000000000 --- a/packages/helm-charts/aad-pod-identity/templates/azure-identity-crd.yaml +++ /dev/null @@ -1,12 +0,0 @@ -apiVersion: apiextensions.k8s.io/v1beta1 -kind: CustomResourceDefinition -metadata: - name: azureidentities.aadpodidentity.k8s.io -spec: - group: aadpodidentity.k8s.io - version: v1 - names: - kind: AzureIdentity - singular: azureidentity - plural: azureidentities - scope: Namespaced diff --git a/packages/helm-charts/aad-pod-identity/templates/azure-pod-identity-exception-crd.yaml b/packages/helm-charts/aad-pod-identity/templates/azure-pod-identity-exception-crd.yaml deleted file mode 100644 index 72bf549483e..00000000000 --- a/packages/helm-charts/aad-pod-identity/templates/azure-pod-identity-exception-crd.yaml +++ /dev/null @@ -1,12 +0,0 @@ -apiVersion: apiextensions.k8s.io/v1beta1 -kind: CustomResourceDefinition -metadata: - name: azurepodidentityexceptions.aadpodidentity.k8s.io -spec: - group: aadpodidentity.k8s.io - version: v1 - names: - kind: AzurePodIdentityException - singular: azurepodidentityexception - plural: azurepodidentityexceptions - scope: Namespaced diff --git a/packages/helm-charts/aad-pod-identity/templates/mic-cluster-role-binding.yaml b/packages/helm-charts/aad-pod-identity/templates/mic-cluster-role-binding.yaml deleted file mode 100644 index c5823c78fd2..00000000000 --- a/packages/helm-charts/aad-pod-identity/templates/mic-cluster-role-binding.yaml +++ /dev/null @@ -1,14 +0,0 @@ -apiVersion: rbac.authorization.k8s.io/v1 -kind: ClusterRoleBinding -metadata: - name: aad-pod-id-mic-binding - labels: - k8s-app: aad-pod-id-mic-binding -subjects: -- kind: ServiceAccount - name: aad-pod-id-mic-service-account - namespace: default -roleRef: - kind: ClusterRole - name: aad-pod-id-mic-role - apiGroup: rbac.authorization.k8s.io diff --git a/packages/helm-charts/aad-pod-identity/templates/mic-cluster-role.yaml b/packages/helm-charts/aad-pod-identity/templates/mic-cluster-role.yaml deleted file mode 100644 index c6ed7dbedee..00000000000 --- a/packages/helm-charts/aad-pod-identity/templates/mic-cluster-role.yaml +++ /dev/null @@ -1,29 +0,0 @@ -apiVersion: rbac.authorization.k8s.io/v1 -kind: ClusterRole -metadata: - name: aad-pod-id-mic-role -rules: -- apiGroups: ["apiextensions.k8s.io"] - resources: ["customresourcedefinitions"] - verbs: ["*"] -- apiGroups: [""] - resources: ["pods", "nodes"] - verbs: [ "list", "watch" ] -- apiGroups: [""] - resources: ["events"] - verbs: ["create", "patch"] -- apiGroups: [""] - resources: ["configmaps"] - verbs: ["get", "create", "update"] -- apiGroups: [""] - resources: ["endpoints"] - verbs: ["create", "get","update"] -- apiGroups: ["aadpodidentity.k8s.io"] - resources: ["azureidentitybindings", "azureidentities"] - verbs: ["get", "list", "watch", "post", "update"] -- apiGroups: ["aadpodidentity.k8s.io"] - resources: ["azurepodidentityexceptions"] - verbs: ["list", "update"] -- apiGroups: ["aadpodidentity.k8s.io"] - resources: ["azureassignedidentities"] - verbs: ["*"] diff --git a/packages/helm-charts/aad-pod-identity/templates/mic-deployment.yaml b/packages/helm-charts/aad-pod-identity/templates/mic-deployment.yaml deleted file mode 100644 index 7e4ee9d3c62..00000000000 --- a/packages/helm-charts/aad-pod-identity/templates/mic-deployment.yaml +++ /dev/null @@ -1,60 +0,0 @@ -apiVersion: apps/v1 -kind: Deployment -metadata: - labels: - component: mic - k8s-app: aad-pod-id - name: mic -spec: - replicas: 2 - selector: - matchLabels: - component: mic - app: mic - template: - metadata: - labels: - component: mic - app: mic - annotations: - prometheus.io/scrape: "true" - prometheus.io/port: "{{ .Values.nmi.prometheusPort }}" - spec: - serviceAccountName: aad-pod-id-mic-service-account - containers: - - name: mic - image: {{ .Values.mic.image.repo }}:{{ .Values.mic.image.tag }} - imagePullPolicy: Always - args: - - "--syncRetryDuration={{ .Values.mic.syncRetryDuration }}" - - "--cloudconfig=/etc/kubernetes/azure.json" - - "--logtostderr" - - "--prometheus-port={{ .Values.mic.prometheusPort }}" - env: - - name: MIC_POD_NAMESPACE - valueFrom: - fieldRef: - fieldPath: metadata.namespace - - name: FORCENAMESPACED - value: "{{ .Values.mic.forceNamespaced }}" - ports: - - name: prometheus - containerPort: {{ .Values.mic.prometheusPort }} - resources: -{{ toYaml .Values.mic.resources | indent 10 }} - volumeMounts: - - name: k8s-azure-file - mountPath: /etc/kubernetes/azure.json - readOnly: true - livenessProbe: - httpGet: - path: /healthz - port: 8080 - initialDelaySeconds: 10 - periodSeconds: 5 - volumes: - - name: k8s-azure-file - hostPath: - path: /etc/kubernetes/azure.json - nodeSelector: - beta.kubernetes.io/os: linux diff --git a/packages/helm-charts/aad-pod-identity/templates/mic-service-account.yaml b/packages/helm-charts/aad-pod-identity/templates/mic-service-account.yaml deleted file mode 100644 index ff5cf047ce3..00000000000 --- a/packages/helm-charts/aad-pod-identity/templates/mic-service-account.yaml +++ /dev/null @@ -1,5 +0,0 @@ -apiVersion: v1 -kind: ServiceAccount -metadata: - name: aad-pod-id-mic-service-account - namespace: {{ .Values.namespace }} diff --git a/packages/helm-charts/aad-pod-identity/templates/nmi-cluster-role-binding.yaml b/packages/helm-charts/aad-pod-identity/templates/nmi-cluster-role-binding.yaml deleted file mode 100644 index 380e227b3f7..00000000000 --- a/packages/helm-charts/aad-pod-identity/templates/nmi-cluster-role-binding.yaml +++ /dev/null @@ -1,14 +0,0 @@ -apiVersion: rbac.authorization.k8s.io/v1 -kind: ClusterRoleBinding -metadata: - name: aad-pod-id-nmi-binding - labels: - k8s-app: aad-pod-id-nmi-binding -subjects: -- kind: ServiceAccount - name: aad-pod-id-nmi-service-account - namespace: default -roleRef: - kind: ClusterRole - name: aad-pod-id-nmi-role - apiGroup: rbac.authorization.k8s.io diff --git a/packages/helm-charts/aad-pod-identity/templates/nmi-cluster-role.yaml b/packages/helm-charts/aad-pod-identity/templates/nmi-cluster-role.yaml deleted file mode 100644 index 7145aec03db..00000000000 --- a/packages/helm-charts/aad-pod-identity/templates/nmi-cluster-role.yaml +++ /dev/null @@ -1,20 +0,0 @@ -apiVersion: rbac.authorization.k8s.io/v1 -kind: ClusterRole -metadata: - name: aad-pod-id-nmi-role -rules: -- apiGroups: ["apiextensions.k8s.io"] - resources: ["customresourcedefinitions"] - verbs: ["get", "list"] -- apiGroups: [""] - resources: ["pods"] - verbs: ["get", "list", "watch"] -- apiGroups: [""] - resources: ["secrets"] - verbs: ["get"] -- apiGroups: ["aadpodidentity.k8s.io"] - resources: ["azureidentitybindings", "azureidentities", "azurepodidentityexceptions"] - verbs: ["get", "list", "watch"] -- apiGroups: ["aadpodidentity.k8s.io"] - resources: ["azureassignedidentities"] - verbs: ["get", "list", "watch"] diff --git a/packages/helm-charts/aad-pod-identity/templates/nmi-daemonset.yaml b/packages/helm-charts/aad-pod-identity/templates/nmi-daemonset.yaml deleted file mode 100644 index c28d055c80a..00000000000 --- a/packages/helm-charts/aad-pod-identity/templates/nmi-daemonset.yaml +++ /dev/null @@ -1,78 +0,0 @@ -apiVersion: apps/v1 -kind: DaemonSet -metadata: - labels: - component: nmi - tier: node - k8s-app: aad-pod-id - name: nmi -spec: - updateStrategy: - type: RollingUpdate - selector: - matchLabels: - component: nmi - tier: node - template: - metadata: - labels: - component: nmi - tier: node - annotations: - prometheus.io/scrape: "true" - prometheus.io/port: "{{ .Values.nmi.prometheusPort }}" - spec: - serviceAccountName: aad-pod-id-nmi-service-account - hostNetwork: true - volumes: - - hostPath: - path: /run/xtables.lock - type: FileOrCreate - name: iptableslock - containers: - - name: nmi - image: {{ .Values.nmi.image.repo }}:{{ .Values.nmi.image.tag }} - imagePullPolicy: Always - args: - {{- if semverCompare "<= 1.6.1-0" .Values.nmi.image.tag }} - - "--host-ip=$(HOST_IP)" - {{- end }} - - "--node=$(NODE_NAME)" - - "--prometheus-port={{ .Values.nmi.prometheusPort }}" - {{- if .Values.nmi.probePort }} - - --http-probe-port={{ .Values.nmi.probePort }} - {{- end }} - env: - {{- if semverCompare "<= 1.6.1-0" .Values.nmi.image.tag }} - - name: HOST_IP - valueFrom: - fieldRef: - fieldPath: status.podIP - {{- end }} - - name: NODE_NAME - valueFrom: - fieldRef: - fieldPath: spec.nodeName - - name: FORCENAMESPACED - value: "{{ .Values.nmi.forceNamespaced }}" - ports: - - name: prometheus - containerPort: {{ .Values.nmi.prometheusPort }} - resources: -{{ toYaml .Values.nmi.resources | indent 10 }} - securityContext: - privileged: true - capabilities: - add: - - NET_ADMIN - volumeMounts: - - mountPath: /run/xtables.lock - name: iptableslock - livenessProbe: - httpGet: - path: /healthz - port: 8080 - initialDelaySeconds: 10 - periodSeconds: 5 - nodeSelector: - beta.kubernetes.io/os: linux diff --git a/packages/helm-charts/aad-pod-identity/templates/nmi-service-account.yaml b/packages/helm-charts/aad-pod-identity/templates/nmi-service-account.yaml deleted file mode 100644 index c6e28537b33..00000000000 --- a/packages/helm-charts/aad-pod-identity/templates/nmi-service-account.yaml +++ /dev/null @@ -1,4 +0,0 @@ -apiVersion: v1 -kind: ServiceAccount -metadata: - name: aad-pod-id-nmi-service-account diff --git a/packages/helm-charts/aad-pod-identity/values.yaml b/packages/helm-charts/aad-pod-identity/values.yaml deleted file mode 100644 index caa43c2dfe2..00000000000 --- a/packages/helm-charts/aad-pod-identity/values.yaml +++ /dev/null @@ -1,30 +0,0 @@ -namespace: default - -nmi: - forceNamespaced: true - prometheusPort: 9090 - probePort: 8080 - image: - repo: mcr.microsoft.com/k8s/aad-pod-identity/nmi - tag: 1.6.2 - resources: - limits: - cpu: 200m - memory: 512Mi - requests: - cpu: 100m - memory: 256Mi -mic: - syncRetryDuration: 1m - forceNamespaced: true - prometheusPort: 9090 - image: - repo: mcr.microsoft.com/k8s/aad-pod-identity/mic - tag: 1.6.2 - resources: - limits: - cpu: 200m - memory: 1024Mi - requests: - cpu: 100m - memory: 256Mi diff --git a/packages/helm-charts/celo-fullnode/README.md b/packages/helm-charts/celo-fullnode/README.md deleted file mode 100644 index c7c79f993b7..00000000000 --- a/packages/helm-charts/celo-fullnode/README.md +++ /dev/null @@ -1,3 +0,0 @@ -# celo-fullnode - -Moved to https://github.com/celo-org/charts/tree/main/charts/celo-fullnode diff --git a/packages/helm-charts/celo-fullnode/baklava-gcp-forno-europe-west1-values.yaml b/packages/helm-charts/celo-fullnode/baklava-gcp-forno-europe-west1-values.yaml deleted file mode 100644 index 7888c91e7cb..00000000000 --- a/packages/helm-charts/celo-fullnode/baklava-gcp-forno-europe-west1-values.yaml +++ /dev/null @@ -1,5 +0,0 @@ -geth: - resources: - requests: - memory: "2Gi" - cpu: "1" diff --git a/packages/helm-charts/celo-fullnode/rc1-blockscout-archive-nodes.yaml b/packages/helm-charts/celo-fullnode/rc1-blockscout-archive-nodes.yaml deleted file mode 100644 index a1c9dc21792..00000000000 --- a/packages/helm-charts/celo-fullnode/rc1-blockscout-archive-nodes.yaml +++ /dev/null @@ -1,117 +0,0 @@ -extraPodLabels: - stack: blockscout - mode: archive -gcp: true -genesis: - network: mainnet - networkId: 42220 - useGenesisFileBase64: false -geth: - autoscaling: - enabled: true - flags: --txpool.nolocals - gcmode: archive - image: - repository: us.gcr.io/celo-org/geth - tag: 1.6.0 - light: - maxpeers: 100 - serve: 10 - maxpeers: 300 - node_keys: - - f4486a87798a9126713c146cd509254facc106dda552bd40962a7082e26d4460 - - 5b21d503b04d249adf9a4e6ce21d44989b2db354f159efee5fd33d561092a002 - - fc97615d43b667355ee2f552f8f948b1ed5a9d7c30200373da718d336a3a26da - - a15274c6dca70162871ef29a39b3a62256ce0bbf99ebbae389e1377f5c30f08c - - dc485cf3ff51fab4c06ec9bcb4a603d3695bf05e03520fdbab623265d9b63052 - - ef250753e59d89aab92c1f27ff3752b3aca89a88042792fbba58b88c2c4e66f4 - - 34097470523dec7fad6b1c410bed2db8bc7230defb0f59a9273f1f77526f4758 - - 3d043585fc9e17231234f44072c96b635d65f30791a8176fc5a80e889aa0903d - - cc711146c6b5894994f8765ffa42a83c4ca1656e81c654c8345f76c2635c6f4e - - ad76caee7ef9e42f61507e2c8529319f40a61a5b455a9d1031692a92ce415965 - - ef15f6d4279f2b2bc3a4c9a7a452678f5fa8c8b07d03ec76bc6bd50190aaee1d - - dd66a303cbbf1c98b5bcd56243ae4e24e60dec5174b11f87fab549c4f80fd198 - - d7a242bf77ab1bd12f5442c932686b000d99cd2ee4bc923ca01b398c6b57db6b - - 4062c9fa13f2ffec078505af8248062215347032ce2b741c470a1f36774273b8 - - d264a62bfe34d53cd2c9dc58898ab19147a13538bbfb24d31e02214d7a927c3b - - 78b97813b660129f12366da15ff9e81c586ea354468c652fc65dfcee40abb814 - - 44a8b95c09f4ccc3f415f7a11824a82fb6288224efa773f3091b750d52b0cbe0 - - 8f5cbf90cf7cf071557ff637804fa11550199555899f6253533e23151590e5ed - - f9c35ab423b6db764e756d5f77ce5132cf5b820a4be88bc278c0754f174728ec - - 7ad8d862501c4c7de1654a96bcf23eedcfdc04b8a95195da4089d185abdf60ef - - 8aef854cfac6841c199bc767abe7d2eadfda6073a61f34894b3da00e036ad5f0 - - cd67d0045352cd7fd51bbc34a6af187326e325db0c0ad6cdcce51a33a38cffd8 - - ab30ca2331d3d6a9ff455ff9a3f077e143fe01e6593b8e42cd4b0aa2350f9a9e - - 8408dcd47dcb987c4c89033f7b8506901c66e181a10458ebcc91eeef982c2582 - - 202b60fdb9353776f8f4e00225ff02115d666540a56cf549d30818e299cd4d59 - - 8d29fa939f8941c49b4f07b0ad0a97c6eb239e5880692a072d4e1be2c463b4b0 - - 9f803e572b044fdfae9c7f3e74dee7c2162d233ba423693334008125a648b274 - - 0386a0924d55c55e7ede22f2e9e97d6d35f5701131d8731fd0da151126806ffd - - 0b2fd5138854399a8219b0dd1784f244b9f9fc296d181d298a32867187b16a67 - - 0182756023478f62374dd687afdccfd51851486377accc96f4a4c1db25744f99 - - 3fe6b563d9716770e0299118fca2ba1ccc7614790d34ab2edb1641a5746985df - - f2a08b997813997b0bdda27fba307febf289d74b447fc2249034fe871a6f49c7 - - 22c0c78eb636490dd207314752e1688c54904d1f0f30fdc3a4def0546e4901ed - - f1c92539640b122db86c32bf746174bcaa54c9ee0824ea68aef1d625721b3f68 - - b8674e3f79f79790209c4e75040e31ce321184085c8d49b71341a45c86b050bc - - f2939dbec209f51467e406a4e80884e074c533bafe378cced62a19f14f454e73 - - 8bad5dbd251fd52499ded9ce5e4944dbd1f9d6208546537763998051c49f8f91 - - 71d8c4e8ac3611f208e0f01391738d3f6146be3f1f535aae0d54aa66de83caad - - 069163e28ee56776b26d354831586422ca84e01c500f158709b413c76fb2a68e - - 1c83157e1c8dd52ea9f826c15acc8c8a4de3338746ab411386879135440b3085 - - 5eba03f3b36e92b711d58b6fa9a66c8342c86cabe11ab748f4a66fa504c3e9ff - - 04851878b38b9509de08f7c02bc49f949937cd5dabb6aea71c552f4f86734d1f - - 00c3206352cb7b855ed72675e62192f07cef72717b8cacf1add709d46acd7a89 - - c37112d3e511aff3113512b46a1433ebe473c4bbf55aeca33c321c64bf075dd8 - - 9584d2a83be292abd8de0332af9c5fc327e98ee4f190340f30a9cd7a65ae606d - - cd81f5f300e8fe070e7c5b867a1c2a531f5d4c3d1b7a1b1b49258c6e4289be13 - - 82bfe2e82c3ee2331cbbe14ca03a286b19c691785d7db8d4213919b4a970e5c4 - - 08508d726d3d592e922324eacdcf622480a479c61a0414e47b59e219b7c3c245 - - 0176d2e8d1a3fc60f2cc55265ebe2bfb04074521d52b1fb844e2fe28646f604b - - 53cc64a704ac5712d1990efd511571d4b22a566c1c184731dda57c04acc815c9 - public_ip_per_node: - - 34.168.166.173 - - 34.168.53.31 - - 34.168.142.153 - - 104.196.248.95 - - 34.168.14.34 - - 35.199.167.145 - - 34.168.58.236 - - 34.168.208.61 - - 35.230.20.151 - - 35.203.178.255 - resources: - requests: - cpu: "7" - memory: 24Gi - rpc_apis: eth,net,rpc,web3 - service_protocols: - - TCP - - UDP - service_type: LoadBalancer - use_gstorage_data: false - verbosity: 2 - ws_port: 8545 -ingress: - enabled: false -namespace: blockscout - -metrics: true -prometheus: true -replicaCount: 2 -storage: - accessModes: ReadWriteOnce - enable: true - size: 1500Gi - snapshot: - enabled: true - kind: VolumeSnapshot - name: snapshot-archive-node-blockscout - storageClass: premium-rwo -nodeSelector: - pool: t2d-standard-8-spot -tolerations: -- effect: NoSchedule - key: pool - operator: Equal - value: t2d-standard-8-spot diff --git a/packages/helm-charts/celo-fullnode/rc1-gcp-private-txnodes-values.yaml b/packages/helm-charts/celo-fullnode/rc1-gcp-private-txnodes-values.yaml deleted file mode 100644 index fd64892197c..00000000000 --- a/packages/helm-charts/celo-fullnode/rc1-gcp-private-txnodes-values.yaml +++ /dev/null @@ -1,19 +0,0 @@ -geth: - resources: - requests: - memory: "21Gi" - cpu: "7" - service_session_affinity: None - node_keys: - - "28406f9c37274e163f5f3335fec2a35e0d3bf3895c28f7a54dd8ecd614d1437c" - - "7c0b1c0518bdd3e1a0db8b0ed6999e4404b344a950462fda42ab53ca7ccea271" - - "e658b12507dc91948440e15dc5f85d670b9c55a6ec0ad6e11f87a2eacebf07c5" - - "3d9a4d83ac67bef79558fe49993ac0436fa15a64b9d383a77c808d0653911722" - - "eb8a93871d8ae965484e00d6381cd3756dc13381a98bf715456f0a7dd940ad9e" - - "87d3a7ce70fc43db4a070a0db1b69ac78f8db0e10163e34248347966b3a8b072" - - "e7565adbdfab09dab769148a34da35168ababecbba6a468d21ccd9fcff9b5946" - - "09e2c8c304c5019c5569f27ceea3f1d6d97facbdf8fe410831cd3c615e5df82f" - -extraPodLabels: - stack: blockscout - mode: archive diff --git a/packages/helm-charts/celo-fullnode/rc1staging-archivenodes-values.yaml b/packages/helm-charts/celo-fullnode/rc1staging-archivenodes-values.yaml deleted file mode 100644 index ff67901e704..00000000000 --- a/packages/helm-charts/celo-fullnode/rc1staging-archivenodes-values.yaml +++ /dev/null @@ -1,36 +0,0 @@ -fullnameOverride: rc1staging-archivenodes -gcp: true -genesis: - network: rc1 - networkId: 42220 -geth: - gcmode: archive - image: - imagePullPolicy: IfNotPresent - repository: us.gcr.io/celo-org/geth - tag: 1.5.6 - light: - maxpeers: 1000 - serve: 90 - maxpeers: 1100 - node_keys: - - 5781152a2ab09ae18dd0a48baacc743c9b05f7542d6207615c86dd9bc21b5c94 - - 609a06841bbf4e7579ee804d9422094f2401d455b8834336a3faea3545950fa8 - public_ip_per_node: [] # needs to be provided - resources: - requests: - cpu: "7" - memory: 21Gi - rpc_apis: eth,net,rpc,web3,txpool,debug - service_type: None - use_gstorage_data: false - ws_port: 8545 - increase_timeouts: true -namespace: rc1staging -replicaCount: 2 -storage: - size: 1500Gi - storageClass: ssd -extraPodLabels: - stack: blockscout - mode: archive diff --git a/packages/helm-charts/celostats/Chart.yaml b/packages/helm-charts/celostats/Chart.yaml deleted file mode 100644 index 89630c7b92f..00000000000 --- a/packages/helm-charts/celostats/Chart.yaml +++ /dev/null @@ -1,11 +0,0 @@ ---- -name: celostats -apiVersion: v1 -version: 0.1.0 -description: Chart which is used to deploy an celostats setup for a celo testnet -keywords: - - ethereum - - blockchain - - celostats - - ethstats -appVersion: "" diff --git a/packages/helm-charts/celostats/README.md b/packages/helm-charts/celostats/README.md deleted file mode 100644 index a9b153a2ef3..00000000000 --- a/packages/helm-charts/celostats/README.md +++ /dev/null @@ -1,17 +0,0 @@ -# celostats - -## Deploying on existing testnet - -This helm chart is an evolution from the former `ethstats` chart. -This chart includes artifacts for deploying the artifacts of -`celostats-server` (https://github.com/celo-org/celostats-server/) and -`celostats-frontend` (https://github.com/celo-org/celostats-frontend/). -Also, for compatibility reasons, include an ingress resource serving -at DNS `https://ethstats-${env}.${celo-domain}`, so the old-configured -clients can report/connect with that endpoint. - -To upgrade from an exisiting `ethstats` package, the easiest way is: - -1. Remove the old `ethstats` package: `helm uninstall --purge ${env}-ethstats` - -2. Deploy the new package: `celotool deploy initial celostats -e ${env}` diff --git a/packages/helm-charts/celostats/templates/celostats-frontend.deployment.yaml b/packages/helm-charts/celostats/templates/celostats-frontend.deployment.yaml deleted file mode 100644 index 3fdc2fc190a..00000000000 --- a/packages/helm-charts/celostats/templates/celostats-frontend.deployment.yaml +++ /dev/null @@ -1,57 +0,0 @@ -apiVersion: apps/v1 -kind: Deployment -metadata: - name: {{ .Release.Namespace }}-celostats-frontend - labels: - app: celostats - chart: celostats - release: {{ .Release.Name }} - heritage: {{ .Release.Service }} - component: celostats-frontend -spec: - replicas: 1 - selector: - matchLabels: - app: celostats - release: {{ .Release.Name }} - component: celostats-frontend - template: - metadata: - labels: - app: celostats - release: {{ .Release.Name }} - component: celostats-frontend - spec: - containers: - - name: celostats-frontend - image: {{ .Values.celostats.image.frontend.repository }}:{{ .Values.celostats.image.frontend.tag }} - imagePullPolicy: {{ .Values.imagePullPolicy }} - env: - - name: ETHSTATS_SERVICE - value: https://{{ .Release.Namespace }}-celostats-server.{{ .Values.domain.name }}.org - - name: BLOCKSCOUT_URL - value: https://{{ .Release.Namespace }}-blockscout.{{ .Values.domain.name }}.org - - name: SUBMENU_BLOCKSCOUT - value: https://{{ .Release.Namespace }}-blockscout.{{ .Values.domain.name }}.org - - name: GRAPHQL_BLOCKSCOUT_URL - value: https://{{ .Release.Namespace }}-blockscout.{{ .Values.domain.name }}.org/graphiql - command: - - /bin/sh - - -c - args: - - | - /var/www/scripts/set-env-variables.js /var/www/app - ngsw-config /var/www/app/ /var/www/ngsw-config.json - exec nginx -g "daemon off;" - ports: - - name: http - containerPort: 80 - protocol: TCP - {{- with .Values.resources.frontend }} - resources: - {{- toYaml . | nindent 10 }} - {{- end }} - {{- with .Values.nodeSelector }} - nodeSelector: - {{- toYaml . | nindent 8 }} - {{- end }} diff --git a/packages/helm-charts/celostats/templates/celostats-frontend.ingress.yaml b/packages/helm-charts/celostats/templates/celostats-frontend.ingress.yaml deleted file mode 100644 index f84fc7964e0..00000000000 --- a/packages/helm-charts/celostats/templates/celostats-frontend.ingress.yaml +++ /dev/null @@ -1,29 +0,0 @@ -apiVersion: networking.k8s.io/v1 -kind: Ingress -metadata: - name: {{ .Release.Namespace }}-celostats-frontend - labels: - app: celostats - chart: celostats - release: {{ .Release.Name }} - heritage: {{ .Release.Service }} - component: celostats-frontend - annotations: - kubernetes.io/tls-acme: "true" -spec: - ingressClassName: {{ default "nginx" .Values.ingressClassName }} - tls: - - hosts: - - {{ .Release.Namespace }}-celostats.{{ .Values.domain.name }}.org - secretName: {{ .Release.Namespace }}-celostats-frontend-tls - rules: - - host: {{ .Release.Namespace }}-celostats.{{ .Values.domain.name }}.org - http: - paths: - - path: / - pathType: Prefix - backend: - service: - name: {{ .Release.Namespace }}-celostats-frontend - port: - number: 80 diff --git a/packages/helm-charts/celostats/templates/celostats-frontend.service.yaml b/packages/helm-charts/celostats/templates/celostats-frontend.service.yaml deleted file mode 100644 index d54dc5c79a8..00000000000 --- a/packages/helm-charts/celostats/templates/celostats-frontend.service.yaml +++ /dev/null @@ -1,19 +0,0 @@ -kind: Service -apiVersion: v1 -metadata: - name: {{ .Release.Namespace }}-celostats-frontend - labels: - app: celostats - chart: celostats - release: {{ .Release.Name }} - heritage: {{ .Release.Service }} - component: celostats-frontend -spec: - selector: - app: celostats - release: {{ .Release.Name }} - component: celostats-frontend - type: {{ .Values.celostats.service.type }} - ports: - - port: 80 - targetPort: http diff --git a/packages/helm-charts/celostats/templates/celostats-server.deployment.yaml b/packages/helm-charts/celostats/templates/celostats-server.deployment.yaml deleted file mode 100644 index fb3bc22bdbe..00000000000 --- a/packages/helm-charts/celostats/templates/celostats-server.deployment.yaml +++ /dev/null @@ -1,56 +0,0 @@ -apiVersion: apps/v1 -kind: Deployment -metadata: - name: {{ .Release.Namespace }}-celostats-server - labels: - app: celostats - chart: celostats - release: {{ .Release.Name }} - heritage: {{ .Release.Service }} - component: celostats-server -spec: - replicas: 1 - selector: - matchLabels: - app: celostats - release: {{ .Release.Name }} - component: celostats-server - template: - metadata: - labels: - app: celostats - release: {{ .Release.Name }} - component: celostats-server - spec: - containers: - - name: celostats-server - image: {{ .Values.celostats.image.server.repository }}:{{ .Values.celostats.image.server.tag }} - imagePullPolicy: {{ .Values.imagePullPolicy }} - env: - - name: TRUSTED_ADDRESSES - value: {{ .Values.celostats.trusted_addresses }} - - name: BANNED_ADDRESSES - value: {{ .Values.celostats.banned_addresses }} - - name: RESERVED_ADDRESSES - value: {{ .Values.celostats.reserved_addresses }} - - name: JSONRPC - value: {{ .Values.celostats.jsonrpc }} - command: - - /bin/sh - - -c - args: - - | - sed -i "s%###NETWORK_NAME###%{{ .Values.celostats.network_name }}%g" /celostats-server/dist/js/netstats.min.js - sed -i "s%###BLOCKSCOUT_URL###%{{ .Values.celostats.blockscout_url }}%g" /celostats-server/dist/js/netstats.min.js - exec npm start - ports: - - name: http - containerPort: 3000 - {{- with .Values.resources.server }} - resources: - {{- toYaml . | nindent 10 }} - {{- end }} - {{- with .Values.nodeSelector }} - nodeSelector: - {{- toYaml . | nindent 8 }} - {{- end }} diff --git a/packages/helm-charts/celostats/templates/celostats-server.ingress.yaml b/packages/helm-charts/celostats/templates/celostats-server.ingress.yaml deleted file mode 100644 index 820ba805f06..00000000000 --- a/packages/helm-charts/celostats/templates/celostats-server.ingress.yaml +++ /dev/null @@ -1,29 +0,0 @@ -apiVersion: networking.k8s.io/v1 -kind: Ingress -metadata: - name: {{ .Release.Namespace }}-celostats-server - labels: - app: celostats - chart: celostats - release: {{ .Release.Name }} - heritage: {{ .Release.Service }} - component: celostats-server - annotations: - kubernetes.io/tls-acme: "true" -spec: - ingressClassName: {{ default "nginx" .Values.ingressClassName }} - tls: - - hosts: - - {{ .Release.Namespace }}-celostats-server.{{ .Values.domain.name }}.org - secretName: {{ .Release.Namespace }}-celostats-tls - rules: - - host: {{ .Release.Namespace }}-celostats-server.{{ .Values.domain.name }}.org - http: - paths: - - path: / - pathType: Prefix - backend: - service: - name: {{ .Release.Namespace }}-celostats-server - port: - number: 80 diff --git a/packages/helm-charts/celostats/templates/celostats-server.service.yaml b/packages/helm-charts/celostats/templates/celostats-server.service.yaml deleted file mode 100644 index b03e537372c..00000000000 --- a/packages/helm-charts/celostats/templates/celostats-server.service.yaml +++ /dev/null @@ -1,19 +0,0 @@ -kind: Service -apiVersion: v1 -metadata: - name: {{ .Release.Namespace }}-celostats-server - labels: - app: celostats - chart: celostats - release: {{ .Release.Name }} - heritage: {{ .Release.Service }} - component: celostats-server -spec: - selector: - app: celostats - release: {{ .Release.Name }} - component: celostats-server - type: {{ .Values.celostats.service.type }} - ports: - - port: 80 - targetPort: http diff --git a/packages/helm-charts/celostats/templates/ethstats.ingress.yaml b/packages/helm-charts/celostats/templates/ethstats.ingress.yaml deleted file mode 100644 index b1359bc86e6..00000000000 --- a/packages/helm-charts/celostats/templates/ethstats.ingress.yaml +++ /dev/null @@ -1,33 +0,0 @@ -apiVersion: networking.k8s.io/v1 -kind: Ingress -metadata: - name: {{ .Release.Namespace }}-ethstats-ingress - labels: - app: celostats - chart: celostats - release: {{ .Release.Name }} - heritage: {{ .Release.Service }} - component: ethstats - annotations: - kubernetes.io/tls-acme: "true" - nginx.ingress.kubernetes.io/configuration-snippet: | - if ($http_upgrade != "websocket") { - return 301 https://{{ .Release.Namespace }}-celostats.{{ .Values.domain.name }}.org/; - } -spec: - ingressClassName: {{ default "nginx" .Values.ingressClassName }} - tls: - - hosts: - - {{ .Release.Namespace }}-ethstats.{{ .Values.domain.name }}.org - secretName: {{ .Release.Namespace }}-ethstats-tls - rules: - - host: {{ .Release.Namespace }}-ethstats.{{ .Values.domain.name }}.org - http: - paths: - - path: / - pathType: Prefix - backend: - service: - name: {{ .Release.Namespace }}-celostats-server - port: - number: 80 diff --git a/packages/helm-charts/celostats/templates/ethstats.service.yaml b/packages/helm-charts/celostats/templates/ethstats.service.yaml deleted file mode 100644 index d7fc6d67e23..00000000000 --- a/packages/helm-charts/celostats/templates/ethstats.service.yaml +++ /dev/null @@ -1,19 +0,0 @@ -kind: Service -apiVersion: v1 -metadata: - name: {{ .Release.Namespace }}-ethstats - labels: - app: celostats - chart: celostats - release: {{ .Release.Name }} - heritage: {{ .Release.Service }} - component: ethstats -spec: - selector: - app: celostats - release: {{ .Release.Name }} - component: celostats-server - type: {{ .Values.celostats.service.type }} - ports: - - port: 80 - targetPort: http diff --git a/packages/helm-charts/celostats/values.yaml b/packages/helm-charts/celostats/values.yaml deleted file mode 100644 index ad82aa6aa4c..00000000000 --- a/packages/helm-charts/celostats/values.yaml +++ /dev/null @@ -1,35 +0,0 @@ -imagePullPolicy: IfNotPresent - -# Node labels for pod assignment -# ref: https://kubernetes.io/docs/user-guide/node-selection/ -nodeSelector: {} - -celostats: - image: - server: - repository: gcr.io/celo-testnet/celostats-server - tag: latest - frontend: - repository: gcr.io/celo-testnet/celostats-frontend - tag: latest - service: - type: NodePort - trusted_addresses: [] - banned_addresses: [] - -domain: - name: celo-testnet - -ingressClassName: nginx - -resources: - server: - requests: - cpu: 15m - memory: 110Mi - limits: {} - frontend: - requests: - cpu: 1m - memory: 5Mi - limits: {} diff --git a/packages/helm-charts/cert-manager-cluster-issuers/Chart.yaml b/packages/helm-charts/cert-manager-cluster-issuers/Chart.yaml deleted file mode 100644 index 8efa04fffe0..00000000000 --- a/packages/helm-charts/cert-manager-cluster-issuers/Chart.yaml +++ /dev/null @@ -1,12 +0,0 @@ -name: cert-manager-issuers -apiVersion: v2 -version: 0.2.0 -description: Chart which is used to deploy let's encrypt issuers -keywords: -- "let's encrypt" -- cert-manager -appVersion: v1.7.3 -dependencies: - - name: cert-manager - version: v1.9.1 - repository: https://charts.jetstack.io diff --git a/packages/helm-charts/cert-manager-cluster-issuers/README.md b/packages/helm-charts/cert-manager-cluster-issuers/README.md deleted file mode 100644 index d64c1bd61ca..00000000000 --- a/packages/helm-charts/cert-manager-cluster-issuers/README.md +++ /dev/null @@ -1,4 +0,0 @@ -# cert-manager-cluster-issuers - -This is the newer version of kube-lego that automatically gets SSL certificates. -This specifies staging & production ClusterIssuers. diff --git a/packages/helm-charts/cert-manager-cluster-issuers/templates/prod.clusterissuer.yaml b/packages/helm-charts/cert-manager-cluster-issuers/templates/prod.clusterissuer.yaml deleted file mode 100644 index a6d42aaf627..00000000000 --- a/packages/helm-charts/cert-manager-cluster-issuers/templates/prod.clusterissuer.yaml +++ /dev/null @@ -1,18 +0,0 @@ -apiVersion: cert-manager.io/v1 -kind: ClusterIssuer -metadata: - name: letsencrypt-prod -spec: - acme: - # The ACME server URL - server: https://acme-v02.api.letsencrypt.org/directory - # Email address used for ACME registration - email: n@celo.org - # Name of a secret used to store the ACME account private key - privateKeySecretRef: - name: letsencrypt-prod - # Enable the HTTP-01 challenge provider - solvers: - - http01: - ingress: - class: nginx diff --git a/packages/helm-charts/cert-manager-cluster-issuers/templates/staging.clusterissuer.yaml b/packages/helm-charts/cert-manager-cluster-issuers/templates/staging.clusterissuer.yaml deleted file mode 100644 index c86e91b3a90..00000000000 --- a/packages/helm-charts/cert-manager-cluster-issuers/templates/staging.clusterissuer.yaml +++ /dev/null @@ -1,18 +0,0 @@ -apiVersion: cert-manager.io/v1 -kind: ClusterIssuer -metadata: - name: letsencrypt-staging -spec: - acme: - # The ACME server URL - server: https://acme-staging-v02.api.letsencrypt.org/directory - # Email address used for ACME registration - email: n@celo.org - # Name of a secret used to store the ACME account private key - privateKeySecretRef: - name: letsencrypt-staging - # Enable the HTTP-01 challenge provider - solvers: - - http01: - ingress: - class: nginx diff --git a/packages/helm-charts/cert-manager-cluster-issuers/values.yaml b/packages/helm-charts/cert-manager-cluster-issuers/values.yaml deleted file mode 100644 index d03351f2be0..00000000000 --- a/packages/helm-charts/cert-manager-cluster-issuers/values.yaml +++ /dev/null @@ -1,9 +0,0 @@ -imagePullPolicy: IfNotPresent - -# Values that are used for the dependency `cert-manager` -cert-manager: - ingressShim: - defaultIssuerKind: ClusterIssuer - defaultIssuerName: letsencrypt-prod - webhook: - enabled: false diff --git a/packages/helm-charts/common/README.md b/packages/helm-charts/common/README.md deleted file mode 100644 index e61ab2c9657..00000000000 --- a/packages/helm-charts/common/README.md +++ /dev/null @@ -1,3 +0,0 @@ -# common - -Moved to https://github.com/celo-org/charts/tree/main/charts/common diff --git a/packages/helm-charts/gcp-ssd/Chart.yaml b/packages/helm-charts/gcp-ssd/Chart.yaml deleted file mode 100644 index 736e3893d62..00000000000 --- a/packages/helm-charts/gcp-ssd/Chart.yaml +++ /dev/null @@ -1,8 +0,0 @@ -name: gcp-ssd -version: 0.0.1 -description: Chart to deploy a GCP SSD storage class -keywords: -- gcp -- ssd -- storage-class -appVersion: v1.7.3 diff --git a/packages/helm-charts/gcp-ssd/templates/storage-class.yaml b/packages/helm-charts/gcp-ssd/templates/storage-class.yaml deleted file mode 100644 index 62500d87f9d..00000000000 --- a/packages/helm-charts/gcp-ssd/templates/storage-class.yaml +++ /dev/null @@ -1,7 +0,0 @@ -apiVersion: storage.k8s.io/v1 -kind: StorageClass -metadata: - name: ssd -provisioner: kubernetes.io/gce-pd -parameters: - type: pd-ssd diff --git a/packages/helm-charts/grafana/values-clabs.yaml b/packages/helm-charts/grafana/values-clabs.yaml deleted file mode 100644 index 8f27dc22680..00000000000 --- a/packages/helm-charts/grafana/values-clabs.yaml +++ /dev/null @@ -1,46 +0,0 @@ -annotations: - prometheus.io/path: /metrics - prometheus.io/port: "3000" - prometheus.io/scrape: "false" -datasources: - datasources.yaml: - apiVersion: 1 - datasources: - # Local prometheus instance - - access: proxy - isDefault: true - name: Prometheus - type: prometheus - url: http://prometheus-server.prometheus:9090 - # Adding a default loki as datasource because Loki is installed on some cluster (i.e: forno) for local storage of - # high-volume logs. If Loki server is not available it won't cause problems on grafana (requests to that DS won't work) - - access: proxy - name: Loki - type: loki - url: http://local-loki:3100 -deploymentStrategy: - type: Recreate -grafana.ini: - auth.google: - allow_sign_up: "true" - allowed_domains: clabs.co - auth_url: https://accounts.google.com/o/oauth2/auth - enabled: true - scopes: https://www.googleapis.com/auth/userinfo.profile https://www.googleapis.com/auth/userinfo.email - token_url: https://accounts.google.com/o/oauth2/token -ingress: - annotations: - kubernetes.io/tls-acme: "true" - enabled: true - path: / -persistence: - enabled: true - size: 10Gi - storageClassName: ssd -sidecar: - dashboards: - enabled: true - datasources: - enabled: false - notifiers: - enabled: false diff --git a/packages/helm-charts/load-test/Chart.yaml b/packages/helm-charts/load-test/Chart.yaml deleted file mode 100644 index 31b1b7efd5c..00000000000 --- a/packages/helm-charts/load-test/Chart.yaml +++ /dev/null @@ -1,12 +0,0 @@ -name: load-test -version: 0.0.1 -description: Chart which is used to run load test -keywords: -- ethereum -- blockchain -- load-test -appVersion: v1.7.3 -dependencies: - - name: common - repository: oci://us-west1-docker.pkg.dev/devopsre/clabs-public-oci - version: 0.2.0 \ No newline at end of file diff --git a/packages/helm-charts/load-test/templates/load-test.configmap.yaml b/packages/helm-charts/load-test/templates/load-test.configmap.yaml deleted file mode 100644 index 81b2e3aa247..00000000000 --- a/packages/helm-charts/load-test/templates/load-test.configmap.yaml +++ /dev/null @@ -1,12 +0,0 @@ -apiVersion: v1 -kind: ConfigMap -metadata: - name: {{ .Values.environment }}-load-test-config - labels: - app: load-test - chart: load-test - release: {{ .Release.Name }} - heritage: {{ .Release.Service }} - component: load-test -data: - static-nodes.json: {{ .Values.geth.staticNodes | b64dec | quote }} diff --git a/packages/helm-charts/load-test/templates/load-test.secret.yaml b/packages/helm-charts/load-test/templates/load-test.secret.yaml deleted file mode 100644 index 976edd2abb6..00000000000 --- a/packages/helm-charts/load-test/templates/load-test.secret.yaml +++ /dev/null @@ -1,14 +0,0 @@ -apiVersion: v1 -kind: Secret -metadata: - name: {{ .Values.environment }}-load-test - labels: - app: load-test - chart: load-test - release: {{ .Release.Name }} - heritage: {{ .Release.Service }} - component: load-test -type: Opaque -data: - accountSecret: {{ .Values.geth.accountSecret | b64enc | quote }} - mnemonic: {{ .Values.mnemonic | b64enc | quote }} diff --git a/packages/helm-charts/load-test/templates/load-test.statefulset.yaml b/packages/helm-charts/load-test/templates/load-test.statefulset.yaml deleted file mode 100644 index eeebf0da679..00000000000 --- a/packages/helm-charts/load-test/templates/load-test.statefulset.yaml +++ /dev/null @@ -1,252 +0,0 @@ -{{- $reuseClient := .Values.reuse_light_clients | default false -}} -apiVersion: v1 -kind: Service -metadata: - name: load-test - labels: - component: load-test -spec: - ports: - - port: 80 - name: web - clusterIP: None - selector: - component: load-test ---- -apiVersion: apps/v1 -kind: StatefulSet -metadata: - name: {{ .Values.environment }}-load-test - labels: - app: load-test - chart: load-test - release: {{ .Release.Name }} - heritage: {{ .Release.Service }} - component: load-test -spec: - podManagementPolicy: Parallel - serviceName: load-test - replicas: {{ .Values.replicas }} - selector: - matchLabels: - app: load-test - release: {{ .Release.Name }} - component: load-test - template: - metadata: - labels: - app: load-test - release: {{ .Release.Name }} - component: load-test - spec: - initContainers: - - name: generate-keys - image: {{ .Values.celotool.image.repository }}:{{ .Values.celotool.image.tag }} - imagePullPolicy: {{ .Values.imagePullPolicy }} - args: - - | - [[ $REPLICA_NAME =~ -([0-9]+)$ ]] || exit 1 - RID=${BASH_REMATCH[1]} - echo $RID > /root/.celo/rid - celotooljs.sh generate public-key --mnemonic "$MNEMONIC" --accountType bootnode --index 0 > /root/.celo/bootnodeEnodeAddress - echo -n "Generating Bootnode enode address for the validator: " - cat /root/.celo/bootnodeEnodeAddress - - BOOTNODE_IP_ADDRESS=${{ .Release.Namespace | upper }}_BOOTNODE_SERVICE_HOST - echo `cat /root/.celo/bootnodeEnodeAddress`@$BOOTNODE_IP_ADDRESS:30301 > /root/.celo/bootnodeEnode - echo -n "Generating Bootnode enode for the validator: " - cat /root/.celo/bootnodeEnode - - celotooljs.sh generate prepare-load-test \ - --mnemonic "$MNEMONIC" \ - --threads {{ .Values.threads | default "1" }} \ - --index $RID - command: - - bash - - -c - env: - - name: REPLICA_NAME - valueFrom: - fieldRef: - fieldPath: metadata.name - - name: MNEMONIC - valueFrom: - secretKeyRef: - name: {{ .Values.environment }}-load-test - key: mnemonic - volumeMounts: - - name: data - mountPath: /root/.celo -{{ include "common.conditional-init-genesis-container" . | indent 6 }} - - name: import-geth-account - image: {{ .Values.geth.image.repository }}:{{ .Values.geth.image.tag }} - imagePullPolicy: {{ .Values.imagePullPolicy }} - command: ["/bin/sh"] - args: - - "-c" - - | - for thread in $(seq 0 {{ sub .Values.threads 1 | default "0" }}); do - geth --nousb account import --password /root/.celo/account/accountSecret /root/.celo/pkey$thread || true - done - volumeMounts: - - name: data - mountPath: /root/.celo - - name: account - mountPath: "/root/.celo/account" - readOnly: true - containers: -{{- if $reuseClient }} - - name: geth - image: {{ $.Values.geth.image.repository }}:{{ $.Values.geth.image.tag }} - imagePullPolicy: {{ $.Values.imagePullPolicy }} - command: ["/bin/sh"] - args: - - "-c" - - |- - set -euo pipefail - cp /var/geth/static-nodes.json /root/.celo/static-nodes.json - - ACCOUNT_ADDRESSES=$(cat /root/.celo/address | tr '\n' ',') - ACCOUNT_ADDRESSES=${ACCOUNT_ADDRESSES::-1} - - ADDITIONAL_FLAGS='--allow-insecure-unlock' - -{{ include "common.geth-http-ws-flags" (dict "Values" $.Values "rpc_apis" "eth,web3,debug,admin,personal,net" "ws_port" "8545" "listen_address" "0.0.0.0") | indent 10 }} - - exec geth \ - --datadir /root/.celo \ - --ipcpath=geth.ipc \ - --nousb \ - --networkid={{ $.Values.geth.networkID }} \ - --nodekey=/root/.celo/pkey0 \ - --syncmode=fast \ - --consoleformat=json \ - --consoleoutput=stdout \ - --verbosity=1 \ - --unlock=$ACCOUNT_ADDRESSES \ - --password=/root/.celo/account/accountSecret \ - ${ADDITIONAL_FLAGS} \ - --port 30303 - resources: - requests: - memory: 4Gi - cpu: 2 - volumeMounts: - - name: data - mountPath: /root/.celo - - name: config - mountPath: /var/geth - - name: account - mountPath: "/root/.celo/account" - readOnly: true -{{- else }} -{{- range $index, $e := until (.Values.threads | int) }} - - name: geth-{{ $index }} - image: {{ $.Values.geth.image.repository }}:{{ $.Values.geth.image.tag }} - imagePullPolicy: {{ $.Values.imagePullPolicy }} - command: ["/bin/sh"] - args: - - "-c" - - |- - set -euo pipefail - cp -rp /root/.celo_share /root/.celo - cp /var/geth/static-nodes.json /root/.celo/static-nodes.json - - ACCOUNT_ADDRESS=$(awk 'NR=={{ add $index 1 }}' /root/.celo/address) - -{{ include "common.geth-http-ws-flags" (dict "Values" $.Values "rpc_apis" "eth,web3,debug,admin,personal,net" "ws_port" "8545" "listen_address" "0.0.0.0") | indent 10 }} - - exec geth \ - --nousb \ - --networkid={{ $.Values.geth.networkID }} \ - --nodekey=/root/.celo/pkey{{ $index }} \ - --syncmode=lightest \ - --consoleformat=json \ - --consoleoutput=stdout \ - --verbosity=1 \ - --unlock=$ACCOUNT_ADDRESS \ - --password=/root/.celo/account/accountSecret \ - --port {{ add 30303 $index }} \ - --http.port {{ add 8545 $index }} - resources: - requests: - memory: 200Mi - cpu: 100m - volumeMounts: - - name: data - mountPath: /root/.celo_share - readOnly: true - - name: config - mountPath: /var/geth - - name: account - mountPath: "/root/.celo_share/account" - readOnly: true -{{- end }} -{{- end }} - - name: simulate-client - image: {{ .Values.celotool.image.repository }}:{{ .Values.celotool.image.tag }} - imagePullPolicy: {{ .Values.imagePullPolicy }} - securityContext: - runAsUser: 0 - command: - - bash - - "-c" - - | - RID=`cat /root/.celo/rid` - - # Send the txs to the next load test client - RECIPIENT_INDEX=$(( ($RID + 1) % {{ .Values.replicas }} )) - - exec celotooljs.sh geth simulate-client \ -{{- if $reuseClient }} - --reuse-client \ -{{- end }} - --index $RID \ - --recipient-index $RECIPIENT_INDEX \ - --delay {{ .Values.delay }} \ - --mnemonic "$MNEMONIC" \ - --blockscout-url {{ .Values.blockscout.url }} \ - --blockscoutMeasurePercent {{ .Values.blockscout.measurePercent }} \ - --client-count {{ .Values.threads | default "1" }} - resources: - requests: - memory: 4Gi - cpu: 2 - env: - - name: LOAD_TEST_USE_RANDOM_RECIPIENT - value: "{{ default "true" .Values.use_random_recipient }}" - - name: MNEMONIC - valueFrom: - secretKeyRef: - name: {{ .Values.environment }}-load-test - key: mnemonic - - name: PASSWORD - valueFrom: - secretKeyRef: - name: {{ .Values.environment }}-load-test - key: accountSecret - - name: LOAD_TEST_USE_RANDOM_RECIPIENT - value: "true" - volumeMounts: - - name: data - mountPath: /root/.celo - volumes: - - name: data - emptyDir: {} - - name: config - configMap: - name: {{ .Values.environment }}-load-test-config - - name: account - secret: - secretName: {{ .Values.environment }}-load-test -{{- if $reuseClient }} - volumeClaimTemplates: - - metadata: - name: data - spec: - storageClassName: ssd - accessModes: [ "ReadWriteOnce" ] - resources: - requests: - storage: {{ .Values.geth.diskSize | default 10 }}Gi -{{- end }} diff --git a/packages/helm-charts/load-test/values.yaml b/packages/helm-charts/load-test/values.yaml deleted file mode 100644 index 77e40c04353..00000000000 --- a/packages/helm-charts/load-test/values.yaml +++ /dev/null @@ -1 +0,0 @@ -imagePullPolicy: Always diff --git a/packages/helm-charts/mock-oracle/.helmignore b/packages/helm-charts/mock-oracle/.helmignore deleted file mode 100644 index 50af0317254..00000000000 --- a/packages/helm-charts/mock-oracle/.helmignore +++ /dev/null @@ -1,22 +0,0 @@ -# Patterns to ignore when building packages. -# This supports shell glob matching, relative path matching, and -# negation (prefixed with !). Only one pattern per line. -.DS_Store -# Common VCS dirs -.git/ -.gitignore -.bzr/ -.bzrignore -.hg/ -.hgignore -.svn/ -# Common backup files -*.swp -*.bak -*.tmp -*~ -# Various IDEs -.project -.idea/ -*.tmproj -.vscode/ diff --git a/packages/helm-charts/mock-oracle/Chart.yaml b/packages/helm-charts/mock-oracle/Chart.yaml deleted file mode 100644 index a122682109a..00000000000 --- a/packages/helm-charts/mock-oracle/Chart.yaml +++ /dev/null @@ -1,5 +0,0 @@ -apiVersion: v1 -appVersion: "1.0" -description: A Helm chart for the mock oracle -name: mock-oracle -version: 0.1.0 diff --git a/packages/helm-charts/mock-oracle/templates/oracle.cronjob.yaml b/packages/helm-charts/mock-oracle/templates/oracle.cronjob.yaml deleted file mode 100644 index f962a6dfbe9..00000000000 --- a/packages/helm-charts/mock-oracle/templates/oracle.cronjob.yaml +++ /dev/null @@ -1,75 +0,0 @@ -apiVersion: batch/v1beta1 -kind: CronJob -metadata: - name: {{ .Release.Name }} - labels: - app: oracle - chart: oracle - release: {{ .Release.Service }} - component: oracle -spec: - schedule: "{{ .Values.oracle.cronSchedule }}" - concurrencyPolicy: Forbid - jobTemplate: - spec: - backoffLimit: 1 - template: - spec: - initContainers: - - name: get-current-price - image: {{ .Values.oracle.image.repository }}:{{ .Values.oracle.image.tag }} - imagePullPolicy: IfNotPresent - command: - - sh - - "-c" - - | - ./current_rate.sh > /celo/.celo/current_price - volumeMounts: - - name: data - mountPath: /celo/.celo - - name: get-account - image: {{ .Values.celotool.image.repository }}:{{ .Values.celotool.image.tag }} - imagePullPolicy: IfNotPresent - command: ["/bin/sh"] - args: - - "-c" - - | - celotooljs.sh generate bip32 --mnemonic "$MNEMONIC" --accountType price_oracle --index 0 > /celo/.celo/pkey - celotooljs.sh generate account-address --private-key `cat /celo/.celo/pkey` > /celo/.celo/account - volumeMounts: - - name: data - mountPath: /celo/.celo - env: - - name: MNEMONIC - valueFrom: - secretKeyRef: - name: {{ .Release.Name }} - key: MNEMONIC - containers: - - name: report-price - image: {{ .Values.celocli.image.repository }}:{{ .Values.celocli.image.tag }} - imagePullPolicy: IfNotPresent - command: ["/bin/sh"] - args: - - "-c" - - | - PRICE=`cat /celo/.celo/current_price` - echo 'current price:' - echo $PRICE - PK=`cat /celo/.celo/pkey` - ACCOUNT=`cat /celo/.celo/account` - celocli config:set --node {{ .Values.celocli.nodeUrl }} - celocli oracle:report StableToken --numerator $PRICE --privateKey $PK --from $ACCOUNT - volumeMounts: - - name: data - mountPath: /celo/.celo - env: - - name: MNEMONIC - valueFrom: - secretKeyRef: - name: {{ .Release.Name }} - key: MNEMONIC - restartPolicy: Never - volumes: - - name: data - emptyDir: {} diff --git a/packages/helm-charts/mock-oracle/templates/oracle.secret.yaml b/packages/helm-charts/mock-oracle/templates/oracle.secret.yaml deleted file mode 100644 index dac777fb212..00000000000 --- a/packages/helm-charts/mock-oracle/templates/oracle.secret.yaml +++ /dev/null @@ -1,12 +0,0 @@ -apiVersion: v1 -kind: Secret -metadata: - name: {{ .Release.Name }} - labels: - app: oracle - chart: oracle - release: {{ .Release.Name }} - heritage: {{ .Release.Service }} -type: Opaque -data: - MNEMONIC: {{ .Values.mnemonic | b64enc | quote }} \ No newline at end of file diff --git a/packages/helm-charts/odis/Chart.yaml b/packages/helm-charts/odis/Chart.yaml deleted file mode 100644 index b0eb29da3b8..00000000000 --- a/packages/helm-charts/odis/Chart.yaml +++ /dev/null @@ -1,9 +0,0 @@ -apiVersion: v1 -appVersion: "1.1.4" -description: A Helm chart for the ODIS app -name: odis -version: 0.1.0 -dependencies: - - name: common - repository: oci://us-west1-docker.pkg.dev/devopsre/clabs-public-oci - version: 0.2.0 \ No newline at end of file diff --git a/packages/helm-charts/odis/templates/_helpers.tpl b/packages/helm-charts/odis/templates/_helpers.tpl deleted file mode 100644 index fd37b001a17..00000000000 --- a/packages/helm-charts/odis/templates/_helpers.tpl +++ /dev/null @@ -1,45 +0,0 @@ -{{/* -The name of the deployment -*/}} -{{- define "name" -}} -{{- .Values.environment.cluster.name -}} -{{- end -}} - -{{/* -Common labels that are recommended to be used by Helm and Kubernetes -*/}} -{{- define "labels" -}} -app.kubernetes.io/name: {{ template "name" . }} -helm.sh/chart: {{ .Chart.Name }}-{{ .Chart.Version | replace "+" "_" }} -app.kubernetes.io/managed-by: {{ .Release.Service }} -app.kubernetes.io/instance: {{ .Release.Name }} -{{- end -}} - -{{/* -Annotations to indicate to the prometheus server that this node should be scraped for metrics -*/}} -{{- define "metric-annotations" -}} -prometheus.io/scrape: "true" -prometheus.io/port: "{{ .Values.relayer.metrics.prometheusPort }}" -{{- end -}} - -{{/* -Label specific to the odis signer component -*/}} -{{- define "odis-signer-component-label" -}} -app.kubernetes.io/component: odis-signer -{{- end -}} - -{{/* -The name of the azure identity binding for the odis signer -*/}} -{{- define "azure-identity-binding-name" -}} -{{- template "name" . -}}-identity-binding -{{- end -}} - -{{/* -The name of the azure identity for the odis signer -*/}} -{{- define "azure-identity-name" -}} -{{- template "name" . -}}-identity -{{- end -}} \ No newline at end of file diff --git a/packages/helm-charts/odis/templates/azure-identity-binding.yaml b/packages/helm-charts/odis/templates/azure-identity-binding.yaml deleted file mode 100644 index 8b4f4b44207..00000000000 --- a/packages/helm-charts/odis/templates/azure-identity-binding.yaml +++ /dev/null @@ -1,7 +0,0 @@ -apiVersion: "aadpodidentity.k8s.io/v1" -kind: AzureIdentityBinding -metadata: - name: {{ template "azure-identity-binding-name" . }} -spec: - azureIdentity: {{ template "azure-identity-name" . }} - selector: {{ template "azure-identity-binding-name" . }} diff --git a/packages/helm-charts/odis/templates/azure-identity.yaml b/packages/helm-charts/odis/templates/azure-identity.yaml deleted file mode 100644 index 4855368bd78..00000000000 --- a/packages/helm-charts/odis/templates/azure-identity.yaml +++ /dev/null @@ -1,10 +0,0 @@ -apiVersion: aadpodidentity.k8s.io/v1 -kind: AzureIdentity -metadata: - name: {{ template "azure-identity-name" . }} - annotations: - aadpodidentity.k8s.io/Behavior: namespaced -spec: - type: 0 - resourceID: {{ .Values.azureKVIdentity.id }} - clientID: {{ .Values.azureKVIdentity.clientId }} diff --git a/packages/helm-charts/odis/templates/dbpassword-secret.yaml b/packages/helm-charts/odis/templates/dbpassword-secret.yaml deleted file mode 100644 index fac4d8f6cf6..00000000000 --- a/packages/helm-charts/odis/templates/dbpassword-secret.yaml +++ /dev/null @@ -1,9 +0,0 @@ -apiVersion: v1 -kind: Secret -metadata: - name: db-password - labels: -{{ include "labels" . | indent 4 }} -type: Opaque -stringData: - db-password: {{ .Values.db.password }} diff --git a/packages/helm-charts/odis/templates/signer-deployment.yaml b/packages/helm-charts/odis/templates/signer-deployment.yaml deleted file mode 100644 index b6eb4dde19c..00000000000 --- a/packages/helm-charts/odis/templates/signer-deployment.yaml +++ /dev/null @@ -1,59 +0,0 @@ -apiVersion: apps/v1 -kind: Deployment -metadata: - name: {{ include "name" . }} - labels: -{{- include "odis-signer-component-label" . | nindent 4 }} -spec: - replicas: 1 - selector: - matchLabels: - {{- include "odis-signer-component-label" . | nindent 6 }} - template: - metadata: - labels: -{{- include "odis-signer-component-label" . | nindent 8 }} - aadpodidbinding: {{ template "azure-identity-binding-name" . }} - spec: - containers: - - name: odis-signer - securityContext: - {{- toYaml .Values.securityContext | nindent 12 }} - image: {{ .Values.image.repository }}:{{ .Values.image.tag }} - imagePullPolicy: Always - ports: - - name: http - containerPort: 3000 - command: - - bash - - "-c" - - | - sleep 60; yarn start:docker - env: - - name: SERVER_PORT - value: "3000" - - name: DB_TYPE - value: "postgres" - - name: DB_DATABASE - value: "phoneNumberPrivacy" - - name: KEYSTORE_TYPE - value: "AzureKeyVault" -{{ include "common.env-var" (dict "name" "LOG_LEVEL" "dict" .Values.log "value_name" "level") | indent 12 }} -{{ include "common.env-var" (dict "name" "LOG_FORMAT" "dict" .Values.log "value_name" "format") | indent 12 }} -{{ include "common.env-var" (dict "name" "BLOCKCHAIN_PROVIDER" "dict" .Values "value_name" "blockchainProvider") | indent 12 }} -{{ include "common.env-var" (dict "name" "BLOCKCHAIN_API_KEY" "dict" .Values "value_name" "blockchainApiKey") | indent 12 }} -{{ include "common.env-var" (dict "name" "DB_HOST" "dict" .Values.db "value_name" "host") | indent 12 }} -{{ include "common.env-var" (dict "name" "DB_PORT" "dict" .Values.db "value_name" "port") | indent 12 }} -{{ include "common.env-var" (dict "name" "DB_USERNAME" "dict" .Values.db "value_name" "username") | indent 12 }} -{{ include "common.env-var" (dict "name" "KEYSTORE_AZURE_VAULT_NAME" "dict" .Values.keystore "value_name" "vaultName") | indent 12 }} -{{ include "common.env-var" (dict "name" "PHONE_NUMBER_PRIVACY_KEY_NAME_BASE" "dict" .Values.keystore "value_name" "pnpKeyNameBase") | indent 12 }} -{{ include "common.env-var" (dict "name" "DOMAINS_KEY_NAME_BASE" "dict" .Values.keystore "value_name" "domainsKeyNameBase") | indent 12 }} -{{ include "common.env-var" (dict "name" "PHONE_NUMBER_PRIVACY_LATEST_KEY_VERSION" "dict" .Values.keystore "value_name" "pnpKeyLatestVersion") | indent 12 }} -{{ include "common.env-var" (dict "name" "DOMAINS_LATEST_KEY_VERSION" "dict" .Values.keystore "value_name" "domainsKeyLatestVersion") | indent 12 }} -{{ include "common.env-var" (dict "name" "DOMAINS_API_ENABLED" "dict" .Values.api "value_name" "domainsAPIEnabled") | indent 12 }} -{{ include "common.env-var" (dict "name" "PHONE_NUMBER_PRIVACY_API_ENABLED" "dict" .Values.api "value_name" "pnpAPIEnabled") | indent 12 }} - - name: DB_PASSWORD - valueFrom: - secretKeyRef: - name: db-password - key: db-password diff --git a/packages/helm-charts/odis/templates/signer-ingress.yaml b/packages/helm-charts/odis/templates/signer-ingress.yaml deleted file mode 100644 index b80dac8da6a..00000000000 --- a/packages/helm-charts/odis/templates/signer-ingress.yaml +++ /dev/null @@ -1,20 +0,0 @@ -apiVersion: networking.k8s.io/v1 -kind: Ingress -metadata: - name: odis-signer-ingress - annotations: - kubernetes.io/tls-acme: "true" -spec: - ingressClassName: {{ default "nginx" .Values.ingress.ingressClassName }} - tls: - - secretName: {{ .Release.Namespace }}-web-tls - rules: - - http: - paths: - - path: / - pathType: Prefix - backend: - service: - name: {{ include "name" . }} - port: - number: 3000 diff --git a/packages/helm-charts/odis/templates/signer-service.yaml b/packages/helm-charts/odis/templates/signer-service.yaml deleted file mode 100644 index 3cd4a69c404..00000000000 --- a/packages/helm-charts/odis/templates/signer-service.yaml +++ /dev/null @@ -1,13 +0,0 @@ -apiVersion: v1 -kind: Service -metadata: - name: {{ include "name" . }} - labels: -{{ include "labels" . | indent 4 }} -spec: - clusterIP: None - selector: -{{ include "odis-signer-component-label" . | indent 4 }} - ports: - - name: http - port: 3000 \ No newline at end of file diff --git a/packages/helm-charts/odis/values.yaml b/packages/helm-charts/odis/values.yaml deleted file mode 100644 index 2acba1ad469..00000000000 --- a/packages/helm-charts/odis/values.yaml +++ /dev/null @@ -1,80 +0,0 @@ -# Default values for odis. -# This is a YAML-formatted file. -# Declare variables to be passed into your templates. - -replicaCount: 1 - -image: - repository: nginx - pullPolicy: IfNotPresent - # Overrides the image tag whose default is the chart appVersion. - tag: "" - -imagePullSecrets: [] -nameOverride: "" -fullnameOverride: "" - -serviceAccount: - # Specifies whether a service account should be created - create: true - # Annotations to add to the service account - annotations: {} - # The name of the service account to use. - # If not set and create is true, a name is generated using the fullname template - name: "" - -podAnnotations: {} - -podSecurityContext: {} - # fsGroup: 2000 - -securityContext: {} - # capabilities: - # drop: - # - ALL - # readOnlyRootFilesystem: true - # runAsNonRoot: true - # runAsUser: 1000 - -service: - type: ClusterIP - port: 80 - -ingress: - enabled: false - ingressClassName: nginx - annotations: {} - # kubernetes.io/ingress.class: nginx - # kubernetes.io/tls-acme: "true" - hosts: - - host: chart-example.local - paths: [] - tls: [] - # - secretName: chart-example-tls - # hosts: - # - chart-example.local - -resources: {} - # We usually recommend not to specify default resources and to leave this as a conscious - # choice for the user. This also increases chances charts run on environments with little - # resources, such as Minikube. If you do want to specify resources, uncomment the following - # lines, adjust them as necessary, and remove the curly braces after 'resources:'. - # limits: - # cpu: 100m - # memory: 128Mi - # requests: - # cpu: 100m - # memory: 128Mi - -autoscaling: - enabled: false - minReplicas: 1 - maxReplicas: 100 - targetCPUUtilizationPercentage: 80 - # targetMemoryUtilizationPercentage: 80 - -nodeSelector: {} - -tolerations: [] - -affinity: {} diff --git a/packages/helm-charts/oracle-rbac/Chart.yaml b/packages/helm-charts/oracle-rbac/Chart.yaml deleted file mode 100644 index 4f10fc0d1a3..00000000000 --- a/packages/helm-charts/oracle-rbac/Chart.yaml +++ /dev/null @@ -1,5 +0,0 @@ -apiVersion: v1 -appVersion: '1.0' -description: A Helm chart to get the RBAC token needed by the oracle to reach the K8s API server -name: oracle-rbac -version: 0.3.0 diff --git a/packages/helm-charts/oracle-rbac/templates/_helper.tpl b/packages/helm-charts/oracle-rbac/templates/_helper.tpl deleted file mode 100644 index 75e25d060b7..00000000000 --- a/packages/helm-charts/oracle-rbac/templates/_helper.tpl +++ /dev/null @@ -1,11 +0,0 @@ -{{- define "name" -}} -{{- .Values.environment.name -}}-{{- .Values.environment.currencyPair | lower -}}-oracle-rbac-{{- .index -}} -{{- end -}} - -{{- define "secret-name" -}} -{{- .Values.environment.name -}}-{{- .Values.environment.currencyPair | lower -}}-oracle-rbac-secret-{{- .index -}} -{{- end -}} - -{{- define "oracle-pod-name" -}} -{{- .Values.environment.name -}}-{{- .Values.environment.currencyPair | lower -}}-oracle-{{- .index -}} -{{- end -}} diff --git a/packages/helm-charts/oracle-rbac/templates/role.yaml b/packages/helm-charts/oracle-rbac/templates/role.yaml deleted file mode 100644 index f5ede1a1b1b..00000000000 --- a/packages/helm-charts/oracle-rbac/templates/role.yaml +++ /dev/null @@ -1,13 +0,0 @@ -{{ range $index, $e := until (.Values.oracle.replicas | int) }} -{{- $index_counter := (dict "Values" $.Values "index" $index) -}} -apiVersion: rbac.authorization.k8s.io/v1 -kind: Role -metadata: - name: {{ template "name" $index_counter }} -rules: -- apiGroups: [""] - resources: ["pods"] - resourceNames: ["{{ template "oracle-pod-name" $index_counter }}"] - verbs: ["get", "patch"] ---- -{{ end }} diff --git a/packages/helm-charts/oracle-rbac/templates/rolebinding.yaml b/packages/helm-charts/oracle-rbac/templates/rolebinding.yaml deleted file mode 100644 index 787908aaa01..00000000000 --- a/packages/helm-charts/oracle-rbac/templates/rolebinding.yaml +++ /dev/null @@ -1,15 +0,0 @@ -{{ range $index, $e := until (.Values.oracle.replicas | int) }} -{{- $index_counter := (dict "Values" $.Values "index" $index) -}} -apiVersion: rbac.authorization.k8s.io/v1 -kind: RoleBinding -metadata: - name: {{ template "name" $index_counter }} -roleRef: - apiGroup: rbac.authorization.k8s.io - kind: Role - name: {{ template "name" $index_counter }} -subjects: -- kind: ServiceAccount - name: {{ template "name" $index_counter }} ---- -{{ end }} diff --git a/packages/helm-charts/oracle-rbac/templates/secret.yaml b/packages/helm-charts/oracle-rbac/templates/secret.yaml deleted file mode 100644 index b2bbda2cbad..00000000000 --- a/packages/helm-charts/oracle-rbac/templates/secret.yaml +++ /dev/null @@ -1,11 +0,0 @@ -{{ range $index, $e := until (.Values.oracle.replicas | int) }} -{{- $index_counter := (dict "Values" $.Values "index" $index) -}} -apiVersion: v1 -kind: Secret -type: kubernetes.io/service-account-token -metadata: - name: {{ template "secret-name" $index_counter }} - annotations: - kubernetes.io/service-account.name: {{ template "name" $index_counter }} ---- -{{ end }} diff --git a/packages/helm-charts/oracle-rbac/templates/service-account.yaml b/packages/helm-charts/oracle-rbac/templates/service-account.yaml deleted file mode 100644 index 004c0246758..00000000000 --- a/packages/helm-charts/oracle-rbac/templates/service-account.yaml +++ /dev/null @@ -1,8 +0,0 @@ -{{ range $index, $e := until (.Values.oracle.replicas | int) }} -{{- $index_counter := (dict "Values" $.Values "index" $index) -}} -apiVersion: v1 -kind: ServiceAccount -metadata: - name: {{ template "name" $index_counter}} ---- -{{ end }} diff --git a/packages/helm-charts/oracle-rbac/values.yaml b/packages/helm-charts/oracle-rbac/values.yaml deleted file mode 100644 index a7e48cd124c..00000000000 --- a/packages/helm-charts/oracle-rbac/values.yaml +++ /dev/null @@ -1,6 +0,0 @@ -environment: - name: default - currencyPair: CELOUSD - -oracle: - replicas: 1 diff --git a/packages/helm-charts/oracle/CELOBRL.yaml b/packages/helm-charts/oracle/CELOBRL.yaml deleted file mode 100644 index 21d64348f50..00000000000 --- a/packages/helm-charts/oracle/CELOBRL.yaml +++ /dev/null @@ -1,53 +0,0 @@ -oracle: - currencyPair: CELOBRL - aggregation: - mid: - maxExchangeVolumeShare: 1 - maxPercentageDeviation: 0.025 - maxPercentageBidAskSpread: 0.015 - metrics: - enabled: true - prometheusPort: 9090 - apiRequestTimeoutMs: 5000 - circuitBreakerPriceChangeThreshold: 0.25 - gasPriceMultiplier: 1.5 - priceSources: "[ - [ - { exchange: 'BINANCE', symbol: 'CELOBUSD', toInvert: false }, - { exchange: 'BINANCE', symbol: 'BUSDBRL', toInvert: false } - ], - [ - { exchange: 'BINANCE', symbol: 'CELOUSDT', toInvert: false }, - { exchange: 'BINANCE', symbol: 'USDTBRL', toInvert: false } - ], - [ - { exchange: 'BINANCE', symbol: 'CELOBTC', toInvert: false }, - { exchange: 'MERCADO', symbol: 'BTCBRL', toInvert: false } - ], - [ - { exchange: 'BINANCEUS', symbol: 'CELOUSD', toInvert: false }, - { exchange: 'BITSO', symbol: 'USDBRL', toInvert: false } - ], - [ - { exchange: 'COINBASE', symbol: 'CELOUSD', toInvert: false}, - { exchange: 'BITSO', symbol: 'USDBRL', toInvert: false } - ], - [ - { exchange: 'COINBASE', symbol: 'CELOBTC', toInvert: false }, - { exchange: 'NOVADAX', symbol: 'BTCBRL', toInvert: false } - ], - [ - { exchange: 'OKX', symbol: 'CELOUSDT', toInvert: false }, - { exchange: 'OKX', symbol: 'BTCUSDT', toInvert: true }, - { exchange: 'NOVADAX', symbol: 'BTCBRL', toInvert: false } - ], - [ - {exchange: 'KUCOIN', symbol: 'CELOUSDT', toInvert: false}, - { exchange: 'BITSO', symbol: 'USDTBRL', toInvert: false } - ] - ]" - minPriceSourceCount: 2 - reportStrategy: BLOCK_BASED - reporter: - blockBased: - minReportPriceChangeThreshold: 0.005 diff --git a/packages/helm-charts/oracle/CELOEUR.yaml b/packages/helm-charts/oracle/CELOEUR.yaml deleted file mode 100644 index ec0ac1b9e85..00000000000 --- a/packages/helm-charts/oracle/CELOEUR.yaml +++ /dev/null @@ -1,47 +0,0 @@ -oracle: - currencyPair: CELOEUR - aggregation: - mid: - maxExchangeVolumeShare: 1 - maxPercentageDeviation: 0.025 - maxPercentageBidAskSpread: 0.015 - metrics: - enabled: true - prometheusPort: 9090 - apiRequestTimeoutMs: 5000 - circuitBreakerPriceChangeThreshold: 0.25 - gasPriceMultiplier: 1.5 - priceSources: "[ - [ - {exchange: 'COINBASE', symbol: 'CELOEUR', toInvert: false}], - [ - {exchange: 'COINBASE', symbol: 'CELOUSD', toInvert: false}, - {exchange: 'COINBASE', symbol: 'USDTUSD', toInvert: true}, - {exchange: 'COINBASE', symbol: 'USDTEUR', toInvert: false} - ], - [ - {exchange: 'COINBASE', symbol: 'CELOBTC', toInvert: false}, - {exchange: 'COINBASE', symbol: 'BTCEUR', toInvert: false} - ], - [ - {exchange: 'BINANCE', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'BINANCE', symbol: 'EURUSDT', toInvert: true} - ], - [ - {exchange: 'BINANCE', symbol: 'CELOBTC', toInvert: false}, - {exchange: 'BINANCE', symbol: 'BTCEUR', toInvert: false} - ], - [ - {exchange: 'OKX', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'COINBASE', symbol: 'USDTEUR', toInvert: false} - ], - [ - {exchange: 'KUCOIN', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'KRAKEN', symbol: 'USDTEUR', toInvert: false} - ], - ]" - minPriceSourceCount: 2 - reportStrategy: BLOCK_BASED - reporter: - blockBased: - minReportPriceChangeThreshold: 0.005 diff --git a/packages/helm-charts/oracle/CELOKES.yaml b/packages/helm-charts/oracle/CELOKES.yaml deleted file mode 100644 index d10c50f4359..00000000000 --- a/packages/helm-charts/oracle/CELOKES.yaml +++ /dev/null @@ -1,95 +0,0 @@ -oracle: - currencyPair: CELOKES - overrideOracleCount: 12 # At 5s block time, every client reports once per minute - aggregation: - mid: - maxExchangeVolumeShare: 1 - maxPercentageDeviation: 0.01 - maxPercentageBidAskSpread: 0.03 - metrics: - enabled: true - prometheusPort: 9090 - apiRequestTimeoutMs: 5000 - circuitBreakerPriceChangeThreshold: 0.25 - gasPriceMultiplier: 1.5 - priceSources: "[ - [ - {exchange: 'BINANCE', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'KRAKEN', symbol: 'USDTUSD', toInvert: false}, - {exchange: 'ALPHAVANTAGE', symbol: 'USDKES', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'BINANCE', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'KRAKEN', symbol: 'USDTUSD', toInvert: false}, - {exchange: 'XIGNITE', symbol: 'USDKES', toInvert: false, ignoreVolume: true} - ], - - [ - {exchange: 'COINBASE', symbol: 'CELOUSD', toInvert: false}, - {exchange: 'ALPHAVANTAGE', symbol: 'USDKES', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'COINBASE', symbol: 'CELOUSD', toInvert: false}, - {exchange: 'XIGNITE', symbol: 'USDKES', toInvert: false, ignoreVolume: true} - ], - - [ - {exchange: 'OKX', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'BITSTAMP', symbol: 'USDTUSD', toInvert: false}, - {exchange: 'ALPHAVANTAGE', symbol: 'USDKES', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'OKX', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'BITSTAMP', symbol: 'USDTUSD', toInvert: false}, - {exchange: 'XIGNITE', symbol: 'USDKES', toInvert: false, ignoreVolume: true} - ], - - [ - {exchange: 'KUCOIN', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'COINBASE', symbol: 'USDTUSD', toInvert: false}, - {exchange: 'ALPHAVANTAGE', symbol: 'USDKES', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'KUCOIN', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'COINBASE', symbol: 'USDTUSD', toInvert: false}, - {exchange: 'XIGNITE', symbol: 'USDKES', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'BITMART', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'COINBASE', symbol: 'USDTUSD', toInvert: false}, - {exchange: 'ALPHAVANTAGE', symbol: 'USDKES', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'BITMART', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'COINBASE', symbol: 'USDTUSD', toInvert: false}, - {exchange: 'XIGNITE', symbol: 'USDKES', toInvert: false, ignoreVolume: true} - ], - - - - ]" - # Additional sources missing adapters [ - # {exchange: 'GATEIO', symbol: 'CELOUSDT', toInvert: false}, - # {exchange: 'BITSTAMP', symbol: 'USDTUSD', toInvert: false}, - # {exchange: 'ALPHAVANTAGE', symbol: 'USDKES', toInvert: false, ignoreVolume: true} - # ], - # [ - # {exchange: 'GATEIO', symbol: 'CELOUSDT', toInvert: false}, - # {exchange: 'BITSTAMP', symbol: 'USDTUSD', toInvert: false}, - # {exchange: 'XIGNITE', symbol: 'USDKES', toInvert: false, ignoreVolume: true} - # ], - # {exchange: 'BYBIT', symbol: 'CELOUSDT', toInvert: false}, - # {exchange: 'BITSTAMP', symbol: 'USDTUSD', toInvert: false}, - # {exchange: 'ALPHAVANTAGE', symbol: 'USDKES', toInvert: false, ignoreVolume: true} - # ], - # [ - # {exchange: 'BYBIT', symbol: 'CELOUSDT', toInvert: false}, - # {exchange: 'BITSTAMP', symbol: 'USDTUSD', toInvert: false}, - # {exchange: 'XIGNITE', symbol: 'USDKES', toInvert: false, ignoreVolume: true} - # ], - - minPriceSourceCount: 7 - reportStrategy: BLOCK_BASED - reporter: - blockBased: - minReportPriceChangeThreshold: 0.005 diff --git a/packages/helm-charts/oracle/CELOUSD.yaml b/packages/helm-charts/oracle/CELOUSD.yaml deleted file mode 100644 index c57e8bb17f7..00000000000 --- a/packages/helm-charts/oracle/CELOUSD.yaml +++ /dev/null @@ -1,46 +0,0 @@ -oracle: - currencyPair: CELOUSD - aggregation: - mid: - maxExchangeVolumeShare: 1 - maxPercentageDeviation: 0.025 - maxPercentageBidAskSpread: 0.015 - metrics: - enabled: true - prometheusPort: 9090 - apiRequestTimeoutMs: 5000 - circuitBreakerPriceChangeThreshold: 0.25 - gasPriceMultiplier: 1.5 - priceSources: "[ - [ - {exchange: 'COINBASE', symbol: 'CELOUSD', toInvert: false} - ], - [ - {exchange: 'COINBASE', symbol: 'CELOBTC', toInvert: false}, - {exchange: 'COINBASE', symbol: 'BTCUSD', toInvert: false} - ], - [ - {exchange: 'BINANCE', symbol: 'CELOBUSD', toInvert: false}, - {exchange: 'COINBASE', symbol: 'BUSDUSD', toInvert: false} - ], - [ - {exchange: 'BINANCE', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'KRAKEN', symbol: 'USDTUSD', toInvert: false} - ], - [ - {exchange: 'BINANCEUS', symbol: 'CELOUSD', toInvert: false } - ], - [ - {exchange: 'OKX', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'COINBASE', symbol: 'USDTUSD', toInvert: false} - ], - [ - {exchange: 'KUCOIN', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'KRAKEN', symbol: 'USDTUSD', toInvert: false} - ], - ]" - minPriceSourceCount: 2 - reportStrategy: BLOCK_BASED - reporter: - blockBased: - minReportPriceChangeThreshold: 0.005 diff --git a/packages/helm-charts/oracle/CELOXOF.yaml b/packages/helm-charts/oracle/CELOXOF.yaml deleted file mode 100644 index e719e990267..00000000000 --- a/packages/helm-charts/oracle/CELOXOF.yaml +++ /dev/null @@ -1,109 +0,0 @@ -oracle: - currencyPair: CELOXOF - overrideOracleCount: 12 # At 5s block time, every client reports once per minute - aggregation: - mid: - maxExchangeVolumeShare: 1 - maxPercentageDeviation: 0.025 - maxPercentageBidAskSpread: 0.015 - metrics: - enabled: true - prometheusPort: 9090 - apiRequestTimeoutMs: 5000 - circuitBreakerPriceChangeThreshold: 0.25 - gasPriceMultiplier: 1.5 - priceSources: "[ - [ - {exchange: 'BINANCE', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'BINANCE', symbol: 'EURUSDT', toInvert: true}, - {exchange: 'ALPHAVANTAGE', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'BINANCE', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'BINANCE', symbol: 'EURUSDT', toInvert: true}, - {exchange: 'CURRENCYAPI', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'BINANCE', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'BINANCE', symbol: 'EURUSDT', toInvert: true}, - {exchange: 'XIGNITE', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - - [ - {exchange: 'COINBASE', symbol: 'CELOUSD', toInvert: false}, - {exchange: 'COINBASE', symbol: 'USDTUSD', toInvert: true}, - {exchange: 'COINBASE', symbol: 'USDTEUR', toInvert: false}, - {exchange: 'ALPHAVANTAGE', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'COINBASE', symbol: 'CELOUSD', toInvert: false}, - {exchange: 'COINBASE', symbol: 'USDTUSD', toInvert: true}, - {exchange: 'COINBASE', symbol: 'USDTEUR', toInvert: false}, - {exchange: 'CURRENCYAPI', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'COINBASE', symbol: 'CELOUSD', toInvert: false}, - {exchange: 'COINBASE', symbol: 'USDTUSD', toInvert: true}, - {exchange: 'COINBASE', symbol: 'USDTEUR', toInvert: false}, - {exchange: 'XIGNITE', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - - [ - {exchange: 'OKX', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'BITSTAMP', symbol: 'USDTEUR', toInvert: false}, - {exchange: 'ALPHAVANTAGE', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'OKX', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'BITSTAMP', symbol: 'USDTEUR', toInvert: false}, - {exchange: 'CURRENCYAPI', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'OKX', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'BITSTAMP', symbol: 'USDTEUR', toInvert: false}, - {exchange: 'XIGNITE', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - - [ - {exchange: 'KUCOIN', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'KRAKEN', symbol: 'USDTEUR', toInvert: false}, - {exchange: 'ALPHAVANTAGE', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'KUCOIN', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'KRAKEN', symbol: 'USDTEUR', toInvert: false}, - {exchange: 'CURRENCYAPI', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'KUCOIN', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'KRAKEN', symbol: 'USDTEUR', toInvert: false}, - {exchange: 'XIGNITE', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - - [ - {exchange: 'BITGET', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'BITGET', symbol: 'USDTEUR', toInvert: false}, - {exchange: 'ALPHAVANTAGE', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'BITGET', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'BITGET', symbol: 'USDTEUR', toInvert: false}, - {exchange: 'CURRENCYAPI', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'BITGET', symbol: 'CELOUSDT', toInvert: false}, - {exchange: 'BITGET', symbol: 'USDTEUR', toInvert: false}, - {exchange: 'XIGNITE', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ] - ]" - # Additional sources missing adapters [ - # {exchange: 'UPBIT', symbol: 'CELOKRW', toInvert: false}, - # {exchange: 'UPBIT', symbol: 'BTCKRW', toInvert: true}, - # {exchange: 'KRAKEN', symbol: 'BTCEUR', toInvert: false}, - # {exchange: 'ALPHAVANTAGE', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - # ] - minPriceSourceCount: 9 - reportStrategy: BLOCK_BASED - reporter: - blockBased: - minReportPriceChangeThreshold: 0.005 diff --git a/packages/helm-charts/oracle/COPUSD.yaml b/packages/helm-charts/oracle/COPUSD.yaml deleted file mode 100644 index 76cf08c8c86..00000000000 --- a/packages/helm-charts/oracle/COPUSD.yaml +++ /dev/null @@ -1,27 +0,0 @@ -oracle: - currencyPair: COPUSD - overrideOracleCount: 12 # At 5s block time, every client reports once per minute - aggregation: - mid: - maxExchangeVolumeShare: 1 - maxPercentageDeviation: 0.01 - maxPercentageBidAskSpread: 0.03 - metrics: - enabled: true - prometheusPort: 9090 - apiRequestTimeoutMs: 5000 - circuitBreakerPriceChangeThreshold: 0.25 - gasPriceMultiplier: 1.5 - priceSources: "[ - [ - {exchange: 'ALPHAVANTAGE', symbol: 'USDCOP', toInvert: true} - ], - [ - {exchange: 'XIGNITE', symbol: 'COPUSD', toInvert: false} - ] - ]" - minPriceSourceCount: 2 - reportStrategy: BLOCK_BASED - reporter: - blockBased: - minReportPriceChangeThreshold: 0.0005 # 0.05% diff --git a/packages/helm-charts/oracle/Chart.yaml b/packages/helm-charts/oracle/Chart.yaml deleted file mode 100644 index 01ca063f2e0..00000000000 --- a/packages/helm-charts/oracle/Chart.yaml +++ /dev/null @@ -1,9 +0,0 @@ -apiVersion: v1 -appVersion: '1.0' -description: A Helm chart for the oracle client -name: oracle -version: 0.2.1 -dependencies: - - name: common - repository: oci://us-west1-docker.pkg.dev/devopsre/clabs-public-oci - version: 0.2.0 diff --git a/packages/helm-charts/oracle/EUROCEUR.yaml b/packages/helm-charts/oracle/EUROCEUR.yaml deleted file mode 100644 index 7ead0c93ec6..00000000000 --- a/packages/helm-charts/oracle/EUROCEUR.yaml +++ /dev/null @@ -1,39 +0,0 @@ -oracle: - currencyPair: EUROCEUR - aggregation: - mid: - maxExchangeVolumeShare: 1 - maxPercentageDeviation: 0.01 - maxPercentageBidAskSpread: 0.0125 - metrics: - enabled: true - prometheusPort: 9090 - apiRequestTimeoutMs: 5000 - circuitBreakerPriceChangeThreshold: 0.25 - gasPriceMultiplier: 1.5 - priceSources: "[ - [ - {exchange: 'COINBASE', symbol: 'EUROCEUR', toInvert: false} - ], - [ - {exchange: 'COINBASE', symbol: 'EUROCUSD', toInvert: false}, - {exchange: 'COINBASE', symbol: 'USDTUSD', toInvert: true}, - {exchange: 'COINBASE', symbol: 'USDTEUR', toInvert: false} - ], - [ - {exchange: 'BITSTAMP', symbol: 'EUROCEUR', toInvert: false}, - ], - [ - {exchange: 'BITMART', symbol: 'EUROCUSDT', toInvert: false}, - {exchange: 'KRAKEN', symbol: 'USDTEUR', toInvert: false} - ], - [ - {exchange: 'BITMART', symbol: 'EUROCUSDC', toInvert: false}, - {exchange: 'KRAKEN', symbol: 'USDCEUR', toInvert: false} - ] - ]" - minPriceSourceCount: 2 - reportStrategy: BLOCK_BASED - reporter: - blockBased: - minReportPriceChangeThreshold: 0.0005 # 0.05% diff --git a/packages/helm-charts/oracle/EUROCXOF.yaml b/packages/helm-charts/oracle/EUROCXOF.yaml deleted file mode 100644 index 9d5483bd9ee..00000000000 --- a/packages/helm-charts/oracle/EUROCXOF.yaml +++ /dev/null @@ -1,97 +0,0 @@ -oracle: - currencyPair: EUROCXOF - overrideOracleCount: 12 # At 5s block time, every client reports once per minute - aggregation: - mid: - maxExchangeVolumeShare: 1 - maxPercentageDeviation: 0.01 - maxPercentageBidAskSpread: 0.0075 - metrics: - enabled: true - prometheusPort: 9090 - apiRequestTimeoutMs: 5000 - circuitBreakerPriceChangeThreshold: 0.25 - gasPriceMultiplier: 1.5 - priceSources: "[ - [ - {exchange: 'COINBASE', symbol: 'EUROCEUR', toInvert: false}, - {exchange: 'ALPHAVANTAGE', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'COINBASE', symbol: 'EUROCEUR', toInvert: false}, - {exchange: 'CURRENCYAPI', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'COINBASE', symbol: 'EUROCEUR', toInvert: false}, - {exchange: 'XIGNITE', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - - [ - {exchange: 'COINBASE', symbol: 'EUROCUSD', toInvert: false}, - {exchange: 'COINBASE', symbol: 'USDTUSD', toInvert: true}, - {exchange: 'COINBASE', symbol: 'USDTEUR', toInvert: false}, - {exchange: 'ALPHAVANTAGE', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'COINBASE', symbol: 'EUROCUSD', toInvert: false}, - {exchange: 'COINBASE', symbol: 'USDTUSD', toInvert: true}, - {exchange: 'COINBASE', symbol: 'USDTEUR', toInvert: false}, - {exchange: 'CURRENCYAPI', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'COINBASE', symbol: 'EUROCUSD', toInvert: false}, - {exchange: 'COINBASE', symbol: 'USDTUSD', toInvert: true}, - {exchange: 'COINBASE', symbol: 'USDTEUR', toInvert: false}, - {exchange: 'XIGNITE', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - - [ - {exchange: 'BITSTAMP', symbol: 'EUROCEUR', toInvert: false}, - {exchange: 'ALPHAVANTAGE', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'BITSTAMP', symbol: 'EUROCEUR', toInvert: false}, - {exchange: 'CURRENCYAPI', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'BITSTAMP', symbol: 'EUROCEUR', toInvert: false}, - {exchange: 'XIGNITE', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - - [ - {exchange: 'BITMART', symbol: 'EUROCUSDT', toInvert: false}, - {exchange: 'KRAKEN', symbol: 'USDTEUR', toInvert: false}, - {exchange: 'ALPHAVANTAGE', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'BITMART', symbol: 'EUROCUSDT', toInvert: false}, - {exchange: 'KRAKEN', symbol: 'USDTEUR', toInvert: false}, - {exchange: 'CURRENCYAPI', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'BITMART', symbol: 'EUROCUSDT', toInvert: false}, - {exchange: 'KRAKEN', symbol: 'USDTEUR', toInvert: false}, - {exchange: 'XIGNITE', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - - [ - {exchange: 'BITMART', symbol: 'EUROCUSDC', toInvert: false}, - {exchange: 'KRAKEN', symbol: 'USDCEUR', toInvert: false}, - {exchange: 'ALPHAVANTAGE', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'BITMART', symbol: 'EUROCUSDC', toInvert: false}, - {exchange: 'KRAKEN', symbol: 'USDCEUR', toInvert: false}, - {exchange: 'CURRENCYAPI', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ], - [ - {exchange: 'BITMART', symbol: 'EUROCUSDC', toInvert: false}, - {exchange: 'KRAKEN', symbol: 'USDCEUR', toInvert: false}, - {exchange: 'XIGNITE', symbol: 'EURXOF', toInvert: false, ignoreVolume: true} - ] - ]" - minPriceSourceCount: 6 - reportStrategy: BLOCK_BASED - reporter: - blockBased: - minReportPriceChangeThreshold: 0.0005 # 0.05% diff --git a/packages/helm-charts/oracle/EURXOF.yaml b/packages/helm-charts/oracle/EURXOF.yaml deleted file mode 100644 index 6a4e9367f40..00000000000 --- a/packages/helm-charts/oracle/EURXOF.yaml +++ /dev/null @@ -1,30 +0,0 @@ -oracle: - currencyPair: EURXOF - overrideOracleCount: 12 # At 5s block time, every client reports once per minute - aggregation: - mid: - maxExchangeVolumeShare: 1 - maxPercentageDeviation: 0.01 - maxPercentageBidAskSpread: 0.005 - metrics: - enabled: true - prometheusPort: 9090 - apiRequestTimeoutMs: 5000 - circuitBreakerPriceChangeThreshold: 0.25 - gasPriceMultiplier: 1.5 - priceSources: "[ - [ - {exchange: 'ALPHAVANTAGE', symbol: 'EURXOF', toInvert: false} - ], - [ - {exchange: 'CURRENCYAPI', symbol: 'EURXOF', toInvert: false} - ], - [ - {exchange: 'XIGNITE', symbol: 'EURXOF', toInvert: false} - ] - ]" - minPriceSourceCount: 3 - reportStrategy: BLOCK_BASED - reporter: - blockBased: - minReportPriceChangeThreshold: 0.0005 # 0.05% diff --git a/packages/helm-charts/oracle/KESUSD.yaml b/packages/helm-charts/oracle/KESUSD.yaml deleted file mode 100644 index 731c76316be..00000000000 --- a/packages/helm-charts/oracle/KESUSD.yaml +++ /dev/null @@ -1,31 +0,0 @@ -oracle: - currencyPair: KESUSD - overrideOracleCount: 12 # At 5s block time, every client reports once per minute - aggregation: - mid: - maxExchangeVolumeShare: 1 - maxPercentageDeviation: 0.01 - maxPercentageBidAskSpread: 0.03 - metrics: - enabled: true - prometheusPort: 9090 - apiRequestTimeoutMs: 5000 - circuitBreakerPriceChangeThreshold: 0.25 - gasPriceMultiplier: 1.5 - priceSources: "[ - [ - {exchange: 'ALPHAVANTAGE', symbol: 'KESUSD', toInvert: false} - ], - [ - {exchange: 'XIGNITE', symbol: 'KESUSD', toInvert: false} - ] - ]" - # Additional sources missing adapters [ - # [ - # {exchange: 'OPENEXCHANGERATES', symbol: 'KESUSD', toInvert: false} - # ], - minPriceSourceCount: 2 - reportStrategy: BLOCK_BASED - reporter: - blockBased: - minReportPriceChangeThreshold: 0.0005 # 0.05% diff --git a/packages/helm-charts/oracle/USDCBRL.yaml b/packages/helm-charts/oracle/USDCBRL.yaml deleted file mode 100644 index 94080e764f5..00000000000 --- a/packages/helm-charts/oracle/USDCBRL.yaml +++ /dev/null @@ -1,47 +0,0 @@ -oracle: - currencyPair: USDCBRL - aggregation: - mid: - maxExchangeVolumeShare: 1 - maxPercentageDeviation: 0.01 - maxPercentageBidAskSpread: 0.005 - metrics: - enabled: true - prometheusPort: 9090 - apiRequestTimeoutMs: 5000 - circuitBreakerPriceChangeThreshold: 0.25 - gasPriceMultiplier: 1.5 - priceSources: "[ - [ - { exchange: 'MERCADO', symbol: 'USDCBRL', toInvert: false } - ], - [ - {exchange: 'KRAKEN', symbol: 'USDCUSD', toInvert: false}, - {exchange: 'BITSO', symbol: 'USDBRL', toInvert: false } - ], - [ - {exchange: 'COINBASE', symbol: 'USDTUSDC', toInvert: true}, - {exchange: 'BINANCE', symbol: 'USDTBRL', toInvert: false } - ], - [ - {exchange: 'WHITEBIT', symbol: 'USDCUSDT', toInvert: false}, - {exchange: 'BINANCE', symbol: 'USDTBRL', toInvert: false } - ], - [ - {exchange: 'KRAKEN', symbol: 'BTCUSDC', toInvert: true}, - {exchange: 'BINANCE', symbol: 'BTCBRL', toInvert: false } - ], - [ - {exchange: 'COINBASE', symbol: 'USDTUSDC', toInvert: true}, - {exchange: 'BITGET', symbol: 'USDTBRL', toInvert: false } - ], - [ - {exchange: 'WHITEBIT', symbol: 'USDCUSDT', toInvert: false}, - {exchange: 'BITGET', symbol: 'USDTBRL', toInvert: false } - ], - ]" - minPriceSourceCount: 2 - reportStrategy: BLOCK_BASED - reporter: - blockBased: - minReportPriceChangeThreshold: 0.0005 # 0.05% diff --git a/packages/helm-charts/oracle/USDCEUR.yaml b/packages/helm-charts/oracle/USDCEUR.yaml deleted file mode 100644 index 112008c9edb..00000000000 --- a/packages/helm-charts/oracle/USDCEUR.yaml +++ /dev/null @@ -1,35 +0,0 @@ -oracle: - currencyPair: USDCEUR - aggregation: - mid: - maxExchangeVolumeShare: 1 - maxPercentageDeviation: 0.01 - maxPercentageBidAskSpread: 0.005 - metrics: - enabled: true - prometheusPort: 9090 - apiRequestTimeoutMs: 5000 - circuitBreakerPriceChangeThreshold: 0.25 - gasPriceMultiplier: 1.5 - priceSources: "[ - [ - {exchange: 'KRAKEN', symbol: 'USDCEUR', toInvert: false} - ], - [ - {exchange: 'COINBASE', symbol: 'USDTUSDC', toInvert: true}, - {exchange: 'COINBASE', symbol: 'USDTEUR', toInvert: false} - ], - [ - {exchange: 'WHITEBIT', symbol: 'USDCUSDT', toInvert: false}, - {exchange: 'COINBASE', symbol: 'USDTEUR', toInvert: false} - ], - [ - {exchange: 'KRAKEN', symbol: 'USDCUSD', toInvert: false}, - {exchange: 'KRAKEN', symbol: 'EURUSD', toInvert: true} - ], - ]" - minPriceSourceCount: 2 - reportStrategy: BLOCK_BASED - reporter: - blockBased: - minReportPriceChangeThreshold: 0.0005 # 0.05% diff --git a/packages/helm-charts/oracle/USDCUSD.yaml b/packages/helm-charts/oracle/USDCUSD.yaml deleted file mode 100644 index 929976120ee..00000000000 --- a/packages/helm-charts/oracle/USDCUSD.yaml +++ /dev/null @@ -1,51 +0,0 @@ -oracle: - currencyPair: USDCUSD - aggregation: - mid: - maxExchangeVolumeShare: 1 - maxPercentageDeviation: 0.005 - maxPercentageBidAskSpread: 0.005 - metrics: - enabled: true - prometheusPort: 9090 - apiRequestTimeoutMs: 5000 - circuitBreakerPriceChangeThreshold: 0.25 - gasPriceMultiplier: 1.5 - priceSources: "[ - [ - {exchange: 'BINANCE', symbol: 'USDCUSDT', toInvert: false}, - {exchange: 'KRAKEN', symbol: 'USDTUSD', toInvert: false } - ], - [ - {exchange: 'KRAKEN', symbol: 'USDCUSD', toInvert: false} - ], - [ - {exchange: 'BITSTAMP', symbol: 'USDCUSD', toInvert: false} - ], - [ - {exchange: 'COINBASE', symbol: 'USDTUSDC', toInvert: true}, - {exchange: 'COINBASE', symbol: 'USDTUSD', toInvert: false} - ], - [ - {exchange: 'OKX', symbol: 'USDCUSDT', toInvert: false}, - {exchange: 'BITSTAMP', symbol: 'USDTUSD', toInvert: false} - ], - [ - {exchange: 'BITGET', symbol: 'USDCUSDT', toInvert: false}, - {exchange: 'KRAKEN', symbol: 'USDTUSD', toInvert: false} - ], - [ - {exchange: 'KUCOIN', symbol: 'USDCUSDT', toInvert: false}, - {exchange: 'COINBASE', symbol: 'USDTUSD', toInvert: false} - ] - ]" - # Additional sources missing adapters - # [ - # {exchange: 'Bybit', symbol: 'USDTUSDC', toInvert: true}, - # {exchange: 'Kraken', symbol: 'USDTUSD', toInvert: false} - # ], - minPriceSourceCount: 5 - reportStrategy: BLOCK_BASED - reporter: - blockBased: - minReportPriceChangeThreshold: 0.0005 # 0.05% diff --git a/packages/helm-charts/oracle/USDTUSD.yaml b/packages/helm-charts/oracle/USDTUSD.yaml deleted file mode 100644 index 1a979a67e7d..00000000000 --- a/packages/helm-charts/oracle/USDTUSD.yaml +++ /dev/null @@ -1,44 +0,0 @@ -oracle: - currencyPair: USDTUSD - aggregation: - mid: - maxExchangeVolumeShare: 1 - maxPercentageDeviation: 0.005 - maxPercentageBidAskSpread: 0.005 - metrics: - enabled: true - prometheusPort: 9090 - apiRequestTimeoutMs: 5000 - circuitBreakerPriceChangeThreshold: 0.25 - gasPriceMultiplier: 1.5 - priceSources: "[ - [ - {exchange: 'OKX', symbol: 'USDCUSDT', toInvert: true}, - {exchange: 'KRAKEN', symbol: 'USDCUSD', toInvert: false } - ], - [ - {exchange: 'KRAKEN', symbol: 'USDTUSD', toInvert: false} - ], - [ - {exchange: 'BITSTAMP', symbol: 'USDTUSD', toInvert: false} - ], - [ - {exchange: 'COINBASE', symbol: 'USDTUSD', toInvert: false} - ], - ]" - # Additional sources missing adapters - # [ - # {exchange: 'BYBIT', symbol: 'USDCUSDT', toInvert: true}, - # {exchange: 'BITSTAMP', symbol: 'USDCUSD', toInvert: false } - # ], - # - # https://api.bybit.com/v5/market/tickers?category=spot&symbol=USDCUSDT - # [ - # {exchange: 'CRYPTO', symbol: 'USDTUSD', toInvert: false} - # ], - # https://api.crypto.com/exchange/v1/public/get-tickers?instrument_name=USDT_USD - minPriceSourceCount: 3 # 4 with additional sources - reportStrategy: BLOCK_BASED - reporter: - blockBased: - minReportPriceChangeThreshold: 0.0005 # 0.05% diff --git a/packages/helm-charts/oracle/templates/_helper.tpl b/packages/helm-charts/oracle/templates/_helper.tpl deleted file mode 100644 index ece6db7dbb1..00000000000 --- a/packages/helm-charts/oracle/templates/_helper.tpl +++ /dev/null @@ -1,55 +0,0 @@ -{{/* -The name of the deployment -*/}} -{{- define "name" -}} -{{- .Values.environment.name -}}-{{- .Values.oracle.currencyPair | lower -}}-oracle -{{- end -}} - -{{/* -Common labels that are recommended to be used by Helm and Kubernetes -*/}} -{{- define "labels" -}} -app.kubernetes.io/name: {{ template "name" . }} -helm.sh/chart: {{ .Chart.Name }}-{{ .Chart.Version | replace "+" "_" }} -app.kubernetes.io/managed-by: {{ .Release.Service }} -app.kubernetes.io/instance: {{ .Release.Name }} -{{- end -}} - -{{/* -Annotations to indicate to the prometheus server that this node should be scraped for metrics -*/}} -{{- define "metric-annotations" -}} -prometheus.io/scrape: "true" -prometheus.io/port: "{{ .Values.oracle.metrics.prometheusPort }}" -{{- end -}} - -{{/* -Label specific to the oracle client component -*/}} -{{- define "oracle-client-component-label" -}} -app.kubernetes.io/component: oracle-client -{{- end -}} - -{{/* -The name of the azure identity binding for all oracles -*/}} -{{- define "azure-identity-binding-name" -}} -{{- with .dot -}}{{ template "name" . }}{{- end -}}-{{ .index }}-identity-binding -{{- end -}} - -{{/* -The name of the azure identity for all oracles -*/}} -{{- define "azure-identity-name" -}} -{{- with .dot -}}{{ template "name" . }}{{- end -}}-{{ .index }}-identity -{{- end -}} - -{{/* -The name of the pkey secret -*/}} -{{- define "pkey-secret-name" -}} -pkey-secret-{{- .Values.oracle.currencyPair | lower -}} -{{- end -}} -{{- define "api-keys-secret-name" -}} -api-keys-{{- .Values.oracle.currencyPair | lower -}} -{{- end -}} diff --git a/packages/helm-charts/oracle/templates/api-keys-secret.yaml b/packages/helm-charts/oracle/templates/api-keys-secret.yaml deleted file mode 100644 index db2d40d782a..00000000000 --- a/packages/helm-charts/oracle/templates/api-keys-secret.yaml +++ /dev/null @@ -1,9 +0,0 @@ -apiVersion: v1 -kind: Secret -metadata: - name: {{ template "api-keys-secret-name" . }} - labels: -{{ include "labels" . | indent 4 }} -type: Opaque -data: - api_keys: {{ .Values.oracle.api_keys | b64enc }} diff --git a/packages/helm-charts/oracle/templates/azure-identity-binding.yaml b/packages/helm-charts/oracle/templates/azure-identity-binding.yaml deleted file mode 100644 index 14d73a9da3f..00000000000 --- a/packages/helm-charts/oracle/templates/azure-identity-binding.yaml +++ /dev/null @@ -1,12 +0,0 @@ -{{- range $index, $identity := .Values.oracle.identities -}} -{{ if (hasKey $identity "azure") }} -apiVersion: "aadpodidentity.k8s.io/v1" -kind: AzureIdentityBinding -metadata: - name: {{ template "azure-identity-binding-name" (dict "dot" $ "index" $index) }} -spec: - azureIdentity: {{ template "azure-identity-name" (dict "dot" $ "index" $index) }} - selector: {{ template "azure-identity-binding-name" (dict "dot" $ "index" $index) }} ---- -{{ end }} -{{ end }} diff --git a/packages/helm-charts/oracle/templates/azure-identity.yaml b/packages/helm-charts/oracle/templates/azure-identity.yaml deleted file mode 100644 index 1ed705377b7..00000000000 --- a/packages/helm-charts/oracle/templates/azure-identity.yaml +++ /dev/null @@ -1,15 +0,0 @@ -{{- range $index, $identity := .Values.oracle.identities -}} -{{ if (hasKey $identity "azure") }} -apiVersion: aadpodidentity.k8s.io/v1 -kind: AzureIdentity -metadata: - name: {{ template "azure-identity-name" (dict "dot" $ "index" $index) }} - annotations: - aadpodidentity.k8s.io/Behavior: namespaced -spec: - type: 0 - resourceID: {{ $identity.azure.id }} - clientID: {{ $identity.azure.clientId }} ---- -{{ end }} -{{ end }} diff --git a/packages/helm-charts/oracle/templates/pkey-secret.yaml b/packages/helm-charts/oracle/templates/pkey-secret.yaml deleted file mode 100644 index 6436ee7bbac..00000000000 --- a/packages/helm-charts/oracle/templates/pkey-secret.yaml +++ /dev/null @@ -1,13 +0,0 @@ -apiVersion: v1 -kind: Secret -metadata: - name: {{ template "pkey-secret-name" . }} - labels: -{{ include "labels" . | indent 4 }} -type: Opaque -data: -{{ range $index, $identity := .Values.oracle.identities }} -{{ if (hasKey $identity "privateKey") }} - private-key-{{ $index }}: {{ $identity.privateKey | b64enc }} -{{ end }} -{{ end }} diff --git a/packages/helm-charts/oracle/templates/statefulset.yaml b/packages/helm-charts/oracle/templates/statefulset.yaml deleted file mode 100644 index dbc43037927..00000000000 --- a/packages/helm-charts/oracle/templates/statefulset.yaml +++ /dev/null @@ -1,163 +0,0 @@ -apiVersion: v1 -kind: Service -metadata: - name: {{ template "name" . }} - labels: -{{ include "labels" . | indent 4 }} -{{ include "oracle-client-component-label" . | indent 4 }} -spec: - clusterIP: None - selector: -{{ include "oracle-client-component-label" . | indent 4 }} ---- -apiVersion: apps/v1 -kind: StatefulSet -metadata: - name: {{ template "name" . }} - labels: -{{ include "labels" . | indent 4 }} -{{ include "oracle-client-component-label" . | indent 4 }} -spec: - podManagementPolicy: Parallel - updateStrategy: - type: RollingUpdate - replicas: {{ .Values.oracle.replicas }} - serviceName: oracle - selector: - matchLabels: -{{ include "labels" . | indent 6 }} -{{ include "oracle-client-component-label" . | indent 6 }} - template: - metadata: - labels: -{{ include "labels" . | indent 8 }} -{{ include "oracle-client-component-label" . | indent 8 }} - annotations: -{{ if .Values.oracle.metrics.enabled }} -{{ include "metric-annotations" . | indent 8 }} -{{ end }} - spec: -{{ if .Values.kube.serviceAccountSecretNames }} - initContainers: - - name: set-metadata - image: {{ .Values.kubectl.image.repository }}:{{ .Values.kubectl.image.tag }} - command: - - /bin/bash - - -c - args: - - | - RID=${POD_NAME##*-} - TOKEN_ENV_VAR_NAME="TOKEN_$RID" - {{ if (eq .Values.kube.cloudProvider "azure") }} - # Azure - CMD="label pod $POD_NAME aadpodidbinding=$POD_NAME-identity-binding" - {{ else }} - # AWS - ROLE_ARNS={{- range $index, $identity := .Values.oracle.identities -}}{{ $identity.aws.roleArn }},{{- end }} - ROLE_ARN=`echo -n $ROLE_ARNS | cut -d ',' -f $((RID + 1))` - CMD="annotate pod $POD_NAME iam.amazonaws.com/role=$ROLE_ARN" - {{ end }} - - kubectl \ - --namespace "$POD_NAMESPACE" \ - --server="https://kubernetes.default.svc" \ - --token="${!TOKEN_ENV_VAR_NAME}" \ - --certificate-authority="/var/run/secrets/kubernetes.io/serviceaccount/ca.crt" \ - --overwrite=true \ - $CMD - env: - - name: POD_NAMESPACE - valueFrom: - fieldRef: - fieldPath: metadata.namespace - - name: POD_NAME - valueFrom: - fieldRef: - fieldPath: metadata.name - {{ range $index, $e := .Values.kube.serviceAccountSecretNames }} - - name: TOKEN_{{ $index }} - valueFrom: - secretKeyRef: - key: token - name: {{ $e }} - {{ end }} -{{ end }} - containers: - - name: oracle-client - image: {{ .Values.image.repository }}:{{ .Values.image.tag }} - imagePullPolicy: Always - ports: - - name: prometheus - containerPort: {{ .Values.oracle.metrics.prometheusPort }} - command: - - bash - - "-c" - - | - [[ $REPLICA_NAME =~ -([0-9]+)$ ]] || exit 1 - RID=${BASH_REMATCH[1]} - - # Set the private key path. If Azure HSM signing is specified, - # it will take precedence. - export PRIVATE_KEY_PATH="/private-keys/private-key-$RID" - - # Get the correct key vault name. If this oracle's identity is not - # using Azure HSM signing, the key vault name will be empty and ignored - AZURE_KEY_VAULT_NAMES={{- range $index, $identity := .Values.oracle.identities -}}{{- if (hasKey $identity "azure" ) -}}{{ $identity.azure.keyVaultName | default "" }}{{- end }},{{- end }} - export AZURE_KEY_VAULT_NAME=`echo -n $AZURE_KEY_VAULT_NAMES | cut -d ',' -f $((RID + 1))` - - # Get the correct oracle account address - ADDRESSES={{- range $index, $identity := .Values.oracle.identities -}}{{ $identity.address }},{{- end }} - export ADDRESS=`echo -n $ADDRESSES | cut -d ',' -f $((RID + 1))` - - exec pnpm start - env: - - name: REPLICA_NAME - valueFrom: - fieldRef: - fieldPath: metadata.name - - name: API_KEYS - valueFrom: - secretKeyRef: - key: api_keys - name: {{ template "api-keys-secret-name" . }} -{{ include "common.env-var" (dict "name" "API_REQUEST_TIMEOUT" "dict" .Values.oracle "value_name" "apiRequestTimeoutMs" "optional" true) | indent 8 }} -{{ include "common.env-var" (dict "name" "AZURE_HSM_INIT_TRY_COUNT" "dict" .Values.oracle.azureHsm "value_name" "initTryCount") | indent 8 }} -{{ include "common.env-var" (dict "name" "AZURE_HSM_INIT_MAX_RETRY_BACKOFF_MS" "dict" .Values.oracle.azureHsm "value_name" "initMaxRetryBackoffMs") | indent 8 }} -{{ include "common.env-var" (dict "name" "CIRCUIT_BREAKER_PRICE_CHANGE_THRESHOLD" "dict" .Values.oracle "value_name" "circuitBreakerPriceChangeThreshold") | indent 8 }} -{{ include "common.env-var" (dict "name" "CURRENCY_PAIR" "dict" .Values.oracle "value_name" "currencyPair") | indent 8 }} -{{ include "common.env-var" (dict "name" "MINIMUM_PRICE_SOURCES" "dict" .Values.oracle "value_name" "minPriceSourceCount") | indent 8 }} -{{ include "common.env-var" (dict "name" "PRICE_SOURCES" "dict" .Values.oracle "value_name" "priceSources") | indent 8 }} -{{ include "common.env-var" (dict "name" "GAS_PRICE_MULTIPLIER" "dict" .Values.oracle "value_name" "gasPriceMultiplier") | indent 8 }} -{{ include "common.env-var" (dict "name" "HTTP_RPC_PROVIDER_URL" "dict" .Values.oracle.rpcProviderUrls "value_name" "http") | indent 8 }} -{{ include "common.env-var" (dict "name" "MAX_BLOCK_TIMESTAMP_AGE_MS" "dict" .Values.oracle "value_name" "maxBlockTimestampAgeMs" "optional" true) | indent 8 }} -{{ include "common.env-var" (dict "name" "METRICS" "dict" .Values.oracle.metrics "value_name" "enabled") | indent 8 }} -{{ include "common.env-var" (dict "name" "MID_AGGREGATION_MAX_PERCENTAGE_DEVIATION" "dict" .Values.oracle.aggregation.mid "value_name" "maxPercentageDeviation" "optional" true) | indent 8 }} -{{ include "common.env-var" (dict "name" "MID_AGGREGATION_MAX_EXCHANGE_VOLUME_SHARE" "dict" .Values.oracle.aggregation.mid "value_name" "maxExchangeVolumeShare" "optional" true) | indent 8 }} -{{ include "common.env-var" (dict "name" "MID_AGGREGATION_MAX_PERCENTAGE_BID_ASK_SPREAD" "dict" .Values.oracle.aggregation.mid "value_name" "maxPercentageBidAskSpread" "optional" true) | indent 8 }} -{{ include "common.env-var" (dict "name" "MIN_REPORT_PRICE_CHANGE_THRESHOLD" "dict" .Values.oracle.reporter.blockBased "value_name" "minReportPriceChangeThreshold" "optional" true) | indent 8 }} -{{ include "common.env-var" (dict "name" "OVERRIDE_INDEX" "dict" .Values.oracle "value_name" "overrideIndex" "optional" true) | indent 8 }} -{{ include "common.env-var" (dict "name" "OVERRIDE_ORACLE_COUNT" "dict" .Values.oracle "value_name" "overrideOracleCount" "optional" true) | indent 8 }} -{{ include "common.env-var" (dict "name" "PRIVATE_KEY_PATH" "dict" .Values.oracle "value_name" "privateKeyPath" "optional" true) | indent 8 }} -{{ include "common.env-var" (dict "name" "PROMETHEUS_PORT" "dict" .Values.oracle.metrics "value_name" "prometheusPort") | indent 8 }} -{{ include "common.env-var" (dict "name" "REPORT_STRATEGY" "dict" .Values.oracle "value_name" "reportStrategy") | indent 8 }} -{{ include "common.env-var" (dict "name" "REPORT_TARGET_OVERRIDE" "dict" .Values.oracle "value_name" "reportTargetOverride" "optional" true) | indent 8 }} -{{ include "common.env-var" (dict "name" "TARGET_MAX_HEARTBEAT_PERIOD_MS" "dict" .Values.oracle.reporter.blockBased "value_name" "targetMaxHeartbeatPeriodMs" "optional" true) | indent 8 }} -{{ include "common.env-var" (dict "name" "UNUSED_ORACLE_ADDRESSES" "dict" .Values.oracle "value_name" "unusedOracleAddresses") | indent 8 }} -{{ include "common.env-var" (dict "name" "WALLET_TYPE" "dict" .Values.oracle "value_name" "walletType") | indent 8 }} -{{ include "common.env-var" (dict "name" "WS_RPC_PROVIDER_URL" "dict" .Values.oracle.rpcProviderUrls "value_name" "ws") | indent 8 }} - readinessProbe: - exec: - command: - - /celo-oracle/readinessProbe.sh - - "{{ .Values.oracle.metrics.prometheusPort }}" - - "{{ .Values.oracle.currencyPair }}" - initialDelaySeconds: 10 - periodSeconds: 5 - volumeMounts: - - name: private-key-volume - readOnly: true - mountPath: "/private-keys" - volumes: - - name: private-key-volume - secret: - secretName: {{ template "pkey-secret-name" . }} diff --git a/packages/helm-charts/oracle/values.yaml b/packages/helm-charts/oracle/values.yaml deleted file mode 100644 index e9138382150..00000000000 --- a/packages/helm-charts/oracle/values.yaml +++ /dev/null @@ -1,62 +0,0 @@ -# This file is intended to show the expected value structure with placeholder values. -# Many values are optional, and the defaults are left up to the client. -# These values are commented out in this file, but show the correct structure -# if they were to be specified. - -environment: - name: test - -image: - repository: oracletest.azurecr.io/test/oracle - tag: hsmtest - -kubectl: - image: - repository: bitnami/kubectl - tag: 1.29.3 - -kube: - cloudProvider: azure - -oracle: - replicas: 2 - rpcProviderUrls: - ws: wss://alfajoresstaging-forno.celo-testnet.org - http: https://alfajoresstaging-forno.celo-testnet.org - identities: - - address: "0x0000000000000000000000000000000000000000" - azure: - id: defaultId - clientId: defaultClientId - keyVaultName: defaultKeyVaultName - - address: "0x0000000000000000000000000000000000000001" - azure: - id: defaultId1 - clientId: defaultClientId1 - keyVaultName: defaultKeyVaultName1 - azureHsm: - initTryCount: 5 - initMaxRetryBackoffMs: 30000 - aggregation: - mid: - maxExchangeVolumeShare: 1 - askMaxPercentageDeviation: 0.05 - bidMaxPercentageDeviation: 0.05 - maxPercentageBidAskSpread: 0.025 - # minReportPriceChangeThreshold - metrics: - enabled: true - prometheusPort: 9090 - apiRequestTimeoutMs: 5000 - circuitBreakerPriceChangeThreshold: 0.25 - gasPriceMultiplier: 6 - reportStrategy: BLOCK_BASED - reporter: - blockBased: - minReportPriceChangeThreshold: 0.005 # 0.5% - # targetMaxHeartbeatPeriodMs - # privateKeyPath - # unusedOracleAddresses - # overrideIndex - # overrideOracleCount - # maxBlockTimestampAgeMs diff --git a/packages/helm-charts/prometheus-stackdriver/Chart.yaml b/packages/helm-charts/prometheus-stackdriver/Chart.yaml deleted file mode 100644 index 9ce46dd137c..00000000000 --- a/packages/helm-charts/prometheus-stackdriver/Chart.yaml +++ /dev/null @@ -1,5 +0,0 @@ -apiVersion: v1 -appVersion: "1.0" -description: A Helm chart for installing prometheus and the stackdriver-prometheus-sidecar -name: prometheus-stackdriver -version: 0.1.0 diff --git a/packages/helm-charts/prometheus-stackdriver/templates/_helpers.tpl b/packages/helm-charts/prometheus-stackdriver/templates/_helpers.tpl deleted file mode 100644 index d7be3393d0c..00000000000 --- a/packages/helm-charts/prometheus-stackdriver/templates/_helpers.tpl +++ /dev/null @@ -1,58 +0,0 @@ -{{/* -Expand the name of the chart. -*/}} -{{- define "prometheus-stackdriver.name" -}} -{{- default .Chart.Name .Values.nameOverride | trunc 63 | trimSuffix "-" }} -{{- end }} - -{{/* -Create a default fully qualified app name. -We truncate at 63 chars because some Kubernetes name fields are limited to this (by the DNS naming spec). -If release name contains chart name it will be used as a full name. -*/}} -{{- define "prometheus-stackdriver.fullname" -}} -{{- if .Values.fullnameOverride }} -{{- .Values.fullnameOverride | trunc 63 | trimSuffix "-" }} -{{- else }} -{{- $name := default .Chart.Name .Values.nameOverride }} -{{- if contains $name .Release.Name }} -{{- .Release.Name | trunc 63 | trimSuffix "-" }} -{{- else }} -{{- printf "%s-%s" .Release.Name $name | trunc 63 | trimSuffix "-" }} -{{- end }} -{{- end }} -{{- end }} - -{{/* -Create chart name and version as used by the chart label. -*/}} -{{- define "prometheus-stackdriver.chart" -}} -{{- printf "%s-%s" .Chart.Name .Chart.Version | replace "+" "_" | trunc 63 | trimSuffix "-" }} -{{- end }} - -{{/* -Common labels -*/}} -{{- define "prometheus-stackdriver.labels" -}} -helm.sh/chart: {{ include "prometheus-stackdriver.chart" . }} -{{ include "prometheus-stackdriver.selectorLabels" . }} -{{- if .Chart.AppVersion }} -app.kubernetes.io/version: {{ .Chart.AppVersion | quote }} -{{- end }} -app.kubernetes.io/managed-by: {{ .Release.Service }} -{{- end }} - -{{/* -Selector labels -*/}} -{{- define "prometheus-stackdriver.selectorLabels" -}} -app.kubernetes.io/name: {{ include "prometheus-stackdriver.name" . }} -app.kubernetes.io/instance: {{ .Release.Name }} -{{- end }} - -{{/* -Create the name of the service account to use -*/}} -{{- define "prometheus-stackdriver.serviceAccountName" -}} -{{- default (include "prometheus-stackdriver.fullname" .) .Values.serviceAccount.name }} -{{- end }} diff --git a/packages/helm-charts/prometheus-stackdriver/templates/configmap.yaml b/packages/helm-charts/prometheus-stackdriver/templates/configmap.yaml deleted file mode 100644 index 613631d329a..00000000000 --- a/packages/helm-charts/prometheus-stackdriver/templates/configmap.yaml +++ /dev/null @@ -1,177 +0,0 @@ -#Copyright 2019 Google LLC - -#Licensed under the Apache License, Version 2.0 (the "License"); -#you may not use this file except in compliance with the License. -#You may obtain a copy of the License at - -#https://www.apache.org/licenses/LICENSE-2.0 - -#Unless required by applicable law or agreed to in writing, software -#distributed under the License is distributed on an "AS IS" BASIS, -#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -#See the License for the specific language governing permissions and -#limitations under the License. - -apiVersion: v1 -kind: ConfigMap -metadata: - name: prometheus-server-conf - labels: -{{ include "prometheus-stackdriver.labels" . | indent 4 }} -data: - prometheus.yml: |- - global: - scrape_interval: 60s - evaluation_interval: 5s - # Label the metrics with a custom label if using multiple prometheus for same environments - external_labels: - cluster_name: {{ required "Valid .Values.cluster entry required!" .Values.cluster }} - - {{- if .Values.remote_write }} - remote_write: - - basic_auth: - password: {{ .Values.remote_write.basic_auth.password }} - username: {{ .Values.remote_write.basic_auth.username }} - url: {{ .Values.remote_write.url }} - write_relabel_configs: - {{- with .Values.remote_write.write_relabel_configs }} - {{- toYaml . | nindent 8 }} - {{- end }} - {{- end }} - - scrape_configs: - {{- if .Values.jobs.prometheus | default true }} - - job_name: 'prometheus' - scrape_interval: 20s - static_configs: - - targets: ['localhost:9090'] - {{- end }} - - {{- if .Values.jobs.kubernetes_apiservers | default true }} - - job_name: 'kubernetes-apiservers' - kubernetes_sd_configs: - - role: endpoints - scheme: https - tls_config: - ca_file: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt - bearer_token_file: /var/run/secrets/kubernetes.io/serviceaccount/token - relabel_configs: - - source_labels: [__meta_kubernetes_namespace, __meta_kubernetes_service_name, __meta_kubernetes_endpoint_port_name] - action: keep - regex: default;kubernetes;https - {{- end }} - - {{- if .Values.jobs.kubernetes_nodes | default true }} - - job_name: 'kubernetes-nodes' - scheme: https - tls_config: - ca_file: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt - bearer_token_file: /var/run/secrets/kubernetes.io/serviceaccount/token - kubernetes_sd_configs: - - role: node - relabel_configs: - - action: labelmap - regex: __meta_kubernetes_node_label_(.+) - - target_label: __address__ - replacement: kubernetes.default.svc:443 - - source_labels: [__meta_kubernetes_node_name] - regex: (.+) - target_label: __metrics_path__ - replacement: /api/v1/nodes/${1}/proxy/metrics - {{- end }} - - {{- if .Values.jobs.kubernetes_pods | default true }} - - job_name: 'kubernetes-pods' - kubernetes_sd_configs: - - role: pod - relabel_configs: - - source_labels: [__meta_kubernetes_pod_annotation_prometheus_io_scrape] - action: keep - regex: true - - source_labels: [__meta_kubernetes_pod_annotation_prometheus_io_path] - action: replace - target_label: __metrics_path__ - regex: (.+) - - source_labels: [__address__, __meta_kubernetes_pod_annotation_prometheus_io_port] - action: replace - regex: ([^:]+)(?::\d+)?;(\d+) - replacement: $1:$2 - target_label: __address__ - - action: labelmap - regex: __meta_kubernetes_pod_label_(.+) - - source_labels: [__meta_kubernetes_namespace] - action: replace - target_label: kubernetes_namespace - - source_labels: [__meta_kubernetes_pod_name] - action: replace - target_label: kubernetes_pod_name - {{- end }} - - {{- if .Values.jobs.kubernetes_cadvisor | default true }} - - job_name: 'kubernetes-cadvisor' - scheme: https - tls_config: - ca_file: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt - bearer_token_file: /var/run/secrets/kubernetes.io/serviceaccount/token - kubernetes_sd_configs: - - role: node - relabel_configs: - - action: labelmap - regex: __meta_kubernetes_node_label_(.+) - - target_label: __address__ - replacement: kubernetes.default.svc:443 - - source_labels: [__meta_kubernetes_node_name] - regex: (.+) - target_label: __metrics_path__ - replacement: /api/v1/nodes/${1}/proxy/metrics/cadvisor - {{- end }} - - {{- if .Values.jobs.kubernetes_service_endpoints | default true }} - - job_name: 'kubernetes-service-endpoints' - kubernetes_sd_configs: - - role: endpoints - relabel_configs: - - source_labels: [__meta_kubernetes_service_annotation_prometheus_io_scrape] - action: keep - regex: true - - source_labels: [__meta_kubernetes_service_annotation_prometheus_io_scheme] - action: replace - target_label: __scheme__ - regex: (https?) - - source_labels: [__meta_kubernetes_service_annotation_prometheus_io_path] - action: replace - target_label: __metrics_path__ - regex: (.+) - - source_labels: [__address__, __meta_kubernetes_service_annotation_prometheus_io_port] - action: replace - target_label: __address__ - regex: ([^:]+)(?::\d+)?;(\d+) - replacement: $1:$2 - - action: labelmap - regex: __meta_kubernetes_service_label_(.+) - - source_labels: [__meta_kubernetes_namespace] - action: replace - target_label: kubernetes_namespace - - source_labels: [__meta_kubernetes_service_name] - action: replace - target_label: kubernetes_name - {{- end }} - -{{- if .Values.scrapeJob }} - {{- if .Values.scrapeJob.Name }} - - job_name: {{ .Values.scrapeJob.Name }} - {{- end }} - static_configs: - {{- if .Values.scrapeJob.Targets }} - - targets: - {{- range .Values.scrapeJob.Targets }} - - {{ . }} - {{- end -}} - {{- end -}} - {{ if .Values.scrapeJob.Labels }} - labels: - {{- range .Values.scrapeJob.Labels }} - {{ . }} - {{- end -}} - {{- end -}} -{{- end -}} \ No newline at end of file diff --git a/packages/helm-charts/prometheus-stackdriver/templates/deployment.yaml b/packages/helm-charts/prometheus-stackdriver/templates/deployment.yaml deleted file mode 100644 index a053d85ee3c..00000000000 --- a/packages/helm-charts/prometheus-stackdriver/templates/deployment.yaml +++ /dev/null @@ -1,103 +0,0 @@ -#Copyright 2019 Google LLC - -#Licensed under the Apache License, Version 2.0 (the "License"); -#you may not use this file except in compliance with the License. -#You may obtain a copy of the License at - -#https://www.apache.org/licenses/LICENSE-2.0 - -#Unless required by applicable law or agreed to in writing, software -#distributed under the License is distributed on an "AS IS" BASIS, -#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -#See the License for the specific language governing permissions and -#limitations under the License. - -apiVersion: apps/v1 -kind: Deployment -metadata: - name: prometheus - labels: -{{ include "prometheus-stackdriver.labels" . | indent 4 }} -spec: - replicas: 1 - strategy: - type: Recreate - selector: - matchLabels: - app: prometheus-server - template: - metadata: - annotations: - prometheus.io/scrape: "true" - prometheus.io/path: "/metrics" - prometheus.io/port: "9090" - labels: -{{ include "prometheus-stackdriver.labels" . | indent 8 }} - app: prometheus-server - spec: - securityContext: - runAsUser: 1000 - runAsGroup: 1000 - fsGroup: 1000 - serviceAccountName: {{ .Values.serviceAccount.name }} - containers: - - name: prometheus - image: prom/prometheus:{{ .Values.prometheus.imageTag }} - resources: - requests: - memory: 4Gi - cpu: 2 - args: - - "--config.file=/etc/prometheus/prometheus.yml" - - "--storage.tsdb.retention.time={{ .Values.prometheus.retention_time | default "7d" }}" - - "--storage.tsdb.path=/prometheus/" - ports: - - containerPort: 9090 - volumeMounts: - - name: prometheus-config-volume - mountPath: /etc/prometheus/ - - name: prometheus-storage-volume - mountPath: /prometheus/ -{{- if not .Values.stackdriver.disabled }} - - name: sidecar - image: gcr.io/stackdriver-prometheus/stackdriver-prometheus-sidecar:{{ .Values.stackdriver.sidecar.imageTag }} - imagePullPolicy: Always - args: - - --stackdriver.project-id={{ .Values.gcloud.project }} - - --prometheus.wal-directory=/prometheus/wal - - --stackdriver.kubernetes.location={{ .Values.gcloud.region }} - - --stackdriver.kubernetes.cluster-name={{ .Values.cluster }} -{{ if .Values.stackdriver.includeFilter -}} -{{ indent 12 (printf "- --include=%s" .Values.stackdriver.includeFilter) }} -{{- end -}} -{{/* This used to be enabled, but now is not. Enable this for oracle clusters only. */}} -{{- if (and true .Values.stackdriver.metricsPrefix) }} - - --stackdriver.metrics-prefix={{ .Values.stackdriver.metricsPrefix }} -{{- end }} - ports: - - name: sidecar - containerPort: 9091 - volumeMounts: - - name: prometheus-storage-volume - mountPath: /prometheus -{{- if not (kindIs "invalid" .Values.stackdriver.gcloudServiceAccountKeyBase64) }} - - name: prometheus-service-account-key - mountPath: /var/secrets/google - env: - - name: GOOGLE_APPLICATION_CREDENTIALS - value: /var/secrets/google/prometheus-service-account.json -{{- end }} -{{- end }} - volumes: - - name: prometheus-config-volume - configMap: - defaultMode: 420 - name: prometheus-server-conf - - name: prometheus-storage-volume - persistentVolumeClaim: - claimName: prometheus -{{- if not (kindIs "invalid" .Values.stackdriver.gcloudServiceAccountKeyBase64) }} - - name: prometheus-service-account-key - secret: - secretName: prometheus-service-account-key -{{- end }} diff --git a/packages/helm-charts/prometheus-stackdriver/templates/gcloud-service-account-key-secret.yaml b/packages/helm-charts/prometheus-stackdriver/templates/gcloud-service-account-key-secret.yaml deleted file mode 100644 index 34527b976d2..00000000000 --- a/packages/helm-charts/prometheus-stackdriver/templates/gcloud-service-account-key-secret.yaml +++ /dev/null @@ -1,11 +0,0 @@ -{{- if not (kindIs "invalid" .Values.stackdriver.gcloudServiceAccountKeyBase64) }} -apiVersion: v1 -kind: Secret -metadata: - name: prometheus-service-account-key - labels: -{{ include "prometheus-stackdriver.labels" . | indent 4 }} -type: Opaque -data: - prometheus-service-account.json: {{ .Values.stackdriver.gcloudServiceAccountKeyBase64 }} -{{ end -}} diff --git a/packages/helm-charts/prometheus-stackdriver/templates/prometheus-pvc.yaml b/packages/helm-charts/prometheus-stackdriver/templates/prometheus-pvc.yaml deleted file mode 100644 index 2cafdaa9a3e..00000000000 --- a/packages/helm-charts/prometheus-stackdriver/templates/prometheus-pvc.yaml +++ /dev/null @@ -1,15 +0,0 @@ -apiVersion: v1 -kind: PersistentVolumeClaim -metadata: - labels: -{{ include "prometheus-stackdriver.labels" . | indent 4 }} - name: prometheus -spec: - accessModes: - - ReadWriteOnce - resources: - requests: - storage: {{ .Values.storageSize }} -{{- if .Values.storageClassName }} - storageClassName: {{ .Values.storageClassName }} -{{- end }} diff --git a/packages/helm-charts/prometheus-stackdriver/templates/service-account.yaml b/packages/helm-charts/prometheus-stackdriver/templates/service-account.yaml deleted file mode 100644 index 3dd19cb03cb..00000000000 --- a/packages/helm-charts/prometheus-stackdriver/templates/service-account.yaml +++ /dev/null @@ -1,62 +0,0 @@ -#Copyright 2019 Google LLC - -#Licensed under the Apache License, Version 2.0 (the "License"); -#you may not use this file except in compliance with the License. -#You may obtain a copy of the License at - -#https://www.apache.org/licenses/LICENSE-2.0 - -#Unless required by applicable law or agreed to in writing, software -#distributed under the License is distributed on an "AS IS" BASIS, -#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -#See the License for the specific language governing permissions and -#limitations under the License. - -apiVersion: rbac.authorization.k8s.io/v1 -kind: ClusterRole -metadata: - name: prometheus - labels: - {{- include "prometheus-stackdriver.labels" . | nindent 4 }} -rules: -- apiGroups: [""] - resources: - - nodes - - nodes/proxy - - services - - endpoints - - pods - verbs: ["get", "list", "watch"] -- apiGroups: - - extensions - resources: - - ingresses - verbs: ["get", "list", "watch"] -- nonResourceURLs: ["/metrics"] - verbs: ["get"] ---- -apiVersion: rbac.authorization.k8s.io/v1 -kind: ClusterRoleBinding -metadata: - name: prometheus - labels: -{{ include "prometheus-stackdriver.labels" . | indent 4 }} -roleRef: - apiGroup: rbac.authorization.k8s.io - kind: ClusterRole - name: prometheus -subjects: -- kind: ServiceAccount - name: {{ include "prometheus-stackdriver.serviceAccountName" . }} - namespace: {{ .Release.Namespace }} ---- -apiVersion: v1 -kind: ServiceAccount -metadata: - name: {{ include "prometheus-stackdriver.serviceAccountName" . }} - labels: - {{- include "prometheus-stackdriver.labels" . | nindent 4 }} - {{- with .Values.serviceAccount.annotations }} - annotations: - {{- toYaml . | nindent 4 }} - {{- end }} diff --git a/packages/helm-charts/prometheus-stackdriver/templates/service.yaml b/packages/helm-charts/prometheus-stackdriver/templates/service.yaml deleted file mode 100644 index c374552cf32..00000000000 --- a/packages/helm-charts/prometheus-stackdriver/templates/service.yaml +++ /dev/null @@ -1,17 +0,0 @@ -apiVersion: v1 -kind: Service -metadata: - name: prometheus-server - labels: -{{ include "prometheus-stackdriver.labels" . | indent 4 }} -spec: - ports: - - name: web - port: 9090 - protocol: TCP - targetPort: 9090 - selector: - app: prometheus-server - sessionAffinity: None - type: ClusterIP - diff --git a/packages/helm-charts/prometheus-stackdriver/values.yaml b/packages/helm-charts/prometheus-stackdriver/values.yaml deleted file mode 100644 index 201a465f94b..00000000000 --- a/packages/helm-charts/prometheus-stackdriver/values.yaml +++ /dev/null @@ -1,230 +0,0 @@ -cluster: null - -gcloud: - project: null - region: null - -namespace: prometheus - -jobs: {} - -prometheus: - imageTag: v2.27.1 - retention_time: 7d - -# TODO: remove values from the .env files... -scrapeJob: {} - # Labels: null - # Name: null - # Targets: null - -serviceAccount: - # Annotations to add to the service account - annotations: {} - # The name of the service account to use. - # If not set and create is true, a name is generated using the fullname template - name: null - -stackdriver: - disabled: true - # When sending metrics from outside GCP to GCP Stackdriver. - gcloudServiceAccountKeyBase64: null - # To save $, don't send metrics to SD that probably won't be used. - # nginx metrics currently breaks sidecar. - # Stackdriver allows a maximum of 10 custom labels. - # kube-state-metrics has some metrics of the form "kube_.+_labels" - # that provides the labels of k8s resources as metric labels. - # If some k8s resources have too many labels, this results in a bunch of - # errors when the sidecar tries to send metrics to Stackdriver. - includeFilter: >- - {job=~".+", - __name__!~"apiserver_.+", - __name__!~"container_cpu_load_average_10s", - __name__!~"container_cpu_system_seconds_total", - __name__!~"container_cpu_user_seconds_total", - __name__!~"container_file_.+", - __name__!~"container_fs_[^w].+", - __name__!~"container_last_.+", - __name__!~"container_memory_[^uw].*", - __name__!~"container_network_.+", - __name__!~"container_processes", - __name__!~"container_sockets", - __name__!~"container_spec_.+", - __name__!~"container_start_.+", - __name__!~"container_tasks_state", - __name__!~"container_threads", - __name__!~"container_threads_max", - __name__!~"erlang_.+", - __name__!~"etcd_.+", - __name__!~"kube_.+_labels", - __name__!~"kube_certificatesigningrequest_.+", - __name__!~"kube_configmap_.+", - __name__!~"kube_cronjob_.+", - __name__!~"kube_endpoint_.+", - __name__!~"kube_horizontalpodautoscaler_.+", - __name__!~"kube_ingress_.+", - __name__!~"kube_job_.+", - __name__!~"kube_lease_.+", - __name__!~"kube_limitrange_.+", - __name__!~"kube_mutatingwebhookconfiguration_.+", - __name__!~"kube_namespace_.+", - __name__!~"kube_networkpolicy_.+", - __name__!~"kube_pod_[^cs].+", - __name__!~"kube_pod_container_[^r].+", - __name__!~"kube_pod_container_status_last_terminated_reason", - __name__!~"kube_pod_container_status_terminated_reason", - __name__!~"kube_pod_container_status_waiting_reason", - __name__!~"kube_poddisruptionbudget_.+", - __name__!~"kube_replicaset_.+", - __name__!~"kube_replicationcontroller_.+", - __name__!~"kube_resourcequota_.+", - __name__!~"kube_secret_.+", - __name__!~"kube_service_.+", - __name__!~"kube_node_status_condition", - __name__!~"kube_storageclass_.+", - __name__!~"kube_service_.+", - __name__!~"kube_validatingwebhookconfiguration_.+", - __name__!~"kube_verticalpodautoscaler_.+", - __name__!~"kube_volumeattachment_.+", - __name__!~"kubelet_.+", - __name__!~"nginx_.+", - __name__!~"phoenix_.+", - __name__!~"rest_client_.+", - __name__!~"storage_.+", - __name__!~"workqueue_.+"} - # metricsPrefix: external.googleapis.com/prometheus/prefix - sidecar: - imageTag: 0.7.3 - -# storageClassName: null -storageSize: 50Gi - -remote_write: - url: null - basic_auth: - username: null - password: null - write_relabel_configs: - - action: drop - regex: "(aadpodidentity.*\ - |aggregator_.*\ - |apiserver_.+\ - |authentication.*\ - |cadvisor_version_info\ - |certmanager.*\ - |cloudprovider_gce.*\ - |configconnector.*\ - |container_blkio.*\ - |container_cpu_cfs_throttled_seconds_total\ - |container_cpu_load_average_10s\ - |container_cpu_system_seconds_total\ - |container_cpu_user_seconds_total\ - |container_file_.+\ - |container_fs_.+\ - |container_last_.+\ - |container_memory_[^wu].*\ - |container_network_.+\ - |container_processes\ - |container_scrape_error\ - |container_sockets\ - |container_spec_.+\ - |container_start_.+\ - |container_tasks_state\ - |container_threads_max\ - |container_threads\ - |container_ulimits.*\ - |coredns.*\ - |cortex.*\ - |csi_operations.*\ - |erlang_.+\ - |etcd_.+\ - |go_gc_.*\ - |go_info.*\ - |go_memstats_.*\ - |go_threads.*\ - |grpc_client_.*\ - |jaeger_.*\ - |kube_configmap.*\ - |kube_deployment_created\ - |kube_deployment_labels\ - |kube_deployment_metadata_generation\ - |kube_deployment_spec_paused\ - |kube_deployment_spec_replicas\ - |kube_deployment_spec_strategy_rollingupdate_max_surge\ - |kube_deployment_spec_strategy_rollingupdate_max_unavailable\ - |kube_deployment_status_condition\ - |kube_deployment_status_observed_generation\ - |kube_deployment_status_replicas_unavailable\ - |kube_deployment_status_replicas_updated\ - |kube_endpoint.*\ - |kube_job.*\ - |kube_namespace.*\ - |kube_node_status_condition\ - |kube_persistentvolume.*\ - |kube_pod_[^cs].+\ - |kube_pod_container_info\ - |kube_pod_container_state_started\ - |kube_pod_container_status_last_terminated_reason\ - |kube_pod_container_status_ready\ - |kube_pod_container_status_running\ - |kube_pod_container_status_terminated_reason\ - |kube_pod_container_status_terminated\ - |kube_pod_container_status_waiting_reason\ - |kube_pod_container_status_waiting\ - |kube_pod_status_ready\ - |kube_pod_status_scheduled\ - |kube_pod_status_scheduled_time\ - |kube_replicaset.*\ - |kube_resourcequota.*\ - |kube_secret_.+\ - |kube_service.*\ - |kubedns.*\ - |kubelet_[^v].+\ - |loki.*\ - |machine_cpu_physical_cores\ - |machine_cpu_sockets\ - |machine_scrape.*\ - |net_conntrack_.*\ - |nginx_.+\ - |nodejs_active_.*\ - |nodejs_gc_.*\ - |oracle_action_duration_bucket\ - |oracle_exchange_api_request_duration_seconds_bucket\ - |oracle_exchange_api_request_error_count\ - |oracle_ticker_property\ - |phoenix_.+\ - |process_cpu_system.*\ - |process_cpu_user.*\ - |process_disk_reads_total\ - |process_disk_writes_total\ - |process_heap_bytes\ - |process_involuntary_context_switches_total\ - |process_io_pagefaults_total\ - |process_max_fds\ - |process_max_resident_memory_bytes\ - |process_noio_pagefaults_total\ - |process_open_fds\ - |process_signals_delivered_total\ - |process_start_time_seconds\ - |process_swaps_total\ - |process_threads_total\ - |process_uptime_seconds\ - |process_virtual_memory_bytes\ - |process_virtual_memory_max_bytes\ - |process_voluntary_context_switches_total\ - |prometheus.*\ - |promhttp_.*\ - |python.*\ - |relay_nodejs_gc_.*\ - |rest_client_.+\ - |scrape.*\ - |state_.+\ - |storage_.+\ - |workqueue_.+)" - source_labels: [__name__] - - action: drop - regex: lens-metrics - source_labels: [kubernetes_namespace] - - action: drop - regex: (container_.*|kube_.*);(kube.+|lens-metrics|default|kong|pl) - source_labels: [__name__,namespace] diff --git a/packages/helm-charts/promtail/README.md b/packages/helm-charts/promtail/README.md deleted file mode 100644 index 727d8d242ce..00000000000 --- a/packages/helm-charts/promtail/README.md +++ /dev/null @@ -1,29 +0,0 @@ -# Promtail - -Helm chart values to manage the Promtail deployment used to ingest k8s logs into Grafana Cloud's Loki instance. - -- Documentation: - - - - -- Helm repo: -- Code: - -## Configuration - -- Promtail ingests Kubernetes logs from allow-listed namespaces. -- Nothing else gets scraped (eg: ) — to be re-evaluated. -- Promtail exposes metrics to Prometheus, which can scrape them. See . -- Each k8s cluster has its own ServiceAccount. - -## Querying logs - -Head to and select the logs datasource. -Have a look at the LogQL specs: . - -## Deployment - -Use `celotool`: - -```sh -celotool deploy {initial,upgrade,destroy} promtail -e [--context ] -``` diff --git a/packages/helm-charts/promtail/values.yaml b/packages/helm-charts/promtail/values.yaml deleted file mode 100644 index f48a513aa6c..00000000000 --- a/packages/helm-charts/promtail/values.yaml +++ /dev/null @@ -1,41 +0,0 @@ -annotations: - # Enable Prometheus scraping of log based metrics, - # see https://grafana.com/docs/loki/latest/clients/promtail/configuration/#pipeline_stages: metrics - prometheus.io/scrape: 'true' - prometheus.io/port: 'http-metrics' - -config: - # add a destination for the logs ingestion - # lokiAddress: - - snippets: - pipelineStages: - - cri: {} - - docker: {} - - extraRelabelConfigs: - # Do not ingest logs for fluentd. - - action: drop - source_labels: - - app - regex: fluentd-log-agent - # Keep Kubernetes pod labels. - - action: labelmap - regex: __meta_kubernetes_pod_label_(.+) - # Only keep logs from these namespaces. - - action: keep - source_labels: - - namespace - # TODO: reconsider what should be kept from the default namespace - # eg: ingress controller - regex: (alfajores|baklava|blockscout|default|komenci|rc1|rc1staging|staging|walletconnect) - - action: labeldrop - regex: (app_kubernetes_io_component|app_kubernetes_io_instance|app.kubernetes.io/managed-by|app_kubernetes_io_managed_by|app_kubernetes_io_name|app_kubernetes_io_version|controller-revision-hash|controller_revision_hash|pod_template_hash|filename) - -serviceAccount: - # add a iam.gke.io/gcp-service-account, see - # https://cloud.google.com/kubernetes-engine/docs/how-to/workload-identity#gcloud - annotations: {} - -updateStrategy: - type: RollingUpdate diff --git a/packages/helm-charts/testnet/Chart.yaml b/packages/helm-charts/testnet/Chart.yaml deleted file mode 100644 index 530e1567823..00000000000 --- a/packages/helm-charts/testnet/Chart.yaml +++ /dev/null @@ -1,22 +0,0 @@ ---- -name: testnet -apiVersion: v1 -version: 0.2.1 -description: private Ethereum network Helm chart for Kubernetes -keywords: - - celo - - ethereum - - blockchain -home: https://www.celo.org/ -sources: - - https://github.com/celo-org/celo-monorepo -maintainers: - - name: celo - email: contact@celo.org -icon: https://avatars1.githubusercontent.com/u/37552875?s=200&v=4 -appVersion: v1.0.0 -dependencies: - - name: common - repository: oci://us-west1-docker.pkg.dev/devopsre/clabs-public-oci - version: 0.2.7 - diff --git a/packages/helm-charts/testnet/README.md b/packages/helm-charts/testnet/README.md deleted file mode 100644 index 45c1f0f69cd..00000000000 --- a/packages/helm-charts/testnet/README.md +++ /dev/null @@ -1,132 +0,0 @@ -# Testnet Helm Chart - -This helm chart allows you to deploy testnets, on which you can deploy smart contracts or interact with our app. See the README at the parent folder for more general Helm - -`NAMESPACE_NAME` is the Kubernetes namespace all Kubernetes primitives are getting deployed to. This isolates various networks from each other. `RELEASE_NAME` is the helm chart release name, i.e. a consistent name that refers to the primitives as a group. By convention, `NAMESPACE_NAME` and `RELEASE_NAME` should be the same name and just use [a-z0-9\-] characters so that most scripts you just pass `NAME` instead of having to specify all the names separately. However if you would like to, you can generally use the `-r` or `-n` flags to do so. - -(These commands assume your current path is at packages/helm-charts ) - -Deploy a release of the helm chart by running the following command: - -```bash -export NAME=my-name -./testnet/scripts/create-network.sh -r $NAME -``` - -> if you are deploying to non-development envs(i.e. testnet_dev, testnet_staging or testnet_prod), pass also `celo-testnet` as domain name -> or the ethstats domain will not resolve properly: `./testnet/scripts/create-network.sh -r $NAME -d celo-testnet` - -You can also upgrade an environment by passing the `-u` flag: - -```bash -export NAME=my-name -./testnet/scripts/create-network.sh -u -r $NAME -``` - -The output of the above should have more interesting instructions regarding getting info of the nodes you'll likely want to connect to. You'll have to wait a minute or two for the loadbalancers to provision. You can also use a script to produce the necessary connection info for the mobile package. - -```bash -./testnet/scripts/write-mobile-network-config.sh $NAME -``` - -If you need to connect via RPC, you can run: - -```bash -./testnet/scripts/port-forward.sh $NAME -``` - -All the port-forward script really does is find the pod under the `gethminer1` service of your release and port-forwards it to your machine. - -So a contract deploy as per [protocol README](../../protocol/README.md) would look like: - -```bash -# pwd: .../packages/protocol -# portforward is active -# Don't forget to set $NAME in the new terminal -yarn run init-network -n $NAME -``` - -You can then share the contract build artifacts by running: - -```bash -yarn run upload-artifacts -n $NAME -``` - -This will upload the build artifacts to the cluster, and can be consequently downloaded via: - -```bash -yarn run download-artifacts -n $NAME -``` - -This will download the build artifacts to your build folder, as if you deployed the contracts yourself. - -You should be sure to update the appropriate yaml file in `packages/blockchain-api/` with the addresses of the GoldToken and StableToken proxy contracts. - -The last step is to update the contract ABIs and addresses for use in the mobile app as per [mobile README](../../mobile/README.md) - -```bash -# pwd: .../packages/mobile -yarn run update-contracts --testnets=testnet_prod,integration,argentinastaging,argentinaproduction,$NAME -``` - -After you are done, you can (and should after usage) teardown your testnet by running: - -```console -./testnet/scripts/destroy-network.sh -r $NAME -``` - -## Geth Docker Images - -Docker images for Geth (and other services) are built automatically by [Google Cloud Build](https://console.cloud.google.com/cloud-build/triggers?organizationId=54829595577&project=celo-testnet) when a PR is raised or merged to master. - -To try out local changes to Geth, use `celotool`. Alternatively, to deploy a dev version of Geth to a Helm release without pushing commits, you can build your own Docker image locally. - -First, install [Docker](https://store.docker.com/editions/community/docker-ce-desktop-mac). You'll need to create an account with Docker to do this. It's a bit painful to install Docker via Homebrew. - -Then run: - -```console -cd $CELO/geth -make clean && make all -./dockerize_testnet.sh -p PROJECT_ID -t TAG -``` - -where TAG is any old string. - -For a Docker build reflecting an actual commit of geth we tend to use the commit hash: - -```console -git rev-parse HEAD -``` - -This script will produce and upload two Docker images (one for Geth regular nodes, and one for the Bootnode) to the [GCP Container Registry](https://console.cloud.google.com/gcr/images/celo-testnet/GLOBAL/testnet-geth) under `gcr.io/PROJECT_ID/testnet-geth:TAG`. - -You can then start a network with your custom builds by modifying the `geth/image/repository/tag` value in `values.yaml`. (Alternatively you can pass the values in your `helm install` command with `--set geth.miner.tag=TAG` but that makes it harder to use `create-network` and other scripts). - -When you are finally happy with your changes to geth: - -- Raise a PR and get that reviewed and merged -- Identify the tag for the latest Docker image built -- Update the value in the geth.miner.tag field in `values.yaml` and raise a PR - -## Configuration - -The following table lists the configurable parameters of the vault chart and their default values. - -| Parameter | Description | Default | -| --------------------------- | ------------------------------------------------------------------ | ---------------------------- | -| `imagePullPolicy` | Container pull policy | `IfNotPresent` | -| `nodeSelector` | Node labels for pod assignmen | | -| `bootnode.image.repository` | bootnode container image to use | `ethereum/client-go` | -| `bootnode.image.tag` | bootnode container image tag to deploy | `alltools-v1.7.3` | -| `geth.image.repository` | geth container image to use | `ethereum/client-go` | -| `geth.image.tag` | geth container image tag to deploy | `v1.7.3` | -| `geth.tx.replicaCount` | geth transaction nodes replica count | `1` | -| `geth.miner.replicaCount` | geth miner nodes replica count | `1` | -| `geth.miner.account.secret` | geth account secret | `my-secret-account-password` | -| `geth.genesis.networkId` | Ethereum network id | `1101` | -| `geth.genesis.difficulty` | Ethereum network difficulty | `0x0400` | -| `geth.genesis.gasLimit` | Ethereum network gas limit | `0x8000000` | -| `geth.account.address` | Geth Account to be initially funded and deposited with mined Ether | | -| `geth.account.privateKey` | Geth Private Key | | -| `geth.account.secret` | Geth Account Secret | | diff --git a/packages/helm-charts/testnet/scripts/create-network.sh b/packages/helm-charts/testnet/scripts/create-network.sh deleted file mode 100755 index 2e89fbe1780..00000000000 --- a/packages/helm-charts/testnet/scripts/create-network.sh +++ /dev/null @@ -1,77 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -NAMESPACE="" -RELEASE="" -DOMAIN_NAME_OPT="" -ACTION=install -TEST_OPT="" -ZONE="us-west1-a" - -while getopts ':utn:r:z:d:v:a:' flag; do - case "${flag}" in - u) ACTION=upgrade ;; - t) TEST_OPT="--debug --dry-run" ;; - n) NAMESPACE="${OPTARG}" ;; - r) RELEASE="${OPTARG}" ;; - z) ZONE="${OPTARG}" ;; - d) DOMAIN_NAME_OPT="--set domain.name=${OPTARG}" ;; - a) VERIFICATION_REWARDS_ADDRESS="${OPTARG}" ;; - *) echo "Unexpected option ${flag}" ;; - esac -done -shift $((OPTIND -1)) - -[ -z "$RELEASE" ] && echo "Need to set release via the -r flag" && exit 1; -[ -z "$NAMESPACE" ] && NAMESPACE=$RELEASE - -# Create blockscout DB only if we are installing -if [ "$ACTION" = "install" ]; then - - # Create a new username and password - BLOCKSCOUT_DB_USERNAME=$(openssl rand -hex 12) - BLOCKSCOUT_DB_PASSWORD=$(openssl rand -hex 24) - - if [ "z$TEST_OPT" = "z" ]; then - echo "Creating Cloud SQL database, this might take a minute or two ..." - gcloud sql instances create $RELEASE --zone $ZONE --database-version POSTGRES_9_6 --cpu 1 --memory 4G - gcloud sql users create $BLOCKSCOUT_DB_USERNAME -i $RELEASE --password $BLOCKSCOUT_DB_PASSWORD - gcloud sql databases create blockscout -i $RELEASE - kubectl create namespace $NAMESPACE - - # This command assumes the secret being available on the cluster in the default namespace - kubectl get secret blockscout-cloudsql-credentials --namespace default --export -o yaml |\ - grep -v creationTimestamp | grep -v resourceVersion | grep -v selfLink | grep -v uid | grep -v namespace |\ - kubectl apply --namespace=$NAMESPACE -f - - fi -fi - -# Get the connection name for the database -BLOCKSCOUT_DB_CONNECTION_NAME=$(gcloud sql instances describe $RELEASE --format="value(connectionName)") - -if [ "$ACTION" = "install" ]; then - - echo "Deploying new environment..." - - helm install ./testnet --name $RELEASE --namespace $NAMESPACE \ - $DOMAIN_NAME_OPT $TEST_OPT \ - --set miner.verificationrewards=$VERIFICATION_REWARDS_ADDRESS \ - --set blockscout.db.username=$BLOCKSCOUT_DB_USERNAME \ - --set blockscout.db.password=$BLOCKSCOUT_DB_PASSWORD \ - --set blockscout.db.connection_name=$BLOCKSCOUT_DB_CONNECTION_NAME - -elif [ "$ACTION" = "upgrade" ]; then - - # Get existing username and password from the database - BLOCKSCOUT_DB_USERNAME=`kubectl get secret $RELEASE-blockscout --export -o jsonpath='{.data.DB_USERNAME}' -n $NAMESPACE | base64 --decode` - BLOCKSCOUT_DB_PASSWORD=`kubectl get secret $RELEASE-blockscout --export -o jsonpath='{.data.DB_PASSWORD}' -n $NAMESPACE | base64 --decode` - - echo "Upgrading existing environment..." - - helm upgrade $RELEASE ./testnet \ - $DOMAIN_NAME_OPT $TEST_OPT \ - --set miner.verificationrewards=$VERIFICATION_REWARDS_ADDRESS \ - --set blockscout.db.username=$BLOCKSCOUT_DB_USERNAME \ - --set blockscout.db.password=$BLOCKSCOUT_DB_PASSWORD \ - --set blockscout.db.connection_name=$BLOCKSCOUT_DB_CONNECTION_NAME -fi diff --git a/packages/helm-charts/testnet/scripts/destroy-network.sh b/packages/helm-charts/testnet/scripts/destroy-network.sh deleted file mode 100755 index 4a72aeb94b9..00000000000 --- a/packages/helm-charts/testnet/scripts/destroy-network.sh +++ /dev/null @@ -1,39 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -RELEASE="" -DELETE=false -[ -n "$1" ] && RELEASE=$1 -while getopts ':r:d' flag; do - case "${flag}" in - r) RELEASE="${OPTARG}" ;; - d) DELETE=true ;; - *) echo "Unexpected option ${flag}" ;; - esac -done -[ -z "$RELEASE" ] && echo "Need to set RELEASE_NAME via the -r flag" && exit 1; - -if [[ ( "$RELEASE" = "integration") || ( "$RELEASE" = "staging" ) || ( "$RELEASE" = "production" ) ]]; then - echo "You just tried to delete $RELEASE. You probably did not want to do that. Exiting the script. If this is a mistake, modify this script or do it manually" - exit 1; -fi - - -echo "You are about to delete the network $RELEASE" -if $DELETE -then - PVCS="$(kubectl get pvc --namespace=$RELEASE | grep $RELEASE | awk '{print $1}')" -fi - -gcloud sql instances delete $RELEASE -helm del --purge $RELEASE -kubectl delete namespace $RELEASE - -echo $PVCS -if $DELETE -then - while read -r pvc; do - kubectl delete pvc --namespace=$RELEASE "$pvc" - done <<< "$PVCS" - gcloud sql databases delete blockscout -i $RELEASE -fi diff --git a/packages/helm-charts/testnet/scripts/get-bootnode.sh b/packages/helm-charts/testnet/scripts/get-bootnode.sh deleted file mode 100644 index 7fb2e125e34..00000000000 --- a/packages/helm-charts/testnet/scripts/get-bootnode.sh +++ /dev/null @@ -1,21 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -apk add --no-cache curl; -CNT=0; -echo "retreiving bootnodes from $BOOTNODE_SVC" -while [ $CNT -le 90 ] -do - curl -m 5 -s $BOOTNODE_SVC | xargs echo -n > /geth/bootnodes; - if [ -s /geth/bootnodes ] - then - cat /geth/bootnodes; - exit 0; - fi; - - echo "no bootnodes found. retrying $CNT..."; - sleep 2 || break; - CNT=$((CNT+1)); -done; -echo "WARNING. unable to find bootnodes. continuing but geth may not be able to find any peers."; -exit 0; \ No newline at end of file diff --git a/packages/helm-charts/testnet/scripts/port-forward.sh b/packages/helm-charts/testnet/scripts/port-forward.sh deleted file mode 100755 index 7b36a382569..00000000000 --- a/packages/helm-charts/testnet/scripts/port-forward.sh +++ /dev/null @@ -1,22 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# Basic Usage: -# ./port-forward.sh NAME -# Specify -n NAMESPACE -r RELEASE or just pass NAME to the script and that will set NAMESPACE and RELEASE to it - -NAMESPACE="" -RELEASE="" -# Substitute _ for - because of the discrepancy between GCP and monorepo testnet names. -[ -n "$1" ] && NAMESPACE=${1/_/-} && RELEASE=${1/_/-} -while getopts ':r:n:' flag; do - case "${flag}" in - n) NAMESPACE="${OPTARG}" ;; - r) RELEASE="${OPTARG}" ;; - *) error "Unexpected option ${flag}" ;; - esac -done -[ -z "$NAMESPACE" ] && echo "Need to set the NAMESPACE_NAME via the -n flag" && exit 1; -[ -z "$RELEASE" ] && echo "Need to set RELEASE_NAME via the -r flag" && exit 1; - -kubectl port-forward --namespace $NAMESPACE $(kubectl get pods --namespace $NAMESPACE -l "app in (ethereum,testnet), component=gethminer1, release=$RELEASE" --field-selector=status.phase=Running -o jsonpath="{.items[0].metadata.name}") 8545:8545 8546:8546 diff --git a/packages/helm-charts/testnet/scripts/write-mobile-network-config.sh b/packages/helm-charts/testnet/scripts/write-mobile-network-config.sh deleted file mode 100755 index f6b17120c18..00000000000 --- a/packages/helm-charts/testnet/scripts/write-mobile-network-config.sh +++ /dev/null @@ -1,51 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# Basic Usage: -# ./write-mobile-network-config.sh NAME -# Specify -n NAMESPACE -r RELEASE -t TESTNET_NAME or just pass NAME to the script and that will set NAMESPACE, RELEASE and TESTNET_NAME to it - - -NAMESPACE="" -RELEASE="" -TESTNET_NAME="" -[ -n "$1" ] && NAMESPACE=$1 && RELEASE=$1 && TESTNET_NAME=$1 -while getopts ':r:n:' flag; do - case "${flag}" in - n) NAMESPACE="${OPTARG}" ;; - r) RELEASE="${OPTARG}" ;; - t) TESTNET_NAME="${OPTARG}" ;; - *) error "Unexpected option ${flag}" ;; - esac -done -[ -z "$NAMESPACE" ] && echo "Need to set the NAMESPACE_NAME via the -n flag" && exit 1; -[ -z "$RELEASE" ] && echo "Need to set RELEASE_NAME via the -r flag" && exit 1; -[ -z "$TESTNET_NAME" ] && echo "Need to set TESTNET_NAME via the -t flag" && exit 1; - -UNDERSCORE_TESTNET_NAME="${TESTNET_NAME/-/_}" - -[ -z $(kubectl get svc --namespace $NAMESPACE $RELEASE-gethtx1 -o jsonpath='{.status.loadBalancer.ingress[0].ip}') ] && echo "Wait a minute, gethtx1 load balancer has not yet provisioned" && exit 1; - -[ -z $(kubectl get svc --namespace $NAMESPACE $RELEASE-gethtx2 -o jsonpath='{.status.loadBalancer.ingress[0].ip}') ] && echo "Wait a minute, gethtx2 load balancer has not yet provisioned" && exit 1; - -[ -z $(kubectl get svc --namespace $NAMESPACE $RELEASE-gethtx3 -o jsonpath='{.status.loadBalancer.ingress[0].ip}') ] && echo "Wait a minute, gethtx3 load balancer has not yet provisioned" && exit 1; - -[ -z $(kubectl get svc --namespace $NAMESPACE $RELEASE-gethtx4 -o jsonpath='{.status.loadBalancer.ingress[0].ip}') ] && echo "Wait a minute, gethtx4 load balancer has not yet provisioned" && exit 1; - -cat > ../mobile/src/geth/additionalNetworks.ts << EOF -export default { - '$UNDERSCORE_TESTNET_NAME': { - nodeDir: '.$UNDERSCORE_TESTNET_NAME', - enodes: [ - 'enode://$(kubectl get configmaps $NAMESPACE-geth-config -n $NAMESPACE -o jsonpath='{.data.gethtx1NodeId}')@$(kubectl get svc --namespace $NAMESPACE $RELEASE-gethtx1 -o jsonpath='{.status.loadBalancer.ingress[0].ip}'):30303', - 'enode://$(kubectl get configmaps $NAMESPACE-geth-config -n $NAMESPACE -o jsonpath='{.data.gethtx2NodeId}' - )@$(kubectl get svc --namespace $NAMESPACE $RELEASE-gethtx2 -o jsonpath='{.status.loadBalancer.ingress[0].ip}'):30303', - 'enode://$(kubectl get configmaps $NAMESPACE-geth-config -n $NAMESPACE -o jsonpath='{.data.gethtx3NodeId}')@$(kubectl get svc --namespace $NAMESPACE $RELEASE-gethtx3 -o jsonpath='{.status.loadBalancer.ingress[0].ip}'):30303', - 'enode://$(kubectl get configmaps $NAMESPACE-geth-config -n $NAMESPACE -o jsonpath='{.data.gethtx4NodeId}' - )@$(kubectl get svc --namespace $NAMESPACE $RELEASE-gethtx4 -o jsonpath='{.status.loadBalancer.ingress[0].ip}'):30303' - ], - networkID: 1101, - genesis: $(kubectl get configmaps $NAMESPACE-geth-config -n $NAMESPACE -o jsonpath="{.data['genesis\.json']}") - } -} -EOF diff --git a/packages/helm-charts/testnet/ssdstorageclass.yaml b/packages/helm-charts/testnet/ssdstorageclass.yaml deleted file mode 100644 index 142f29778c2..00000000000 --- a/packages/helm-charts/testnet/ssdstorageclass.yaml +++ /dev/null @@ -1,8 +0,0 @@ -allowVolumeExpansion: true -apiVersion: storage.k8s.io/v1 -kind: StorageClass -metadata: - name: ssd -provisioner: kubernetes.io/gce-pd -parameters: - type: pd-ssd diff --git a/packages/helm-charts/testnet/templates/NOTES.txt b/packages/helm-charts/testnet/templates/NOTES.txt deleted file mode 100644 index adfe9d92dfd..00000000000 --- a/packages/helm-charts/testnet/templates/NOTES.txt +++ /dev/null @@ -1 +0,0 @@ -Deployed {{ .Release.Name }}! diff --git a/packages/helm-charts/testnet/templates/_helpers.tpl b/packages/helm-charts/testnet/templates/_helpers.tpl deleted file mode 100644 index 2764475eebd..00000000000 --- a/packages/helm-charts/testnet/templates/_helpers.tpl +++ /dev/null @@ -1,255 +0,0 @@ -{{/* vim: set filetype=mustache: */}} - -{{- define "celo.geth-exporter-container" -}} -- name: geth-exporter - image: "{{ .Values.gethexporter.image.repository }}:{{ .Values.gethexporter.image.tag }}" - imagePullPolicy: {{ .Values.imagePullPolicy }} - ports: - - name: profiler - containerPort: 9200 - command: - - /usr/local/bin/geth_exporter - - -ipc - - /root/.celo/geth.ipc - - -filter - - (.*overall|percentiles_95) - resources: - requests: - memory: 50M - cpu: 50m - volumeMounts: - - name: data - mountPath: /root/.celo -{{- end -}} - -{{- /* This template does not define ports that will be exposed */ -}} -{{- define "celo.node-service" -}} -kind: Service -apiVersion: v1 -metadata: - name: {{ template "common.fullname" $ }}-{{ .svc_name | default .node_name }}-{{ .index }}{{ .svc_name_suffix | default "" }} - labels: - {{- include "common.standard.labels" . | nindent 4 }} - component: {{ .component_label }} -spec: - selector: - statefulset.kubernetes.io/pod-name: {{ template "common.fullname" $ }}-{{ .node_name }}-{{ .index }} - type: {{ .service_type }} - publishNotReadyAddresses: true - {{- if (eq .service_type "LoadBalancer") }} - loadBalancerIP: {{ .load_balancer_ip }} - {{- end -}} -{{- end -}} - -{{- define "celo.full-node-statefulset" -}} -apiVersion: v1 -kind: Service -metadata: - name: {{ .name }} - labels: - {{- if .proxy | default false -}} - {{- $validatorProxied := printf "%s-validators-%d" .Release.Namespace .validator_index }} - validator-proxied: "{{ $validatorProxied }}" - {{- end }} - component: {{ .component_label }} -spec: - sessionAffinity: None - ports: - - port: 8545 - name: rpc - {{- $wsPort := ((.ws_port | default .Values.geth.ws_port) | int) -}} - {{- if ne $wsPort 8545 }} - - port: {{ $wsPort }} - name: ws - {{- end }} - selector: - {{- if .proxy | default false -}} - {{- $validatorProxied := printf "%s-validators-%d" .Release.Namespace .validator_index }} - validator-proxied: "{{ $validatorProxied }}" - {{- end }} - component: {{ .component_label }} ---- -apiVersion: v1 -kind: Service -metadata: - name: {{ .name }}-headless - labels: - {{- if .proxy | default false -}} - {{- $validatorProxied := printf "%s-validators-%d" .Release.Namespace .validator_index }} - validator-proxied: "{{ $validatorProxied }}" - {{- end }} - component: {{ .component_label }} -spec: - type: ClusterIP - clusterIP: None - ports: - - port: 8545 - name: rpc - {{- if ne $wsPort 8545 }} - - port: {{ .ws_port | default .Values.geth.ws_port }} - name: ws - {{- end }} - selector: - {{- if .proxy | default false }} - {{- $validatorProxied := printf "%s-validators-%d" .Release.Namespace .validator_index }} - validator-proxied: "{{ $validatorProxied }}" - {{- end }} - component: {{ .component_label }} ---- -apiVersion: apps/v1 -kind: StatefulSet -metadata: - name: {{ template "common.fullname" . }}-{{ .name }} - labels: - {{- include "common.standard.labels" . | nindent 4 }} - component: {{ .component_label }} - {{- if .proxy | default false -}} - {{- $validatorProxied := printf "%s-validators-%d" .Release.Namespace .validator_index }} - validator-proxied: "{{ $validatorProxied }}" - {{- end }} -spec: - {{- $updateStrategy := index $.Values.updateStrategy $.component_label }} - updateStrategy: - {{- toYaml $updateStrategy | nindent 4 }} - {{- if .Values.geth.ssd_disks }} - volumeClaimTemplates: - - metadata: - name: data - {{- if .pvc_annotations }} - annotations: - {{- toYaml .pvc_annotations | nindent 8 }} - {{- end }} - spec: - storageClassName: {{ $.Values.geth.storageClass }} - accessModes: [ "ReadWriteOnce" ] - resources: - requests: - {{- $disk_size := ((eq .name "tx-nodes-private" ) | ternary .Values.geth.privateTxNodediskSizeGB .Values.geth.diskSizeGB ) }} - storage: {{ $disk_size }}Gi - {{- end }} - podManagementPolicy: Parallel - replicas: {{ .replicas }} - serviceName: {{ .name }} - selector: - matchLabels: - {{- include "common.standard.labels" . | nindent 6 }} - component: {{ .component_label }} - {{- if .proxy | default false -}} - {{- $validatorProxied := printf "%s-validators-%d" .Release.Namespace .validator_index }} - validator-proxied: "{{ $validatorProxied }}" - {{- end }} - template: - metadata: - labels: - {{- include "common.standard.labels" . | nindent 8 }} - component: {{ .component_label }} - {{- if .extraPodLabels -}} - {{- toYaml .extraPodLabels | nindent 8 }} - {{- end }} - {{- if .proxy | default false }} - {{- $validatorProxied := printf "%s-validators-%d" .Release.Namespace .validator_index }} - validator-proxied: "{{ $validatorProxied }}" - {{- end }} - {{- if .Values.metrics | default false }} - annotations: - {{- include "common.prometheus-annotations" . | nindent 8 }} - {{- end }} - spec: - initContainers: - {{- include "common.conditional-init-genesis-container" . | nindent 6 }} - {{- include "common.celotool-full-node-statefulset-container" (dict - "Values" .Values - "Release" .Release - "Chart" .Chart - "proxy" .proxy - "mnemonic_account_type" .mnemonic_account_type - "service_ip_env_var_prefix" .service_ip_env_var_prefix - "ip_addresses" .ip_addresses - "validator_index" .validator_index - ) | nindent 6 }} - {{- if .unlock | default false }} - {{- include "common.import-geth-account-container" . | nindent 6 }} - {{- end }} - containers: - {{- include "common.full-node-container" (dict - "Values" .Values - "Release" .Release - "Chart" .Chart - "proxy" .proxy - "proxy_allow_private_ip_flag" .proxy_allow_private_ip_flag - "unlock" .unlock - "rpc_apis" .rpc_apis - "expose" .expose - "syncmode" .syncmode - "gcmode" .gcmode - "resources" .resources - "ws_port" (default .Values.geth.ws_port .ws_port) - "pprof" (or (.Values.metrics) (.Values.pprof.enabled)) - "pprof_port" (.Values.pprof.port) - "light_serve" .Values.geth.light.serve - "light_maxpeers" .Values.geth.light.maxpeers - "maxpeers" .Values.geth.maxpeers - "metrics" .Values.metrics - "public_ips" .public_ips - "ethstats" (printf "%s-ethstats.%s" (include "common.fullname" .) .Release.Namespace) - "extra_setup" .extra_setup - ) | nindent 6 }} - terminationGracePeriodSeconds: {{ .Values.geth.terminationGracePeriodSeconds | default 300 }} - {{- with .affinity }} - affinity: - {{- toYaml . | nindent 8 }} - {{- end }} - {{- with .node_selector }} - nodeSelector: - {{- toYaml . | nindent 8 }} - {{- end }} - {{- with .tolerations }} - tolerations: - {{- toYaml . | nindent 8 }} - {{- end }} - volumes: - - name: data - emptyDir: {} - - name: data-shared - emptyDir: {} - - name: config - configMap: - name: {{ template "common.fullname" . }}-geth-config - - name: account - secret: - secretName: {{ template "common.fullname" . }}-geth-account -{{- end -}} - -{{- /* This template puts a semicolon-separated pair of proxy enodes into $PROXY_ENODE_URL_PAIR. */ -}} -{{- /* I.e ;. */ -}} -{{- /* Expects env variables MNEMONIC, RID (the validator index), and PROXY_INDEX */ -}} -{{- define "celo.proxyenodeurlpair" -}} -echo "Generating proxy enode url pair for proxy $PROXY_INDEX" -PROXY_INTERNAL_IP_ENV_VAR={{ $.Release.Namespace | upper }}_VALIDATORS_${RID}_PROXY_INTERNAL_${PROXY_INDEX}_SERVICE_HOST -echo "PROXY_INTERNAL_IP_ENV_VAR=$PROXY_INTERNAL_IP_ENV_VAR" -PROXY_INTERNAL_IP=`eval "echo \\${${PROXY_INTERNAL_IP_ENV_VAR}}"` -# If $PROXY_IPS is not empty, then we use the IPs from there. Otherwise, -# we use the IP address of the proxy internal service -if [ ! -z $PROXY_IPS ]; then - echo "Proxy external IP from PROXY_IPS=$PROXY_IPS: " - PROXY_EXTERNAL_IP=`echo -n $PROXY_IPS | cut -d '/' -f $((PROXY_INDEX + 1))` -else - PROXY_EXTERNAL_IP=$PROXY_INTERNAL_IP -fi -echo "Proxy internal IP: $PROXY_INTERNAL_IP" -echo "Proxy external IP: $PROXY_EXTERNAL_IP" -# Proxy key index to allow for a high number of proxies per validator without overlap -PROXY_KEY_INDEX=$(( ($RID * 10000) + $PROXY_INDEX )) -PROXY_ENODE_ADDRESS=`celotooljs.sh generate public-key --mnemonic "$MNEMONIC" --accountType proxy --index $PROXY_KEY_INDEX` -PROXY_INTERNAL_ENODE=enode://${PROXY_ENODE_ADDRESS}@${PROXY_INTERNAL_IP}:30503 -PROXY_EXTERNAL_ENODE=enode://${PROXY_ENODE_ADDRESS}@${PROXY_EXTERNAL_IP}:30303 -echo "Proxy internal enode: $PROXY_INTERNAL_ENODE" -echo "Proxy external enode: $PROXY_EXTERNAL_ENODE" -PROXY_ENODE_URL_PAIR=$PROXY_INTERNAL_ENODE\;$PROXY_EXTERNAL_ENODE -{{- end -}} - -{{- define "celo.proxyipaddresses" -}} -{{- if .Values.geth.static_ips -}} -{{- index .Values.geth.proxyIPAddressesPerValidatorArray .validatorIndex -}} -{{- end -}} -{{- end -}} diff --git a/packages/helm-charts/testnet/templates/bootnode.deployment.yaml b/packages/helm-charts/testnet/templates/bootnode.deployment.yaml deleted file mode 100644 index ab9aca5506d..00000000000 --- a/packages/helm-charts/testnet/templates/bootnode.deployment.yaml +++ /dev/null @@ -1,101 +0,0 @@ -apiVersion: apps/v1 -kind: Deployment -metadata: - name: {{ template "common.fullname" . }}-bootnode - labels: - {{- include "common.standard.labels" . | nindent 4 }} - component: bootnode -spec: - strategy: - type: Recreate - replicas: 1 - selector: - matchLabels: - {{- include "common.standard.labels" . | nindent 6 }} - component: bootnode - template: - metadata: - labels: - {{- include "common.standard.labels" . | nindent 8 }} - component: bootnode - spec: - containers: - - name: bootnode - image: {{ .Values.bootnode.image.repository }}:{{ .Values.bootnode.image.tag }} - imagePullPolicy: {{ .Values.imagePullPolicy }} - command: - - /bin/sh - - -c - args: - - | - set -euo pipefail - NAT_FLAG="" - [[ "$IP_ADDRESS" != "none" ]] && NAT_FLAG="--nat=extip:$IP_ADDRESS" - /usr/local/bin/bootnode --nodekey=/etc/bootnode/node.key --writeaddress > /enode.key - exec /usr/local/bin/bootnode --nodekey=/etc/bootnode/node.key --verbosity=5 ${NAT_FLAG} --networkid=${NETWORK_ID} --ping-ip-from-packet={{ .Values.geth.ping_ip_from_packet }} - env: - - name: IP_ADDRESS - value: {{ default "none" .Values.geth.bootnodeIpAddress }} - - name: NETWORK_ID - valueFrom: - configMapKeyRef: - name: {{ template "common.fullname" . }}-geth-config - key: networkid - livenessProbe: - exec: - command: - - /bin/sh - - -c - - | - devp2p discv4 --networkid {{ .Values.genesis.networkId }} ping "enode://$(cat /enode.key)@127.0.0.1:0?discport=30301" - initialDelaySeconds: 30 - periodSeconds: 30 - volumeMounts: - {{- if not .Values.geth.overwriteBootnodePrivateKey }} - - name: data - mountPath: /etc/bootnode - {{- else }} - - name: bootnode-pkey - mountPath: /etc/bootnode - readOnly: true - {{- end }} - ports: - - name: discovery - containerPort: 30301 - protocol: UDP - {{- if not .Values.geth.overwriteBootnodePrivateKey }} - initContainers: - - name: generate-node-key - image: {{ .Values.celotool.image.repository }}:{{ .Values.celotool.image.tag }} - imagePullPolicy: {{ .Values.imagePullPolicy }} - command: - - /bin/bash - - -c - - | - celotooljs.sh generate bip32 --mnemonic "$MNEMONIC" --accountType bootnode --index 0 > /etc/bootnode/node.key - celotooljs.sh generate public-key --mnemonic "$MNEMONIC" --accountType bootnode --index 0 > /etc/bootnode/enode.key - env: - - name: MNEMONIC - valueFrom: - secretKeyRef: - name: {{ template "common.fullname" . }}-geth-account - key: mnemonic - volumeMounts: - - name: data - mountPath: /etc/bootnode - {{- end }} - volumes: - - name: data - emptyDir: {} - {{- if .Values.geth.overwriteBootnodePrivateKey }} - - name: bootnode-pkey - secret: - secretName: {{ template "common.fullname" . }}-geth-account - items: - - key: bootnodePrivateKey - path: node.key - {{- end }} - {{- with .Values.nodeSelector }} - nodeSelector: - {{- toYaml . | nindent 8 }} - {{- end }} diff --git a/packages/helm-charts/testnet/templates/bootnode.service.yaml b/packages/helm-charts/testnet/templates/bootnode.service.yaml deleted file mode 100644 index e8b60f0329d..00000000000 --- a/packages/helm-charts/testnet/templates/bootnode.service.yaml +++ /dev/null @@ -1,20 +0,0 @@ -apiVersion: v1 -kind: Service -metadata: - name: {{ template "common.fullname" . }}-bootnode - labels: - {{- include "common.standard.labels" . | nindent 4 }} - component: bootnode -spec: - {{- if $.Values.geth.static_ips }} - type: LoadBalancer - loadBalancerIP: {{ .Values.geth.bootnodeIpAddress }} - {{- end }} - selector: - app: {{ template "common.name" . }} - release: {{ .Release.Name }} - component: bootnode - ports: - - name: discovery - port: 30301 - protocol: UDP diff --git a/packages/helm-charts/testnet/templates/forno.ingress.yaml b/packages/helm-charts/testnet/templates/forno.ingress.yaml deleted file mode 100644 index b16d44e088c..00000000000 --- a/packages/helm-charts/testnet/templates/forno.ingress.yaml +++ /dev/null @@ -1,31 +0,0 @@ -apiVersion: networking.k8s.io/v1 -kind: Ingress -metadata: - name: {{ .Release.Namespace }}-forno-ingress - labels: - {{- include "common.standard.labels" . | nindent 4 }} - component: forno - annotations: - kubernetes.io/tls-acme: "true" - nginx.ingress.kubernetes.io/enable-cors: "true" - # Allows WS connections to be 20 minutes long, see https://kubernetes.github.io/ingress-nginx/user-guide/miscellaneous/#websockets - nginx.ingress.kubernetes.io/proxy-read-timeout: "1200" - nginx.ingress.kubernetes.io/proxy-send-timeout: "1200" - nginx.ingress.kubernetes.io/rewrite-target: / -spec: - ingressClassName: {{ default "nginx" .Values.ingressClassName }} - tls: - - hosts: - - {{ .Release.Namespace }}-forno.{{ .Values.domain.name }}.org - secretName: {{ .Release.Namespace }}-forno-web-tls - rules: - - host: {{ .Release.Namespace }}-forno.{{ .Values.domain.name }}.org - http: - paths: - - path: /(.*) - pathType: Prefix - backend: - service: - name: tx-nodes - port: - number: 8545 diff --git a/packages/helm-charts/testnet/templates/geth-account.secret.yaml b/packages/helm-charts/testnet/templates/geth-account.secret.yaml deleted file mode 100644 index 06189239035..00000000000 --- a/packages/helm-charts/testnet/templates/geth-account.secret.yaml +++ /dev/null @@ -1,13 +0,0 @@ -apiVersion: v1 -kind: Secret -metadata: - name: {{ template "common.fullname" . }}-geth-account - labels: - {{- include "common.standard.labels" . | nindent 4 }} -type: Opaque -data: - accountSecret: {{ .Values.geth.account.secret | b64enc }} - mnemonic: {{ .Values.mnemonic | b64enc }} - {{- if .Values.geth.overwriteBootnodePrivateKey }} - bootnodePrivateKey: {{ .Values.geth.bootnodePrivateKey | b64enc }} - {{- end }} diff --git a/packages/helm-charts/testnet/templates/geth.configmap.yaml b/packages/helm-charts/testnet/templates/geth.configmap.yaml deleted file mode 100644 index 40be96e952f..00000000000 --- a/packages/helm-charts/testnet/templates/geth.configmap.yaml +++ /dev/null @@ -1 +0,0 @@ -{{- include "common.geth-configmap" . }} diff --git a/packages/helm-charts/testnet/templates/proxy.service.yaml b/packages/helm-charts/testnet/templates/proxy.service.yaml deleted file mode 100644 index f3c00335877..00000000000 --- a/packages/helm-charts/testnet/templates/proxy.service.yaml +++ /dev/null @@ -1,67 +0,0 @@ -{{- range $validatorIndex, $proxyCount := .Values.geth.proxiesPerValidator }} -{{- range $index, $e := until ($proxyCount | int) }} -{{- template "celo.node-service" (dict - "Values" $.Values - "Release" $.Release - "Chart" $.Chart - "index" $index - "service_type" "ClusterIP" - "svc_name" (print "validators-" $validatorIndex "-proxy-internal") - "node_name" (print "validators-" $validatorIndex "-proxy") - "name_suffix" "" - "component_label" "proxy" - "load_balancer_ip" "" -) }} - ports: - - name: internal-tcp - port: 30503 - protocol: TCP - - name: internal-udp - port: 30503 - protocol: UDP - {{- if not $.Values.geth.static_ips }} - - name: external-tcp - port: 30303 - protocol: TCP - - name: external-udp - port: 30303 - protocol: UDP - {{- end }} ---- -{{- if $.Values.geth.static_ips }} -{{- $proxyIpAddresses := (splitList "/" (index $.Values.geth.proxyIPAddressesPerValidatorArray $validatorIndex)) }} -{{- $loadBalancerIP := index $proxyIpAddresses $index }} -{{- template "celo.node-service" (dict - "Values" $.Values - "Release" $.Release - "Chart" $.Chart - "index" $index - "service_type" "LoadBalancer" - "node_name" (print "validators-" $validatorIndex "-proxy") - "component_label" "proxy" - "load_balancer_ip" $loadBalancerIP -) }} - ports: - - name: discovery - port: 30303 - protocol: UDP ---- -{{- template "celo.node-service" (dict - "Values" $.Values - "Release" $.Release - "Chart" $.Chart - "index" $index - "service_type" "LoadBalancer" - "node_name" (print "validators-" $validatorIndex "-proxy") - "svc_name_suffix" "-tcp" - "component_label" "proxy" - "load_balancer_ip" $loadBalancerIP -) }} - ports: - - name: celo - port: 30303 - protocol: TCP ---- -{{- end }} -{{- end }} -{{- end }} diff --git a/packages/helm-charts/testnet/templates/proxy.statefulset.yaml b/packages/helm-charts/testnet/templates/proxy.statefulset.yaml deleted file mode 100644 index 4bb8d19c96b..00000000000 --- a/packages/helm-charts/testnet/templates/proxy.statefulset.yaml +++ /dev/null @@ -1,29 +0,0 @@ -{{- range $validatorIndex, $proxyCount := .Values.geth.proxiesPerValidator }} -{{- include "celo.full-node-statefulset" (dict - "Values" $.Values - "Release" $.Release - "Chart" $.Chart - "name" (print "validators-" $validatorIndex "-proxy") - "component_label" "proxy" - "mnemonic_account_type" "proxy" - "replicas" $proxyCount - "proxy" true - "proxy_allow_private_ip_flag" true - "unlock" true - "expose" false - "rpc_apis" "eth,net,web3" - "syncmode" "full" - "gcmode" "full" - "resources" (default $.Values.geth.resources $.Values.geth.proxyResources) - "service_ip_env_var_prefix" (printf "%s%s%d%s" ($.Release.Namespace | upper) "_VALIDATORS_" $validatorIndex "_PROXY_INTERNAL_") - "validator_index" $validatorIndex - "ip_addresses" (include "celo.proxyipaddresses" (dict "Values" $.Values "validatorIndex" $validatorIndex) ) - "extra_setup" $.Values.geth.proxyExtraSnippet - "affinity" $.Values.geth.proxyAffinity - "node_selector" $.Values.geth.proxyNodeSelector - "tolerations" $.Values.geth.proxyTolerations - "extraPodLabels" $.Values.extraPodLabels.proxy - "pvc_annotations" $.Values.pvcAnnotations.proxy -) }} ---- -{{- end }} diff --git a/packages/helm-charts/testnet/templates/secondaries.service.yaml b/packages/helm-charts/testnet/templates/secondaries.service.yaml deleted file mode 100644 index 7c0e102c3e4..00000000000 --- a/packages/helm-charts/testnet/templates/secondaries.service.yaml +++ /dev/null @@ -1,19 +0,0 @@ -{{- if .Values.geth.static_ips }} -{{- range $index, $e := until (.Values.geth.secondaries | int) }} -{{- if (ge $index (len $.Values.geth.proxiesPerValidator)) }} -{{- $loadBalancerIP := index $.Values.geth.validatorsIPAddressArray $index }} -{{- template "celo.node-service" (dict "Values" $.Values "Release" $.Release "Chart" $.Chart "index" $index "service_type" "LoadBalancer" "node_name" "validators" "component_label" "validators" "load_balancer_ip" $loadBalancerIP ) }} - ports: - - name: discovery - port: 30303 - protocol: UDP ---- -{{- template "celo.node-service" (dict "Values" $.Values "Release" $.Release "Chart" $.Chart "index" $index "service_type" "LoadBalancer" "node_name" "validators" "svc_name_suffix" "-tcp" "component_label" "validators" "load_balancer_ip" $loadBalancerIP ) }} - ports: - - name: celo - port: 30303 - protocol: TCP ---- -{{- end }} -{{- end }} -{{- end }} diff --git a/packages/helm-charts/testnet/templates/secondaries.statefulset.yaml b/packages/helm-charts/testnet/templates/secondaries.statefulset.yaml deleted file mode 100644 index 2213234e9c7..00000000000 --- a/packages/helm-charts/testnet/templates/secondaries.statefulset.yaml +++ /dev/null @@ -1,290 +0,0 @@ -apiVersion: v1 -kind: Service -metadata: - name: secondaries - labels: - component: secondaries -spec: - ports: - - port: 80 - name: web - clusterIP: None - selector: - component: secondaries ---- -apiVersion: apps/v1 -kind: StatefulSet -metadata: - name: {{ template "common.fullname" . }}-secondaries - labels: - {{- include "common.standard.labels" . | nindent 4 }} - component: secondaries -spec: - {{- if .Values.geth.ssd_disks }} - volumeClaimTemplates: - - metadata: - name: data - {{- if .Values.pvcAnnotations.secondary }} - annotations: - {{- toYaml .Values.pvcAnnotations.secondary | nindent 8 }} - {{- end }} - spec: - storageClassName: {{ .Values.geth.storageClass }} - accessModes: [ "ReadWriteOnce" ] - resources: - requests: - storage: {{ .Values.geth.diskSizeGB }}Gi - {{- end }} - podManagementPolicy: Parallel - updateStrategy: - {{- toYaml .Values.updateStrategy.secondaries | nindent 4 }} - replicas: {{ .Values.geth.secondaries }} - serviceName: secondaries - selector: - matchLabels: - {{- include "common.standard.labels" . | nindent 6 }} - component: secondaries - template: - metadata: - labels: - {{- include "common.standard.labels" . | nindent 8 }} - component: secondaries - {{- if .Values.extraPodLabels.secondaries }} - {{- toYaml .Values.extraPodLabels.secondaries | nindent 8 }} - {{- end }} - {{- if .Values.metrics | default false }} - annotations: - {{- include "common.prometheus-annotations" . | nindent 8 }} - {{- end }} - spec: - initContainers: - {{- include "common.conditional-init-genesis-container" . | nindent 6 }} - - name: get-account - image: {{ .Values.celotool.image.repository }}:{{ .Values.celotool.image.tag }} - imagePullPolicy: Always - command: - - /bin/bash - args: - - "-c" - - |- - [[ $REPLICA_NAME =~ -([0-9]+)$ ]] || exit 1 - RID=${BASH_REMATCH[1]} - echo -n "$RID" >/root/.celo/replica_id - echo "Generating private key for rid=$RID" - celotooljs.sh generate bip32 --mnemonic "$MNEMONIC" --accountType validator --index "$RID" > /root/.celo/pkey - echo 'Generating address' - celotooljs.sh generate account-address --private-key $(cat /root/.celo/pkey) > /root/.celo/address - - # If this is a proxied validator, it will not have an external IP address - # and EXTERNAL_IP_ADDRESS will be empty - EXTERNAL_IP_ADDRESS=$(echo -n "$IP_ADDRESSES" | cut -d '/' -f $((RID + 1))) - echo "$EXTERNAL_IP_ADDRESS" > /root/.celo/externalIpAddress - - # Put the proxies per validator array into a comma separated string - # so we can index it at runtime - PROXIES_PER_VALIDATOR="{{ join "," .Values.geth.proxiesPerValidator }}" - PROXY_COUNT=$(echo -n $PROXIES_PER_VALIDATOR | cut -d ',' -f $((RID + 1))) - echo -n "$PROXY_COUNT" > /root/.celo/proxyCount - - if [[ -z "$EXTERNAL_IP_ADDRESS" ]]; then - echo "$POD_IP" > /root/.celo/ipAddress - else - cat /root/.celo/externalIpAddress > /root/.celo/ipAddress - fi - echo -n "Generating IP address for validator: " - cat /root/.celo/ipAddress - - celotooljs.sh generate public-key --mnemonic "$MNEMONIC" --accountType bootnode --index 0 > /root/.celo/bootnodeEnodeAddress - echo -n "Generating Bootnode enode address for the validator: " - cat /root/.celo/bootnodeEnodeAddress - - [[ "$BOOTNODE_IP_ADDRESS" == 'none' ]] && BOOTNODE_IP_ADDRESS=${{ .Release.Namespace | upper }}_BOOTNODE_SERVICE_HOST - echo "enode://$(cat /root/.celo/bootnodeEnodeAddress)@$BOOTNODE_IP_ADDRESS:30301" > /root/.celo/bootnodeEnode - echo -n "Generating Bootnode enode for the validator: " - cat /root/.celo/bootnodeEnode - - # If this validator is meant to be proxied - if [[ ! -z "$PROXY_COUNT" ]]; then - # Put the all the validator's proxy IP addresses into a comma separated string - # so we can access it at runtime. Validators are separated by commas, - # and individual IP addresses are separated by /'s. For example, - # if one validator has 2 proxies, and the other has 1 proxy: - # ALL_VALIDATOR_PROXY_IPS would have the form X.X.X.X/X.X.X.X,X.X.X.X - ALL_VALIDATOR_PROXY_IPS='{{ join "," .Values.geth.proxyIPAddressesPerValidatorArray }}' - [[ $ALL_VALIDATOR_PROXY_IPS = '' ]] && ALL_VALIDATOR_PROXY_IPS='' - - PROXY_IPS=$(echo -n $ALL_VALIDATOR_PROXY_IPS | cut -d ',' -f $((RID + 1))) - - # Clear the proxy enode file because it's persisted - rm -f /root/.celo/proxyEnodeUrlPairs - # Generate all proxy enode urls and put them into /root/.celo/proxyEnodeUrlPairs - PROXY_INDEX=0 - while [ "$PROXY_INDEX" -lt "$PROXY_COUNT" ]; do - if [ "$PROXY_INDEX" -gt 0 ]; then - echo -n "," >> /root/.celo/proxyEnodeUrlPairs - fi - - # gives us PROXY_ENODE_URL_PAIR - {{- include "celo.proxyenodeurlpair" . | nindent 14 }} - echo -n $PROXY_ENODE_URL_PAIR >> /root/.celo/proxyEnodeUrlPairs - - PROXY_INDEX=$(( $PROXY_INDEX + 1 )) - done - fi - env: - - name: POD_IP - valueFrom: - fieldRef: - apiVersion: v1 - fieldPath: status.podIP - - name: REPLICA_NAME - valueFrom: - fieldRef: - fieldPath: metadata.name - - name: MNEMONIC - valueFrom: - secretKeyRef: - name: {{ template "common.fullname" . }}-geth-account - key: mnemonic - - name: IP_ADDRESSES - value: {{ join "/" .Values.geth.validatorsIPAddressArray }} - - name: BOOTNODE_IP_ADDRESS - value: "{{ default "none" .Values.geth.bootnodeIpAddress }}" - volumeMounts: - - name: data - mountPath: /root/.celo - {{- include "common.import-geth-account-container" . | nindent 6 }} - containers: - - name: geth - image: {{ .Values.geth.image.repository }}:{{ .Values.geth.image.tag }} - imagePullPolicy: Always - command: ["/bin/sh"] - args: - - "-c" - - |- - set -euo pipefail - rm /root/.celo/pkey || true - ADDITIONAL_FLAGS='' - - ACCOUNT_ADDRESS=$(cat /root/.celo/address) - RID=$(cat /root/.celo/replica_id) - - if [ "$RID" -lt "$FAULTY_NODES" ]; then - ADDITIONAL_FLAGS="${ADDITIONAL_FLAGS} --istanbul.faultymode $FAULTY_NODE_TYPE" - fi - - if geth --help | grep 'proxy.proxyenodeurlpairs' >/dev/null; then - PROXY_FLAG="--proxy.proxyenodeurlpairs" - else - PROXY_FLAG="--proxy.proxyenodeurlpair" - fi - - PROXY_COUNT=$(cat /root/.celo/proxyCount) - if [ "$PROXY_COUNT" -gt 0 ]; then - ADDITIONAL_FLAGS="${ADDITIONAL_FLAGS} --proxy.proxied ${PROXY_FLAG}=$(cat /root/.celo/proxyEnodeUrlPairs) --nodiscover --proxy.allowprivateip" - else - ADDITIONAL_FLAGS="${ADDITIONAL_FLAGS} --ethstats=${HOSTNAME}@${ETHSTATS_SVC}" - fi - - [[ "$PING_IP_FROM_PACKET" == "true" ]] && ADDITIONAL_FLAGS="${ADDITIONAL_FLAGS} --ping-ip-from-packet" - - [[ "$IN_MEMORY_DISCOVERY_TABLE" == "true" ]] && ADDITIONAL_FLAGS="${ADDITIONAL_FLAGS} --use-in-memory-discovery-table" - - {{- include "common.bootnode-flag-script" . | nindent 10 }} - - {{- include "common.geth-add-metrics-pprof-config" . | nindent 10 }} - - {{- include "common.geth-http-ws-flags" (dict "Values" $.Values "rpc_apis" "eth,net,web3,debug" "ws_port" "8545" "listen_address" "0.0.0.0") | nindent 10 }} - - {{- .Values.geth.secondayExtraSnippet | nindent 10 }} - - exec geth \ - $BOOTNODE_FLAG \ - --datadir /root/.celo \ - --ipcpath=geth.ipc \ - --nousb \ - --password=/root/.celo/account/accountSecret \ - --unlock=${ACCOUNT_ADDRESS} \ - --mine \ - --etherbase=${ACCOUNT_ADDRESS} \ - --syncmode=full \ - --consoleformat=json \ - --consoleoutput=stdout \ - --verbosity={{ .Values.geth.verbosity }} \ - --vmodule={{ .Values.geth.vmodule }} \ - --istanbul.replica \ - --maxpeers=125 \ - --nat=extip:`cat /root/.celo/ipAddress` \ - --allow-insecure-unlock \ - ${ADDITIONAL_FLAGS} - env: - - name: POD_IP - valueFrom: - fieldRef: - apiVersion: v1 - fieldPath: status.podIP - - name: ETHSTATS_SVC - value: {{ template "common.fullname" . }}-ethstats.{{ .Release.Namespace }} - - name: NETWORK_ID - valueFrom: - configMapKeyRef: - name: {{ template "common.fullname" . }}-geth-config - key: networkid - - name: FAULTY_NODES - value: {{ .Values.geth.faultyValidators | quote }} - - name: FAULTY_NODE_TYPE - value: {{ .Values.geth.faultyValidatorType | quote }} - - name: STATIC_IPS_FOR_GETH_NODES - value: "{{ default false .Values.geth.static_ips }}" - - name: PING_IP_FROM_PACKET - value: "{{ default false .Values.geth.ping_ip_from_packet }}" - - name: IN_MEMORY_DISCOVERY_TABLE - value: "{{ default "false" .Values.geth.in_memory_discovery_table }}" - ports: - - name: discovery-udp - containerPort: 30303 - protocol: UDP - - name: discovery-tcp - containerPort: 30303 - - name: rpc - containerPort: 8545 - - name: ws - containerPort: 8546 - {{- if .Values.geth.enable_metrics | default false }} - - name: metrics - containerPort: 6060 - {{- end }} - {{- $resources := default .Values.geth.resources .Values.geth.secondaryResources -}} - {{- with $resources }} - resources: - {{- toYaml . | nindent 10 }} - {{- end }} - volumeMounts: - - name: data - mountPath: /root/.celo - - name: account - mountPath: /root/.celo/account - readOnly: true - {{- with .Values.geth.secondaryAffinity }} - affinity: - {{- toYaml . | nindent 8 }} - {{- end }} - {{- with .Values.secondayNodeSelector }} - nodeSelector: - {{- toYaml . | nindent 8 }} - {{- end }} - {{- with .Values.secondaryTolerations }} - tolerations: - {{- toYaml . | nindent 8 }} - {{- end }} - volumes: - - name: data - emptyDir: {} - - name: data-shared - emptyDir: {} - - name: config - configMap: - name: {{ template "common.fullname" . }}-geth-config - - name: account - secret: - secretName: {{ template "common.fullname" . }}-geth-account diff --git a/packages/helm-charts/testnet/templates/txnode-private.service.yaml b/packages/helm-charts/testnet/templates/txnode-private.service.yaml deleted file mode 100644 index 9d00ba22a6d..00000000000 --- a/packages/helm-charts/testnet/templates/txnode-private.service.yaml +++ /dev/null @@ -1,17 +0,0 @@ -{{- if .Values.geth.static_ips }} -{{- range $index, $e := until (.Values.geth.private_tx_nodes | int) }} -{{- $loadBalancerIP := default "" (index $.Values.geth (print "private_tx_nodes_" $index "IpAddress")) }} -{{- template "celo.node-service" (dict "Values" $.Values "Release" $.Release "Chart" $.Chart "index" $index "service_type" "LoadBalancer" "svc_name" "tx-nodes-private" "node_name" "tx-nodes-private" "component_label" "tx_nodes_private" "load_balancer_ip" $loadBalancerIP ) }} - ports: - - name: discovery - port: 30303 - protocol: UDP ---- -{{- template "celo.node-service" (dict "Values" $.Values "Release" $.Release "Chart" $.Chart "index" $index "service_type" "LoadBalancer" "svc_name" "tx-nodes-private" "node_name" "tx-nodes-private" "svc_name_suffix" "-tcp" "component_label" "tx_nodes_private" "load_balancer_ip" $loadBalancerIP ) }} - ports: - - name: celo - port: 30303 - protocol: TCP ---- -{{- end }} -{{- end }} diff --git a/packages/helm-charts/testnet/templates/txnode-private.statefulset.yaml b/packages/helm-charts/testnet/templates/txnode-private.statefulset.yaml deleted file mode 100644 index 2f9b5ccf07b..00000000000 --- a/packages/helm-charts/testnet/templates/txnode-private.statefulset.yaml +++ /dev/null @@ -1,22 +0,0 @@ -{{- include "celo.full-node-statefulset" (dict - "Values" $.Values - "Release" $.Release - "Chart" $.Chart - "name" "tx-nodes-private" - "component_label" "tx_nodes_private" - "replicas" .Values.geth.private_tx_nodes - "mnemonic_account_type" "tx_node_private" - "expose" true - "syncmode" "full" - "gcmode" "archive" - "rpc_apis" "eth,net,web3,debug,txpool" - "ws_port" "8545" - "resources" (default $.Values.geth.resources $.Values.geth.txNodePrivateResources) - "ip_addresses" (join "/" .Values.geth.private_tx_node_ip_addresses) - "extra_setup" $.Values.geth.txNodePrivateExtraSnippet - "affinity" $.Values.geth.txNodePrivateAffinity - "node_selector" $.Values.geth.txNodePrivateNodeSelector - "tolerations" $.Values.geth.txNodePrivateTolerations - "extraPodLabels" $.Values.extraPodLabels.txnode_private - "pvc_annotations" $.Values.pvcAnnotations.txnode_private -) }} diff --git a/packages/helm-charts/testnet/templates/txnode.service.yaml b/packages/helm-charts/testnet/templates/txnode.service.yaml deleted file mode 100644 index ed1656917a8..00000000000 --- a/packages/helm-charts/testnet/templates/txnode.service.yaml +++ /dev/null @@ -1,42 +0,0 @@ -{{- range $index, $e := until (.Values.geth.tx_nodes | int) }} -{{- $loadBalancerIP := (index $.Values.geth.txNodesIPAddressArray $index) }} -{{- template "celo.node-service" (dict - "Values" $.Values - "Release" $.Release - "Chart" $.Chart - "index" $index - "service_type" "LoadBalancer" - "svc_name" "service" - "node_name" "tx-nodes" - "component_label" "tx_nodes" - "load_balancer_ip" $loadBalancerIP -) }} - ports: - - name: discovery - port: 30303 - protocol: UDP ---- -{{- template "celo.node-service" (dict - "Values" $.Values - "Release" $.Release - "Chart" $.Chart - "index" $index - "service_type" "LoadBalancer" - "svc_name" "service" - "node_name" "tx-nodes" - "svc_name_suffix" "-tcp" - "component_label" "tx_nodes" - "load_balancer_ip" $loadBalancerIP -) }} - ports: - - name: celo - port: 30303 - protocol: TCP - - name: rpc - port: 8545 - protocol: TCP - - name: ws - port: 8546 - protocol: TCP ---- -{{- end }} diff --git a/packages/helm-charts/testnet/templates/txnode.statefulset.yaml b/packages/helm-charts/testnet/templates/txnode.statefulset.yaml deleted file mode 100644 index 06a081d5ca4..00000000000 --- a/packages/helm-charts/testnet/templates/txnode.statefulset.yaml +++ /dev/null @@ -1,22 +0,0 @@ -{{- include "celo.full-node-statefulset" (dict - "Values" $.Values - "Release" $.Release - "Chart" $.Chart - "name" "tx-nodes" - "component_label" "tx_nodes" - "replicas" .Values.geth.tx_nodes - "mnemonic_account_type" "tx_node" - "expose" true - "syncmode" "full" - "gcmode" "full" - "rpc_apis" "eth,net,web3" - "ws_port" "8545" - "resources" $.Values.geth.txNodeResources - "ip_addresses" (join "/" .Values.geth.txNodesIPAddressArray) - "extra_setup" $.Values.geth.txNodeExtraSnippet - "affinity" $.Values.geth.txNodeAffinity - "node_selector" $.Values.geth.txNodeNodeSelector - "tolerations" $.Values.geth.txNodeTolerations - "extraPodLabels" $.Values.extraPodLabels.txnode - "pvc_annotations" $.Values.pvcAnnotations.txnode -) }} diff --git a/packages/helm-charts/testnet/templates/validators.service.yaml b/packages/helm-charts/testnet/templates/validators.service.yaml deleted file mode 100644 index 995d8557c73..00000000000 --- a/packages/helm-charts/testnet/templates/validators.service.yaml +++ /dev/null @@ -1,38 +0,0 @@ -{{- if .Values.geth.static_ips }} -{{- range $index, $e := until (.Values.geth.validators | int) }} -{{- if (eq (index $.Values.geth.proxiesPerValidator $index | int) 0) }} -{{- $loadBalancerIP := index $.Values.geth.validatorsIPAddressArray $index }} -{{- template "celo.node-service" (dict - "Values" $.Values - "Release" $.Release - "Chart" $.Chart - "index" $index - "service_type" "LoadBalancer" - "node_name" "validators" - "component_label" "validators" - "load_balancer_ip" $loadBalancerIP -) }} - ports: - - name: discovery - port: 30303 - protocol: UDP ---- -{{- template "celo.node-service" (dict - "Values" $.Values - "Release" $.Release - "Chart" $.Chart - "index" $index - "service_type" "LoadBalancer" - "node_name" "validators" - "svc_name_suffix" "-tcp" - "component_label" "validators" - "load_balancer_ip" $loadBalancerIP -) }} - ports: - - name: celo - port: 30303 - protocol: TCP ---- -{{- end }} -{{- end }} -{{- end }} diff --git a/packages/helm-charts/testnet/templates/validators.statefulset.yaml b/packages/helm-charts/testnet/templates/validators.statefulset.yaml deleted file mode 100644 index 9b203c60f45..00000000000 --- a/packages/helm-charts/testnet/templates/validators.statefulset.yaml +++ /dev/null @@ -1,301 +0,0 @@ -apiVersion: v1 -kind: Service -metadata: - name: validators - labels: - component: validators -spec: - ports: - - port: 80 - name: web - clusterIP: None - selector: - component: validators ---- -apiVersion: apps/v1 -kind: StatefulSet -metadata: - name: {{ template "common.fullname" . }}-validators - labels: - {{- include "common.standard.labels" . | nindent 4 }} - component: validators -spec: - {{- if .Values.geth.ssd_disks }} - volumeClaimTemplates: - - metadata: - name: data - {{- if .Values.pvcAnnotations.validator }} - annotations: - {{- toYaml .Values.pvcAnnotations.validator | nindent 8 }} - {{- end }} - spec: - storageClassName: {{ .Values.geth.storageClass }} - accessModes: [ "ReadWriteOnce" ] - resources: - requests: - storage: {{ .Values.geth.diskSizeGB }}Gi - {{- end }} - podManagementPolicy: Parallel - {{- with .Values.updateStrategy.validators }} - updateStrategy: - {{- toYaml . | nindent 4 }} - {{- end }} - replicas: {{ .Values.geth.validators }} - serviceName: validators - selector: - matchLabels: - {{- include "common.standard.labels" . | nindent 6 }} - component: validators - template: - metadata: - labels: - {{- include "common.standard.labels" . | nindent 8 }} - component: validators - {{- if .Values.extraPodLabels.validators }} - {{- toYaml .Values.extraPodLabels.validators | nindent 8 }} - {{- end }} - {{- if .Values.metrics | default false }} - annotations: - {{- include "common.prometheus-annotations" . | nindent 8 }} - {{- end }} - spec: - initContainers: - {{- include "common.conditional-init-genesis-container" . | nindent 6 }} - - name: get-account - image: {{ .Values.celotool.image.repository }}:{{ .Values.celotool.image.tag }} - imagePullPolicy: Always - command: - - /bin/bash - - "-c" - args: - - | - [[ $REPLICA_NAME =~ -([0-9]+)$ ]] || exit 1 - RID=${BASH_REMATCH[1]} - echo -n "$RID" >/root/.celo/replica_id - echo "Generating private key for rid=$RID" - celotooljs.sh generate bip32 --mnemonic "$MNEMONIC" --accountType validator --index "$RID" > /root/.celo/pkey - echo 'Generating address' - celotooljs.sh generate account-address --private-key `cat /root/.celo/pkey` > /root/.celo/address - - # If this is a proxied validator, it will not have an external IP address - # and EXTERNAL_IP_ADDRESS will be empty - EXTERNAL_IP_ADDRESS=`echo -n "$IP_ADDRESSES" | cut -d '/' -f $((RID + 1))` - echo "$EXTERNAL_IP_ADDRESS" > /root/.celo/externalIpAddress - - # Put the proxies per validator array into a comma separated string - # so we can index it at runtime - PROXIES_PER_VALIDATOR="{{ join "," .Values.geth.proxiesPerValidator }}" - PROXY_COUNT=`echo -n $PROXIES_PER_VALIDATOR | cut -d ',' -f $((RID + 1))` - echo -n "$PROXY_COUNT" > /root/.celo/proxyCount - - if [[ -z "$EXTERNAL_IP_ADDRESS" ]]; then - echo "$POD_IP" > /root/.celo/ipAddress - else - cat /root/.celo/externalIpAddress > /root/.celo/ipAddress - fi - echo -n "Generating IP address for validator: " - cat /root/.celo/ipAddress - - celotooljs.sh generate public-key --mnemonic "$MNEMONIC" --accountType bootnode --index 0 > /root/.celo/bootnodeEnodeAddress - echo -n "Generating Bootnode enode address for the validator: " - cat /root/.celo/bootnodeEnodeAddress - - [[ "$BOOTNODE_IP_ADDRESS" == 'none' ]] && BOOTNODE_IP_ADDRESS=${{ .Release.Namespace | upper }}_BOOTNODE_SERVICE_HOST - echo "enode://$(cat /root/.celo/bootnodeEnodeAddress)@$BOOTNODE_IP_ADDRESS:30301" > /root/.celo/bootnodeEnode - echo -n "Generating Bootnode enode for the validator: " - cat /root/.celo/bootnodeEnode - - # Generate a fee recipient address if available - if celotooljs.sh generate address-from-env --help | grep tx_fee_recipient >/dev/null; then - celotooljs.sh generate account-address --mnemonic "$MNEMONIC" --accountType tx_fee_recipient --index $RID > /root/.celo/feeRecipientAddress - fi - - # If this validator is meant to be proxied - if [[ ! -z "$PROXY_COUNT" ]]; then - # Put the all the validator's proxy IP addresses into a comma separated string - # so we can access it at runtime. Validators are separated by commas, - # and individual IP addresses are separated by /'s. For example, - # if one validator has 2 proxies, and the other has 1 proxy: - # ALL_VALIDATOR_PROXY_IPS would have the form X.X.X.X/X.X.X.X,X.X.X.X - ALL_VALIDATOR_PROXY_IPS='{{ join "," .Values.geth.proxyIPAddressesPerValidatorArray }}' - [[ $ALL_VALIDATOR_PROXY_IPS = '' ]] && ALL_VALIDATOR_PROXY_IPS='' - PROXY_IPS=`echo -n $ALL_VALIDATOR_PROXY_IPS | cut -d ',' -f $((RID + 1))` - # Clear the proxy enode file because it's persisted - rm -f /root/.celo/proxyEnodeUrlPairs - # Generate all proxy enode urls and put them into /root/.celo/proxyEnodeUrlPairs - PROXY_INDEX=0 - while [ "$PROXY_INDEX" -lt "$PROXY_COUNT" ]; do - if [ "$PROXY_INDEX" -gt 0 ]; then - echo -n "," >> /root/.celo/proxyEnodeUrlPairs - fi - # gives us PROXY_ENODE_URL_PAIR - {{- include "celo.proxyenodeurlpair" . | nindent 14 }} - echo -n $PROXY_ENODE_URL_PAIR >> /root/.celo/proxyEnodeUrlPairs - PROXY_INDEX=$(( $PROXY_INDEX + 1 )) - done - fi - env: - - name: POD_IP - valueFrom: - fieldRef: - apiVersion: v1 - fieldPath: status.podIP - - name: REPLICA_NAME - valueFrom: - fieldRef: - fieldPath: metadata.name - - name: MNEMONIC - valueFrom: - secretKeyRef: - name: {{ template "common.fullname" . }}-geth-account - key: mnemonic - - name: IP_ADDRESSES - value: {{ join "/" .Values.geth.validatorsIPAddressArray }} - - name: BOOTNODE_IP_ADDRESS - value: "{{ default "none" .Values.geth.bootnodeIpAddress }}" - volumeMounts: - - name: data - mountPath: /root/.celo - {{- include "common.import-geth-account-container" . | nindent 6 }} - containers: - - name: geth - image: {{ .Values.geth.image.repository }}:{{ .Values.geth.image.tag }} - imagePullPolicy: Always - command: ["/bin/sh"] - args: - - "-c" - - |- - set -euo pipefail - rm /root/.celo/pkey || true - ADDITIONAL_FLAGS='' - - ACCOUNT_ADDRESS=$(cat /root/.celo/address) - RID=$(cat /root/.celo/replica_id) - - if [ "$RID" -lt "$FAULTY_NODES" ]; then - ADDITIONAL_FLAGS="${ADDITIONAL_FLAGS} --istanbul.faultymode $FAULTY_NODE_TYPE" - fi - - if geth --help | grep 'proxy.proxyenodeurlpairs' >/dev/null; then - PROXY_FLAG="--proxy.proxyenodeurlpairs" - else - PROXY_FLAG="--proxy.proxyenodeurlpair" - fi - - PROXY_COUNT=$(cat /root/.celo/proxyCount) - if [ "$PROXY_COUNT" -gt 0 ]; then - ADDITIONAL_FLAGS="${ADDITIONAL_FLAGS} --proxy.proxied ${PROXY_FLAG}=$(cat /root/.celo/proxyEnodeUrlPairs) --nodiscover --proxy.allowprivateip" - else - ADDITIONAL_FLAGS="${ADDITIONAL_FLAGS} --ethstats=${HOSTNAME}@${ETHSTATS_SVC}" - ADDITIONAL_FLAGS="${ADDITIONAL_FLAGS} --light.maxpeers=1000 --maxpeers=1200" - fi - - [[ "$PING_IP_FROM_PACKET" == "true" ]] && ADDITIONAL_FLAGS="${ADDITIONAL_FLAGS} --ping-ip-from-packet" - - [[ "$IN_MEMORY_DISCOVERY_TABLE" == "true" ]] && ADDITIONAL_FLAGS="${ADDITIONAL_FLAGS} --use-in-memory-discovery-table" - - # Use a different fee recipient address if option available - set +e - geth --help | grep 'tx-fee-recipient' >/dev/null - tx_fee_available=$? - set -e - if [ $tx_fee_available -eq 0 ] && [ -f /root/.celo/feeRecipientAddress ] ; then - ADDITIONAL_FLAGS="${ADDITIONAL_FLAGS} --miner.validator ${ACCOUNT_ADDRESS}" - ADDITIONAL_FLAGS="${ADDITIONAL_FLAGS} --tx-fee-recipient $(cat /root/.celo/feeRecipientAddress)" - else - ADDITIONAL_FLAGS="${ADDITIONAL_FLAGS} --etherbase=${ACCOUNT_ADDRESS}" - fi - - {{- include "common.bootnode-flag-script" . | nindent 10 }} - {{- include "common.geth-add-metrics-pprof-config" . | nindent 10 }} - {{- include "common.geth-http-ws-flags" (dict "Values" $.Values "rpc_apis" "eth,net,web3,debug" "ws_port" "8545" "listen_address" "0.0.0.0") | nindent 10 }} - - {{- .Values.geth.validatorExtraSnippet | nindent 10 }} - exec geth \ - $BOOTNODE_FLAG \ - --datadir /root/.celo \ - --ipcpath=geth.ipc \ - --nousb \ - --password=/root/.celo/account/accountSecret \ - --unlock=${ACCOUNT_ADDRESS} \ - --mine \ - --syncmode=full \ - --consoleformat=json \ - --consoleoutput=stdout \ - --verbosity={{ .Values.geth.verbosity }} \ - --vmodule={{ .Values.geth.vmodule }} \ - --nat=extip:`cat /root/.celo/ipAddress` \ - --allow-insecure-unlock \ - ${ADDITIONAL_FLAGS} - env: - - name: POD_IP - valueFrom: - fieldRef: - apiVersion: v1 - fieldPath: status.podIP - - name: ETHSTATS_SVC - value: {{ template "common.fullname" . }}-ethstats.{{ .Release.Namespace }} - - name: NETWORK_ID - valueFrom: - configMapKeyRef: - name: {{ template "common.fullname" . }}-geth-config - key: networkid - - name: FAULTY_NODES - value: {{ .Values.geth.faultyValidators | quote }} - - name: FAULTY_NODE_TYPE - value: {{ .Values.geth.faultyValidatorType | quote }} - - name: STATIC_IPS_FOR_GETH_NODES - value: "{{ default false .Values.geth.static_ips }}" - - name: PING_IP_FROM_PACKET - value: "{{ default false .Values.geth.ping_ip_from_packet }}" - - name: IN_MEMORY_DISCOVERY_TABLE - value: "{{ default "false" .Values.geth.in_memory_discovery_table }}" - ports: - - name: discovery-udp - containerPort: 30303 - protocol: UDP - - name: discovery-tcp - containerPort: 30303 - - name: rpc - containerPort: 8545 - - name: ws - containerPort: 8546 - {{- if .Values.geth.enable_metrics | default false }} - - name: metrics - containerPort: 6060 - {{- end }} - {{- $resources := default .Values.geth.resources .Values.geth.validatorResources -}} - {{- with $resources }} - resources: - {{- toYaml . | nindent 10 }} - {{- end }} - volumeMounts: - - name: data - mountPath: /root/.celo - - name: account - mountPath: /root/.celo/account - readOnly: true - terminationGracePeriodSeconds: {{ .Values.geth.terminationGracePeriodSeconds | default 300 }} - {{- with .Values.geth.validatorAffinity }} - affinity: - {{- toYaml . | nindent 8 }} - {{- end }} - {{- with .Values.geth.validatorNodeSelector }} - nodeSelector: - {{ toYaml . | nindent 8 }} - {{- end }} - {{- with .Values.geth.validatorTolerations }} - tolerations: - {{ toYaml . | nindent 8 }} - {{- end }} - volumes: - - name: data - emptyDir: {} - - name: data-shared - emptyDir: {} - - name: config - configMap: - name: {{ template "common.fullname" . }}-geth-config - - name: account - secret: - secretName: {{ template "common.fullname" . }}-geth-account diff --git a/packages/helm-charts/testnet/values-alfajores.yaml b/packages/helm-charts/testnet/values-alfajores.yaml deleted file mode 100644 index 1ae0dcf8828..00000000000 --- a/packages/helm-charts/testnet/values-alfajores.yaml +++ /dev/null @@ -1,81 +0,0 @@ ---- -geth: - resources: - requests: - cpu: 400m - memory: 3Gi - validatorResources: - requests: - cpu: 0.2 - memory: 3G - txNodeResources: - requests: - cpu: 0.4 - memory: 9G - txNodePrivateResources: - requests: - cpu: 1.7 - memory: 4Gi - validatorAffinity: - nodeAffinity: - preferredDuringSchedulingIgnoredDuringExecution: - - weight: 100 - preference: - matchExpressions: - - key: node.kubernetes.io/instance-type - operator: In - values: - - n2-highmem-2 - podAntiAffinity: {} - # preferredDuringSchedulingIgnoredDuringExecution: - # - weight: 100 - # podAffinityTerm: - # labelSelector: - # matchExpressions: - # - key: component - # operator: In - # values: - # - validators - # topologyKey: kubernetes.io/hostname - txNodeAffinity: - nodeAffinity: - preferredDuringSchedulingIgnoredDuringExecution: - - weight: 99 - preference: - matchExpressions: - - key: node.kubernetes.io/instance-type - operator: In - values: - - n2-highmem-2 - podAntiAffinity: - preferredDuringSchedulingIgnoredDuringExecution: - - weight: 100 - podAffinityTerm: - labelSelector: - matchExpressions: - - key: component - operator: In - values: - - tx_nodes - topologyKey: kubernetes.io/hostname - txNodePrivateAffinity: - nodeAffinity: - preferredDuringSchedulingIgnoredDuringExecution: - - weight: 100 - preference: - matchExpressions: - - key: node.kubernetes.io/instance-type - operator: In - values: - - n2-highcpu-8 - podAntiAffinity: - preferredDuringSchedulingIgnoredDuringExecution: - - weight: 99 - podAffinityTerm: - labelSelector: - matchExpressions: - - key: component - operator: In - values: - - tx_nodes_private - topologyKey: kubernetes.io/hostname diff --git a/packages/helm-charts/testnet/values.yaml b/packages/helm-charts/testnet/values.yaml deleted file mode 100644 index 3c2ad7ad1c0..00000000000 --- a/packages/helm-charts/testnet/values.yaml +++ /dev/null @@ -1,141 +0,0 @@ -# Default values for ethereum. -# This is a YAML-formatted file. -# Declare variables to be passed into your templates. - -imagePullPolicy: Always - -# Node labels for pod assignment -# ref: https://kubernetes.io/docs/user-guide/node-selection/ -nodeSelector: {} - -bootnode: - image: - repository: us.gcr.io/celo-testnet/geth-all - tag: fc254b550a4993956ac7aa3fcd8dd4db63b8c9d2 - -celotool: - image: - repository: gcr.io/celo-testnet/celo-monorepo - tag: celotool-dc5e5dfa07231a4ff4664816a95eae606293eae9 - -genesis: - networkId: 1110 - network: testnet - useGenesisFileBase64: true - genesisFileBase64: "" - -geth: - image: - repository: us.gcr.io/celo-testnet/geth - tag: fc254b550a4993956ac7aa3fcd8dd4db63b8c9d2 - resources: - requests: - memory: "256Mi" - cpu: "500m" - limits: {} - # limits: - # memory: "4Gi" - # cpu: "4" - validatorResources: {} - secondaryResources: {} - proxyResources: {} - txNodeResources: {} - txNodePrivateResources: {} - ws_port: 8546 - rpc_gascap: 10000000 - validatorExtraSnippet: | - echo "Validator" - secondayExtraSnippet: | - echo "secondary-validator" - proxyExtraSnippet: | - echo "Proxy" - txNodeExtraSnippet: | - echo "txnode" - txNodePrivateExtraSnippet: | - echo "txnode-private" - ADDITIONAL_FLAGS="${ADDITIONAL_FLAGS} --http.timeout.read 600 --http.timeout.write 600 --http.timeout.idle 2400" - validatorAffinity: {} - validatorNodeSelector: {} - validatorTolerations: [] - secondaryAffinity: {} - secondaryNodeSelector: {} - secondaryTolerations: [] - proxyAffinity: {} - proxyNodeSelector: {} - proxyTolerations: [] - txNodeAffinity: {} - txNodeNodeSelector: {} - txNodeTolerations: [] - txNodePrivateAffinity: {} - txNodePrivateNodeSelector: {} - txNodePrivateTolerations: [] - storageClass: ssd - maxpeers: 1150 - light: - maxpeers: 1000 - serve: 70 - -# UpdateStrategy for statefulsets only. Partition=0 is default rollingUpdate behaviour. -updateStrategy: - validators: - type: RollingUpdate - rollingUpdate: - partition: 0 - secondaries: - type: RollingUpdate - rollingUpdate: - partition: 0 - proxy: - type: RollingUpdate - rollingUpdate: - partition: 0 - tx_nodes: - type: RollingUpdate - rollingUpdate: - partition: 0 - tx_nodes_private: - type: RollingUpdate - rollingUpdate: - partition: 0 - -gethexporter: - image: - repository: gcr.io/celo-testnet/geth-exporter - tag: ed7d21bd50592709173368cd697ef73c1774a261 - -blockscout: - image: - repository: gcr.io/celo-testnet/blockscout - webTag: web - indexerTag: indexer - db: - # ip: must be provided at runtime # IP address of the postgres DB - # connection_name: must be provided at runtime # name of the cloud sql connection - # username: blockscout - # password: password - name: blockscout - -domain: - name: celo-networks-dev - -ingressClassName: nginx - -extraPodLabels: - validator: - mode: full - secondary: - mode: full - proxy: - mode: full - txnode: - mode: full - txnode_private: - stack: blockscout - mode: archive - -pvcAnnotations: - validator: {} - secondary: {} - proxy: {} - txnode: {} - txnode_private: {} diff --git a/packages/helm-charts/tracer-tool/Chart.yaml b/packages/helm-charts/tracer-tool/Chart.yaml deleted file mode 100644 index 72305665cd0..00000000000 --- a/packages/helm-charts/tracer-tool/Chart.yaml +++ /dev/null @@ -1,8 +0,0 @@ -name: tracer-tool -version: 0.0.1 -description: Chart which runs tracer tool -keywords: -- ethereum -- blockchain -- tracer -appVersion: v1.7.3 diff --git a/packages/helm-charts/tracer-tool/scripts/run.sh b/packages/helm-charts/tracer-tool/scripts/run.sh deleted file mode 100644 index acda36c5a6c..00000000000 --- a/packages/helm-charts/tracer-tool/scripts/run.sh +++ /dev/null @@ -1,29 +0,0 @@ -FIRST_ACCOUNT="0x4da58d267cd465b9313fdb19b120ec591d957ad2"; -SECOND_ACCOUNT="0xc70947239385c2422866e20b6cafffa29157e4b3"; -CELOTOOL="/celo-monorepo/packages/celotool/bin/celotooljs.sh"; -GETH_DIR="/celo-monorepo/node_modules/@celo/geth"; -DATA_DIR="/geth-data"; -ENV_NAME="$(cat /root/envname)" - -wget https://dl.google.com/go/go1.11.5.linux-amd64.tar.gz; -tar xf go1.11.5.linux-amd64.tar.gz -C /tmp; -PATH=$PATH:/tmp/go/bin; - -cd "/celo-monorepo" && yarn run build-sdk $ENV_NAME; - -mkdir $DATA_DIR; - -$CELOTOOL geth build --geth-dir $GETH_DIR -c && -$CELOTOOL geth init --geth-dir $GETH_DIR --data-dir $DATA_DIR -e $ENV_NAME --genesis "/geth/genesis.json" --fetch-static-nodes-from-network=false; - -cat /root/pk2740 >> $DATA_DIR/keystore/UTC--2019-03-02T04-27-40.724063000Z--c70947239385c2422866e20b6cafffa29157e4b3; -cat /root/pk2745 >> $DATA_DIR/keystore/UTC--2019-03-02T04-27-45.410695000Z--4da58d267cd465b9313fdb19b120ec591d957ad2; -cat /root/staticnodes >> $DATA_DIR/static-nodes.json; - -echo "Running geth..."; - -$CELOTOOL geth run --geth-dir $GETH_DIR --data-dir $DATA_DIR --sync-mode ultralight --verbosity 1 & - -sleep 15; - -$CELOTOOL geth trace $FIRST_ACCOUNT $SECOND_ACCOUNT --data-dir $DATA_DIR -e $ENV_NAME diff --git a/packages/helm-charts/tracer-tool/templates/tracer-tool.cronjob.yaml b/packages/helm-charts/tracer-tool/templates/tracer-tool.cronjob.yaml deleted file mode 100644 index 6d32b64d4ba..00000000000 --- a/packages/helm-charts/tracer-tool/templates/tracer-tool.cronjob.yaml +++ /dev/null @@ -1,42 +0,0 @@ -apiVersion: batch/v1beta1 -kind: CronJob -metadata: - name: {{ .Values.environment }}-tracer-tool - labels: - app: tracer-tool - chart: tracer-tool - release: {{ .Release.Name }} - heritage: {{ .Release.Service }} - component: tracer-tool -spec: - schedule: "*/5 * * * *" - concurrencyPolicy: Forbid - jobTemplate: - spec: - backoffLimit: 1 - replicas: 1 - template: - spec: - containers: - - name: tracer-tool - image: {{ .Values.imageRepository }}:{{ .Values.imageTag }} - imagePullPolicy: {{ .Values.imagePullPolicy }} - command: ["/bin/bash"] - args: ["-c", '{{ .Files.Get "scripts/run.sh" }}', "{{ .Values.environment }}"] - resources: - requests: - memory: 1G - cpu: 300m - volumeMounts: - - name: starting-files - mountPath: /root - - name: geth-config - mountPath: /geth - volumes: - - name: starting-files - configMap: - name: {{ .Values.environment }}-tracer-tool-config - - name: geth-config - configMap: - name: {{ .Values.environment }}-geth-config - restartPolicy: Never diff --git a/packages/helm-charts/tracer-tool/templates/tracer.configmap.yaml b/packages/helm-charts/tracer-tool/templates/tracer.configmap.yaml deleted file mode 100644 index 18cd3085bbc..00000000000 --- a/packages/helm-charts/tracer-tool/templates/tracer.configmap.yaml +++ /dev/null @@ -1,18 +0,0 @@ -apiVersion: v1 -kind: ConfigMap -metadata: - name: {{ .Values.environment }}-tracer-tool-config - labels: - app: tracer-tool - chart: tracer-tool - release: {{ .Release.Name }} - heritage: {{ .Release.Service }} - component: tracer-tool -data: - envname: {{ .Values.environment }} - staticnodes: |- - {{ .Values.enodes | b64dec }} - pk2740: |- - {"address":"c70947239385c2422866e20b6cafffa29157e4b3","crypto":{"cipher":"aes-128-ctr","ciphertext":"d39cfef5d85d61972410720b2769f716db3daa19c58b6884a5afe57a99138d17","cipherparams":{"iv":"48af2b5f12455782e8494c42cbd8e7f6"},"kdf":"scrypt","kdfparams":{"dklen":32,"n":262144,"p":1,"r":8,"salt":"2b8f7f1b33b495ab8500d1e84aef63888c284aa8ee34d6203a88594b4d3a0246"},"mac":"0e7446c44056f6afa5a47978371cd75a1f9e6cb34a71a8c8b2d8b73405be4614"},"id":"7cd8dd56-1996-485b-ac6a-b6e508f40355","version":3} - pk2745: |- - {"address":"4da58d267cd465b9313fdb19b120ec591d957ad2","crypto":{"cipher":"aes-128-ctr","ciphertext":"f09751814fc90ce684c83c5197e9c0588bcececa4e02dedad8acd9a9a0b27b33","cipherparams":{"iv":"82d833b076e3a7be6c6c366cad39f31d"},"kdf":"scrypt","kdfparams":{"dklen":32,"n":262144,"p":1,"r":8,"salt":"3c19137687ca4f0a3b09df901553f68c08e95b6c5e06d7aee7706574840c0348"},"mac":"1f4bcf6b0be9d77a2e451cfe77ca6d6ba5871d99b4d4ee19b084c3e15cade5f8"},"id":"e6721754-3f34-44f5-a2b9-221edb67b4b1","version":3} diff --git a/packages/helm-charts/tracer-tool/values.yaml b/packages/helm-charts/tracer-tool/values.yaml deleted file mode 100644 index d4289f1a1f2..00000000000 --- a/packages/helm-charts/tracer-tool/values.yaml +++ /dev/null @@ -1 +0,0 @@ -imagePullPolicy: IfNotPresent diff --git a/packages/helm-charts/transaction-metrics-exporter/Chart.yaml b/packages/helm-charts/transaction-metrics-exporter/Chart.yaml deleted file mode 100644 index 30477922bed..00000000000 --- a/packages/helm-charts/transaction-metrics-exporter/Chart.yaml +++ /dev/null @@ -1,10 +0,0 @@ -name: transaction-metrics-exporter -version: 0.0.1 -description: A small chart that runs our exporter for transaction metrics -keywords: -- blockchain -- prometheus -home: https://github.com/celo-org/celo-monorepo/packages/transaction-metrics-exporter -sources: -- https://github.com/celo-org/celo-monorepo/packages/transaction-metrics-exporter -appVersion: v1.7.3 diff --git a/packages/helm-charts/transaction-metrics-exporter/templates/deployment.yaml b/packages/helm-charts/transaction-metrics-exporter/templates/deployment.yaml deleted file mode 100644 index 3bc523d5dee..00000000000 --- a/packages/helm-charts/transaction-metrics-exporter/templates/deployment.yaml +++ /dev/null @@ -1,46 +0,0 @@ -apiVersion: apps/v1 -kind: Deployment -metadata: - name: {{ .Values.environment }}-transaction-metrics-exporter-{{ .Values.deploymentSuffix }} - labels: - app: transaction-metrics-exporter - chart: transaction-metrics-exporter - release: {{ .Release.Name }} - heritage: {{ .Release.Service }} - component: transaction-metrics-exporter -spec: - replicas: 1 - selector: - matchLabels: - app: transaction-metrics-exporter - release: {{ .Release.Name }} - component: transaction-metrics-exporter - template: - metadata: - labels: - app: transaction-metrics-exporter - release: {{ .Release.Name }} - component: transaction-metrics-exporter - spec: - containers: - - name: transaction-metrics-exporter-{{ .Values.deploymentSuffix }} - image: {{ required "A valid image repository required!" .Values.imageRepository }}:{{ required "A valid image tag required!" .Values.imageTag }} - imagePullPolicy: {{ .Values.imagePullPolicy }} - env: - - name: WEB3_PROVIDER - value: {{ .Values.web3Provider }} - - name: FROM_BLOCK - value: "{{ .Values.fromBlock }}" - - name: TO_BLOCK - value: "{{ .Values.toBlock }}" - - name: BLOCK_INTERVAL - value: "{{ .Values.blockInterval }}" - - name: WATCH_ADDRESS - value: "{{ .Values.watchAddress }}" - ports: - - name: http - containerPort: 3000 - resources: - requests: - memory: 100M - cpu: 100m diff --git a/packages/helm-charts/transaction-metrics-exporter/values.yaml b/packages/helm-charts/transaction-metrics-exporter/values.yaml deleted file mode 100644 index 1985a13bdb9..00000000000 --- a/packages/helm-charts/transaction-metrics-exporter/values.yaml +++ /dev/null @@ -1,3 +0,0 @@ -imagePullPolicy: IfNotPresent - -web3Provider: ws://tx-nodes:8546 diff --git a/packages/helm-charts/voting-bot/Chart.yaml b/packages/helm-charts/voting-bot/Chart.yaml deleted file mode 100644 index f173ab264f1..00000000000 --- a/packages/helm-charts/voting-bot/Chart.yaml +++ /dev/null @@ -1,5 +0,0 @@ -apiVersion: v1 -appVersion: "1.0" -description: A bot that votes on validator groups -name: voting-bot -version: 0.1.0 \ No newline at end of file diff --git a/packages/helm-charts/voting-bot/templates/voting-bot.cronjob.yaml b/packages/helm-charts/voting-bot/templates/voting-bot.cronjob.yaml deleted file mode 100644 index 816e07b0f84..00000000000 --- a/packages/helm-charts/voting-bot/templates/voting-bot.cronjob.yaml +++ /dev/null @@ -1,45 +0,0 @@ -apiVersion: batch/v1beta1 -kind: CronJob -metadata: - name: {{ .Release.Name }} - labels: - app: voting-bot - chart: voting-bot - release: {{ .Release.Service }} - component: voting-bot -spec: - schedule: "{{ .Values.cronSchedule }}" - concurrencyPolicy: Forbid - jobTemplate: - spec: - backoffLimit: 1 - template: - spec: - containers: - - name: voting-bot - image: {{ .Values.imageRepository }}:{{ .Values.imageTag }} - imagePullPolicy: {{ .Values.imagePullPolicy }} - command: - - bash - - "-c" - - | - CELOTOOL="/celo-monorepo/packages/celotool/bin/celotooljs.sh"; - - $CELOTOOL bots auto-vote --celoProvider {{ .Values.celoProvider }} --excludedGroups {{ .Values.votingBot.excludedGroups }} - env: - - name: MNEMONIC - valueFrom: - secretKeyRef: - name: {{ .Values.environment }}-voting-bot-secrets - key: mnemonic - - name: VOTING_BOT_CHANGE_BASELINE - value: {{ .Values.votingBot.changeBaseline | quote }} - - name: VOTING_BOT_EXPLORE_PROBABILITY - value: {{ .Values.votingBot.exploreProbability | quote }} - - name: VOTING_BOT_SCORE_SENSITIVITY - value: {{ .Values.votingBot.scoreSensitivity | quote }} - - name: VOTING_BOT_WAKE_PROBABILITY - value: {{ .Values.votingBot.wakeProbability | quote }} - - name: VOTING_BOTS - value: {{ .Values.votingBot.count | quote }} - restartPolicy: Never diff --git a/packages/helm-charts/voting-bot/templates/voting-bot.secret.yaml b/packages/helm-charts/voting-bot/templates/voting-bot.secret.yaml deleted file mode 100644 index 43e568d1920..00000000000 --- a/packages/helm-charts/voting-bot/templates/voting-bot.secret.yaml +++ /dev/null @@ -1,13 +0,0 @@ -apiVersion: v1 -kind: Secret -metadata: - name: {{ .Values.environment }}-voting-bot-secrets - labels: - app: voting-bot - chart: voting-bot - release: {{ .Release.Name }} - heritage: {{ .Release.Service }} - component: voting-bot -type: Opaque -data: - mnemonic: {{ .Values.mnemonic | b64enc }} \ No newline at end of file diff --git a/packages/helm-charts/voting-bot/values.yaml b/packages/helm-charts/voting-bot/values.yaml deleted file mode 100644 index d4289f1a1f2..00000000000 --- a/packages/helm-charts/voting-bot/values.yaml +++ /dev/null @@ -1 +0,0 @@ -imagePullPolicy: IfNotPresent diff --git a/packages/helm-charts/wallet-connect/Chart.yaml b/packages/helm-charts/wallet-connect/Chart.yaml deleted file mode 100644 index 57e76866cf2..00000000000 --- a/packages/helm-charts/wallet-connect/Chart.yaml +++ /dev/null @@ -1,9 +0,0 @@ -apiVersion: v1 -appVersion: "0.1.0" -description: A Helm chart for Wallet Connect app -name: walletconnect -version: 0.1.0 -dependencies: -- name: redis - version: 12.8.3 - repository: https://charts.bitnami.com/bitnami diff --git a/packages/helm-charts/wallet-connect/templates/_helpers.tpl b/packages/helm-charts/wallet-connect/templates/_helpers.tpl deleted file mode 100644 index 7899e732f38..00000000000 --- a/packages/helm-charts/wallet-connect/templates/_helpers.tpl +++ /dev/null @@ -1,51 +0,0 @@ -{{/* -Expand the name of the chart. -*/}} -{{- define "walletconnect.name" -}} -{{- default .Chart.Name .Values.nameOverride | trunc 63 | trimSuffix "-" }} -{{- end }} - -{{/* -Create a default fully qualified app name. -We truncate at 63 chars because some Kubernetes name fields are limited to this (by the DNS naming spec). -If release name contains chart name it will be used as a full name. -*/}} -{{- define "walletconnect.fullname" -}} -{{- if .Values.fullnameOverride }} -{{- .Values.fullnameOverride | trunc 63 | trimSuffix "-" }} -{{- else }} -{{- $name := default .Chart.Name .Values.nameOverride }} -{{- if contains $name .Release.Name }} -{{- .Release.Name | trunc 63 | trimSuffix "-" }} -{{- else }} -{{- printf "%s-%s" .Release.Name $name | trunc 63 | trimSuffix "-" }} -{{- end }} -{{- end }} -{{- end }} - -{{/* -Create chart name and version as used by the chart label. -*/}} -{{- define "walletconnect.chart" -}} -{{- printf "%s-%s" .Chart.Name .Chart.Version | replace "+" "_" | trunc 63 | trimSuffix "-" }} -{{- end }} - -{{/* -Common labels -*/}} -{{- define "walletconnect.labels" -}} -helm.sh/chart: {{ include "walletconnect.chart" . }} -{{ include "walletconnect.selectorLabels" . }} -{{- if .Chart.AppVersion }} -app.kubernetes.io/version: {{ .Chart.AppVersion | quote }} -{{- end }} -app.kubernetes.io/managed-by: {{ .Release.Service }} -{{- end }} - -{{/* -Selector labels -*/}} -{{- define "walletconnect.selectorLabels" -}} -app.kubernetes.io/name: {{ include "walletconnect.name" . }} -app.kubernetes.io/instance: {{ .Release.Name }} -{{- end }} \ No newline at end of file diff --git a/packages/helm-charts/wallet-connect/templates/wallet-connect-deployment.yaml b/packages/helm-charts/wallet-connect/templates/wallet-connect-deployment.yaml deleted file mode 100644 index 861e9637d90..00000000000 --- a/packages/helm-charts/wallet-connect/templates/wallet-connect-deployment.yaml +++ /dev/null @@ -1,42 +0,0 @@ -apiVersion: apps/v1 -kind: Deployment -metadata: - name: {{ .Release.Name }} - labels: -{{ include "walletconnect.labels" . | indent 4 }} -spec: - replicas: 1 - selector: - matchLabels: -{{ include "walletconnect.selectorLabels" . | indent 6 }} - template: - metadata: - labels: -{{ include "walletconnect.selectorLabels" . | indent 8 }} - annotations: - prometheus.io/scrape: 'true' - prometheus.io/port: '5000' - prometheus.io/path: "/metrics" - spec: - containers: - - name: walletconnect - image: {{ .Values.walletconnect.image.repository }}:{{ .Values.walletconnect.image.tag }} - imagePullPolicy: {{ .Values.walletconnect.imagePullPolicy }} - env: - - name: REDIS_URL - value: redis://{{ .Release.Name }}-redis-headless:6379/0 - - name: REDIS_PREFIX - value: walletconnect-bridge - - name: NODE_ENV - value: production - ports: - - name: http - containerPort: 5000 - resources: - requests: - cpu: 100m - memory: 250Mi - {{- with .Values.nodeSelector }} - nodeSelector: -{{ toYaml . | indent 8 }} - {{- end }} diff --git a/packages/helm-charts/wallet-connect/templates/wallet-connect-ingress.yaml b/packages/helm-charts/wallet-connect/templates/wallet-connect-ingress.yaml deleted file mode 100644 index b2695174fdf..00000000000 --- a/packages/helm-charts/wallet-connect/templates/wallet-connect-ingress.yaml +++ /dev/null @@ -1,25 +0,0 @@ -apiVersion: networking.k8s.io/v1 -kind: Ingress -metadata: - name: {{ .Release.Name }} - labels: -{{ include "walletconnect.labels" . | indent 4 }} - annotations: - kubernetes.io/tls-acme: "true" -spec: - ingressClassName: {{ default "nginx" .Values.ingressClassName }} - tls: - - hosts: - - {{ .Release.Name }}.{{ .Values.domain.name }}.org - secretName: {{ .Release.Name }}-tls - rules: - - host: {{ .Release.Name }}.{{ .Values.domain.name }}.org - http: - paths: - - path: / - pathType: Prefix - backend: - service: - name: {{ .Release.Name }} - port: - number: 5000 diff --git a/packages/helm-charts/wallet-connect/templates/wallet-connect-service.yaml b/packages/helm-charts/wallet-connect/templates/wallet-connect-service.yaml deleted file mode 100644 index 10c12a7ac11..00000000000 --- a/packages/helm-charts/wallet-connect/templates/wallet-connect-service.yaml +++ /dev/null @@ -1,14 +0,0 @@ -kind: Service -apiVersion: v1 -metadata: - name: {{ .Release.Name }} - labels: -{{ include "walletconnect.labels" . | indent 4 }} -spec: - selector: -{{ include "walletconnect.selectorLabels" . | indent 4 }} - clusterIP: None - type: {{ .Values.walletconnect.service.type }} - ports: - - port: 5000 - targetPort: http \ No newline at end of file diff --git a/packages/helm-charts/wallet-connect/values.yaml b/packages/helm-charts/wallet-connect/values.yaml deleted file mode 100644 index a6a88cb6746..00000000000 --- a/packages/helm-charts/wallet-connect/values.yaml +++ /dev/null @@ -1,24 +0,0 @@ -environment: - name: test - network: alfajores - cluster: - name: test-cluster - location: location - -walletconnect: - image: - repository: us.gcr.io/celo-testnet/walletconnect - tag: 1472bcaad57e3746498f7a661c42ff5cf9acaf5a - imagePullPolicy: IfNotPresent - service: - type: ClusterIP - -domain: - name: walletconnect - -redis: - cluster: - enabled: false - usePassword: false - -ingressClassName: nginx diff --git a/packages/metadata-crawler/.gitignore b/packages/metadata-crawler/.gitignore deleted file mode 100644 index 4fd586cad7a..00000000000 --- a/packages/metadata-crawler/.gitignore +++ /dev/null @@ -1,3 +0,0 @@ -lib/ -tmp/ -.tmp/ \ No newline at end of file diff --git a/packages/metadata-crawler/CHANGELOG.md b/packages/metadata-crawler/CHANGELOG.md deleted file mode 100644 index 0d85ca35b43..00000000000 --- a/packages/metadata-crawler/CHANGELOG.md +++ /dev/null @@ -1,67 +0,0 @@ -# @celo/metadata-crawler - -## 0.0.4-beta.0 - -### Patch Changes - -- Updated dependencies [1c9c844cf] - - @celo/contractkit@6.0.0-beta.0 -## 0.0.4 - -### Patch Changes - -- Updated dependencies [9ab9d00eb] -- Updated dependencies [1c9c844cf] -- Updated dependencies [9ab9d00eb] - - @celo/contractkit@6.0.0 - -## 0.0.3 - -### Patch Changes - -- Updated dependencies -- Updated dependencies [679ef0c60] -- Updated dependencies [32face3d8] -- Updated dependencies [87647b46b] - - @celo/contractkit@5.2.0 - - @celo/connect@5.1.1 - - @celo/utils@5.0.6 - -## 0.0.3-beta.0 - -### Patch Changes - -- Updated dependencies -- Updated dependencies [32face3d8] -- Updated dependencies [87647b46b] - - @celo/contractkit@5.2.0-beta.0 - - @celo/connect@5.1.1-beta.0 - - @celo/utils@5.0.6-beta.0 - -## 0.0.2 - -### Patch Changes - -- Updated dependencies [d48c68afc] -- Updated dependencies [d48c68afc] -- Updated dependencies [53bbd4958] -- Updated dependencies [d48c68afc] -- Updated dependencies [53bbd4958] -- Updated dependencies [d48c68afc] - - @celo/contractkit@5.1.0 - - @celo/connect@5.1.0 - - @celo/utils@5.0.5 - -## 0.0.2-beta.0 - -### Patch Changes - -- Updated dependencies [d48c68afc] -- Updated dependencies [d48c68afc] -- Updated dependencies [53bbd4958] -- Updated dependencies [d48c68afc] -- Updated dependencies [53bbd4958] -- Updated dependencies [d48c68afc] - - @celo/contractkit@5.1.0-beta.0 - - @celo/connect@5.1.0-beta.0 - - @celo/utils@5.0.5-beta.0 diff --git a/packages/metadata-crawler/README.md b/packages/metadata-crawler/README.md deleted file mode 100644 index 4f7ad561fd5..00000000000 --- a/packages/metadata-crawler/README.md +++ /dev/null @@ -1,36 +0,0 @@ -# Metadata Crawler - -This package connects to Blockscout database, get all the metadata urls, -verify the metadata claims and update the database if the user claims -could be verified succesfully. - -For this package to work properly, the software must have SELECT and UPDATE -access to the Blockscout database. - - -## Build - -You can build the crawler executing: - -```bash -yarn && yarn build -``` - -## Running the crawler - -For connecting to the Blockscout is necessary to setup the following environment variables: - -```bash -export PGUSER="postgres" # Database user -export PGPASSWORD="" # Database password -export PGHOST="127.0.0.1" # Database host -export PGPORT="5432" # Database port -export PGDATABASE="blockscout" # Database name -export PROVIDER_URL="http://localhost:8545" # Provider Url -``` - -You can start the crawler executing: - -```bash -yarn dev -``` diff --git a/packages/metadata-crawler/index.d.ts b/packages/metadata-crawler/index.d.ts deleted file mode 100644 index 102dc17cf17..00000000000 --- a/packages/metadata-crawler/index.d.ts +++ /dev/null @@ -1 +0,0 @@ -declare module 'bunyan-debug-stream' diff --git a/packages/metadata-crawler/package.json b/packages/metadata-crawler/package.json deleted file mode 100644 index 5c1fb1ed824..00000000000 --- a/packages/metadata-crawler/package.json +++ /dev/null @@ -1,38 +0,0 @@ -{ - "name": "@celo/metadata-crawler", - "version": "0.0.4", - "description": "Celo metadata crawler", - "main": "index.js", - "types": "./lib/index.d.ts", - "author": "Celo", - "license": "Apache-2.0", - "homepage": "https://github.com/celo-org/celo-monorepo/tree/master/packages/metadata-crawler", - "repository": "https://github.com/celo-org/celo-monorepo/tree/master/packages/metadata-crawler", - "dependencies": { - "@celo/connect": "^5.1.2", - "@celo/contractkit": "^7.0.0", - "@celo/utils": "^5.0.6", - "@types/pg": "^7.14.3", - "bunyan": "1.8.12", - "bunyan-gke-stackdriver": "0.1.2", - "debug": "^4.1.1", - "dotenv": "^8.2.0", - "googleapis": "^39.2.0", - "pg": "^7.18.0", - "ts-node": "^10.9.2", - "web3": "1.10.0" - }, - "devDependencies": { - "@tsconfig/recommended": "^1.0.3", - "@types/bunyan": "1.8.8", - "@types/dotenv": "^8.2.0", - "bunyan-debug-stream": "^2.0.0" - }, - "scripts": { - "dev": "yarn run ts-node src/crawler.ts", - "build": "tsc -b .", - "clean": "tsc -b . --clean", - "clean:all": "yarn clean && rm -rf lib" - }, - "private": true -} diff --git a/packages/metadata-crawler/src/crawler.ts b/packages/metadata-crawler/src/crawler.ts deleted file mode 100644 index bc6f8d1d132..00000000000 --- a/packages/metadata-crawler/src/crawler.ts +++ /dev/null @@ -1,183 +0,0 @@ -import { Address } from '@celo/connect' -import { newKitFromWeb3 } from '@celo/contractkit' -import { ClaimTypes, IdentityMetadataWrapper } from '@celo/contractkit/lib/identity' -import { - verifyAccountClaim, - verifyDomainRecord, -} from '@celo/contractkit/lib/identity/claims/verify' -import { normalizeAddressWith0x, trimLeading0x } from '@celo/utils/lib/address' -import { concurrentMap } from '@celo/utils/lib/async' -import Logger from 'bunyan' -import { Client } from 'pg' -import Web3 from 'web3' -import { dataLogger, logger, operationalLogger } from './logger' - -const CONCURRENCY = 10 - -const PGUSER = process.env['PGUSER'] || 'postgres' -const PGPASSWORD = process.env['PGPASSWORD'] || '' -const PGHOST = process.env['PGHOST'] || '127.0.0.1' -const PGPORT = process.env['PGPORT'] || '5432' -const PGDATABASE = process.env['PGDATABASE'] || 'blockscout' -const PROVIDER_URL = process.env['PROVIDER_URL'] || 'http://localhost:8545' - -const client = new Client({ - user: PGUSER, - password: PGPASSWORD, - host: PGHOST, - port: Number(PGPORT), - database: PGDATABASE, -}) - -const kit = newKitFromWeb3(new Web3(PROVIDER_URL)) - -async function jsonQuery(query: string) { - let res = await client.query(`SELECT json_agg(t) FROM (${query}) t`) - return res.rows[0].json_agg -} - -async function createVerificationClaims( - address: string, - domain: string, - verified: boolean, - accounts: Array
-) { - await addDatabaseVerificationClaims(address, domain, verified) - await concurrentMap(CONCURRENCY, accounts, (account) => - addDatabaseVerificationClaims(account, domain, verified) - ) -} - -async function addDatabaseVerificationClaims(address: string, domain: string, verified: boolean) { - try { - const query = `INSERT INTO celo_claims (address, type, element, verified, timestamp, inserted_at, updated_at) VALUES - (decode($1, 'hex'), 'domain', $2, $3, now(), now(), now()) - ON CONFLICT (address, type, element) DO - UPDATE SET verified=$3, timestamp=now(), updated_at=now() ` - // Trim 0x to match Blockscout convention - const values = [trimLeading0x(address), domain, verified] - - await client - .query(query, values) - .catch((err) => logger.error({ err, query }, 'addDataBaseVerificationClaims error')) - .then(() => dataLogger.info({ domain, address }, 'VERIFIED_DOMAIN_CLAIM')) - } catch (err) { - logger.error({ err }, 'addDataBaseVerificationClaims error') - } -} - -async function getVerifiedAccounts(metadata: IdentityMetadataWrapper, address: Address) { - const unverifiedAccounts = metadata.filterClaims(ClaimTypes.ACCOUNT) - const accountVerification = await Promise.all( - unverifiedAccounts.map(async (claim) => ({ - claim, - verified: await verifyAccountClaim(kit, claim, address), - })) - ) - const accounts = accountVerification - .filter(({ verified }) => verified === undefined) - .map((a) => a.claim.address) - - return accounts -} - -async function getVerifiedDomains( - metadata: IdentityMetadataWrapper, - address: Address, - logger: Logger -) { - const unverifiedDomains = metadata.filterClaims(ClaimTypes.DOMAIN) - - const domainVerification = await concurrentMap(CONCURRENCY, unverifiedDomains, async (claim) => { - try { - const verificationStatus = await verifyDomainRecord(kit, claim, address) - logger.debug({ claim, verificationStatus }, `verified_domain`) - return { - claim, - verified: verificationStatus === undefined, - } - } catch (err) { - logger.error({ err, claim }) - return { - claim, - verified: false, - } - } - }) - - const domains = domainVerification.filter(({ verified }) => verified).map((_) => _.claim.domain) - - return domains -} - -async function processDomainClaimForValidator(item: { url: string; address: string }) { - const itemLogger = operationalLogger.child({ url: item.url, address: item.address }) - try { - itemLogger.debug('fetch_metadata') - const metadata = await IdentityMetadataWrapper.fetchFromURL( - await kit.contracts.getAccounts(), - item.url - ) - const verifiedAccounts = await getVerifiedAccounts(metadata, item.address) - const verifiedDomains = await getVerifiedDomains(metadata, item.address, itemLogger) - - await concurrentMap(CONCURRENCY, verifiedDomains, (domain) => - createVerificationClaims(item.address, domain, true, verifiedAccounts) - ) - - itemLogger.debug( - { - verfiedAccountClaims: verifiedAccounts.length, - verifiedDomainClaims: verifiedDomains.length, - }, - 'processDomainClaimForValidator done' - ) - } catch (err) { - itemLogger.error({ err }, 'processDomainClaimForValidator error') - } -} - -async function processDomainClaims() { - let items: { address: string; url: string }[] = await jsonQuery( - `SELECT address, url FROM celo_account WHERE url is NOT NULL ` - ) - - operationalLogger.debug({ length: items.length }, 'fetching all accounts') - - items = items || [] - items = items.map((a) => ({ - ...a, - // Addresses are stored by blockscout as just the bytes prepended with \x - address: normalizeAddressWith0x(a.address.substring(2)), - })) - - return concurrentMap(CONCURRENCY, items, (item) => processDomainClaimForValidator(item)) - .then(() => { - operationalLogger.info('Closing DB connecting and finishing') - }) - .catch((err) => { - operationalLogger.error({ err }, 'processDomainClaimForValidator error') - client.end() - process.exit(1) - }) -} - -async function main() { - operationalLogger.info({ host: PGHOST }, 'Connecting DB') - await client.connect() - - client.on('error', (err) => { - operationalLogger.error({ err }, 'Reconnecting after error') - client.connect() - }) - - await processDomainClaims() - - client.end() - process.exit(0) -} - -main().catch((err) => { - operationalLogger.error({ err }) - process.exit(1) -}) diff --git a/packages/metadata-crawler/src/env.ts b/packages/metadata-crawler/src/env.ts deleted file mode 100644 index fc8e9306b08..00000000000 --- a/packages/metadata-crawler/src/env.ts +++ /dev/null @@ -1,19 +0,0 @@ -import * as dotenv from 'dotenv' - -if (process.env.CONFIG) { - dotenv.config({ path: process.env.CONFIG }) -} - -export function fetchEnv(name: string): string { - if (process.env[name] === undefined || process.env[name] === '') { - console.error(`ENV var '${name}' was not defined`) - throw new Error(`ENV var '${name}' was not defined`) - } - return process.env[name] as string -} - -export function fetchEnvOrDefault(name: string, defaultValue: string): string { - return process.env[name] === undefined || process.env[name] === '' - ? defaultValue - : (process.env[name] as string) -} diff --git a/packages/metadata-crawler/src/logger.ts b/packages/metadata-crawler/src/logger.ts deleted file mode 100644 index d5a18a8288b..00000000000 --- a/packages/metadata-crawler/src/logger.ts +++ /dev/null @@ -1,30 +0,0 @@ -import Logger, { createLogger, levelFromName, LogLevelString, stdSerializers } from 'bunyan' -// @ts-ignore -import bunyanDebugStream from 'bunyan-debug-stream' -import { createStream } from 'bunyan-gke-stackdriver' -import { fetchEnvOrDefault } from './env' - -const logLevel = fetchEnvOrDefault('LOG_LEVEL', 'debug') as LogLevelString -const logFormat = fetchEnvOrDefault('LOG_FORMAT', 'human') - -let stream: any -switch (logFormat) { - case 'stackdriver': - stream = createStream(levelFromName[logLevel]) - break - case 'json': - stream = { stream: process.stdout, level: logLevel } - break - default: - stream = { level: logLevel, stream: bunyanDebugStream() } - break -} - -export const logger: Logger = createLogger({ - name: 'metadata-crawler', - serializers: stdSerializers, - streams: [stream], -}) - -export const operationalLogger = logger.child({ logger: 'operation' }) -export const dataLogger = logger.child({ logger: 'data' }) diff --git a/packages/metadata-crawler/tsconfig.json b/packages/metadata-crawler/tsconfig.json deleted file mode 100644 index 015d57d38e4..00000000000 --- a/packages/metadata-crawler/tsconfig.json +++ /dev/null @@ -1,8 +0,0 @@ -{ - "extends": "@tsconfig/recommended/tsconfig.json", - "compilerOptions": { - "rootDir": "src", - "outDir": "lib" - }, - "include": ["src", "types/", "index.d.ts"], -} diff --git a/packages/op-tooling/README.md b/packages/op-tooling/README.md new file mode 100644 index 00000000000..16d83bf5673 --- /dev/null +++ b/packages/op-tooling/README.md @@ -0,0 +1,12 @@ +# Collection of tools useful during Op releases + +Sections: +- [deposit](./deposit/) - scripts for performing L1 to L2 deposit via OptimismPortal +- [exec](./exec/) - scripts for executing upgrade transactions +- [fork](./fork/) - scripts for forking & mocking networks (most useful: `fork_l1.sh` & `mock-mainnet.sh`) +- [impls](./impls/) - scripts for deployment & upgrade of individual OpStack contracts +- [op-deployer](./op-deployer/) - scripts for interacting with op-deployer upgrade pipeline (most useful: `run_upgrade.sh`) +- [safe](./safe/) - scripts for interacting with Safe API tx submission (requires delegatecall creation not disabled over API) +- [scripts](./scripts/) - scripts for interacting with OpStack contracts +- [verify](./verify/) - scripts for performing smart contract verification for OpStack +- [withdrawal](./withdrawal/) - scripts for performing L2 to L1 withdrawal via L2L1MessagePasser & OptimismPortal diff --git a/packages/op-tooling/deposit/README.md b/packages/op-tooling/deposit/README.md new file mode 100644 index 00000000000..b188982b713 --- /dev/null +++ b/packages/op-tooling/deposit/README.md @@ -0,0 +1,100 @@ +# L1 to L2 Deposits + +This directory contains tooling for performing L1 to L2 deposits of CELO to Celo's L2 networks. The workflow follows the Optimism-style deposit process using ERC20 token deposits. + +## Supported Networks + +The tooling supports two network configurations: + +- **Sepolia**: L1 (Ethereum Sepolia) ↔ L2 (Celo Sepolia) - *Testnet* +- **Mainnet**: L1 (Ethereum Mainnet) ↔ L2 (Celo Mainnet) + +Set the `NETWORK` environment variable to specify which network to use (`sepolia` or `mainnet`). This variable is **required**. + +## Important Notes + +- **Private Keys**: Always provide private keys without `0x` prefix to all scripts +- **Values**: All VALUE parameters should be specified in wei +- **Approval**: The script automatically handles ERC20 token approval before deposit +- **Timing**: Deposits typically take a ~15 minutes to appear on L2 + +## Performing a Deposit + +Initiates the deposit process from L1 to L2 using the Optimism Portal contract. + +```sh +RECIPIENT=0x... VALUE=1000000000000000000 PK=123... L1_RPC_URL=https://... ./deposit.sh +``` + +**Required Environment Variables:** +- `RECIPIENT`: L2 address that will receive the funds +- `VALUE`: Amount to deposit in wei +- `PK`: Private key (without 0x prefix) of the sender +- `L1_RPC_URL`: L1 RPC URL to use for the deposit +- `NETWORK`: Network to use (`sepolia` or `mainnet`) + +**Optional Environment Variables:** +- `GAS_LIMIT`: Gas limit for the L2 transaction (default: 100000) +- `IS_CREATION`: Whether this is a contract creation (default: false) +- `DATA`: Additional data to include (default: "0x00") + +**Output:** Transaction hashes for approval and deposit transactions + +## How It Works + +The deposit script performs three automated steps: + +1. **Retrieves Gas Paying Token**: Queries the System Config contract to get the gas paying token (CELO) address on L1 +2. **Approves Token**: Calls `approve()` on the gas paying token contract to allow the Optimism Portal to spend the specified amount +3. **Deposits Token**: Calls `depositERC20Transaction()` on the Optimism Portal to bridge tokens to L2 + +## Contract Addresses + +### Network-Specific Contract Addresses + +**Sepolia (L1: Ethereum Sepolia, L2: Celo Sepolia):** +- **SYSTEM_CONFIG**: `0x760a5f022c9940f4a074e0030be682f560d29818` (Ethereum Sepolia) +- **OPTIMISM_PORTAL**: `0x44ae3d41a335a7d05eb533029917aad35662dcc2` (Ethereum Sepolia) + +**Mainnet (L1: Ethereum Mainnet, L2: Celo Mainnet):** +- **SYSTEM_CONFIG**: `0x89E31965D844a309231B1f17759Ccaf1b7c09861` (Ethereum Mainnet) +- **OPTIMISM_PORTAL**: `0xc5c5D157928BDBD2ACf6d0777626b6C75a9EAEDC` (Ethereum Mainnet) + +**Note:** The [deposit.sh](deposit.sh) script supports both networks via the `NETWORK` environment variable. + +## Example Usage + +### Deposit 0.1 CELO to Sepolia (Testnet) + +```sh +# Using Sepolia - testnet +NETWORK=sepolia RECIPIENT=0x742d35Cc6634C0532925a3b844Bc9e7595f0bEb VALUE=100000000000000000 PK=your_private_key L1_RPC_URL=https://sepolia.infura.io/v3/YOUR_KEY ./deposit.sh +``` + +### Deposit 0.1 CELO to Mainnet + +```sh +# Using Mainnet +NETWORK=mainnet RECIPIENT=0x742d35Cc6634C0532925a3b844Bc9e7595f0bEb VALUE=100000000000000000 PK=your_private_key L1_RPC_URL=https://mainnet.infura.io/v3/YOUR_KEY ./deposit.sh +``` + +## Checking Deposit on L2 + +After initiating a deposit, you can check if it has arrived on L2 by: + +1. Monitoring the L2 transaction using a block explorer: + - Sepolia Testnet: https://sepolia.celoscan.io + - Celo Mainnet: https://celoscan.io + +## Troubleshooting + +- **Transaction failed**: Ensure you have enough CELO on L1 to cover the deposit amount plus gas fees +- **RPC errors**: Ensure your L1_RPC_URL is valid and accessible +- **Private key format**: Ensure PK is provided without 0x prefix +- **Value format**: Ensure VALUE is in wei (not ETH/CELO in decimal form) +- **Network errors**: Ensure NETWORK is set to one of: `sepolia`, `mainnet` +- **Deposit not appearing on L2**: Wait up to 15 minutes for the deposit to be processed + +## Related Documentation + +For the reverse operation (withdrawing from L2 to L1), see the [withdrawal README](../withdrawal/README.md). diff --git a/packages/op-tooling/deposit/deposit.sh b/packages/op-tooling/deposit/deposit.sh new file mode 100755 index 00000000000..cc4c14f98d7 --- /dev/null +++ b/packages/op-tooling/deposit/deposit.sh @@ -0,0 +1,60 @@ +#!/bin/bash +set -euo pipefail + +# Determine network +NETWORK=${NETWORK:-}; [ -z "${NETWORK:-}" ] && echo "Need to set the NETWORK via env" && exit 1; +case $NETWORK in + sepolia) + SYSTEM_CONFIG=0x760a5f022c9940f4a074e0030be682f560d29818 + OPTIMISM_PORTAL=0x44ae3d41a335a7d05eb533029917aad35662dcc2 + ;; + mainnet) + SYSTEM_CONFIG=0x89E31965D844a309231B1f17759Ccaf1b7c09861 + OPTIMISM_PORTAL=0xc5c5D157928BDBD2ACf6d0777626b6C75a9EAEDC + ;; + *) + echo "Unsupported network: $NETWORK" + exit 1 + ;; +esac + +# Required environment variables +RECIPIENT=${RECIPIENT:-}; [ -z "${RECIPIENT:-}" ] && echo "Need to set the RECIPIENT via env" && exit 1; +VALUE=${VALUE:-}; [ -z "${VALUE:-}" ] && echo "Need to set the VALUE via env" && exit 1; +PK=${PK:-}; [ -z "${PK:-}" ] && echo "Need to set the PK via env" && exit 1; +L1_RPC_URL=${L1_RPC_URL:-}; [ -z "${L1_RPC_URL:-}" ] && echo "Need to set the L1_RPC_URL via env" && exit 1; + +# Optional environment variables +GAS_LIMIT=${GAS_LIMIT:-100000} +IS_CREATION=${IS_CREATION:-false} +DATA=${DATA:-"0x00"} + +# Retrieve gas paying token on L1 +echo "1. Retrieving gas paying token on L1..." +CELO_L1=$(cast call $SYSTEM_CONFIG "gasPayingToken()(address,uint8)" --rpc-url $L1_RPC_URL) +read -r CELO_L1 <<< "$CELO_L1" +echo " >>> Gas paying token on L1: $CELO_L1" + +# Give approval on L1 +echo "2. Giving approval on L1..." +cast send $CELO_L1 \ + "approve(address,uint256)" \ + $OPTIMISM_PORTAL \ + $VALUE \ + --private-key $PK \ + --rpc-url $L1_RPC_URL +echo " >>> Approval transaction sent." + +# Perform deposit on L1 +echo "3. Performing deposit on L1..." +cast send $OPTIMISM_PORTAL \ + "depositERC20Transaction(address,uint256,uint256,uint64,bool,bytes)" \ + $RECIPIENT \ + $VALUE \ + $VALUE \ + $GAS_LIMIT \ + $IS_CREATION \ + $DATA \ + --private-key $PK \ + --rpc-url $L1_RPC_URL +echo " >>> Deposit transaction sent." diff --git a/packages/op-tooling/exec/README.md b/packages/op-tooling/exec/README.md new file mode 100644 index 00000000000..0c45067ebab --- /dev/null +++ b/packages/op-tooling/exec/README.md @@ -0,0 +1,188 @@ +# Transaction Execution Scripts + +This directory contains scripts for executing OPCM upgrade transactions through the Celo Safe Multisig. + +## Prerequisites + +- Valid signatures from desired Multisig signatories for transaction execution + +## Scripts + +### `exec.sh` + +Generalized upgrade script for future migrations. Executes OPCM upgrade transactions through the complete multisig approval chain. + +**Features:** +- Executes OPCM upgrade transactions with proper multisig approvals +- Supports custom calldata and signatures +- Handles the complete approval chain: Mento → Council → cLabs → Parent +- Uses delegatecall for OPCM upgrades + +**Required Environment Variables:** +- `PK` - Private key for transaction execution +- `OPCM_ADDRESS` - Address of the Optimism Chain Manager contract +- `OPCM_UPGRADE_CALLDATA` - Calldata for the upgrade transaction +- `MENTO_SIG` - Signature from Mento multisig +- `COUNCIL_SIG` - Signature from Council multisig +- `CLABS_SIG` - Signature from cLabs multisig + +**Optional Environment Variables:** +- `RPC_URL` - RPC endpoint (defaults to `http://127.0.0.1:8545`) + +**Multisig Addresses:** +- **Parent Safe**: `0x4092A77bAF58fef0309452cEaCb09221e556E112` +- **cLabs Safe**: `0x9Eb44Da23433b5cAA1c87e35594D15FcEb08D34d` +- **Council Safe**: `0xC03172263409584f7860C25B6eB4985f0f6F4636` +- **Mento Safe**: `0xD1C635987B6Aa287361d08C6461491Fa9df087f2` + +**Example Execution:** +```bash +PK="0x..." OPCM_ADDRESS="0x..." OPCM_UPGRADE_CALLDATA="0x..." MENTO_SIG="0x..." COUNCIL_SIG="0x..." CLABS_SIG="0x..." ./exec.sh +``` + +### `exec-v2v3.sh` + +Final script used for migration from `1.8.0` to `2.0.0` & from `2.0.0` to `3.0.0`. Contains hardcoded transaction data and signatures for the specific v2 and v3 upgrades. + +**Features:** +- Hardcoded transaction data for v2 and v3 upgrades +- Pre-configured signatures from all multisig members +- Executes both v2 and v3 upgrades in sequence +- Uses deterministic nonces and contract addresses + +**Required Environment Variables:** +- `PK` - Private key for transaction execution + +**Optional Environment Variables:** +- `RPC_URL` - RPC endpoint (defaults to `http://127.0.0.1:8545`) + +**Version-Specific Data:** + +#### V2.0.0 Configuration +- **OPCM Address**: `0x597f110a3bee7f260b1657ab63c36d86b3740f36` +- **Parent Nonce**: 22 +- **cLabs Nonce**: 19 +- **Council Nonce**: 21 +- **Mento Nonce**: 2 +- **Prestate Hash**: `0x03b357b30095022ecbb44ef00d1de19df39cf69ee92a60683a6be2c6f8fe6a3e` + +#### V3.0.0 Configuration +- **OPCM Address**: `0x2e8cd74af534f5eeb53f889d92fd4220546a15e7` +- **Parent Nonce**: 23 +- **cLabs Nonce**: 20 +- **Council Nonce**: 22 +- **Mento Nonce**: 3 +- **Prestate Hash**: `0x034b32d11f017711ce7122ac71d87b1c6cc73e10a0dbd957d8b27f6360acaf8f` + +**Example Execution:** +```bash +PK="0x..." ./exec-v2v3.sh +``` + +### `exec-mocked.sh` + +Simplified and mocked simulation of network upgrade with support for providing an arbitrary account with signature. Designed for testing and development. + +**Features:** +- Mocked multisig environment for testing +- Support for external account signatures with enhanced validation +- Configurable approval or signing behavior +- Grand Child multisig support for Council team +- Simplified execution flow for development +- Sender address validation for security + +**Required Environment Variables:** +- `VERSION` - Target version (`v2` or `v3`) +- `PK` - Private key for transaction execution +- `SENDER` - Expected sender address (must match the address derived from `PK`) +- `SIGNER_1_PK` - Private key for first signer (unless external account used) +- `SIGNER_2_PK` - Private key for second signer +- `SIGNER_3_PK` - Private key for third signer (unless external account used) +- `SIGNER_4_PK` - Private key for fourth signer + +> **Important**: The signer addresses used in `exec-mocked.sh` must correspond exactly to the mocked signers configured in `mock-mainnet.sh`: +> - `SIGNER_1_PK` → `MOCKED_SIGNER_1` (cLabs team signer) +> - `SIGNER_2_PK` → `MOCKED_SIGNER_2` (cLabs team signer) +> - `SIGNER_3_PK` → `MOCKED_SIGNER_3` (Council team signer) +> - `SIGNER_4_PK` → `MOCKED_SIGNER_4` (Council team signer) +> +> The `SENDER` address is validated against the `PK` to ensure proper account control. + +**Optional Environment Variables:** +- `SIG` - External signature (used with `ACCOUNT` and `TEAM`) +- `ACCOUNT` - External account address +- `TEAM` - Team identifier (`clabs` or `council`) +- `GC_MULTISIG` - Grand Child multisig address (`council` team only) + +**Behavior Modes:** +- **`approve`**: Uses `approveHash` calls for multisig approvals +- **`sign`**: Uses signature-based execution (default) + +**Example Executions:** + +#### Basic Mocked Execution +```bash +VERSION="v3" PK="0x..." SENDER="0x..." SIGNER_1_PK="0x..." SIGNER_2_PK="0x..." SIGNER_3_PK="0x..." SIGNER_4_PK="0x..." ./exec-mocked.sh +``` + +#### With External cLabs Account +```bash +VERSION="v3" PK="0x..." SENDER="0x..." SIG="0x..." ACCOUNT="0x..." TEAM="clabs" SIGNER_2_PK="0x..." SIGNER_3_PK="0x..." SIGNER_4_PK="0x..." ./exec-mocked.sh +``` + +#### With External Council Account +```bash +VERSION="v3" PK="0x..." SENDER="0x..." SIG="0x..." ACCOUNT="0x..." TEAM="council" SIGNER_1_PK="0x..." SIGNER_2_PK="0x..." SIGNER_4_PK="0x..." ./exec-mocked.sh +``` + +#### With Mento Member from External Council Account and Grand Child Multisig +```bash +VERSION="v3" PK="0x..." SENDER="0x..." SIG="0x..." ACCOUNT="0x..." TEAM="council" GC_MULTISIG="0x..." SIGNER_1_PK="0x..." SIGNER_2_PK="0x..." SIGNER_4_PK="0x..." ./exec-mocked.sh +``` + +## Execution Flow + +### Standard Flow (exec.sh, exec-v2v3.sh) +1. **Mento Approval**: Approve Council transaction +2. **Council Approval**: Approve Parent transaction +3. **cLabs Approval**: Approve Parent transaction +4. **Parent Execution**: Execute OPCM upgrade + +### Mocked Flow (exec-mocked.sh) +1. **cLabs Approval**: Approve Parent transaction +2. **Council Approval**: Approve Parent transaction (with optional Grand Child) +3. **Parent Execution**: Execute OPCM upgrade + +## Transaction Parameters + +### Common Parameters +- **Value**: 0 ETH +- **Operation**: 1 (delegatecall) for OPCM upgrades +- **Safe Tx Gas**: 0 (unlimited) +- **Base Gas**: 0 +- **Gas Price**: 0 +- **Gas Token**: Zero address +- **Refund Receiver**: Zero address + +### Calldata Structure +The upgrade calldata follows the format: +``` +0xa4589780 + [chain configs array] + [system config proxy] + [proxy admin] + [prestate hash] +``` + +## Network Support + +| Network | Environment | Use Case | +|---------|-------------|----------| +| Local Fork | Development | Testing with mocked environment | +| Testnet | Staging | Pre-production validation | +| Mainnet | Production | Live network upgrades | + +## Notes + +- Signatures must be ordered by signer address for proper multisig execution +- The `exec-v2v3.sh` script contains production-ready transaction data and signatures +- The `exec-mocked.sh` script is designed for development and testing scenarios +- Gas limits are set to 16,000,000 for OPCM upgrade transactions +- Nonces must be sequential and match the current multisig state +- **Critical**: The signer private keys in `exec-mocked.sh` must correspond to the addresses configured in `mock-mainnet.sh` for proper multisig operation diff --git a/packages/op-tooling/exec/exec-mocked.sh b/packages/op-tooling/exec/exec-mocked.sh new file mode 100755 index 00000000000..1e167e06e7a --- /dev/null +++ b/packages/op-tooling/exec/exec-mocked.sh @@ -0,0 +1,252 @@ +#!/usr/bin/env bash +set -euo pipefail + +# optionally allow to specify signer +EXTERNAL_SIG=${SIG:-} +EXTERNAL_ACCOUNT=${ACCOUNT:-} +EXTERNAL_TEAM=${TEAM:-} +GRAND_CHILD_MULTISIG=${GC_MULTISIG:-} + +# determine if using internal signers +USE_INTERNAL_CLABS=$([ -z "$EXTERNAL_SIG" ] || [ -z "$EXTERNAL_ACCOUNT" ] || [ "$EXTERNAL_TEAM" != "clabs" ] && echo "true" || echo "false") +USE_INTERNAL_COUNCIL=$([ -z "$EXTERNAL_SIG" ] || [ -z "$EXTERNAL_ACCOUNT" ] || [ "$EXTERNAL_TEAM" != "council" ] && echo "true" || echo "false") + +if [ -n "$EXTERNAL_SIG" ] && [ -n "$EXTERNAL_ACCOUNT" ]; then + echo "Detected external account: $EXTERNAL_ACCOUNT" + case $EXTERNAL_TEAM in + "clabs"|"council") + echo "Detected valid team: $EXTERNAL_TEAM" + ;; + *) + echo "Invalid team: $EXTERNAL_TEAM" && exit 1 + ;; + esac + echo "External sig: $EXTERNAL_SIG" +fi +if [ -n "$GRAND_CHILD_MULTISIG" ] && [ "$EXTERNAL_TEAM" != "council" ]; then + echo "Grand Child multisig is not supported for other team than council" && exit 1 +fi + +# required envs +[ -z "${VERSION:-}" ] && echo "Need to set the VERSION via env" && exit 1; +[ -z "${PK:-}" ] && echo "Need to set the PK via env" && exit 1; +[ -z "${SENDER:-}" ] && echo "Need to set the SENDER via env" && exit 1; +[ "$USE_INTERNAL_CLABS" = "true" ] && [ -z "${SIGNER_1_PK:-}" ] && echo "Need to set the SIGNER_1_PK via env" && exit 1; +[ -z "${SIGNER_2_PK:-}" ] && echo "Need to set the SIGNER_2_PK via env" && exit 1; +[ "$USE_INTERNAL_COUNCIL" = "true" ] && [ -z "${SIGNER_3_PK:-}" ] && echo "Need to set the SIGNER_3_PK via env" && exit 1; +[ -z "${SIGNER_4_PK:-}" ] && echo "Need to set the SIGNER_4_PK via env" && exit 1; + +# check version +case $VERSION in + "v2"|"v3"|"succinct") + echo "Detected supported version: $VERSION" + ;; + *) + echo "Invalid version: $VERSION" && exit 1 + ;; +esac + +# addresses +if [ $SENDER != $(cast wallet address --private-key $PK) ]; then + echo "Invalid PK"; exit 1; +fi +[ "$USE_INTERNAL_CLABS" = "true" ] && MOCKED_SIGNER_1=$(cast wallet address --private-key $SIGNER_1_PK) +MOCKED_SIGNER_2=$(cast wallet address --private-key $SIGNER_2_PK) +[ "$USE_INTERNAL_COUNCIL" = "true" ] && MOCKED_SIGNER_3=$(cast wallet address --private-key $SIGNER_3_PK) +MOCKED_SIGNER_4=$(cast wallet address --private-key $SIGNER_4_PK) + +# validate signer ordering +if [ "$USE_INTERNAL_CLABS" = "true" ] && [[ ${MOCKED_SIGNER_1:2,,} > ${MOCKED_SIGNER_2:2,,} ]]; then + echo "Error: MOCKED_SIGNER_1 must be < MOCKED_SIGNER_2 (addresses must be in ascending order)" && exit 1 +fi +if [ "$USE_INTERNAL_COUNCIL" = "true" ] && [[ ${MOCKED_SIGNER_3:2,,} > ${MOCKED_SIGNER_4:2,,} ]]; then + echo "Error: MOCKED_SIGNER_3 must be < MOCKED_SIGNER_4 (addresses must be in ascending order)" && exit 1 +fi + +# rpc +RPC_URL=http://127.0.0.1:8545 + +# defaults +VALUE=0 +OP_CALL=0 +OP_DELEGATECALL=1 +SAFE_TX_GAS=0 +BASE_GAS=0 +GAS_PRICE=0 +GAS_TOKEN=0x0000000000000000000000000000000000000000 +REFUND_RECEIVER=0x95ffac468e37ddeef407ffef18f0cc9e86d8f13b + +# safes +PARENT_SAFE_ADDRESS=0x4092A77bAF58fef0309452cEaCb09221e556E112 +CLABS_SAFE_ADDRESS=0x9Eb44Da23433b5cAA1c87e35594D15FcEb08D34d +COUNCIL_SAFE_ADDRESS=0xC03172263409584f7860C25B6eB4985f0f6F4636 + +# tx data +# @dev TX_CALLDATA is the calldata for performing the tx... +# ...for v2 & v3: +# ...it was generated during last step of interaction with op-deployer (op-deployer upgrade) +# ...bytes4(keccak256(abi.encodePacked("upgrade((address,address,bytes32)[],bool)"))) = 0xa4589780 +# ...for succinct: +# ...it was generated via interaction between safe & multicall contract (multicall aggregate3) +# ...bytes4(keccak256(abi.encodePacked("aggregate3((address,bool,bytes)[])"))) = 0x82ad56cb +if [ "$VERSION" = "v2" ]; then + PARENT_SAFE_NONCE=22 + CLABS_SAFE_NONCE=19 + COUNCIL_SAFE_NONCE=21 + TARGET=0x597f110a3bee7f260b1657ab63c36d86b3740f36 + TX_CALLDATA=0xa458978000000000000000000000000000000000000000000000000000000000000000400000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000100000000000000000000000089e31965d844a309231b1f17759ccaf1b7c09861000000000000000000000000783a434532ee94667979213af1711505e8bfe37403b357b30095022ecbb44ef00d1de19df39cf69ee92a60683a6be2c6f8fe6a3e +elif [ "$VERSION" = "v3" ]; then + PARENT_SAFE_NONCE=23 + CLABS_SAFE_NONCE=20 + COUNCIL_SAFE_NONCE=22 + TARGET=0x2e8cd74af534f5eeb53f889d92fd4220546a15e7 + TX_CALLDATA=0xa458978000000000000000000000000000000000000000000000000000000000000000400000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000100000000000000000000000089e31965d844a309231b1f17759ccaf1b7c09861000000000000000000000000783a434532ee94667979213af1711505e8bfe374034b32d11f017711ce7122ac71d87b1c6cc73e10a0dbd957d8b27f6360acaf8f +elif [ "$VERSION" = "succinct" ]; then + PARENT_SAFE_NONCE=24 + CLABS_SAFE_NONCE=21 + COUNCIL_SAFE_NONCE=23 + TARGET=0xcA11bde05977b3631167028862bE2a173976CA11 + TX_CALLDATA=0x82ad56cb0000000000000000000000000000000000000000000000000000000000000020000000000000000000000000000000000000000000000000000000000000000200000000000000000000000000000000000000000000000000000000000000400000000000000000000000000000000000000000000000000000000000000120000000000000000000000000fbac162162f4009bb007c6debc36b1dac10af6830000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000006000000000000000000000000000000000000000000000000000000000000000441e334240000000000000000000000000000000000000000000000000000000000000002a000000000000000000000000000000000000000000000000002386f26fc1000000000000000000000000000000000000000000000000000000000000000000000000000000000000fbac162162f4009bb007c6debc36b1dac10af68300000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000060000000000000000000000000000000000000000000000000000000000000004414f6b1a3000000000000000000000000000000000000000000000000000000000000002a000000000000000000000000113f434f82ff82678ae7f69ea122791fe1f6b73e00000000000000000000000000000000000000000000000000000000 +fi + +echo "--- Parent prep ---" + +# parent hash +PARENT_TX_HASH=$(cast call $PARENT_SAFE_ADDRESS \ + "getTransactionHash(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,uint256)(bytes32)" \ + $TARGET $VALUE $TX_CALLDATA $OP_DELEGATECALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $PARENT_SAFE_NONCE \ + -r $RPC_URL +) +echo "Parent hash: $PARENT_TX_HASH" + +echo "--- cLabs part ---" + +# cLabs tx +APPROVE_ON_PARENT_CALLDATA=$(cast calldata 'approveHash(bytes32)' $PARENT_TX_HASH) +CLABS_TX_HASH=$(cast call $CLABS_SAFE_ADDRESS \ + "getTransactionHash(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,uint256)(bytes32)" \ + $PARENT_SAFE_ADDRESS $VALUE $APPROVE_ON_PARENT_CALLDATA $OP_CALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $CLABS_SAFE_NONCE \ + -r $RPC_URL +) +echo "cLabs hash: $CLABS_TX_HASH" + +# sign by cLabs +echo "Sign cLabs hash" +if [ "$USE_INTERNAL_CLABS" = "true" ]; then + CLABS_SIG_1=$(cast wallet sign --no-hash $CLABS_TX_HASH --private-key $SIGNER_1_PK) +else + CLABS_SIG_1=$EXTERNAL_SIG +fi +echo "Sig 1: $CLABS_SIG_1" +CLABS_SIG_2=$(cast wallet sign --no-hash $CLABS_TX_HASH --private-key $SIGNER_2_PK) +echo "Sig 2: $CLABS_SIG_2" +echo "cLabs hash signed" + +# concat cLabs sigs +if [ "$USE_INTERNAL_CLABS" = "true" ]; then + CLABS_SIG=0x${CLABS_SIG_1:2}${CLABS_SIG_2:2} +elif [[ ${EXTERNAL_ACCOUNT:2,,} < ${MOCKED_SIGNER_2:2,,} ]]; then + CLABS_SIG=0x${EXTERNAL_SIG:2}${CLABS_SIG_2:2} +else + CLABS_SIG=0x${CLABS_SIG_2:2}${EXTERNAL_SIG:2} +fi + +# exec cLabs +echo "Exec cLabs approval" +echo "cLabs sig: $CLABS_SIG" +cast send $CLABS_SAFE_ADDRESS \ + "execTransaction(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,bytes)" \ + $PARENT_SAFE_ADDRESS $VALUE $APPROVE_ON_PARENT_CALLDATA $OP_CALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $CLABS_SIG \ + --private-key $PK \ + -r $RPC_URL +echo "cLabs approval executed" + +echo "--- Council part ---" + +# Council tx +COUNCIL_TX_HASH=$(cast call $COUNCIL_SAFE_ADDRESS \ + "getTransactionHash(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,uint256)(bytes32)" \ + $PARENT_SAFE_ADDRESS $VALUE $APPROVE_ON_PARENT_CALLDATA $OP_CALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $COUNCIL_SAFE_NONCE \ + -r $RPC_URL +) +echo "Council hash: $COUNCIL_TX_HASH" + +# optional Grand child section +if [ -n "$GRAND_CHILD_MULTISIG" ]; then + echo "Detected Grand Child multisig: $GRAND_CHILD_MULTISIG" + + GRAND_CHILD_NONCE=$(cast call $GRAND_CHILD_MULTISIG "nonce()(uint256)" -r $RPC_URL) + echo "Grand Child nonce: $GRAND_CHILD_NONCE" + + APPROVE_ON_CHILD_CALLDATA=$(cast calldata 'approveHash(bytes32)' $COUNCIL_TX_HASH) + GRAND_CHILD_TX_HASH=$(cast call $GRAND_CHILD_MULTISIG \ + "getTransactionHash(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,uint256)(bytes32)" \ + $COUNCIL_SAFE_ADDRESS $VALUE $APPROVE_ON_CHILD_CALLDATA $OP_CALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $GRAND_CHILD_NONCE \ + -r $RPC_URL + ) + echo "Grand Child hash: $GRAND_CHILD_TX_HASH" + echo "Grand Child sig: $EXTERNAL_SIG" + cast send $GRAND_CHILD_MULTISIG \ + "execTransaction(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,bytes)" \ + $COUNCIL_SAFE_ADDRESS $VALUE $APPROVE_ON_CHILD_CALLDATA $OP_CALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $EXTERNAL_SIG \ + --private-key $PK \ + -r $RPC_URL + echo "Grand Child approval executed" +fi + +# sign by Council +echo "Sign Council hash" +if [ -z "$GRAND_CHILD_MULTISIG" ]; then + if [ "$USE_INTERNAL_COUNCIL" = "true" ]; then + COUNCIL_SIG_1=$(cast wallet sign --no-hash $COUNCIL_TX_HASH --private-key $SIGNER_3_PK) + else + COUNCIL_SIG_1=$EXTERNAL_SIG + fi + echo "Sig 1: $COUNCIL_SIG_1" +fi +COUNCIL_SIG_2=$(cast wallet sign --no-hash $COUNCIL_TX_HASH --private-key $SIGNER_4_PK) +echo "Sig 2: $COUNCIL_SIG_2" +echo "Council hash signed" + +# concat Council sigs +if [ -z "$GRAND_CHILD_MULTISIG" ]; then + if [ "$USE_INTERNAL_COUNCIL" = "true" ]; then + COUNCIL_SIG=0x${COUNCIL_SIG_1:2}${COUNCIL_SIG_2:2} + elif [[ ${EXTERNAL_ACCOUNT:2,,} < ${MOCKED_SIGNER_4:2,,} ]]; then + COUNCIL_SIG=0x${EXTERNAL_SIG:2}${COUNCIL_SIG_2:2} + else + COUNCIL_SIG=0x${COUNCIL_SIG_2:2}${EXTERNAL_SIG:2} + fi +else + if [[ ${GRAND_CHILD_MULTISIG:2,,} < ${MOCKED_SIGNER_4:2,,} ]]; then + COUNCIL_SIG=0x000000000000000000000000${GRAND_CHILD_MULTISIG:2}000000000000000000000000000000000000000000000000000000000000000001${COUNCIL_SIG_2:2} + else + COUNCIL_SIG=0x${COUNCIL_SIG_2:2}000000000000000000000000${GRAND_CHILD_MULTISIG:2}000000000000000000000000000000000000000000000000000000000000000001 + fi +fi + +# exec Council +echo "Exec Council approval" +echo "Council sig: $COUNCIL_SIG" +cast send $COUNCIL_SAFE_ADDRESS \ + "execTransaction(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,bytes)" \ + $PARENT_SAFE_ADDRESS $VALUE $APPROVE_ON_PARENT_CALLDATA $OP_CALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $COUNCIL_SIG \ + --private-key $PK \ + -r $RPC_URL +echo "Council approval executed" + +echo "--- Parent exec ---" + +# signature in format where signer is nested safe (https://github.com/safe-global/safe-smart-account/blob/main/contracts/Safe.sol#L349C17-L351C94) +PARENT_SIG=0x000000000000000000000000${CLABS_SAFE_ADDRESS:2}000000000000000000000000000000000000000000000000000000000000000001000000000000000000000000${COUNCIL_SAFE_ADDRESS:2}000000000000000000000000000000000000000000000000000000000000000001000000000000000000000000000000000000000000000000000000000000 +echo "Parent sig: $PARENT_SIG" + +# exec parent tx +echo "Exec tx" +cast send $PARENT_SAFE_ADDRESS \ + "execTransaction(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,bytes)" \ + $TARGET $VALUE $TX_CALLDATA $OP_DELEGATECALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $PARENT_SIG \ + --gas-limit 16000000 \ + --private-key $PK \ + -r $RPC_URL +echo "Tx executed" diff --git a/packages/op-tooling/exec/exec-succinct.sh b/packages/op-tooling/exec/exec-succinct.sh new file mode 100755 index 00000000000..e3e106456d8 --- /dev/null +++ b/packages/op-tooling/exec/exec-succinct.sh @@ -0,0 +1,212 @@ +#!/usr/bin/env bash +set -euo pipefail + +# get repo root +REPO_ROOT=$(git rev-parse --show-toplevel) + +# required decoded files +[ ! -f "$REPO_ROOT/secrets/.env.signers.succinct" ] && echo "Need to decode .env.signers.succinct.enc first" && exit 1; + +# load decoded signers +source "$REPO_ROOT/secrets/.env.signers.succinct" + +# required envs +[ -z "${PK:-}" ] && echo "Need to set the PK via env" && exit 1; +echo "Logged in as wallet: $(cast wallet address --private-key $PK)" + +# optional envs +RPC_URL=${RPC_URL:-"http://127.0.0.1:8545"} + +# safes +PARENT_SAFE_ADDRESS=0x4092A77bAF58fef0309452cEaCb09221e556E112 +CLABS_SAFE_ADDRESS=0x9Eb44Da23433b5cAA1c87e35594D15FcEb08D34d +COUNCIL_SAFE_ADDRESS=0xC03172263409584f7860C25B6eB4985f0f6F4636 +GC_SAFE_ADDRESS=0xD1C635987B6Aa287361d08C6461491Fa9df087f2 + +# clabs signers (sorted by addresses) + +# 09c +CLABS_SIGNER_09C=${CLABS_SIGNER_09C__ADDRESS} +CLABS_SIGNER_09C_SIG=${CLABS_SIGNER_09C__SIG} + +# 21e +CLABS_SIGNER_21E=${CLABS_SIGNER_21E__ADDRESS} +CLABS_SIGNER_21E_SIG=${CLABS_SIGNER_21E__SIG} +# 481 +CLABS_SIGNER_481=${CLABS_SIGNER_481__ADDRESS} +CLABS_SIGNER_481_SIG=${CLABS_SIGNER_481__SIG} + +# 4D8 +CLABS_SIGNER_4D8=${CLABS_SIGNER_4D8__ADDRESS} +CLABS_SIGNER_4D8_SIG=${CLABS_SIGNER_4D8__SIG} +# 8b4 +CLABS_SIGNER_8B4=${CLABS_SIGNER_8B4__ADDRESS} +CLABS_SIGNER_8B4_SIG=${CLABS_SIGNER_8B4__SIG} + +# E00 +CLABS_SIGNER_E00=${CLABS_SIGNER_E00__ADDRESS} +CLABS_SIGNER_E00_SIG=${CLABS_SIGNER_E00__SIG} + +# council signers (sorted by addresses) + +# 148 +COUNCIL_SIGNER_148=${COUNCIL_SIGNER_148__ADDRESS} +COUNCIL_SIGNER_148_SIG=${COUNCIL_SIGNER_148__SIG} + +# 5f7 +COUNCIL_SIGNER_5F7=${COUNCIL_SIGNER_5F7__ADDRESS} +COUNCIL_SIGNER_5F7_SIG=${COUNCIL_SIGNER_5F7__SIG} + +# 6FD +COUNCIL_SIGNER_6FD=${COUNCIL_SIGNER_6FD__ADDRESS} +COUNCIL_SIGNER_6FD_SIG=${COUNCIL_SIGNER_6FD__SIG} + +# b96 +COUNCIL_SIGNER_B96=${COUNCIL_SIGNER_B96__ADDRESS} +COUNCIL_SIGNER_B96_SIG=${COUNCIL_SIGNER_B96__SIG} + +# d0c +COUNCIL_SIGNER_D0C=${COUNCIL_SIGNER_D0C__ADDRESS} +COUNCIL_SIGNER_D0C_SIG=${COUNCIL_SIGNER_D0C__SIG} + +# signers of grand child: 0xD1C + +# c96 +GC_D1C_SIGNER_C96=${GC_D1C_SIGNER_C96__ADDRESS} +GC_D1C_SIGNER_C96_SIG=${GC_D1C_SIGNER_C96__SIG} + +# D80 +GC_D1C_SIGNER_D80=${GC_D1C_SIGNER_D80__ADDRESS} +GC_D1C_SIGNER_D80_SIG=${GC_D1C_SIGNER_D80__SIG} + +# defaults +VALUE=0 +OP_CALL=0 +OP_DELEGATECALL=1 +SAFE_TX_GAS=0 +BASE_GAS=0 +GAS_PRICE=0 +GAS_TOKEN=0x0000000000000000000000000000000000000000 +REFUND_RECEIVER=0x95ffac468e37ddeef407ffef18f0cc9e86d8f13b + +function performUpgrade() { + # tx data + # @dev CALLDATA is the calldata for performing the upgrade... + # ...it was generated using ConfigureDeploymentSafe script (celo-org/op-succinct repo) + # ...bytes4(keccak256(abi.encodePacked("aggregate3((address,bool,bytes)[])"))) = 0x82ad56cb + PARENT_SAFE_NONCE=24 + CLABS_SAFE_NONCE=21 + COUNCIL_SAFE_NONCE=23 + GC_SAFE_NONCE=5 + TARGET_ADDRESS=0xcA11bde05977b3631167028862bE2a173976CA11 + CALLDATA=0x82ad56cb0000000000000000000000000000000000000000000000000000000000000020000000000000000000000000000000000000000000000000000000000000000200000000000000000000000000000000000000000000000000000000000000400000000000000000000000000000000000000000000000000000000000000120000000000000000000000000fbac162162f4009bb007c6debc36b1dac10af6830000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000006000000000000000000000000000000000000000000000000000000000000000441e334240000000000000000000000000000000000000000000000000000000000000002a000000000000000000000000000000000000000000000000002386f26fc1000000000000000000000000000000000000000000000000000000000000000000000000000000000000fbac162162f4009bb007c6debc36b1dac10af68300000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000060000000000000000000000000000000000000000000000000000000000000004414f6b1a3000000000000000000000000000000000000000000000000000000000000002a000000000000000000000000113f434f82ff82678ae7f69ea122791fe1f6b73e00000000000000000000000000000000000000000000000000000000 + + echo "--- Parent prep ---" + + # parent hash + PARENT_TX_HASH=$(cast call $PARENT_SAFE_ADDRESS \ + "getTransactionHash(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,uint256)(bytes32)" \ + $TARGET_ADDRESS $VALUE $CALLDATA $OP_DELEGATECALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $PARENT_SAFE_NONCE \ + -r $RPC_URL + ) + echo "Parent hash: $PARENT_TX_HASH" + + echo "--- Council part ---" + + # council hash + APPROVE_PARENT_CALLDATA=$(cast calldata 'approveHash(bytes32)' $PARENT_TX_HASH) + COUNCIL_TX_HASH=$(cast call $COUNCIL_SAFE_ADDRESS \ + "getTransactionHash(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,uint256)(bytes32)" \ + $PARENT_SAFE_ADDRESS $VALUE $APPROVE_PARENT_CALLDATA $OP_CALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $COUNCIL_SAFE_NONCE \ + -r $RPC_URL + ) + echo "Council hash: $COUNCIL_TX_HASH" + + echo "--- Grand child part ---" + + # gc hash + APPROVE_COUNCIL_CALLDATA=$(cast calldata 'approveHash(bytes32)' $COUNCIL_TX_HASH) + GC_TX_HASH=$(cast call $GC_SAFE_ADDRESS \ + "getTransactionHash(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,uint256)(bytes32)" \ + $COUNCIL_SAFE_ADDRESS $VALUE $APPROVE_COUNCIL_CALLDATA $OP_CALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $GC_SAFE_NONCE \ + -r $RPC_URL + ) + echo "Grand child hash: $GC_TX_HASH" + + # gc sig + GC_SIG=0x${GC_D1C_SIGNER_C96__SIG:2}${GC_D1C_SIGNER_D80__SIG:2} + echo "Grand child sig: $GC_SIG" + + echo "--- Grand child exec ---" + + # gc exec + cast send $GC_SAFE_ADDRESS \ + "execTransaction(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,bytes)" \ + $COUNCIL_SAFE_ADDRESS $VALUE $APPROVE_COUNCIL_CALLDATA $OP_CALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $GC_SIG \ + --private-key $PK \ + -r $RPC_URL + echo "Grand child executed" + + echo "--- Grand child done ---" + + # council sig + COUNCIL_SIG=0x${COUNCIL_SIGNER_148__SIG:2}${COUNCIL_SIGNER_5F7__SIG:2}${COUNCIL_SIGNER_6FD__SIG:2}${COUNCIL_SIGNER_B96__SIG:2}${COUNCIL_SIGNER_D0C__SIG:2}000000000000000000000000${GC_SAFE_ADDRESS:2}000000000000000000000000000000000000000000000000000000000000000001 + echo "Council sig: $COUNCIL_SIG" + + echo "--- Council exec ---" + + # council exec + cast send $COUNCIL_SAFE_ADDRESS \ + "execTransaction(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,bytes)" \ + $PARENT_SAFE_ADDRESS $VALUE $APPROVE_PARENT_CALLDATA $OP_CALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $COUNCIL_SIG \ + --private-key $PK \ + -r $RPC_URL + echo "Council executed" + + echo "--- Council done ---" + + echo "--- cLabs part ---" + + # clabs hash + CLABS_TX_HASH=$(cast call $CLABS_SAFE_ADDRESS \ + "getTransactionHash(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,uint256)(bytes32)" \ + $PARENT_SAFE_ADDRESS $VALUE $APPROVE_PARENT_CALLDATA $OP_CALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $CLABS_SAFE_NONCE \ + -r $RPC_URL + ) + echo "cLabs hash: $CLABS_TX_HASH" + + # clabs sig + CLABS_SIG=0x${CLABS_SIGNER_09C__SIG:2}${CLABS_SIGNER_21E__SIG:2}${CLABS_SIGNER_481__SIG:2}${CLABS_SIGNER_4D8__SIG:2}${CLABS_SIGNER_8B4__SIG:2}${CLABS_SIGNER_E00__SIG:2} + echo "cLabs sig: $CLABS_SIG" + + echo "--- cLabs exec ---" + + # clabs exec + cast send $CLABS_SAFE_ADDRESS \ + "execTransaction(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,bytes)" \ + $PARENT_SAFE_ADDRESS $VALUE $APPROVE_PARENT_CALLDATA $OP_CALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $CLABS_SIG \ + --private-key $PK \ + -r $RPC_URL + echo "cLabs executed" + + echo "--- cLabs done ---" + + echo "--- Parent exec ---" + + # exec parent tx + echo "Exec upgrade" + # signature in format where signer is nested safe (https://github.com/safe-global/safe-smart-account/blob/main/contracts/Safe.sol#L349C17-L351C94) + PARENT_SIG=0x000000000000000000000000${CLABS_SAFE_ADDRESS:2}000000000000000000000000000000000000000000000000000000000000000001000000000000000000000000${COUNCIL_SAFE_ADDRESS:2}000000000000000000000000000000000000000000000000000000000000000001000000000000000000000000000000000000000000000000000000000000 + echo "Parent sig: $PARENT_SIG" + cast send $PARENT_SAFE_ADDRESS \ + "execTransaction(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,bytes)" \ + $TARGET_ADDRESS $VALUE $CALLDATA $OP_DELEGATECALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $PARENT_SIG \ + --gas-limit 16000000 \ + --private-key $PK \ + -r $RPC_URL + echo "Upgrade executed" + + echo "--- Parent done ---" +} + +performUpgrade diff --git a/packages/op-tooling/exec/exec-v2v3.sh b/packages/op-tooling/exec/exec-v2v3.sh new file mode 100755 index 00000000000..6d8eec84d0a --- /dev/null +++ b/packages/op-tooling/exec/exec-v2v3.sh @@ -0,0 +1,267 @@ +#!/usr/bin/env bash +set -euo pipefail + +# get repo root +REPO_ROOT=$(git rev-parse --show-toplevel) + +# required decoded files +[ ! -f "$REPO_ROOT/secrets/.env.signers.v2" ] && echo "Need to decode .env.signers.v2.enc first" && exit 1; +[ ! -f "$REPO_ROOT/secrets/.env.signers.v3" ] && echo "Need to decode .env.signers.v3.enc first" && exit 1; + +# load decoded signers +source "$REPO_ROOT/secrets/.env.signers.v2" +source "$REPO_ROOT/secrets/.env.signers.v3" + +# required envs +[ -z "${PK:-}" ] && echo "Need to set the PK via env" && exit 1; +echo "Logged in as wallet: $(cast wallet address --private-key $PK)" + +# optional envs +RPC_URL=${RPC_URL:-"http://127.0.0.1:8545"} + +# safes +PARENT_SAFE_ADDRESS=0x4092A77bAF58fef0309452cEaCb09221e556E112 +CLABS_SAFE_ADDRESS=0x9Eb44Da23433b5cAA1c87e35594D15FcEb08D34d +COUNCIL_SAFE_ADDRESS=0xC03172263409584f7860C25B6eB4985f0f6F4636 +GC_SAFE_ADDRESS=0xD1C635987B6Aa287361d08C6461491Fa9df087f2 + +# clabs signers (sorted by addresses) + +# 09c +CLABS_SIGNER_09C=${CLABS_SIGNER_09C__ADDRESS} +CLABS_SIGNER_09C_SIG_V2=${CLABS_SIGNER_09C__SIG_V2} +CLABS_SIGNER_09C_SIG_V3=${CLABS_SIGNER_09C__SIG_V3} + +# 21e +CLABS_SIGNER_21E=${CLABS_SIGNER_21E__ADDRESS} +CLABS_SIGNER_21E_SIG_V2=${CLABS_SIGNER_21E__SIG_V2} +CLABS_SIGNER_21E_SIG_V3=${CLABS_SIGNER_21E__SIG_V3} + +# 481 +CLABS_SIGNER_481=${CLABS_SIGNER_481__ADDRESS} +CLABS_SIGNER_481_SIG_V2=${CLABS_SIGNER_481__SIG_V2} +CLABS_SIGNER_481_SIG_V3=${CLABS_SIGNER_481__SIG_V3} + +# 4D8 +CLABS_SIGNER_4D8=${CLABS_SIGNER_4D8__ADDRESS} +CLABS_SIGNER_4D8_SIG_V2=${CLABS_SIGNER_4D8__SIG_V2} +CLABS_SIGNER_4D8_SIG_V3=${CLABS_SIGNER_4D8__SIG_V3} + +# 8b4 +CLABS_SIGNER_8B4=${CLABS_SIGNER_8B4__ADDRESS} +CLABS_SIGNER_8B4_SIG_V2=${CLABS_SIGNER_8B4__SIG_V2} +CLABS_SIGNER_8B4_SIG_V3=${CLABS_SIGNER_8B4__SIG_V3} + +# E00 +CLABS_SIGNER_E00=${CLABS_SIGNER_E00__ADDRESS} +CLABS_SIGNER_E00_SIG_V2=${CLABS_SIGNER_E00__SIG_V2} +CLABS_SIGNER_E00_SIG_V3=${CLABS_SIGNER_E00__SIG_V3} + +# council signers (sorted by addresses) + +# 148 +COUNCIL_SIGNER_148=${COUNCIL_SIGNER_148__ADDRESS} +COUNCIL_SIGNER_148_SIG_V2=${COUNCIL_SIGNER_148__SIG_V2} +COUNCIL_SIGNER_148_SIG_V3=${COUNCIL_SIGNER_148__SIG_V3} + +# 2BE +COUNCIL_SIGNER_2BE=${COUNCIL_SIGNER_2BE__ADDRESS} +COUNCIL_SIGNER_2BE_SIG_V2=${COUNCIL_SIGNER_2BE__SIG_V2} +COUNCIL_SIGNER_2BE_SIG_V3=${COUNCIL_SIGNER_2BE__SIG_V3} + +# 6FD +COUNCIL_SIGNER_6FD=${COUNCIL_SIGNER_6FD__ADDRESS} +COUNCIL_SIGNER_6FD_SIG_V2=${COUNCIL_SIGNER_6FD__SIG_V2} +COUNCIL_SIGNER_6FD_SIG_V3=${COUNCIL_SIGNER_6FD__SIG_V3} + +# b96 +COUNCIL_SIGNER_B96=${COUNCIL_SIGNER_B96__ADDRESS} +COUNCIL_SIGNER_B96_SIG_V2=${COUNCIL_SIGNER_B96__SIG_V2} +COUNCIL_SIGNER_B96_SIG_V3=${COUNCIL_SIGNER_B96__SIG_V3} + +# d0c +COUNCIL_SIGNER_D0C=${COUNCIL_SIGNER_D0C__ADDRESS} +COUNCIL_SIGNER_D0C_SIG_V2=${COUNCIL_SIGNER_D0C__SIG_V2} +COUNCIL_SIGNER_D0C_SIG_V3=${COUNCIL_SIGNER_D0C__SIG_V3} + +# signers of grand child: 0xD1C + +# c96 +GC_D1C_SIGNER_C96=${GC_D1C_SIGNER_C96__ADDRESS} +GC_D1C_SIGNER_C96_SIG_V2=${GC_D1C_SIGNER_C96__SIG_V2} +GC_D1C_SIGNER_C96_SIG_V3=${GC_D1C_SIGNER_C96__SIG_V3} + +# D80 +GC_D1C_SIGNER_D80=${GC_D1C_SIGNER_D80__ADDRESS} +GC_D1C_SIGNER_D80_SIG_V2=${GC_D1C_SIGNER_D80__SIG_V2} +GC_D1C_SIGNER_D80_SIG_V3=${GC_D1C_SIGNER_D80__SIG_V3} + +# defaults +VALUE=0 +OP_CALL=0 +OP_DELEGATECALL=1 +SAFE_TX_GAS=0 +BASE_GAS=0 +GAS_PRICE=0 +GAS_TOKEN=0x0000000000000000000000000000000000000000 +REFUND_RECEIVER=0x0000000000000000000000000000000000000000 + +function performUpgrade() { + # params + VERSION=$1 + + # check version + case $VERSION in + "v2"|"v3") + echo "Detected supported version: $VERSION" + ;; + *) + echo "Invalid version: $VERSION" && exit 1 + ;; + esac + + # tx data + # @dev OPCM_UPGRADE_CALLDATA is the calldata for performing the upgrade through OPCM... +# ...it was generated during last step of interaction with op-deployer (op-deployer upgrade) +# ...bytes4(keccak256(abi.encodePacked("upgrade((address,address,bytes32)[],bool)"))) = 0xa4589780 + if [ $VERSION = 'v2' ]; then + PARENT_SAFE_NONCE=22 + CLABS_SAFE_NONCE=19 + COUNCIL_SAFE_NONCE=21 + GC_SAFE_NONCE=2 + OPCM_ADDRESS=0x597f110a3bee7f260b1657ab63c36d86b3740f36 + OPCM_UPGRADE_CALLDATA=0xa458978000000000000000000000000000000000000000000000000000000000000000400000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000100000000000000000000000089e31965d844a309231b1f17759ccaf1b7c09861000000000000000000000000783a434532ee94667979213af1711505e8bfe37403b357b30095022ecbb44ef00d1de19df39cf69ee92a60683a6be2c6f8fe6a3e + else + PARENT_SAFE_NONCE=23 + CLABS_SAFE_NONCE=20 + COUNCIL_SAFE_NONCE=22 + GC_SAFE_NONCE=3 + OPCM_ADDRESS=0x2e8cd74af534f5eeb53f889d92fd4220546a15e7 + OPCM_UPGRADE_CALLDATA=0xa458978000000000000000000000000000000000000000000000000000000000000000400000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000100000000000000000000000089e31965d844a309231b1f17759ccaf1b7c09861000000000000000000000000783a434532ee94667979213af1711505e8bfe374034b32d11f017711ce7122ac71d87b1c6cc73e10a0dbd957d8b27f6360acaf8f + fi + + echo "--- Parent prep ---" + + # parent hash + PARENT_TX_HASH=$(cast call $PARENT_SAFE_ADDRESS \ + "getTransactionHash(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,uint256)(bytes32)" \ + $OPCM_ADDRESS $VALUE $OPCM_UPGRADE_CALLDATA $OP_DELEGATECALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $PARENT_SAFE_NONCE \ + -r $RPC_URL + ) + echo "Parent hash: $PARENT_TX_HASH" + + echo "--- Council part ---" + + # council hash + APPROVE_PARENT_CALLDATA=$(cast calldata 'approveHash(bytes32)' $PARENT_TX_HASH) + COUNCIL_TX_HASH=$(cast call $COUNCIL_SAFE_ADDRESS \ + "getTransactionHash(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,uint256)(bytes32)" \ + $PARENT_SAFE_ADDRESS $VALUE $APPROVE_PARENT_CALLDATA $OP_CALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $COUNCIL_SAFE_NONCE \ + -r $RPC_URL + ) + echo "Council hash: $COUNCIL_TX_HASH" + + echo "--- Grand child part ---" + + # gc hash + APPROVE_COUNCIL_CALLDATA=$(cast calldata 'approveHash(bytes32)' $COUNCIL_TX_HASH) + GC_TX_HASH=$(cast call $GC_SAFE_ADDRESS \ + "getTransactionHash(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,uint256)(bytes32)" \ + $COUNCIL_SAFE_ADDRESS $VALUE $APPROVE_COUNCIL_CALLDATA $OP_CALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $GC_SAFE_NONCE \ + -r $RPC_URL + ) + echo "Grand child hash: $GC_TX_HASH" + + # gc sig + if [ $VERSION = "v2" ]; then + GC_SIG=0x${GC_D1C_SIGNER_C96__SIG_V2:2}${GC_D1C_SIGNER_D80__SIG_V2:2} + else + GC_SIG=0x${GC_D1C_SIGNER_C96__SIG_V3:2}${GC_D1C_SIGNER_D80__SIG_V3:2} + fi + echo "Grand child sig: $GC_SIG" + + echo "--- Grand child exec ---" + + # gc exec + cast send $GC_SAFE_ADDRESS \ + "execTransaction(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,bytes)" \ + $COUNCIL_SAFE_ADDRESS $VALUE $APPROVE_COUNCIL_CALLDATA $OP_CALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $GC_SIG \ + --private-key $PK \ + -r $RPC_URL + echo "Grand child executed" + + echo "--- Grand child done ---" + + # council sig + if [ $VERSION = "v2" ]; then + COUNCIL_SIG=0x${COUNCIL_SIGNER_148__SIG_V2:2}${COUNCIL_SIGNER_2BE__SIG_V2:2}${COUNCIL_SIGNER_6FD__SIG_V2:2}${COUNCIL_SIGNER_B96__SIG_V2:2}${COUNCIL_SIGNER_D0C__SIG_V2:2}000000000000000000000000${GC_SAFE_ADDRESS:2}000000000000000000000000000000000000000000000000000000000000000001 + else + COUNCIL_SIG=0x${COUNCIL_SIGNER_148__SIG_V3:2}${COUNCIL_SIGNER_2BE__SIG_V3:2}${COUNCIL_SIGNER_6FD__SIG_V3:2}${COUNCIL_SIGNER_B96__SIG_V3:2}${COUNCIL_SIGNER_D0C__SIG_V3:2}000000000000000000000000${GC_SAFE_ADDRESS:2}000000000000000000000000000000000000000000000000000000000000000001 + fi + echo "Council sig: $COUNCIL_SIG" + + echo "--- Council exec ---" + + # council exec + cast send $COUNCIL_SAFE_ADDRESS \ + "execTransaction(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,bytes)" \ + $PARENT_SAFE_ADDRESS $VALUE $APPROVE_PARENT_CALLDATA $OP_CALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $COUNCIL_SIG \ + --private-key $PK \ + -r $RPC_URL + echo "Council executed" + + echo "--- Council done ---" + + echo "--- cLabs part ---" + + # clabs hash + CLABS_TX_HASH=$(cast call $CLABS_SAFE_ADDRESS \ + "getTransactionHash(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,uint256)(bytes32)" \ + $PARENT_SAFE_ADDRESS $VALUE $APPROVE_PARENT_CALLDATA $OP_CALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $CLABS_SAFE_NONCE \ + -r $RPC_URL + ) + echo "cLabs hash: $CLABS_TX_HASH" + + # clabs sig + if [ $VERSION = "v2" ]; then + CLABS_SIG=0x${CLABS_SIGNER_09C__SIG_V2:2}${CLABS_SIGNER_21E__SIG_V2:2}${CLABS_SIGNER_481__SIG_V2:2}${CLABS_SIGNER_4D8__SIG_V2:2}${CLABS_SIGNER_8B4__SIG_V2:2}${CLABS_SIGNER_E00__SIG_V2:2} + else + CLABS_SIG=0x${CLABS_SIGNER_09C__SIG_V3:2}${CLABS_SIGNER_21E__SIG_V3:2}${CLABS_SIGNER_481__SIG_V3:2}${CLABS_SIGNER_4D8__SIG_V3:2}${CLABS_SIGNER_8B4__SIG_V3:2}${CLABS_SIGNER_E00__SIG_V3:2} + fi + echo "cLabs sig: $CLABS_SIG" + + echo "--- cLabs exec ---" + + # clabs exec + cast send $CLABS_SAFE_ADDRESS \ + "execTransaction(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,bytes)" \ + $PARENT_SAFE_ADDRESS $VALUE $APPROVE_PARENT_CALLDATA $OP_CALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $CLABS_SIG \ + --private-key $PK \ + -r $RPC_URL + echo "cLabs executed" + + echo "--- cLabs done ---" + + echo "--- Parent exec ---" + + # exec parent tx + echo "Exec OPCM upgrade" + # signature in format where signer is nested safe (https://github.com/safe-global/safe-smart-account/blob/main/contracts/Safe.sol#L349C17-L351C94) + PARENT_SIG=0x000000000000000000000000${CLABS_SAFE_ADDRESS:2}000000000000000000000000000000000000000000000000000000000000000001000000000000000000000000${COUNCIL_SAFE_ADDRESS:2}000000000000000000000000000000000000000000000000000000000000000001000000000000000000000000000000000000000000000000000000000000 + echo "Parent sig: $PARENT_SIG" + cast send $PARENT_SAFE_ADDRESS \ + "execTransaction(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,bytes)" \ + $OPCM_ADDRESS $VALUE $OPCM_UPGRADE_CALLDATA $OP_DELEGATECALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $PARENT_SIG \ + --gas-limit 16000000 \ + --private-key $PK \ + -r $RPC_URL + echo "OPCM upgrade executed" + + echo "--- Parent done ---" +} + +echo "--------- V2 ---------" +performUpgrade "v2" +echo "--------- V3 ---------" +performUpgrade "v3" +echo "--------- EOF ---------" diff --git a/packages/op-tooling/exec/exec.sh b/packages/op-tooling/exec/exec.sh new file mode 100755 index 00000000000..a28f616898e --- /dev/null +++ b/packages/op-tooling/exec/exec.sh @@ -0,0 +1,148 @@ +#!/usr/bin/env bash +set -euo pipefail + +# required envs +[ -z "${PK:-}" ] && echo "Need to set the PK via env" && exit 1; +[ -z "${OPCM_ADDRESS:-}" ] && echo "Need to set the OPCM_ADDRESS via env" && exit 1; +[ -z "${OPCM_UPGRADE_CALLDATA:-}" ] && echo "Need to set the OPCM_UPGRADE_CALLDATA via env" && exit 1; +[ -z "${GC_SIG:-}" ] && echo "Need to set the GC_SIG via env" && exit 1; +[ -z "${COUNCIL_SIG:-}" ] && echo "Need to set the COUNCIL_SIG via env" && exit 1; +[ -z "${CLABS_SIG:-}" ] && echo "Need to set the CLABS_SIG via env" && exit 1; +echo "Logged in as wallet: $(cast wallet address --private-key $PK)" +echo "Detected OPCM under address: $OPCM_ADDRESS" +echo "Detected upgrade calldata: $OPCM_UPGRADE_CALLDATA" + +# optional envs +RPC_URL=${RPC_URL:-"http://127.0.0.1:8545"} + +# safes +PARENT_SAFE_ADDRESS=0x4092A77bAF58fef0309452cEaCb09221e556E112 +CLABS_SAFE_ADDRESS=0x9Eb44Da23433b5cAA1c87e35594D15FcEb08D34d +COUNCIL_SAFE_ADDRESS=0xC03172263409584f7860C25B6eB4985f0f6F4636 +GC_SAFE_ADDRESS=0xD1C635987B6Aa287361d08C6461491Fa9df087f2 + +# defaults +VALUE=0 +OP_CALL=0 +OP_DELEGATECALL=1 +SAFE_TX_GAS=0 +BASE_GAS=0 +GAS_PRICE=0 +GAS_TOKEN=0x0000000000000000000000000000000000000000 +REFUND_RECEIVER=0x0000000000000000000000000000000000000000 + +function performUpgrade() { + # fetch current nonces + PARENT_SAFE_NONCE=$(cast call $PARENT_SAFE_ADDRESS "nonce()(uint256)" -r $RPC_URL) + CLABS_SAFE_NONCE=$(cast call $CLABS_SAFE_ADDRESS "nonce()(uint256)" -r $RPC_URL) + COUNCIL_SAFE_NONCE=$(cast call $COUNCIL_SAFE_ADDRESS "nonce()(uint256)" -r $RPC_URL) + GC_SAFE_NONCE=$(cast call $GC_SAFE_ADDRESS "nonce()(uint256)" -r $RPC_URL) + + echo "--- Parent prep ---" + + # parent hash + PARENT_TX_HASH=$(cast call $PARENT_SAFE_ADDRESS \ + "getTransactionHash(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,uint256)(bytes32)" \ + $OPCM_ADDRESS $VALUE $OPCM_UPGRADE_CALLDATA $OP_DELEGATECALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $PARENT_SAFE_NONCE \ + -r $RPC_URL + ) + echo "Parent hash: $PARENT_TX_HASH" + + echo "--- Council part ---" + + # council hash + APPROVE_PARENT_CALLDATA=$(cast calldata 'approveHash(bytes32)' $PARENT_TX_HASH) + COUNCIL_TX_HASH=$(cast call $COUNCIL_SAFE_ADDRESS \ + "getTransactionHash(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,uint256)(bytes32)" \ + $PARENT_SAFE_ADDRESS $VALUE $APPROVE_PARENT_CALLDATA $OP_CALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $COUNCIL_SAFE_NONCE \ + -r $RPC_URL + ) + echo "Council hash: $COUNCIL_TX_HASH" + + echo "--- Grand child part ---" + + # gc hash + APPROVE_COUNCIL_CALLDATA=$(cast calldata 'approveHash(bytes32)' $COUNCIL_TX_HASH) + GC_TX_HASH=$(cast call $GC_SAFE_ADDRESS \ + "getTransactionHash(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,uint256)(bytes32)" \ + $COUNCIL_SAFE_ADDRESS $VALUE $APPROVE_COUNCIL_CALLDATA $OP_CALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $GC_SAFE_NONCE \ + -r $RPC_URL + ) + echo "Grand child hash: $GC_TX_HASH" + + # gc sig + echo "Grand child sig: $GC_SIG" + + echo "--- Grand child exec ---" + + # gc exec + cast send $GC_SAFE_ADDRESS \ + "execTransaction(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,bytes)" \ + $COUNCIL_SAFE_ADDRESS $VALUE $APPROVE_COUNCIL_CALLDATA $OP_CALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $GC_SIG \ + --private-key $PK \ + -r $RPC_URL + echo "Grand child executed" + + echo "--- Grand child done ---" + + # council sig + echo "Council sig: $COUNCIL_SIG" + + echo "--- Council exec ---" + + # council exec + cast send $COUNCIL_SAFE_ADDRESS \ + "execTransaction(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,bytes)" \ + $PARENT_SAFE_ADDRESS $VALUE $APPROVE_PARENT_CALLDATA $OP_CALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $COUNCIL_SIG \ + --private-key $PK \ + -r $RPC_URL + echo "Council executed" + + echo "--- Council done ---" + + echo "--- cLabs part ---" + + # clabs hash + CLABS_TX_HASH=$(cast call $CLABS_SAFE_ADDRESS \ + "getTransactionHash(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,uint256)(bytes32)" \ + $PARENT_SAFE_ADDRESS $VALUE $APPROVE_PARENT_CALLDATA $OP_CALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $CLABS_SAFE_NONCE \ + -r $RPC_URL + ) + echo "cLabs hash: $CLABS_TX_HASH" + + # clabs sig + echo "cLabs sig: $CLABS_SIG" + + echo "--- cLabs exec ---" + + # clabs exec + cast send $CLABS_SAFE_ADDRESS \ + "execTransaction(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,bytes)" \ + $PARENT_SAFE_ADDRESS $VALUE $APPROVE_PARENT_CALLDATA $OP_CALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $CLABS_SIG \ + --private-key $PK \ + -r $RPC_URL + echo "cLabs executed" + + echo "--- cLabs done ---" + + echo "--- Parent exec ---" + + # exec parent tx + echo "Exec OPCM upgrade" + # signature in format where signer is nested safe (https://github.com/safe-global/safe-smart-account/blob/main/contracts/Safe.sol#L349C17-L351C94) + PARENT_SIG=0x000000000000000000000000${CLABS_SAFE_ADDRESS:2}000000000000000000000000000000000000000000000000000000000000000001000000000000000000000000${COUNCIL_SAFE_ADDRESS:2}000000000000000000000000000000000000000000000000000000000000000001000000000000000000000000000000000000000000000000000000000000 + echo "Parent sig: $PARENT_SIG" + cast send $PARENT_SAFE_ADDRESS \ + "execTransaction(address,uint256,bytes,uint8,uint256,uint256,uint256,address,address,bytes)" \ + $OPCM_ADDRESS $VALUE $OPCM_UPGRADE_CALLDATA $OP_DELEGATECALL $SAFE_TX_GAS $BASE_GAS $GAS_PRICE $GAS_TOKEN $REFUND_RECEIVER $PARENT_SIG \ + --gas-limit 16000000 \ + --private-key $PK \ + -r $RPC_URL + echo "OPCM upgrade executed" + + echo "--- Parent done ---" +} + +echo "--------- START ---------" +performUpgrade +echo "--------- END ---------" diff --git a/packages/op-tooling/fork/README.md b/packages/op-tooling/fork/README.md new file mode 100644 index 00000000000..5a04d28b0bf --- /dev/null +++ b/packages/op-tooling/fork/README.md @@ -0,0 +1,128 @@ +# Network Forking Scripts + +This directory contains scripts for forking various blockchain networks for local development and testing purposes. + +## Scripts + +### `fork_l1.sh` + +Forks an L1 network (Ethereum mainnet, Sepolia testnet, or Holesky testnet) using Anvil. + +**Required Environment Variables:** +- `ALCHEMY_API_KEY` - Your Alchemy API key +- `NETWORK` - Network to fork (`mainnet`, `sepolia`, or `holesky`) +- `BLOCK_NUMBER` - Block number to fork from + +**Supported Networks:** +- **mainnet**: Chain ID 1 +- **sepolia**: Chain ID 11155111 +- **holesky**: Chain ID 17000 + +**Example Execution:** +```bash +ALCHEMY_API_KEY="..." NETWORK="mainnet" BLOCK_NUMBER="..." ./fork_l1.sh +``` + +The script will start an Anvil instance on port 8545 with the specified network forked. + +### `fork_l2.sh` + +Forks the Celo Alfajores testnet (L2) using Anvil. + +**No environment variables required.** + +**Network Details:** +- **Network**: Celo Alfajores testnet +- **Chain ID**: 44787 +- **Block Number**: 47029145 +- **RPC URL**: https://alfajores-forno.celo-testnet.org + +**Example Execution:** +```bash +./fork_l2.sh +``` + +The script will start an Anvil instance on port 8546 with the Celo Alfajores testnet forked. + +### `mock-mainnet.sh` + +Sets up a mocked mainnet environment with predefined multisig configurations and account balances. This script is designed to work with a running Anvil instance (typically started by `fork_l1.sh`). + +**Optional Environment Variables:** +- `ACCOUNT` - External account address to use as a signer +- `TEAM` - Team identifier (`clabs` or `council`) +- `GC_MULTISIG` - Grand Child multisig address (only supported for council team) + +**Features:** +- Sets 10,000 ETH balance for all mocked accounts +- Configures multisig thresholds and owner counts +- Sets up ownership circular linked lists for multisigs +- Enhanced external account integration with dynamic signer replacement +- Comprehensive Grand Child multisig support for Council team +- Detailed validation reporting for all multisig configurations + +**Multisig Structure:** +- **Parent Multisig**: Controls cLabs and Council multisigs (threshold: 2) +- **cLabs Multisig**: Controlled by Signer #1 and Signer #2 (threshold: 2) +- **Council Multisig**: Controlled by Signer #3 and Signer #4 (threshold: 2) or Grand Child + Signer #4 +- **Grand Child Multisig**: Controlled by Signer #3 (threshold: 1, only for Council team) + +**Required Environment Variables:** +- `MOCKED_SIGNER_1` - Address for first signer (unless external account used for cLabs) +- `MOCKED_SIGNER_2` - Address for second signer +- `MOCKED_SIGNER_3` - Address for third signer (unless external account used for council) +- `MOCKED_SIGNER_4` - Address for fourth signer + +> **Important**: These signer addresses must correspond exactly to the private keys used in `exec-mocked.sh`: +> - `MOCKED_SIGNER_1` → `SIGNER_1_PK` (cLabs team signer) +> - `MOCKED_SIGNER_2` → `SIGNER_2_PK` (cLabs team signer) +> - `MOCKED_SIGNER_3` → `SIGNER_3_PK` (Council team signer) +> - `MOCKED_SIGNER_4` → `SIGNER_4_PK` (Council team signer) + +**Example Execution:** +```bash +# Basic execution (uses default mocked accounts) +MOCKED_SIGNER_1="0x..." MOCKED_SIGNER_2="0x..." MOCKED_SIGNER_3="0x..." MOCKED_SIGNER_4="0x..." ./mock-mainnet.sh + +# With external account for cLabs team (replaces MOCKED_SIGNER_1) +ACCOUNT="0x..." TEAM="clabs" MOCKED_SIGNER_2="0x..." MOCKED_SIGNER_3="0x..." MOCKED_SIGNER_4="0x..." ./mock-mainnet.sh + +# With external account and Grand Child multisig for council team (replaces MOCKED_SIGNER_3) +ACCOUNT="0x..." TEAM="council" GC_MULTISIG="0x..." MOCKED_SIGNER_1="0x..." MOCKED_SIGNER_2="0x..." MOCKED_SIGNER_4="0x..." ./mock-mainnet.sh +``` + +## Usage Workflow + +1. **Start L1 fork:** + ```bash + export ALCHEMY_API_KEY="your_key" + export NETWORK="mainnet" + export BLOCK_NUMBER="desired_block" + ./fork_l1.sh + ``` + +2. **Start L2 fork (optional):** + ```bash + ./fork_l2.sh + ``` + +3. **In another terminal, set up mocked environment:** + ```bash + # Use the same addresses that correspond to your private keys in exec-mocked.sh + MOCKED_SIGNER_1="0x..." MOCKED_SIGNER_2="0x..." MOCKED_SIGNER_3="0x..." MOCKED_SIGNER_4="0x..." ./mock-mainnet.sh + ``` + +## Ports Used + +- **8545**: L1 fork (mainnet/holesky) +- **8546**: L2 fork (alfajores) + +## Notes + +- The mock-mainnet script requires a running Anvil instance on port 8545 +- External accounts can be used to replace default mocked signers for testing specific scenarios +- The Grand Child multisig feature is only available for the council team configuration +- All signer addresses must be provided via environment variables for proper multisig setup +- The script performs comprehensive validation of all multisig configurations after setup +- External account integration dynamically replaces the appropriate signer based on team selection +- **Critical**: The signer addresses configured here must match the private keys used in `exec-mocked.sh` for proper multisig operation diff --git a/packages/op-tooling/fork/fork_l1.sh b/packages/op-tooling/fork/fork_l1.sh new file mode 100755 index 00000000000..a3d76cb793d --- /dev/null +++ b/packages/op-tooling/fork/fork_l1.sh @@ -0,0 +1,35 @@ +#!/usr/bin/env bash +set -euo pipefail + +[ -z "${NETWORK:-}" ] && echo "Need to set the NETWORK via env" && exit 1; +[ -z "${BLOCK_NUMBER:-}" ] && echo "Need to set the BLOCK_NUMBER via env" && exit 1; + +if [ -z "${RPC_URL:-}" ]; then + [ -z "${ALCHEMY_API_KEY:-}" ] && echo "Need to set the ALCHEMY_API_KEY via env" && exit 1; + RPC_URL="https://eth-$NETWORK.g.alchemy.com/v2/$ALCHEMY_API_KEY" +fi + +# Check network +case $NETWORK in + "mainnet") + echo "Detected supported network: $NETWORK" + CHAIN_ID=1 + ;; + "sepolia") + echo "Detected supported network: $NETWORK" + CHAIN_ID=11155111 + ;; + "holesky") + echo "Detected supported network: $NETWORK" + CHAIN_ID=17000 + ;; + *) + echo "Unsupported network: $NETWORK" && exit 1 + ;; +esac + +anvil \ + --port 8545 \ + --fork-url $RPC_URL \ + --fork-chain-id $CHAIN_ID \ + --fork-block-number $BLOCK_NUMBER diff --git a/packages/op-tooling/fork/fork_l2.sh b/packages/op-tooling/fork/fork_l2.sh new file mode 100755 index 00000000000..c955d1dff36 --- /dev/null +++ b/packages/op-tooling/fork/fork_l2.sh @@ -0,0 +1,8 @@ +#!/usr/bin/env bash +set -euo pipefail + +anvil \ + --port 8546 \ + --fork-url https://forno.celo-sepolia.celo-testnet.org \ + --fork-chain-id 11142220 \ + --fork-block-number 9055818 diff --git a/packages/op-tooling/fork/mock-mainnet.sh b/packages/op-tooling/fork/mock-mainnet.sh new file mode 100755 index 00000000000..e3336bfa023 --- /dev/null +++ b/packages/op-tooling/fork/mock-mainnet.sh @@ -0,0 +1,172 @@ +#!/usr/bin/env bash +set -euo pipefail + +# addresses +MOCKED_DEPLOYER=0x95FFAC468e37DdeEF407FfEf18f0cC9E86D8f13B + +# multisigs +PARENT_MULTISIG=0x4092A77bAF58fef0309452cEaCb09221e556E112 +CLABS_MULTISIG=0x9Eb44Da23433b5cAA1c87e35594D15FcEb08D34d +COUNCIL_MULTISIG=0xC03172263409584f7860C25B6eB4985f0f6F4636 + +# optionally allow to specify signer +EXTERNAL_ACCOUNT=${ACCOUNT:-} +EXTERNAL_TEAM=${TEAM:-} +GRAND_CHILD_MULTISIG=${GC_MULTISIG:-} + +# determine if using internal signers +USE_INTERNAL_CLABS=$([ -z "$EXTERNAL_ACCOUNT" ] || [ "$EXTERNAL_TEAM" != "clabs" ] && echo "true" || echo "false") +USE_INTERNAL_COUNCIL=$([ -z "$EXTERNAL_ACCOUNT" ] || [ "$EXTERNAL_TEAM" != "council" ] && echo "true" || echo "false") + +if [ -n "$EXTERNAL_ACCOUNT" ]; then + echo "Detected external account: $EXTERNAL_ACCOUNT" + case $EXTERNAL_TEAM in + "clabs"|"council") + echo "Detected valid team: $EXTERNAL_TEAM" + ;; + *) + echo "Invalid team: $EXTERNAL_TEAM" && exit 1 + ;; + esac +fi +if [ -n "$GRAND_CHILD_MULTISIG" ] && [ "$EXTERNAL_TEAM" != "council" ]; then + echo "Grand Child multisig is not supported for other team than council" && exit 1 +fi + +# signers +[ "$USE_INTERNAL_CLABS" = "true" ] && [ -z "${MOCKED_SIGNER_1:-}" ] && echo "Need to set the MOCKED_SIGNER_1 via env" && exit 1 +[ "$USE_INTERNAL_CLABS" = "false" ] && MOCKED_SIGNER_1=$EXTERNAL_ACCOUNT +[ -z "${MOCKED_SIGNER_2:-}" ] && echo "Need to set the MOCKED_SIGNER_2 via env" && exit 1 +[ "$USE_INTERNAL_COUNCIL" = "true" ] && [ -z "${MOCKED_SIGNER_3:-}" ] && echo "Need to set the MOCKED_SIGNER_3 via env" && exit 1 +[ "$USE_INTERNAL_COUNCIL" = "false" ] && MOCKED_SIGNER_3=$EXTERNAL_ACCOUNT +[ -z "${MOCKED_SIGNER_4:-}" ] && echo "Need to set the MOCKED_SIGNER_4 via env" && exit 1 + +# validate signer ordering +if [ "$USE_INTERNAL_CLABS" = "true" ] && [[ ${MOCKED_SIGNER_1:2,,} > ${MOCKED_SIGNER_2:2,,} ]]; then + echo "Error: MOCKED_SIGNER_1 must be < MOCKED_SIGNER_2 (addresses must be in ascending order)" && exit 1 +fi +if [ "$USE_INTERNAL_COUNCIL" = "true" ] && [[ ${MOCKED_SIGNER_3:2,,} > ${MOCKED_SIGNER_4:2,,} ]]; then + echo "Error: MOCKED_SIGNER_3 must be < MOCKED_SIGNER_4 (addresses must be in ascending order)" && exit 1 +fi + +# safe internal +SENTINEL_ADDRESS=0x0000000000000000000000000000000000000001 + +# rpc +RPC_URL=http://127.0.0.1:8545 + +# set 10_000 ETH on mocked owner +echo "Mock accounts balance" +cast rpc anvil_setBalance $MOCKED_DEPLOYER 0x21e19e0c9bab2400000 -r $RPC_URL +cast rpc anvil_setBalance $MOCKED_SIGNER_1 0x21e19e0c9bab2400000 -r $RPC_URL +cast rpc anvil_setBalance $MOCKED_SIGNER_2 0x21e19e0c9bab2400000 -r $RPC_URL +cast rpc anvil_setBalance $MOCKED_SIGNER_3 0x21e19e0c9bab2400000 -r $RPC_URL +cast rpc anvil_setBalance $MOCKED_SIGNER_4 0x21e19e0c9bab2400000 -r $RPC_URL + +# change threshold of signers to 2 for each multisig +echo "Change treshold for multisigs" +cast rpc anvil_setStorageAt $PARENT_MULTISIG 0x0000000000000000000000000000000000000000000000000000000000000004 0x0000000000000000000000000000000000000000000000000000000000000002 -r $RPC_URL +cast rpc anvil_setStorageAt $CLABS_MULTISIG 0x0000000000000000000000000000000000000000000000000000000000000004 0x0000000000000000000000000000000000000000000000000000000000000002 -r $RPC_URL +cast rpc anvil_setStorageAt $COUNCIL_MULTISIG 0x0000000000000000000000000000000000000000000000000000000000000004 0x0000000000000000000000000000000000000000000000000000000000000002 -r $RPC_URL + +# change threshold to 1 for Grand Child multisig +if [ -n "$GRAND_CHILD_MULTISIG" ]; then + cast rpc anvil_setStorageAt $GRAND_CHILD_MULTISIG 0x0000000000000000000000000000000000000000000000000000000000000004 0x0000000000000000000000000000000000000000000000000000000000000001 -r $RPC_URL +fi + +# change owner count to 2 for each multisig +echo "Change owner count for multisigs" +cast rpc anvil_setStorageAt $PARENT_MULTISIG 0x0000000000000000000000000000000000000000000000000000000000000003 0x0000000000000000000000000000000000000000000000000000000000000002 -r $RPC_URL +cast rpc anvil_setStorageAt $CLABS_MULTISIG 0x0000000000000000000000000000000000000000000000000000000000000003 0x0000000000000000000000000000000000000000000000000000000000000002 -r $RPC_URL +cast rpc anvil_setStorageAt $COUNCIL_MULTISIG 0x0000000000000000000000000000000000000000000000000000000000000003 0x0000000000000000000000000000000000000000000000000000000000000002 -r $RPC_URL + +# change owner count to 1 for Grand Child multisig +if [ -n "$GRAND_CHILD_MULTISIG" ]; then + cast rpc anvil_setStorageAt $GRAND_CHILD_MULTISIG 0x0000000000000000000000000000000000000000000000000000000000000003 0x0000000000000000000000000000000000000000000000000000000000000001 -r $RPC_URL +fi + +# mock ownership circular linked list +echo "Mock ownership for multisigs" +# [Parent] +# Sentinel -> cLabs +SENTINEL_SLOT=$(cast index address 0x0000000000000000000000000000000000000001 2) +cast rpc anvil_setStorageAt $PARENT_MULTISIG $SENTINEL_SLOT 0x000000000000000000000000${CLABS_MULTISIG:2} -r $RPC_URL +# cLabs -> Council +CLABS_SLOT=$(cast index address 0x9Eb44Da23433b5cAA1c87e35594D15FcEb08D34d 2) +cast rpc anvil_setStorageAt $PARENT_MULTISIG $CLABS_SLOT 0x000000000000000000000000${COUNCIL_MULTISIG:2} -r $RPC_URL +# Council -> Sentinel +COUNCIL_SLOT=$(cast index address 0xC03172263409584f7860C25B6eB4985f0f6F4636 2) +cast rpc anvil_setStorageAt $PARENT_MULTISIG $COUNCIL_SLOT 0x000000000000000000000000${SENTINEL_ADDRESS:2} -r $RPC_URL +# [cLabs] +# Sentinel -> Signer #1 +cast rpc anvil_setStorageAt $CLABS_MULTISIG $SENTINEL_SLOT 0x000000000000000000000000${MOCKED_SIGNER_1:2} -r $RPC_URL +# Signer #1 -> Signer #2 +SIGNER_1_SLOT=$(cast index address $MOCKED_SIGNER_1 2) +cast rpc anvil_setStorageAt $CLABS_MULTISIG $SIGNER_1_SLOT 0x000000000000000000000000${MOCKED_SIGNER_2:2} -r $RPC_URL +# Signer #2 -> Sentinel +SIGNER_2_SLOT=$(cast index address $MOCKED_SIGNER_2 2) +cast rpc anvil_setStorageAt $CLABS_MULTISIG $SIGNER_2_SLOT 0x000000000000000000000000${SENTINEL_ADDRESS:2} -r $RPC_URL +# [Council] +if [ -z "$GRAND_CHILD_MULTISIG" ]; then + # Sentinel -> Signer #3 + cast rpc anvil_setStorageAt $COUNCIL_MULTISIG $SENTINEL_SLOT 0x000000000000000000000000${MOCKED_SIGNER_3:2} -r $RPC_URL + # Signer #3 -> Signer #4 + SIGNER_3_SLOT=$(cast index address $MOCKED_SIGNER_3 2) + cast rpc anvil_setStorageAt $COUNCIL_MULTISIG $SIGNER_3_SLOT 0x000000000000000000000000${MOCKED_SIGNER_4:2} -r $RPC_URL + # Signer #4 -> Sentinel + SIGNER_4_SLOT=$(cast index address $MOCKED_SIGNER_4 2) + cast rpc anvil_setStorageAt $COUNCIL_MULTISIG $SIGNER_4_SLOT 0x000000000000000000000000${SENTINEL_ADDRESS:2} -r $RPC_URL +else + if [[ ${GRAND_CHILD_MULTISIG:2,,} < ${MOCKED_SIGNER_4:2,,} ]]; then + # Sentinel -> Grand Child + cast rpc anvil_setStorageAt $COUNCIL_MULTISIG $SENTINEL_SLOT 0x000000000000000000000000${GRAND_CHILD_MULTISIG:2} -r $RPC_URL + # Grand Child -> Signer #4 + GC_SLOT=$(cast index address $GRAND_CHILD_MULTISIG 2) + cast rpc anvil_setStorageAt $COUNCIL_MULTISIG $GC_SLOT 0x000000000000000000000000${MOCKED_SIGNER_4:2} -r $RPC_URL + # Signer #4 -> Sentinel + SIGNER_4_SLOT=$(cast index address $MOCKED_SIGNER_4 2) + cast rpc anvil_setStorageAt $COUNCIL_MULTISIG $SIGNER_4_SLOT 0x000000000000000000000000${SENTINEL_ADDRESS:2} -r $RPC_URL + else + # Sentinel -> Signer #4 + cast rpc anvil_setStorageAt $COUNCIL_MULTISIG $SENTINEL_SLOT 0x000000000000000000000000${MOCKED_SIGNER_4:2} -r $RPC_URL + # Signer #4 -> Grand Child + SIGNER_4_SLOT=$(cast index address $MOCKED_SIGNER_4 2) + cast rpc anvil_setStorageAt $COUNCIL_MULTISIG $SIGNER_4_SLOT 0x000000000000000000000000${GRAND_CHILD_MULTISIG:2} -r $RPC_URL + # Grand Child -> Sentinel + GC_SLOT=$(cast index address $GRAND_CHILD_MULTISIG 2) + cast rpc anvil_setStorageAt $COUNCIL_MULTISIG $GC_SLOT 0x000000000000000000000000${SENTINEL_ADDRESS:2} -r $RPC_URL + fi + + # GC Sentinel -> Signer #3 + cast rpc anvil_setStorageAt $GRAND_CHILD_MULTISIG $SENTINEL_SLOT 0x000000000000000000000000${MOCKED_SIGNER_3:2} -r $RPC_URL + # Signer #3 -> GC Sentinel + SIGNER_3_SLOT=$(cast index address $MOCKED_SIGNER_3 2) + cast rpc anvil_setStorageAt $GRAND_CHILD_MULTISIG $SIGNER_3_SLOT 0x000000000000000000000000${SENTINEL_ADDRESS:2} -r $RPC_URL +fi + +# validate safe correctly mocked +echo "Validation" +echo "--- Parent ---" +echo "Parent threshold: $(cast call $PARENT_MULTISIG "getThreshold()(uint256)" -r $RPC_URL)" +echo "Parent owners: $(cast call $PARENT_MULTISIG "getOwners()(address[])" -r $RPC_URL)" +echo "Parent signer is cLabs: $(cast call $PARENT_MULTISIG "isOwner(address)(bool)" $CLABS_MULTISIG -r $RPC_URL)" +echo "Parent signer is Council: $(cast call $PARENT_MULTISIG "isOwner(address)(bool)" $COUNCIL_MULTISIG -r $RPC_URL)" +echo "--- cLabs ---" +echo "cLabs threshold: $(cast call $CLABS_MULTISIG "getThreshold()(uint256)" -r $RPC_URL)" +echo "cLabs owners: $(cast call $CLABS_MULTISIG "getOwners()(address[])" -r $RPC_URL)" +echo "cLabs signer is Signer #1: $(cast call $CLABS_MULTISIG "isOwner(address)(bool)" $MOCKED_SIGNER_1 -r $RPC_URL)" +echo "cLabs signer is Signer #2: $(cast call $CLABS_MULTISIG "isOwner(address)(bool)" $MOCKED_SIGNER_2 -r $RPC_URL)" +echo "--- Council ---" +echo "Council threshold: $(cast call $COUNCIL_MULTISIG "getThreshold()(uint256)" -r $RPC_URL)" +echo "Council owners: $(cast call $COUNCIL_MULTISIG "getOwners()(address[])" -r $RPC_URL)" +if [ -z "$GRAND_CHILD_MULTISIG" ]; then + echo "Council signer is Signer #3: $(cast call $COUNCIL_MULTISIG "isOwner(address)(bool)" $MOCKED_SIGNER_3 -r $RPC_URL)" + echo "Council signer is Signer #4: $(cast call $COUNCIL_MULTISIG "isOwner(address)(bool)" $MOCKED_SIGNER_4 -r $RPC_URL)" +else + echo "Council signer is Grand Child: $(cast call $COUNCIL_MULTISIG "isOwner(address)(bool)" $GRAND_CHILD_MULTISIG -r $RPC_URL)" + echo "Council signer is Signer #4: $(cast call $COUNCIL_MULTISIG "isOwner(address)(bool)" $MOCKED_SIGNER_4 -r $RPC_URL)" + echo "--- Grand Child ---" + echo "Grand Child threshold: $(cast call $GRAND_CHILD_MULTISIG "getThreshold()(uint256)" -r $RPC_URL)" + echo "Grand Child owners: $(cast call $GRAND_CHILD_MULTISIG "getOwners()(address[])" -r $RPC_URL)" + echo "Grand Child signer is Signer #3: $(cast call $GRAND_CHILD_MULTISIG "isOwner(address)(bool)" $MOCKED_SIGNER_3 -r $RPC_URL)" +fi diff --git a/packages/op-tooling/impls/DeployMIPS.sol b/packages/op-tooling/impls/DeployMIPS.sol new file mode 100644 index 00000000000..67dd6084675 --- /dev/null +++ b/packages/op-tooling/impls/DeployMIPS.sol @@ -0,0 +1,50 @@ +// SPDX-License-Identifier: MIT +pragma solidity ^0.8.15; + +import { Script } from "forge-std/Script.sol"; +import { console } from "forge-std/console.sol"; + +import { IMIPS } from "interfaces/cannon/IMIPS.sol"; +import { IPreimageOracle } from "interfaces/cannon/IPreimageOracle.sol"; +import { DeployUtils } from "scripts/libraries/DeployUtils.sol"; + +contract DeployMIPS is Script { + // This script requires running with --root and the following env vars: + // PREIMAGE_ORACLE (optional) - if not provided, MIN_PROPOSAL_SIZE and CHALLENGE_PERIOD must be provided + // MIN_PROPOSAL_SIZE (optional) - minimum proposal size for a new PreimageOracle if PREIMAGE_ORACLE is not provided + // CHALLENGE_PERIOD (optional) - challenge period for a new PreimageOracle if PREIMAGE_ORACLE is not provided + + error MissingEnvVars(); + + function run() external { + address oracle_ = vm.envOr("PREIMAGE_ORACLE", address(0)); + uint256 minProposalSize_ = vm.envOr("MIN_PROPOSAL_SIZE", uint256(0)); + uint256 challengePeriod_ = vm.envOr("CHALLENGE_PERIOD", uint256(0)); + + if (oracle_ == address(0) && !(minProposalSize_ > 0 && challengePeriod_ > 0)) { + revert MissingEnvVars(); + } + + if (oracle_ == address(0)) { + oracle_ = DeployUtils.createDeterministic({ + _name: "PreimageOracle", + _args: DeployUtils.encodeConstructor( + abi.encodeCall(IPreimageOracle.__constructor__, (minProposalSize_, challengePeriod_)) + ), + _salt: DeployUtils.DEFAULT_SALT + }); + console.log("Using new PreimageOracle:", oracle_); + } else { + console.log("Using provided PreimageOracle:", oracle_); + } + + address mips_ = DeployUtils.createDeterministic({ + _name: "MIPS", + _args: DeployUtils.encodeConstructor( + abi.encodeCall(IMIPS.__constructor__, (IPreimageOracle(oracle_))) + ), + _salt: DeployUtils.DEFAULT_SALT + }); + console.log("MIPS deployed at:", mips_); + } +} diff --git a/packages/op-tooling/impls/DeployPortalImpl.s.sol b/packages/op-tooling/impls/DeployPortalImpl.s.sol new file mode 100644 index 00000000000..1b8858e721b --- /dev/null +++ b/packages/op-tooling/impls/DeployPortalImpl.s.sol @@ -0,0 +1,31 @@ +// SPDX-License-Identifier: MIT +pragma solidity ^0.8.15; + +import { Script } from "forge-std/Script.sol"; +import { console } from "forge-std/console.sol"; + +import { IOptimismPortal2 } from "interfaces/L1/IOptimismPortal2.sol"; +import { DeployUtils } from "scripts/libraries/DeployUtils.sol"; + +contract DeployPortalImpl is Script { + // This script requires running with --root and the following env vars: + // PROOF_MATURITY_DELAY_SECONDS (required) - proof maturity delay for the OptimismPortal2 + // DISPUTE_GAME_FINALITY_DELAY_SECONDS (required) - dispute game finality delay + + function run() external { + uint256 proofMaturityDelaySeconds_ = vm.envUint("PROOF_MATURITY_DELAY_SECONDS"); + uint256 disputeGameFinalityDelaySeconds_ = vm.envUint("DISPUTE_GAME_FINALITY_DELAY_SECONDS"); + + address impl_ = DeployUtils.createDeterministic({ + _name: "OptimismPortal2", + _args: DeployUtils.encodeConstructor( + abi.encodeCall( + IOptimismPortal2.__constructor__, + (proofMaturityDelaySeconds_, disputeGameFinalityDelaySeconds_) + ) + ), + _salt: DeployUtils.DEFAULT_SALT + }); + console.log("OptimismPortal2Impl deployed at:", impl_); + } +} diff --git a/packages/op-tooling/impls/README.md b/packages/op-tooling/impls/README.md new file mode 100644 index 00000000000..f2128b09a3b --- /dev/null +++ b/packages/op-tooling/impls/README.md @@ -0,0 +1,167 @@ +# Optimism Implementation Scripts + +This directory contains Foundry scripts for deploying and managing Optimism contract implementations. These scripts interact with the Optimism repository and must be executed from the Optimism contracts-bedrock directory. + +## Important Usage Note + +**All scripts in this directory must be executed with root pointing to the Optimism repository contracts folder:** +```bash +forge script --root $PATH_TO_OP_REPO/packages/contracts-bedrock +``` + +## Scripts + +### `DeployMIPS.sol` + +Deploys a new MIPS (MIPS Instruction Set Architecture) implementation contract with configurable preimage oracle parameters. + +**Features:** +- Deploys MIPS implementation using deterministic deployment +- Optionally deploys a new PreimageOracle if not provided +- Configurable minimum proposal size and challenge period for the oracle +- Uses DeployUtils for consistent deployment patterns +- Outputs deployed MIPS and oracle addresses + +**Required Environment Variables:** +- Either provide an existing oracle OR configure oracle parameters: + - `PREIMAGE_ORACLE` - Address of existing PreimageOracle (optional) + - `MIN_PROPOSAL_SIZE` - Minimum proposal size for new oracle (required if no existing oracle) + - `CHALLENGE_PERIOD` - Challenge period for new oracle (required if no existing oracle) + +**Example Execution:** +```bash +# Using existing oracle +PREIMAGE_ORACLE="0x..." forge script DeployMIPS.sol --root $PATH_TO_OP_REPO/packages/contracts-bedrock --broadcast --private-key $PK --rpc-url $RPC + +# Deploying new oracle with MIPS +MIN_PROPOSAL_SIZE=1000 CHALLENGE_PERIOD=3600 forge script DeployMIPS.sol --root $PATH_TO_OP_REPO/packages/contracts-bedrock --broadcast --private-key $PK --rpc-url $RPC +``` + +### `DeployPortalImpl.s.sol` + +Deploys a new OptimismPortal2 implementation contract with configurable proof maturity and dispute game finality delays. + +**Features:** +- Deploys OptimismPortal2 implementation using deterministic deployment +- Configurable proof maturity delay and dispute game finality delay +- Uses DeployUtils for consistent deployment patterns +- Outputs deployed implementation address + +**Required Environment Variables:** +- `PROOF_MATURITY_DELAY_SECONDS` - Proof maturity delay in seconds +- `DISPUTE_GAME_FINALITY_DELAY_SECONDS` - Dispute game finality delay in seconds + +**Example Execution:** +```bash +PROOF_MATURITY_DELAY_SECONDS=604800 DISPUTE_GAME_FINALITY_DELAY_SECONDS=604800 forge script DeployPortalImpl.s.sol --root $PATH_TO_OP_REPO/packages/contracts-bedrock --broadcast --private-key $PK --rpc-url $RPC +``` + +### `SafeSetPortal.s.sol` + +Executes Safe multisig transactions to upgrade the OptimismPortal proxy to a new implementation. + +**Features:** +- Builds Safe transactions for proxy upgrades via ProxyAdmin +- Generates Safe transaction hashes for signature collection +- Executes Safe transactions with provided signatures +- Uses delegatecall operations for proper proxy upgrades + +**Required Environment Variables:** +- `PORTAL_PROXY` - Address of the OptimismPortal proxy +- `PORTAL_IMPL` - Address of the new OptimismPortal implementation +- `PROXY_ADMIN` - Address of the ProxyAdmin contract +- `SAFE` - Address of the Safe multisig wallet +- `SENDER` - Address of the transaction sender + +**Optional Environment Variables:** +- `SIG` - Transaction signatures (required for execution) + +**Functions:** +- `getTransactionHash()` - Generates transaction hash for signature collection +- `execTransaction()` - Executes the Safe transaction with signatures + +**Example Execution:** +```bash +# Generate transaction hash +PORTAL_PROXY="0x..." PORTAL_IMPL="0x..." PROXY_ADMIN="0x..." SAFE="0x..." SENDER="0x..." forge script SafeSetPortal.s.sol --sig "getTransactionHash()" --root $PATH_TO_OP_REPO/packages/contracts-bedrock --rpc-url $RPC + +# Sign transaction hash +cast wallet sign --no-hash --private-key $PK $HASH + +# Execute transaction with signature +PORTAL_PROXY="0x..." PORTAL_IMPL="0x..." PROXY_ADMIN="0x..." SAFE="0x..." SENDER="0x..." SIG="0x..." forge script SafeSetPortal.s.sol --sig "execTransaction()" --root $PATH_TO_OP_REPO/packages/contracts-bedrock --broadcast --private-key $PK --rpc-url $RPC +``` + +### `RedeployGames.s.sol` + +Redeploys dispute game implementations with updated configuration parameters, particularly for modifying the maximum clock duration. + +**Features:** +- Redeploys both permissioned and permissionless dispute games +- Updates maximum clock duration for dispute games +- Preserves existing game configuration while updating specific parameters +- Supports both Cannon (permissionless) and Permissioned Cannon game types +- Uses deterministic salt-based deployment for consistency + +**Required Environment Variables:** +- `OPCM` - Address of existing Optimism Contracts Manager +- `FACTORY` - Address of the DisputeGameFactory +- `SYSTEM_CONFIG` - Address of the SystemConfig proxy +- `MAX_CLOCK_DURATION` - New maximum clock duration in seconds + +**Optional Environment Variables:** +- `CLOCK_EXTENSION` - Clock extension duration in seconds (defaults to existing value) +- `MIPS` - Address of MIPS implementation to use (defaults to existing VM) + +**Example Execution:** +```bash +# Basic redeployment with new max clock duration +OPCM="0x..." FACTORY="0x..." SYSTEM_CONFIG="0x..." MAX_CLOCK_DURATION=604800 forge script RedeployGames.s.sol --root $PATH_TO_OP_REPO/packages/contracts-bedrock --broadcast --private-key $PK --rpc-url $RPC + +# Redeployment with custom clock extension and MIPS +OPCM="0x..." FACTORY="0x..." SYSTEM_CONFIG="0x..." MAX_CLOCK_DURATION=604800 CLOCK_EXTENSION=1800 MIPS="0x..." forge script RedeployGames.s.sol --root $PATH_TO_OP_REPO/packages/contracts-bedrock --broadcast --private-key $PK --rpc-url $RPC +``` + +### `SafeSetGames.s.sol` + +Executes Safe multisig transactions to update dispute game implementations in the DisputeGameFactory. + +**Features:** +- Builds multicall transactions for updating both permissioned and permissionless game implementations +- Generates Safe transaction hashes for signature collection +- Executes Safe transactions with provided signatures +- Uses delegatecall operations for proper proxy upgrades + +**Required Environment Variables:** +- `FACTORY` - Address of the DisputeGameFactory +- `PERMISSIONED_GAME` - Address of the new permissioned dispute game implementation +- `PERMISSIONLESS_GAME` - Address of the new permissionless dispute game implementation +- `SAFE` - Address of the Safe multisig wallet +- `SENDER` - Address of the transaction sender + +**Optional Environment Variables:** +- `SIG` - Transaction signatures (required for execution) + +**Functions:** +- `getTransactionHash()` - Generates transaction hash for signature collection +- `execTransaction()` - Executes the Safe transaction with signatures + +**Example Execution:** +```bash +FACTORY="0x..." PERMISSIONED_GAME="0x..." PERMISSIONLESS_GAME="0x..." SAFE="0x..." SENDER="0x..." forge script SafeSetGames.s.sol --sig "getTransactionHash()" --root $PATH_TO_OP_REPO/packages/contracts-bedrock --rpc-url $RPC + +# Sign transaction hash +cast wallet sign --no-hash --private-key $PK $HASH + +# Execute transaction with signatures +FACTORY="0x..." PERMISSIONED_GAME="0x..." PERMISSIONLESS_GAME="0x..." SAFE="0x..." SENDER="0x..." SIG="0x..." forge script --script SafeSetGames.s.sol --sig "execTransaction()" --root $PATH_TO_OP_REPO/packages/contracts-bedrock --broadcast --private-key $PK --rpc-url $RPC +``` + +## Notes + +- **Critical**: All scripts must be executed from the Optimism contracts-bedrock directory using `--root` flag +- Safe transaction signatures must be ordered by signer address for proper multisig execution +- Implementation addresses are deterministic based on salt and constructor parameters +- The `RedeployGames.s.sol` script preserves existing game configuration while updating specific parameters +- All Safe transactions use combination of delegatecall & multicall for proper proxy upgrades +- Transaction hashes generated by `getTransactionHash()` functions must be signed by Safe owners before execution diff --git a/packages/op-tooling/impls/RedeployGames.s.sol b/packages/op-tooling/impls/RedeployGames.s.sol new file mode 100644 index 00000000000..123a73397da --- /dev/null +++ b/packages/op-tooling/impls/RedeployGames.s.sol @@ -0,0 +1,238 @@ +// SPDX-License-Identifier: MIT +pragma solidity ^0.8.15; + +import { Script } from "forge-std/Script.sol"; +import { console } from "forge-std/console.sol"; +import { console2 } from "forge-std/console2.sol"; + +import { Bytes } from "src/libraries/Bytes.sol"; +import { Blueprint } from "src/libraries/Blueprint.sol"; +import { GameType, GameTypes, Duration } from "src/dispute/lib/Types.sol"; + +import { IAnchorStateRegistry } from "interfaces/dispute/IAnchorStateRegistry.sol"; +import { IBigStepper } from "interfaces/dispute/IBigStepper.sol"; +import { IDelayedWETH } from "interfaces/dispute/IDelayedWETH.sol"; +import { IDisputeGame } from "interfaces/dispute/IDisputeGame.sol"; +import { IDisputeGameFactory } from "interfaces/dispute/IDisputeGameFactory.sol"; +import { IFaultDisputeGame } from "interfaces/dispute/IFaultDisputeGame.sol"; +import { IPermissionedDisputeGame } from "interfaces/dispute/IPermissionedDisputeGame.sol"; +import { IOPContractsManager } from "interfaces/L1/IOPContractsManager.sol"; + +contract RedeployGames is Script { + // This script requires running with --root and the following env vars: + // OPCM (required) - address of the old OPContractsManager + // FACTORY (required) - address of the DisputeGameFactory + // SYSTEM_CONFIG (required) - address of the SystemConfig proxy + // MAX_CLOCK_DURATION (required) - new max clock duration for the dispute games + // CLOCK_EXTENSION (optional) - new clock extension for the dispute games + // MIPS (optional) - new MIPS address for the dispute games + + struct Blueprints { + address permissionedDisputeGame1; + address permissionedDisputeGame2; + address permissionlessDisputeGame1; + address permissionlessDisputeGame2; + } + + function run() external { + console.log("Starting redeployment of dispute games..."); + + IOPContractsManager oldOpcm_ = IOPContractsManager(vm.envAddress("OPCM")); + console.log("Old OPCM:", address(oldOpcm_)); + + IOPContractsManager.Blueprints memory oldBlueprints_ = oldOpcm_.blueprints(); + Blueprints memory blueprints_ = Blueprints({ + permissionedDisputeGame1: oldBlueprints_.permissionedDisputeGame1, + permissionedDisputeGame2: oldBlueprints_.permissionedDisputeGame2, + permissionlessDisputeGame1: oldBlueprints_.permissionlessDisputeGame1, + permissionlessDisputeGame2: oldBlueprints_.permissionlessDisputeGame2 + }); + console.log("Old PermissionedDisputeGame blueprint 1:", blueprints_.permissionedDisputeGame1); + console.log("Old PermissionedDisputeGame blueprint 2:", blueprints_.permissionedDisputeGame2); + console.log( + "Old PermissionlessDisputeGame blueprint 1:", + blueprints_.permissionlessDisputeGame1 + ); + console.log( + "Old PermissionlessDisputeGame blueprint 2:", + blueprints_.permissionlessDisputeGame2 + ); + + IDisputeGameFactory factory_ = IDisputeGameFactory(vm.envAddress("FACTORY")); + console.log("DisputeGameFactory:", address(factory_)); + + IPermissionedDisputeGame oldPermissionedGame_ = IPermissionedDisputeGame( + address(factory_.gameImpls(GameTypes.PERMISSIONED_CANNON)) + ); + console.log("Old PermissionedDisputeGame:", address(oldPermissionedGame_)); + IFaultDisputeGame oldPermissionlessGame_ = IFaultDisputeGame( + address(factory_.gameImpls(GameTypes.CANNON)) + ); + console.log("Old PermissionlessDisputeGame:", address(oldPermissionlessGame_)); + uint256 chainId_ = oldPermissionedGame_.l2ChainId(); + console.log("L2 Chain ID:", chainId_); + + address systemConfigProxy_ = vm.envAddress("SYSTEM_CONFIG"); + console.log("SystemConfig proxy:", systemConfigProxy_); + uint256 maxClock_ = vm.envUint("MAX_CLOCK_DURATION"); + console.log("New max clock duration (seconds):", maxClock_); + + uint256 clock_ = vm.envOr("CLOCK_EXTENSION", uint256(0)); + if (clock_ != 0) { + console.log("Changing clock extension during deployment to:", clock_); + } + + address mips_ = vm.envOr("MIPS", address(0)); + if (mips_ != address(0)) { + console.log("Changing MIPS during deployment to:", mips_); + } + + vm.startBroadcast(); + deployAndSetNewGameImpl({ + _l2ChainId: chainId_, + _disputeGame: IDisputeGame(address(oldPermissionedGame_)), + _gameType: GameTypes.PERMISSIONED_CANNON, + _blueprints: blueprints_, + _systemConfig: systemConfigProxy_, + _maxClock: uint64(maxClock_), + _clock: uint64(clock_), + _mips: mips_ + }); + if (address(oldPermissionlessGame_) != address(0)) { + deployAndSetNewGameImpl({ + _l2ChainId: chainId_, + _disputeGame: IDisputeGame(address(oldPermissionlessGame_)), + _gameType: GameTypes.CANNON, + _blueprints: blueprints_, + _systemConfig: systemConfigProxy_, + _maxClock: uint64(maxClock_), + _clock: uint64(clock_), + _mips: mips_ + }); + } + vm.stopBroadcast(); + console.log("Redeployment of dispute games complete."); + } + + function deployAndSetNewGameImpl( + uint256 _l2ChainId, + IDisputeGame _disputeGame, + GameType _gameType, + Blueprints memory _blueprints, + address _systemConfig, + uint64 _maxClock, + uint64 _clock, + address _mips + ) internal { + console.log("Deploying new implementation for game type:", GameType.unwrap(_gameType)); + + // Get the constructor params for the game + IFaultDisputeGame.GameConstructorParams memory params_ = getGameConstructorParams( + IFaultDisputeGame(address(_disputeGame)) + ); + + // Modify the params with the new vm values. + params_.maxClockDuration = Duration.wrap(_maxClock); + if (_clock != 0) { + params_.clockExtension = Duration.wrap(_clock); + } + if (_mips != address(0)) { + params_.vm = IBigStepper(_mips); + } + + IDisputeGame newGame; + if (GameType.unwrap(_gameType) == GameType.unwrap(GameTypes.PERMISSIONED_CANNON)) { + address proposer = IPermissionedDisputeGame(address(_disputeGame)).proposer(); + console.log("Proposer:", proposer); + address challenger = IPermissionedDisputeGame(address(_disputeGame)).challenger(); + console.log("Challenger:", challenger); + console2.log("Clock extension (seconds):", uint64(Duration.unwrap(params_.clockExtension))); + console2.log( + "Split depth extension (seconds):", + uint64(Duration.unwrap(params_.clockExtension)) * 2 + ); + console2.log("Challenge period (seconds):", params_.vm.oracle().challengePeriod()); + console2.log( + "Max game depth extension (seconds):", + params_.clockExtension.raw() + params_.vm.oracle().challengePeriod() + ); + console2.log( + "Max clock duration (seconds):", + uint64(Duration.unwrap(params_.maxClockDuration)) + ); + newGame = IDisputeGame( + Blueprint.deployFrom( + _blueprints.permissionedDisputeGame1, + _blueprints.permissionedDisputeGame2, + computeSalt(_l2ChainId, reusableSaltMixer(_systemConfig), "PermissionedDisputeGame"), + encodePermissionedFDGConstructor(params_, proposer, challenger) + ) + ); + console.log("New PermissionedDisputeGame:", address(newGame)); + } else { + newGame = IDisputeGame( + Blueprint.deployFrom( + _blueprints.permissionlessDisputeGame1, + _blueprints.permissionlessDisputeGame2, + computeSalt(_l2ChainId, reusableSaltMixer(_systemConfig), "PermissionlessDisputeGame"), + encodePermissionlessFDGConstructor(params_) + ) + ); + console.log("New PermissionlessDisputeGame:", address(newGame)); + } + } + + function computeSalt( + uint256 _l2ChainId, + string memory _saltMixer, + string memory _contractName + ) internal pure returns (bytes32) { + bytes32 salt_ = keccak256(abi.encode(_l2ChainId, _saltMixer, _contractName)); + console.log("Computed salt:"); + console.logBytes32(salt_); + return salt_; + } + + function encodePermissionlessFDGConstructor( + IFaultDisputeGame.GameConstructorParams memory _params + ) internal view virtual returns (bytes memory) { + bytes memory dataWithSelector_ = abi.encodeCall(IFaultDisputeGame.__constructor__, (_params)); + return Bytes.slice(dataWithSelector_, 4); + } + + function encodePermissionedFDGConstructor( + IFaultDisputeGame.GameConstructorParams memory _params, + address _proposer, + address _challenger + ) internal view virtual returns (bytes memory) { + bytes memory dataWithSelector_ = abi.encodeCall( + IPermissionedDisputeGame.__constructor__, + (_params, _proposer, _challenger) + ); + return Bytes.slice(dataWithSelector_, 4); + } + + function reusableSaltMixer(address _systemConfigProxy) internal pure returns (string memory) { + string memory mixer_ = string(bytes.concat(bytes32(uint256(uint160(_systemConfigProxy))))); + return mixer_; + } + + /// @notice Retrieves the constructor params for a given game. + function getGameConstructorParams( + IFaultDisputeGame _disputeGame + ) internal view returns (IFaultDisputeGame.GameConstructorParams memory) { + return + IFaultDisputeGame.GameConstructorParams({ + gameType: _disputeGame.gameType(), + absolutePrestate: _disputeGame.absolutePrestate(), + maxGameDepth: _disputeGame.maxGameDepth(), + splitDepth: _disputeGame.splitDepth(), + clockExtension: _disputeGame.clockExtension(), + maxClockDuration: _disputeGame.maxClockDuration(), + vm: _disputeGame.vm(), + weth: _disputeGame.weth(), + anchorStateRegistry: _disputeGame.anchorStateRegistry(), + l2ChainId: _disputeGame.l2ChainId() + }); + } +} diff --git a/packages/op-tooling/impls/SafeSetGames.s.sol b/packages/op-tooling/impls/SafeSetGames.s.sol new file mode 100644 index 00000000000..018920c4eee --- /dev/null +++ b/packages/op-tooling/impls/SafeSetGames.s.sol @@ -0,0 +1,123 @@ +// SPDX-License-Identifier: MIT +pragma solidity ^0.8.15; + +import { Script } from "forge-std/Script.sol"; +import { console } from "forge-std/console.sol"; +import { IMulticall3 } from "forge-std/interfaces/IMulticall3.sol"; + +import { GnosisSafe } from "safe-contracts/GnosisSafe.sol"; +import { Enum } from "safe-contracts/common/Enum.sol"; + +import { GameTypes } from "src/dispute/lib/Types.sol"; +import { IDisputeGameFactory } from "interfaces/dispute/IDisputeGameFactory.sol"; + +contract SafeSetGames is Script { + // This script requires running with --root and the following env vars: + // FACTORY (required) - address of the DisputeGameFactory + // PERMISSIONED_GAME (required) - address of the new PermissionedDisputeGame implementation + // PERMISSIONLESS_GAME (required) - address of the new PermissionlessDisputeGame implementation + // SAFE (required) - address of the Gnosis Safe to execute the transaction + // SENDER (required) - address of the sender in the Gnosis Safe + // SIG (optional) - signatures for the Safe transaction; if not provided, the transaction + + error MissingSignatures(); + + struct EnvConfig { + address factory; + address permissionedGame; + address permissionlessGame; + address safe; + address sender; + bytes signatures; + } + + function readEnv() internal view returns (EnvConfig memory) { + return + EnvConfig( + vm.envAddress("FACTORY"), + vm.envAddress("PERMISSIONED_GAME"), + vm.envAddress("PERMISSIONLESS_GAME"), + vm.envAddress("SAFE"), + vm.envAddress("SENDER"), + vm.envOr("SIG", bytes(hex"00")) + ); + } + + function buildSafeTx(EnvConfig memory config) internal pure returns (bytes memory) { + IMulticall3.Call3[] memory calls = new IMulticall3.Call3[](2); + calls[0] = IMulticall3.Call3( + config.factory, + false, + abi.encodeWithSelector( + IDisputeGameFactory.setImplementation.selector, + GameTypes.PERMISSIONED_CANNON, + config.permissionedGame + ) + ); + calls[1] = IMulticall3.Call3( + config.factory, + false, + abi.encodeWithSelector( + IDisputeGameFactory.setImplementation.selector, + GameTypes.CANNON, + config.permissionlessGame + ) + ); + + return abi.encodeWithSelector(IMulticall3.aggregate3.selector, calls); + } + + function getTransactionHash() public view returns (bytes32) { + EnvConfig memory config = readEnv(); + + // Build tx + bytes memory calls = buildSafeTx(config); + + // Build tx hash + GnosisSafe safe = GnosisSafe(payable(config.safe)); + bytes32 txHash = safe.getTransactionHash( + MULTICALL3_ADDRESS, + 0, // value + calls, + Enum.Operation(1), // delegate call + 0, // safeTxGas + 0, // baseGas + 0, // gasPrice + address(0), // gasToken + config.sender, // refundReceiver + safe.nonce() + ); + console.log("Transaction hash for Safe: "); + console.logBytes32(txHash); + + return txHash; + } + + function execTransaction() public { + EnvConfig memory config = readEnv(); + if (config.signatures.length == 0) { + revert MissingSignatures(); + } + + // Build tx + bytes memory calls = buildSafeTx(config); + + // Exec tx + GnosisSafe safe = GnosisSafe(payable(config.safe)); + vm.startBroadcast(); + safe.execTransaction( + MULTICALL3_ADDRESS, + 0, // value + calls, + Enum.Operation(1), // delegate call + 0, // safeTxGas + 0, // baseGas + 0, // gasPrice + address(0), // gasToken + payable(config.sender), // refundReceiver + config.signatures + ); + vm.stopBroadcast(); + console.log("Transaction executed with Safe"); + } +} diff --git a/packages/op-tooling/impls/SafeSetPortal.s.sol b/packages/op-tooling/impls/SafeSetPortal.s.sol new file mode 100644 index 00000000000..60384972d78 --- /dev/null +++ b/packages/op-tooling/impls/SafeSetPortal.s.sol @@ -0,0 +1,109 @@ +// SPDX-License-Identifier: MIT +pragma solidity ^0.8.15; + +import { Script } from "forge-std/Script.sol"; +import { console } from "forge-std/console.sol"; +import { IMulticall3 } from "forge-std/interfaces/IMulticall3.sol"; + +import { GnosisSafe } from "safe-contracts/GnosisSafe.sol"; +import { Enum } from "safe-contracts/common/Enum.sol"; + +import { IProxyAdmin } from "interfaces/universal/IProxyAdmin.sol"; + +contract SafeSetPortal is Script { + // This script requires running with --root and the following env vars: + // PORTAL_PROXY (required) - address of the OptimismPortal2 proxy + // PORTAL_IMPL (required) - address of the new OptimismPortal2 implementation + // PROXY_ADMIN (required) - address of the ProxyAdmin managing the OptimismPortal + // SAFE (required) - address of the Gnosis Safe to execute the transaction + // SENDER (required) - address of the sender in the Gnosis Safe + // SIG (optional) - signatures for the Safe transaction; if not provided, the transaction + + error MissingSignatures(); + + struct EnvConfig { + address proxy; + address impl; + address proxyAdmin; + address safe; + address sender; + bytes signatures; + } + + function readEnv() internal view returns (EnvConfig memory) { + return + EnvConfig( + vm.envAddress("PORTAL_PROXY"), + vm.envAddress("PORTAL_IMPL"), + vm.envAddress("PROXY_ADMIN"), + vm.envAddress("SAFE"), + vm.envAddress("SENDER"), + vm.envOr("SIG", bytes(hex"00")) + ); + } + + function buildSafeTx(EnvConfig memory config) internal pure returns (bytes memory) { + IMulticall3.Call3[] memory calls = new IMulticall3.Call3[](1); + calls[0] = IMulticall3.Call3( + config.proxyAdmin, + false, + abi.encodeWithSelector(IProxyAdmin.upgrade.selector, config.proxy, config.impl) + ); + + return abi.encodeWithSelector(IMulticall3.aggregate3.selector, calls); + } + + function getTransactionHash() public view returns (bytes32) { + EnvConfig memory config = readEnv(); + + // Build tx + bytes memory calls = buildSafeTx(config); + + // Build tx hash + GnosisSafe safe = GnosisSafe(payable(config.safe)); + bytes32 txHash = safe.getTransactionHash( + MULTICALL3_ADDRESS, + 0, // value + calls, + Enum.Operation(1), // delegate call + 0, // safeTxGas + 0, // baseGas + 0, // gasPrice + address(0), // gasToken + config.sender, // refundReceiver + safe.nonce() + ); + console.log("Transaction hash for Safe: "); + console.logBytes32(txHash); + + return txHash; + } + + function execTransaction() public { + EnvConfig memory config = readEnv(); + if (config.signatures.length == 0) { + revert MissingSignatures(); + } + + // Build tx + bytes memory calls = buildSafeTx(config); + + // Exec tx + GnosisSafe safe = GnosisSafe(payable(config.safe)); + vm.startBroadcast(); + safe.execTransaction( + MULTICALL3_ADDRESS, + 0, // value + calls, + Enum.Operation(1), // delegate call + 0, // safeTxGas + 0, // baseGas + 0, // gasPrice + address(0), // gasToken + payable(config.sender), // refundReceiver + config.signatures + ); + vm.stopBroadcast(); + console.log("Transaction executed with Safe"); + } +} diff --git a/packages/op-tooling/op-deployer/.gitignore b/packages/op-tooling/op-deployer/.gitignore new file mode 100644 index 00000000000..9dd8671630f --- /dev/null +++ b/packages/op-tooling/op-deployer/.gitignore @@ -0,0 +1,2 @@ +config-validator.json +config-upgrade.json diff --git a/packages/op-tooling/op-deployer/README.md b/packages/op-tooling/op-deployer/README.md new file mode 100644 index 00000000000..a005f3b0a8b --- /dev/null +++ b/packages/op-tooling/op-deployer/README.md @@ -0,0 +1,142 @@ +# Optimism Deployer Scripts + +This directory contains scripts for deploying and upgrading Optimism contracts on Celo networks using the `op-deployer` tool. + +## Prerequisites + +- [op-deployer](https://github.com/celo-org/optimism/tree/op-deployer/v3.0.0/op-deployer) - Optimism deployment tool + +## Environment Variables + +### Required Variables +- `VERSION` - Contracts version to deploy (`v2.0.0` or `v3.0.0`) +- `NETWORK` - Target network (`alfajores`, `baklava`, or `mainnet`) +- `OP_ROOT` - Path to the Optimism repository +- `DEPLOYER_PK` - Private key of the deployer account + +### Optional Variables +- `RPC_URL` - Defaults to `http://localhost:8545` (locally forked network). Change to target RPC network for on-chain deployment + +## Scripts + +### `bootstrap.sh` + +Deploys Optimism contract implementations and creates configuration files for validator and upgrade processes. + +**Features:** +- Deploys all L1 contract implementations for the specified version +- Generates `config-validator.json` with deployed contract addresses +- Generates `config-upgrade.json` for upgrade configuration +- Supports network-specific proxy addresses and configurations + +**Supported Versions:** +- **v2.0.0**: Prestate hash `0x03b357b30095022ecbb44ef00d1de19df39cf69ee92a60683a6be2c6f8fe6a3e` +- **v3.0.0**: Prestate hash `0x034b32d11f017711ce7122ac71d87b1c6cc73e10a0dbd957d8b27f6360acaf8f` + +**Network-Specific Configurations:** +- **alfajores**: Testnet configuration with upgradeable superchain config +- **baklava**: Testnet configuration with upgradeable superchain config +- **mainnet**: Production configuration with non-upgradeable superchain config + +**Example Execution:** +```bash +VERSION="v3.0.0" NETWORK="alfajores" OP_ROOT="/path/to/op/artifacts" DEPLOYER_PK="0x..." ./bootstrap.sh +``` + +### `bootstrap-validator.sh` + +Deploys `StandardValidator` contract (required for superchain-ops) using the configuration generated by `bootstrap.sh`. + +**Features:** +- Uses `config-validator.json` for contract addresses +- Deploys validator implementations for the specified version +- Supports both v2.0.0 and v3.0.0 versions + +**Example Execution:** +```bash +VERSION="v3.0.0" NETWORK="alfajores" OP_ROOT="/path/to/op/artifacts" DEPLOYER_PK="0x..." ./bootstrap-validator.sh +``` + +### `upgrade.sh` + +Performs the actual upgrade of L1 contracts to the specified version. + +**Features:** +- Uses `config-upgrade.json` for upgrade configuration +- Supports different upgrade flows for v2.0.0 and v3.0.0 +- Handles network-specific upgrade parameters + +**Example Execution:** +```bash +VERSION="v3.0.0" NETWORK="alfajores" OP_ROOT="/path/to/op/artifacts" DEPLOYER_PK="0x..." ./upgrade.sh +``` + +### `run_upgrade.sh` + +Orchestrates the complete upgrade process by running all scripts in sequence. + +**Features:** +- Automatically sets network-specific multisig addresses +- Provides fallback to default Anvil deployer key if not specified +- Runs bootstrap, bootstrap-validator, and upgrade scripts in order + +**Network Multisig Addresses:** +- **alfajores**: `0xf05f102e890E713DC9dc0a5e13A8879D5296ee48` +- **baklava**: `0xd542f3328ff2516443FE4db1c89E427F67169D94` +- **mainnet**: `0x4092A77bAF58fef0309452cEaCb09221e556E112` + +**Example Execution:** +```bash +VERSION="v3.0.0" NETWORK="alfajores" OP_ROOT="/path/to/op/artifacts" ./run_upgrade.sh +``` + +## Configuration Files + +### `config-validator.json` + +Contains deployed contract addresses for validator operations. Generated automatically by `bootstrap.sh`. + +**Key Fields:** +- `release` - Optimism version +- `challenger` - Address of the challenger contract +- `l1PAOMultisig` - L1 Proxy Admin Owner multisig address +- Various implementation addresses for L1 contracts + +### `config-upgrade.json` + +Contains configuration for contract upgrades. Generated automatically by `bootstrap.sh`. + +**Key Fields:** +- `prank` - Address to impersonate for upgrades +- `opcm` - Optimism Chain Manager address +- `chainConfigs` - Array of chain-specific upgrade configurations +- `upgradeSuperchainConfig` - Whether to upgrade superchain config + +## Usage Workflow + +1. **Export necessary env vars:** + ```bash + export VERSION="v3.0.0" + export NETWORK="alfajores" + export OP_ROOT="/path/to/optimism" + export DEPLOYER_PK="your_private_key" + ``` + +2. **Run complete upgrade:** + ```bash + ./run_upgrade.sh + ``` + +## Network Support + +| Network | Chain ID | Environment | Superchain Config Upgradeable | +|---------|----------|-------------|------------------------------| +| alfajores | 44787 | Testnet | Yes | +| baklava | 62320 | Testnet | Yes | +| mainnet | 42220 | Production | No | + +## Notes + +- The `run_upgrade.sh` script provides a safe default Anvil private key for local testing +- Network-specific configurations are automatically applied based on the `NETWORK` environment variable +- Contract addresses are deterministic and network-specific diff --git a/packages/op-tooling/op-deployer/bootstrap-validator.sh b/packages/op-tooling/op-deployer/bootstrap-validator.sh new file mode 100755 index 00000000000..2cf3b3bf3b5 --- /dev/null +++ b/packages/op-tooling/op-deployer/bootstrap-validator.sh @@ -0,0 +1,60 @@ +#!/usr/bin/env bash +set -euo pipefail + +# Require env vars +[ -z "${VERSION:-}" ] && echo "Need to set the VERSION via env" && exit 1; +[ -z "${NETWORK:-}" ] && echo "Need to set the NETWORK via env" && exit 1; +[ -z "${OP_ROOT:-}" ] && echo "Need to set the OP_ROOT via env" && exit 1; +[ -z "${DEPLOYER_PK:-}" ] && echo "Need to set the DEPLOYER_PK via env" && exit 1; + +# Check version +case $VERSION in + "v2.0.0"|"v3.0.0") + echo "Detected supported version: $VERSION" + ;; + *) + echo "Invalid version: $VERSION" && exit 1 + ;; +esac + +# Check network +case $NETWORK in + "alfajores"|"baklava"|"mainnet") + echo "Detected supported network: $NETWORK" + ;; + *) + echo "Unsupported network! Choose from 'alfajores', 'baklava' or 'mainnet'" && exit 1 + ;; +esac + +# Set vars +OP_DEPLOYER_CMD="$OP_ROOT/op-deployer/bin/op-deployer" +ARTIFACTS_LOCATOR="file://$OP_ROOT/packages/contracts-bedrock/forge-artifacts" +CONFIG=./op-deployer/config-validator.json +if [[ -z "${RPC_URL:-}" ]]; then + L1_RPC_URL=http://localhost:8545 + echo "Using localhost" +else + L1_RPC_URL=$RPC_URL + echo "Using rpc: $L1_RPC_URL" +fi + +################### +# OP-DEPLOYER CMD # +################### + +# USAGE: op-deployer bootstrap validator [command options] +# OPTIONS: +# --l1-rpc-url value RPC URL for the L1 chain. Must be set for live chains. Can be blank for chains deploying to local allocs files. [$L1_RPC_URL] +# --private-key value Private key of the deployer account. [$DEPLOYER_PRIVATE_KEY] +# --outfile value Output file. Use - for stdout. (default: "-") [$DEPLOYER_OUTFILE] +# --artifacts-locator value Locator for artifacts. [$DEPLOYER_ARTIFACTS_LOCATOR] +# --config value Path to a JSON file [$DEPLOYER_CONFIG] +# --use-interop If true, deploy Interop implementations. (default: false) [$DEPLOYER_USE_INTEROP] + +echo "Performing bootstrap validator to $VERSION for $NETWORK!" +$OP_DEPLOYER_CMD bootstrap validator \ + --l1-rpc-url="$L1_RPC_URL" \ + --artifacts-locator="$ARTIFACTS_LOCATOR" \ + --config="$CONFIG" \ + --private-key=$DEPLOYER_PK diff --git a/packages/op-tooling/op-deployer/bootstrap.sh b/packages/op-tooling/op-deployer/bootstrap.sh new file mode 100755 index 00000000000..38527bd615e --- /dev/null +++ b/packages/op-tooling/op-deployer/bootstrap.sh @@ -0,0 +1,171 @@ +#!/usr/bin/env bash +set -euo pipefail + +# Require env vars +[ -z "${VERSION:-}" ] && echo "Need to set the VERSION via env" && exit 1; +[ -z "${NETWORK:-}" ] && echo "Need to set the NETWORK via env" && exit 1; +[ -z "${OP_ROOT:-}" ] && echo "Need to set the OP_ROOT via env" && exit 1; +[ -z "${MULTISIG_ADDRESS:-}" ] && echo "Need to set the MULTISIG_ADDRESS via env" && exit 1; +[ -z "${DEPLOYER_PK:-}" ] && echo "Need to set the DEPLOYER_PK via env" && exit 1; + +# Check version +case $VERSION in + "v2.0.0") + echo "Detected supported version: $VERSION" + PRESTATE_HASH=0x03b357b30095022ecbb44ef00d1de19df39cf69ee92a60683a6be2c6f8fe6a3e + ;; + "v3.0.0") + echo "Detected supported version: $VERSION" + PRESTATE_HASH=0x034b32d11f017711ce7122ac71d87b1c6cc73e10a0dbd957d8b27f6360acaf8f + ;; + *) + echo "Invalid version: $VERSION" && exit 1 + ;; +esac + +# Set addresses based on network +if [ "${NETWORK}" == "alfajores" ]; then + SUPERCHAIN_CONFIG_PROXY="0xdf4Fb5371B706936527B877F616eAC0e47c9b785" + PROTOCOL_VERSIONS_PROXY="0x5E5FEA4D2A8f632Af05D1E725D7ca865327A080b" + SYSTEM_CONFIG_PROXY="0x499b0C1F4BDC76d61b1D13b03384eac65FAF50c7" + PROXY_ADMIN_OWNER="0xf05f102e890E713DC9dc0a5e13A8879D5296ee48" + PROXY_ADMIN="0x4630583d066520aF0E3fda0de2C628EEd2888683" + CHALLENGER="0xe571b94CF7e95C46DFe6bEa529335f4A11d15D92" + UPGRADE_SUPERCHAIN_CONFIG=true +elif [ "${NETWORK}" == "baklava" ]; then + SUPERCHAIN_CONFIG_PROXY="0xf07502A4a950d870c43b12660fB1Dd18c170D344" + PROTOCOL_VERSIONS_PROXY="0x3d438C63e0431DA844d3F60E6c712d10FC75c529" + SYSTEM_CONFIG_PROXY="0x3ee24bF404e4a5D27A437d910F56E1eD999B1De8" + PROXY_ADMIN_OWNER="0xd542f3328ff2516443FE4db1c89E427F67169D94" + PROXY_ADMIN="0xBF101Bd81fb69aB00ab261465454dF1a171726Bf" + CHALLENGER="0xDc94436A193a827786270dD4F6cD4b35c3f0C8f8" + UPGRADE_SUPERCHAIN_CONFIG=true +elif [ "${NETWORK}" == "mainnet" ]; then + SUPERCHAIN_CONFIG_PROXY="0x95703e0982140D16f8ebA6d158FccEde42f04a4C" + PROTOCOL_VERSIONS_PROXY="0x1b6dEB2197418075AB314ac4D52Ca1D104a8F663" + SYSTEM_CONFIG_PROXY="0x89E31965D844a309231B1f17759Ccaf1b7c09861" + PROXY_ADMIN_OWNER="0x4092A77bAF58fef0309452cEaCb09221e556E112" + PROXY_ADMIN="0x783A434532Ee94667979213af1711505E8bFE374" + CHALLENGER="0x6b145ebf66602ec524b196426b46631259689583" + UPGRADE_SUPERCHAIN_CONFIG=false +else + echo "Unsupported network! Choose from 'alfajores', 'baklava' or 'mainnet'" + exit 1 +fi + +# Set vars +CONFIG_LOC="./op-deployer" +OP_DEPLOYER_CMD="$OP_ROOT/op-deployer/bin/op-deployer" +L1_CONTRACTS_RELEASE=celo-contracts/$VERSION +ARTIFACTS_LOCATOR="file://$OP_ROOT/packages/contracts-bedrock/forge-artifacts" +WITHDRAWAL_DELAY_SECONDS=604800 +if [[ -z "${RPC_URL:-}" ]]; then + L1_RPC_URL=http://localhost:8545 + echo "Using localhost" +else + L1_RPC_URL=$RPC_URL + echo "Using rpc: $L1_RPC_URL" +fi + +################### +# OP-DEPLOYER CMD # +################### + +# USAGE: op-deployer bootstrap implementations [command options] +# OPTIONS: +# --l1-rpc-url value RPC URL for the L1 chain. Must be set for live chains. Can be blank for chains deploying to local allocs files. [$L1_RPC_URL] +# --private-key value Private key of the deployer account. [$DEPLOYER_PRIVATE_KEY] +# --outfile value Output file. Use - for stdout. (default: "-") [$DEPLOYER_OUTFILE] +# --artifacts-locator value Locator for artifacts. [$DEPLOYER_ARTIFACTS_LOCATOR] +# --l1-contracts-release op-contracts/vX.Y.Z Release version to set OPCM implementations for, of the format op-contracts/vX.Y.Z. [$DEPLOYER_L1_CONTRACTS_RELEASE] +# --mips-version value MIPS version. (default: 1) [$DEPLOYER_MIPS_VERSION] +# --withdrawal-delay-seconds value Withdrawal delay in seconds. (default: 302400) [$DEPLOYER_WITHDRAWAL_DELAY_SECONDS] +# --min-proposal-size-bytes value PreimageOracle minimum proposal size in bytes. (default: 126000) [$DEPLOYER_MIN_PROPOSAL_SIZE_BYTES] +# --challenge-period-seconds value PreimageOracle challenge period in seconds. (default: 86400) [$DEPLOYER_CHALLENGE_PERIOD_SECONDS] +# --proof-maturity-delay-seconds value Proof maturity delay in seconds. (default: 604800) [$DEPLOYER_PROOF_MATURITY_DELAY_SECONDS] +# --dispute-game-finality-delay-seconds value Dispute game finality delay in seconds. (default: 302400) [$DEPLOYER_DISPUTE_GAME_FINALITY_DELAY_SECONDS] +# --superchain-config-proxy value Superchain config proxy. [$DEPLOYER_SUPERCHAIN_CONFIG_PROXY] +# --protocol-versions-proxy value Protocol versions proxy. [$DEPLOYER_PROTOCOL_VERSIONS_PROXY] +# --upgrade-controller value Upgrade controller. [$DEPLOYER_UPGRADE_CONTROLLER] +# --use-interop If true, deploy Interop implementations. (default: false) [$DEPLOYER_USE_INTEROP] + +echo "Performing bootstrap implementations to $VERSION for $NETWORK!" +BOOTSTRAP_OUTPUT=`mktemp` +$OP_DEPLOYER_CMD bootstrap implementations \ + --l1-rpc-url="$L1_RPC_URL" \ + --l1-contracts-release="$L1_CONTRACTS_RELEASE" \ + --artifacts-locator="$ARTIFACTS_LOCATOR" \ + --withdrawal-delay-seconds="$WITHDRAWAL_DELAY_SECONDS" \ + --superchain-config-proxy="$SUPERCHAIN_CONFIG_PROXY" \ + --protocol-versions-proxy="$PROTOCOL_VERSIONS_PROXY" \ + --upgrade-controller=$MULTISIG_ADDRESS \ + --private-key=$DEPLOYER_PK | tee $BOOTSTRAP_OUTPUT + +# Set OPCM address from bootstrap output +BOOTSTRAP_JSON=`mktemp` +awk '/{/ { json_start=1 }; json_start==1 { print }; /}/ { json_start=0 }' $BOOTSTRAP_OUTPUT > $BOOTSTRAP_JSON +OPCM=`jq --raw-output '.Opcm' $BOOTSTRAP_JSON` + +# Load addresses from bootstrap +ANCHOR_STATE_REGISTRY_IMPL=`jq --raw-output '.AnchorStateRegistryImpl' $BOOTSTRAP_JSON` +DELAYED_WETH_IMPL=`jq --raw-output '.DelayedWETHImpl' $BOOTSTRAP_JSON` +DISPUTE_GAME_FACTORY_IMPL=`jq --raw-output '.DisputeGameFactoryImpl' $BOOTSTRAP_JSON` +L1_CROSS_DOMAIN_MESSENGER_IMPL=`jq --raw-output '.L1CrossDomainMessengerImpl' $BOOTSTRAP_JSON` +L1_ERC721_BRIDGE_IMPL=`jq --raw-output '.L1ERC721BridgeImpl' $BOOTSTRAP_JSON` +L1_STANDARD_BRIDGE_IMPL=`jq --raw-output '.L1StandardBridgeImpl' $BOOTSTRAP_JSON` +MIPS_SINGLETON=`jq --raw-output '.MipsSingleton' $BOOTSTRAP_JSON` +OPTIMISM_MINTABLE_ERC20_FACTORY_IMPL=`jq --raw-output '.OptimismMintableERC20FactoryImpl' $BOOTSTRAP_JSON` +OPTIMISM_PORTAL_IMPL=`jq --raw-output '.OptimismPortalImpl' $BOOTSTRAP_JSON` +PROTOCOL_VERSIONS_IMPL=`jq --raw-output '.ProtocolVersionsImpl' $BOOTSTRAP_JSON` +SUPERCHAIN_CONFIG_IMPL=`jq --raw-output '.SuperchainConfigImpl' $BOOTSTRAP_JSON` +SYSTEM_CONFIG_IMPL=`jq --raw-output '.SystemConfigImpl' $BOOTSTRAP_JSON` + +# Workaround until we fix deterministic addresses +if [ "${VERSION}" = "v3.0.0" ] && [ -f "$CONFIG_LOC/config-validator.json" ]; then + echo "Using workaround for v3 config validator!" + DELAYED_WETH_IMPL=`cat $CONFIG_LOC/config-validator.json | jq .delayedWETHImpl` + DELAYED_WETH_IMPL="${DELAYED_WETH_IMPL#\"}" # Remove leading " + DELAYED_WETH_IMPL="${DELAYED_WETH_IMPL%\"}" # Remove trailing " + OPTIMISM_MINTABLE_ERC20_FACTORY_IMPL=`cat $CONFIG_LOC/config-validator.json | jq .optimismMintableERC20FactoryImpl` + OPTIMISM_MINTABLE_ERC20_FACTORY_IMPL="${OPTIMISM_MINTABLE_ERC20_FACTORY_IMPL#\"}" # Remove leading " + OPTIMISM_MINTABLE_ERC20_FACTORY_IMPL="${OPTIMISM_MINTABLE_ERC20_FACTORY_IMPL%\"}" # Remove trailing " +fi + +# Create validator config +cat > $CONFIG_LOC/config-validator.json << END +{ + "release": "$VERSION", + "anchorStateRegistryImpl": "$ANCHOR_STATE_REGISTRY_IMPL", + "challenger": "$CHALLENGER", + "delayedWETHImpl": "$DELAYED_WETH_IMPL", + "disputeGameFactoryImpl": "$DISPUTE_GAME_FACTORY_IMPL", + "l1CrossDomainMessengerImpl": "$L1_CROSS_DOMAIN_MESSENGER_IMPL", + "l1ERC721BridgeImpl": "$L1_ERC721_BRIDGE_IMPL", + "l1PAOMultisig": "$PROXY_ADMIN_OWNER", + "l1StandardBridgeImpl": "$L1_STANDARD_BRIDGE_IMPL", + "mips": "$MIPS_SINGLETON", + "mipsImpl": "$MIPS_SINGLETON", + "optimismMintableERC20FactoryImpl": "$OPTIMISM_MINTABLE_ERC20_FACTORY_IMPL", + "optimismPortalImpl": "$OPTIMISM_PORTAL_IMPL", + "protocolVersionsImpl": "$PROTOCOL_VERSIONS_IMPL", + "superchainConfig": "$SUPERCHAIN_CONFIG_PROXY", + "superchainConfigImpl": "$SUPERCHAIN_CONFIG_IMPL", + "systemConfigImpl": "$SYSTEM_CONFIG_IMPL" +} +END + +# Create upgrade config +cat > $CONFIG_LOC/config-upgrade.json < [command options] +# OPTIONS: +# --l1-rpc-url value RPC URL for the L1 chain. Must be set for live chains. Must be blank for chains deploying to local allocs files. [$L1_RPC_URL] +# --config value path to the config file +# --override-artifacts-url value override the artifacts URL +# --log.level value The lowest log level that will be output (default: INFO) [$DEPLOYER_LOG_LEVEL] +# --log.format value Format the log output. Supported formats: 'text', 'terminal', 'logfmt', 'json', 'json-pretty', (default: text) [$DEPLOYER_LOG_FORMAT] +# --log.color Color the log output if in terminal mode (default: false) [$DEPLOYER_LOG_COLOR] +# --log.pid Show pid in the log (default: false) [$DEPLOYER_LOG_PID] + +echo "Performing upgrade to $VERSION for $NETWORK!" +if [ "${VERSION}" == "v2.0.0" ]; then + $OP_DEPLOYER_CMD upgrade $VERSION \ + --l1-rpc-url="$L1_RPC_URL" \ + --config="$CONFIG" \ + --override-artifacts-url="$ARTIFACTS_LOCATOR" \ + --private-key=$DEPLOYER_PK +else + $OP_DEPLOYER_CMD upgrade $VERSION \ + --l1-rpc-url="$L1_RPC_URL" \ + --config="$CONFIG" \ + --override-artifacts-url="$ARTIFACTS_LOCATOR" +fi diff --git a/packages/op-tooling/safe/README.md b/packages/op-tooling/safe/README.md new file mode 100644 index 00000000000..733bafe8aea --- /dev/null +++ b/packages/op-tooling/safe/README.md @@ -0,0 +1,84 @@ +# Safe Transaction Queue Scripts + +This directory contains scripts for interacting with Gnosis Safe multisig wallets. + +## Scripts + +### `queue-safe.sh` + +Queues a transaction to a Gnosis Safe multisig wallet using the Safe Gateway API. This script is specifically designed for proposing OPCM transactions. + +**Requirements:** +- Requires access to unrestricted Safe Gateway API (`delegatecall` operation should not be disabled via API) + +**Features:** +- Submits transactions to the Safe Gateway API +- Supports different OpStack versions (v2.0.0, v3.0.0) +- Uses pre-signed transactions with valid signatures +- Targets Holesky testnet Safe Gateway + +**Required Environment Variables:** +- `SENDER` - Address of the transaction sender +- `SAFE_ADDRESS` - Address of the target Safe multisig wallet +- `OPCM_ADDRESS` - Address of the Optimism Chain Manager contract +- `CALLDATA` - Transaction calldata (hex encoded) +- `NONCE` - Safe transaction nonce +- `TX_HASH` - Safe transaction hash +- `SIG` - Transaction signature + +**API Endpoint:** +``` +https://gateway.holesky-safe.protofire.io/v1/chains/17000/transactions/{SAFE_ADDRESS}/propose +``` + +**Example Configurations:** + +#### Baklava V2.0.0 +```bash +SENDER=0x22EaF69162ae49605441229EdbEF7D9FC5f4f094 +SAFE_ADDRESS=0xd542f3328ff2516443FE4db1c89E427F67169D94 +OPCM_ADDRESS=0xd29841fbcff24eb5157f2abe7ed0b9819340159a +CALLDATA=0xff2dd5a1000000000000000000000000000000000000000000000000000000000000002000000000000000000000000000000000000000000000000000000000000000010000000000000000000000003ee24bf404e4a5d27a437d910f56e1ed999b1de8000000000000000000000000bf101bd81fb69ab00ab261465454df1a171726bf03b357b30095022ecbb44ef00d1de19df39cf69ee92a60683a6be2c6f8fe6a3e +NONCE=20 +TX_HASH=0x2e42a7e0a7bafbc136b302f4f5b5946bb57a98a9e5085ddc225712107381c3e2 +SIG=0xb85ab66c1e782f3b801814babe680f39d19dae9ce81378f0a9acb91c41c97dd40f071f2c4f48ceb3a28f633446f9db9c15298adc2d7353d324c20f03613139f51c +``` + +#### Baklava V3.0.0 +```bash +SENDER=0x22EaF69162ae49605441229EdbEF7D9FC5f4f094 +SAFE_ADDRESS=0xd542f3328ff2516443FE4db1c89E427F67169D94 +OPCM_ADDRESS=0xdd07cb5e4b2e89a618f8d3a08c8ff753acfe1c68 +CALLDATA=0xff2dd5a1000000000000000000000000000000000000000000000000000000000000002000000000000000000000000000000000000000000000000000000000000000010000000000000000000000003ee24bf404e4a5d27a437d910f56e1ed999b1de8000000000000000000000000bf101bd81fb69ab00ab261465454df1a171726bf034b32d11f017711ce7122ac71d87b1c6cc73e10a0dbd957d8b27f6360acaf8f +NONCE=21 +TX_HASH=0xd7bba17d3002691e9dc1da82525b97a18b994f3a189437084358bd3241900731 +SIG=0xb1772fd05b26d8febd5cb42f80b611daf7b0a87d56a241afe8d8651a2123202e2da073667efaecff7373fb8cd367306d2cb50d324ce98736ea41579b4cbf90171c +``` + +**Example Execution:** +```bash +# Set environment variables for Baklava V3 +export SENDER="0x22EaF69162ae49605441229EdbEF7D9FC5f4f094" +export SAFE_ADDRESS="0xd542f3328ff2516443FE4db1c89E427F67169D94" +export OPCM_ADDRESS="0xdd07cb5e4b2e89a618f8d3a08c8ff753acfe1c68" +export CALLDATA="0xff2dd5a1000000000000000000000000000000000000000000000000000000000000002000000000000000000000000000000000000000000000000000000000000000010000000000000000000000003ee24bf404e4a5d27a437d910f56e1ed999b1de8000000000000000000000000bf101bd81fb69ab00ab261465454df1a171726bf034b32d11f017711ce7122ac71d87b1c6cc73e10a0dbd957d8b27f6360acaf8f" +export NONCE="21" +export TX_HASH="0xd7bba17d3002691e9dc1da82525b97a18b994f3a189437084358bd3241900731" +export SIG="0xb1772fd05b26d8febd5cb42f80b611daf7b0a87d56a241afe8d8651a2123202e2da073667efaecff7373fb8cd367306d2cb50d324ce98736ea41579b4cbf90171c" + +# Execute the script +./queue-safe.sh +``` + +## Network Support + +| Network | Chain ID | Safe Gateway | Environment | +|---------|----------|--------------|-------------| +| Holesky | 17000 | https://gateway.holesky-safe.protofire.io | Testnet | + +## Notes + +- All transactions are submitted to the Holesky testnet Safe Gateway +- Transaction signatures must be valid, **ordered by the address of signer** and correspond to the Safe multisig owners +- Nonces must be sequential and match the current Safe state +- The script is designed for one-time use per transaction (nonces increment after each proposal) diff --git a/packages/op-tooling/safe/queue-safe.sh b/packages/op-tooling/safe/queue-safe.sh new file mode 100755 index 00000000000..0dce1d70a92 --- /dev/null +++ b/packages/op-tooling/safe/queue-safe.sh @@ -0,0 +1,20 @@ +#!/usr/bin/env bash +set -euo pipefail + +curl -X POST "https://gateway.holesky-safe.protofire.io/v1/chains/17000/transactions/$SAFE_ADDRESS/propose" \ + -H "Accept: application/json" \ + -H "Content-Type: application/json" \ + -d '{ + "to": "'$OPCM_ADDRESS'", + "value": "0", + "data": "'$CALLDATA'", + "operation": 1, + "safeTxGas": "0", + "baseGas": "0", + "gasPrice": "0", + "gasToken": "0x0000000000000000000000000000000000000000", + "nonce": "'$NONCE'", + "safeTxHash": "'$TX_HASH'", + "sender": "'$SENDER'", + "signature": "'$SIG'" + }' diff --git a/packages/op-tooling/scripts/CloseRecentGame.s.sol b/packages/op-tooling/scripts/CloseRecentGame.s.sol new file mode 100644 index 00000000000..aa7f856d5f9 --- /dev/null +++ b/packages/op-tooling/scripts/CloseRecentGame.s.sol @@ -0,0 +1,106 @@ +// SPDX-License-Identifier: MIT +pragma solidity ^0.8.15; + +import { Script } from "forge-std/Script.sol"; +import { console } from "forge-std/console.sol"; +import { console2 } from "forge-std/console2.sol"; + +import { GameType, GameStatus, Timestamp, BondDistributionMode } from "src/dispute/lib/Types.sol"; + +import { IDisputeGameFactory } from "interfaces/dispute/IDisputeGameFactory.sol"; +import { IDisputeGame } from "interfaces/dispute/IDisputeGame.sol"; +import { IFaultDisputeGame } from "interfaces/dispute/IFaultDisputeGame.sol"; +import { IAnchorStateRegistry } from "interfaces/dispute/IAnchorStateRegistry.sol"; + +contract CloseRecentGame is Script { + // This script requires running with --root and the following env vars: + // FACTORY (required) - address of the dispute game factory + // REGISTRY (required) - address of the anchor state registry + // MAX (optional) - maximum number of recent games to check (default: 50) + + event AnchorStateUpdated( + address indexed game, + uint256 indexed index, + GameStatus status, + Timestamp created + ); + event AnchorStateUpToDate(); + event NoGamesFound(); + event NoEligibleGamesFound(); + + bool private foundEligibleGame; + + function run() external { + IDisputeGameFactory factory_ = IDisputeGameFactory(vm.envAddress("FACTORY")); + console.log("Factory present at:", address(factory_)); + + IAnchorStateRegistry registry_ = IAnchorStateRegistry(vm.envAddress("REGISTRY")); + console.log("Registry present at:", address(registry_)); + + uint256 MAX_GAMES_TO_CHECK = vm.envOr("MAX", uint256(50)); + + uint256 gamesCount_ = factory_.gameCount(); + console2.log("Total games:", gamesCount_); + + if (gamesCount_ == 0) { + console.log("No games found in factory"); + emit NoGamesFound(); + return; + } + + for (uint256 i = 0; i < MAX_GAMES_TO_CHECK && i < gamesCount_; i++) { + uint256 gameIndex_ = gamesCount_ - 1 - i; + (GameType gameType_, Timestamp created_, IDisputeGame game_) = factory_.gameAtIndex( + gameIndex_ + ); + console.log("Checking game at:", address(game_)); + console2.log(" Type:", uint32(gameType_.raw())); + console2.log(" Created:", uint64(created_.raw())); + + // check game status + GameStatus status_ = game_.status(); + console2.log(" Status:", uint8(status_)); + if (status_ == GameStatus.IN_PROGRESS) { + console.log(" >>> Game still in progress. Skipping..."); + continue; + } + + // check if game is resolved + if (game_.resolvedAt().raw() == 0) { + console.log(" >>> Game not resolved yet. Skipping..."); + continue; + } + + // check if game is finalized + if (!registry_.isGameFinalized(game_)) { + console.log(" >>> Game not finalized yet. Skipping..."); + continue; + } + + // check if game already closed + IFaultDisputeGame fdg_ = IFaultDisputeGame(address(game_)); + if (fdg_.bondDistributionMode() != BondDistributionMode.UNDECIDED) { + console.log(" >>> Game already closed. Anchor state up to date..."); + + foundEligibleGame = true; + emit AnchorStateUpToDate(); + break; + } + + // update anchor state by closing the game + vm.startBroadcast(); + console.log(" >>> Closing game at:", address(game_)); + fdg_.closeGame(); + vm.stopBroadcast(); + + foundEligibleGame = true; + emit AnchorStateUpdated(address(game_), gameIndex_, game_.status(), created_); + break; + } + + if (!foundEligibleGame) { + console.log("No eligible games found to close"); + emit NoEligibleGamesFound(); + } + } +} diff --git a/packages/op-tooling/scripts/DisputeGameFactoryPrunner.sol b/packages/op-tooling/scripts/DisputeGameFactoryPrunner.sol new file mode 100644 index 00000000000..675e357d9b3 --- /dev/null +++ b/packages/op-tooling/scripts/DisputeGameFactoryPrunner.sol @@ -0,0 +1,54 @@ +// SPDX-License-Identifier: MIT +pragma solidity ^0.8.15; + +import { Hash, GameId, GameType, Timestamp, Claim } from "src/dispute/lib/Types.sol"; +import { IDisputeGame } from "interfaces/dispute/IDisputeGame.sol"; + +contract DisputeGameFactoryPrunner { + event GamePruned( + uint256 indexed index, + GameId indexed gameId, + Hash indexed gameUUID, + GameType gameType, + address gameProxy + ); + + uint256[103] internal __gap; // slots 0...102 + mapping(Hash => GameId) internal _disputeGames; // slot 103 + GameId[] internal _disputeGameList; // slot 104 + + function pruneGames(uint256 _desiredLength) external { + require(_desiredLength <= _disputeGameList.length, "Desired length exceeds current length"); + while (_disputeGameList.length > _desiredLength) { + // Retrieve game id to prune + uint256 gameIndex_ = _disputeGameList.length - 1; + GameId gameId_ = _disputeGameList[gameIndex_]; + + // Unpack game id + (GameType gameType_, Timestamp timestamp_, address proxy_) = gameId_.unpack(); + + // Load game data + IDisputeGame game_ = IDisputeGame(proxy_); + Claim rootClaim_ = game_.rootClaim(); + bytes memory extraData_ = game_.extraData(); + + // Compute game hash + Hash uuid_ = getGameUUID(gameType_, rootClaim_, extraData_); + + // Delete game from storage + _disputeGames[uuid_] = GameId.wrap(bytes32(0)); + _disputeGameList.pop(); + + // Emit event + emit GamePruned(gameIndex_, gameId_, uuid_, gameType_, proxy_); + } + } + + function getGameUUID( + GameType _gameType, + Claim _rootClaim, + bytes memory _extraData + ) public pure returns (Hash uuid_) { + uuid_ = Hash.wrap(keccak256(abi.encode(_gameType, _rootClaim, _extraData))); + } +} diff --git a/packages/op-tooling/scripts/PruneGamesFromStorage.s.sol b/packages/op-tooling/scripts/PruneGamesFromStorage.s.sol new file mode 100644 index 00000000000..93674e01e1d --- /dev/null +++ b/packages/op-tooling/scripts/PruneGamesFromStorage.s.sol @@ -0,0 +1,80 @@ +// SPDX-License-Identifier: MIT +pragma solidity ^0.8.15; + +import { Script } from "forge-std/Script.sol"; +import { console } from "forge-std/console.sol"; +import { console2 } from "forge-std/console2.sol"; + +import { GameType, Timestamp } from "src/dispute/lib/Types.sol"; +import { IDisputeGame } from "interfaces/dispute/IDisputeGame.sol"; +import { IDisputeGameFactory } from "interfaces/dispute/IDisputeGameFactory.sol"; +import { IProxyAdmin } from "interfaces/universal/IProxyAdmin.sol"; +import { DisputeGameFactoryPrunner } from "./DisputeGameFactoryPrunner.sol"; + +contract PruneGamesFromStorage is Script { + // This script requires running with --root and the following env vars: + // FACTORY (required) - address of the dispute game factory + // PROXY_ADMIN (required) - address of the proxy admin + // RETENTION_INDEX (required) - index up to which games are retained + + function run() external { + IDisputeGameFactory factory_ = IDisputeGameFactory(vm.envAddress("FACTORY")); + console.log("Factory present at:", address(factory_)); + + IProxyAdmin proxyAdmin_ = IProxyAdmin(vm.envAddress("PROXY_ADMIN")); + console.log("ProxyAdmin present at:", address(proxyAdmin_)); + + uint256 retentionIndex_ = vm.envUint("RETENTION_INDEX"); + console.log("Game index to retain:", retentionIndex_); + + // Validate amount of games to prune + uint256 currentGameCount_ = factory_.gameCount(); + require(retentionIndex_ < currentGameCount_, "Retention index out of bounds"); + if ((currentGameCount_ - 1) - retentionIndex_ > 500) { + console.log( + "Too many games to prune at once (%d). Max is 500. Aborting.", + (currentGameCount_ - 1) - retentionIndex_ + ); + return; + } + + // Store factory impl + address factoryImpl_ = proxyAdmin_.getProxyImplementation(address(factory_)); + + // Start broadcast + vm.startBroadcast(); + + // Upgrade factory to DisputeGameFactoryPrunner + console.log("Upgrading factory to DisputeGameFactoryPrunner..."); + DisputeGameFactoryPrunner prunner_ = new DisputeGameFactoryPrunner(); + proxyAdmin_.upgrade(payable(address(factory_)), address(prunner_)); + console.log("Factory upgraded to DisputeGameFactoryPrunner at:", address(prunner_)); + + // Prune games from storage + console.log("Pruning games from storage..."); + uint256 targetLength_ = retentionIndex_ + 1; + DisputeGameFactoryPrunner(payable(address(factory_))).pruneGames(targetLength_); + console.log("Pruning completed."); + + // Restore factory implementation + console.log("Restoring factory implementation..."); + proxyAdmin_.upgrade(payable(address(factory_)), factoryImpl_); + console.log("Factory implementation restored to:", factoryImpl_); + + // Stop broadcast + vm.stopBroadcast(); + + // Log final game count + uint256 gameCount_ = factory_.gameCount(); + console.log("Current game count in factory:", gameCount_); + + // Log retained game at retention index + (GameType gameType_, Timestamp timestamp_, IDisputeGame proxy_) = factory_.gameAtIndex( + retentionIndex_ + ); + console.log("Game retained at index:", retentionIndex_); + console2.log(" Type:", uint32(gameType_.raw())); + console2.log(" Created:", uint64(timestamp_.raw())); + console.log(" Proxy:", address(proxy_)); + } +} diff --git a/packages/op-tooling/scripts/README.md b/packages/op-tooling/scripts/README.md new file mode 100644 index 00000000000..c3bfc1b4410 --- /dev/null +++ b/packages/op-tooling/scripts/README.md @@ -0,0 +1,218 @@ +# Optimism Utility Scripts + +This directory contains Foundry scripts and shell utilities for managing Optimism dispute games and anchor state. These scripts interact with the Optimism repository and must be executed from the Optimism contracts-bedrock directory. + +## Important Usage Note + +**All Foundry scripts in this directory must be executed with root pointing to the Optimism repository contracts folder:** +```bash +forge script --root $PATH_TO_OP_REPO/packages/contracts-bedrock +``` + +## Scripts + +### `CloseRecentGame.s.sol` + +Closes the most recent eligible fault dispute game, which updates the anchor state in the registry. The script searches through recent games to find one that is resolved, finalized, and ready to be closed. + +**Features:** +- Iterates through recent games in reverse chronological order (newest first) +- Validates game status (not in progress, resolved, finalized) +- Checks if game is already closed before attempting to close +- Configurable search limit via MAX environment variable +- Emits events for all execution paths +- Proper use of broadcast for state-changing operations + +**Required Environment Variables:** +- `FACTORY` - Address of the DisputeGameFactory +- `REGISTRY` - Address of the AnchorStateRegistry + +**Optional Environment Variables:** +- `MAX` - Maximum number of recent games to check (default: 50) + +**Example Execution:** +```bash +FACTORY="0x..." REGISTRY="0x..." forge script CloseRecentGame.s.sol --root $PATH_TO_OP_REPO/packages/contracts-bedrock --broadcast --private-key $PK --rpc-url $RPC + +# With custom max games to check +FACTORY="0x..." REGISTRY="0x..." MAX=100 forge script CloseRecentGame.s.sol --root $PATH_TO_OP_REPO/packages/contracts-bedrock --broadcast --private-key $PK --rpc-url $RPC +``` + +**Events Emitted:** +- `AnchorStateUpdated(address game, uint256 index, GameStatus status, Timestamp created)` - Game successfully closed +- `AnchorStateUpToDate()` - Most recent eligible game already closed +- `NoGamesFound()` - Factory has no games +- `NoEligibleGamesFound()` - No games in the checked range are eligible for closing + +**Validation Checks:** +1. Game status is not IN_PROGRESS +2. Game has been resolved (resolvedAt != 0) +3. Game has been finalized by the registry +4. Game hasn't already been closed (bondDistributionMode == UNDECIDED) + +### `close-recent-game.sh` + +Shell wrapper script that simplifies execution of CloseRecentGame.s.sol by handling environment variable validation and forge command construction. + +**Features:** +- Validates required environment variables before execution +- Automatically constructs --root path from OP_DIR +- Provides clear error messages for missing configuration +- Uses proper error handling with set -euo pipefail +- Executes forge script with broadcast enabled + +**Required Environment Variables:** +- `L1_RPC_URL` - RPC URL for L1 network +- `OP_DIR` - Path to Optimism repository contracts directory +- `PK` - Private key for transaction signing +- `FACTORY` - Address of the DisputeGameFactory (passed to forge script) +- `REGISTRY` - Address of the AnchorStateRegistry (passed to forge script) + +**Example Execution:** +```bash +export L1_RPC_URL="https://..." +export OP_DIR="/path/to/optimism" +export PK="0x..." +export FACTORY="0x..." +export REGISTRY="0x..." +./close-recent-game.sh +``` + +### `PruneGamesFromStorage.s.sol` + +Maintenance script that removes games from the DisputeGameFactory storage to reduce storage costs and optimize factory performance. This is a critical operation that temporarily upgrades the factory to a pruning contract, removes games from the end of the array (higher indices), then restores the original implementation. + +**Features:** +- Temporarily upgrades DisputeGameFactory to DisputeGameFactoryPrunner +- Prunes games from the end of the storage array (higher indices) +- Automatically restores original factory implementation +- Verifies oldest retained game after pruning +- Emits events for each pruned game via GamePruned event + +**Required Environment Variables:** +- `FACTORY` - Address of the DisputeGameFactory proxy +- `PROXY_ADMIN` - Address of the ProxyAdmin contract +- `RETENTION_INDEX` - Index up to which games are retained (all games with index ≤ RETENTION_INDEX are kept, games with higher indices are removed) + +**Example Execution:** +```bash +# Keep games 0-100, remove games 101+ +FACTORY="0x..." PROXY_ADMIN="0x..." RETENTION_INDEX=100 forge script PruneGamesFromStorage.s.sol --root $PATH_TO_OP_REPO/packages/contracts-bedrock --broadcast --private-key $PK --rpc-url $RPC +``` + +**Process Flow:** +1. Store original factory implementation address +2. Deploy and upgrade to DisputeGameFactoryPrunner +3. Call pruneGames() to remove games from storage (from end of array) +4. Restore original factory implementation +5. Verify final game count and oldest retained game + +**Safety Considerations:** +- This operation modifies critical factory storage - use with caution +- Requires ProxyAdmin privileges to upgrade factory +- Games are permanently removed from factory storage (though on-chain game contracts remain) +- Ensure RETENTION_INDEX is correct before execution +- Consider creating games backup before pruning + +### `DisputeGameFactoryPrunner.sol` + +Helper contract used by `PruneGamesFromStorage.s.sol` to remove games from DisputeGameFactory storage. This contract replicates the factory's storage layout to safely manipulate the internal game arrays and mappings. + +**Features:** +- Matches DisputeGameFactory storage layout exactly (slots 103-104) +- Provides pruneGames() function to remove games from end of array +- Emits GamePruned event for each removed game +- Computes game UUIDs to properly clean up mapping storage + +**Storage Layout:** +- `uint256[103] __gap` - Reserved slots 0-102 (matches factory layout) +- `mapping(Hash => GameId) _disputeGames` - Slot 103 (game UUID to ID mapping) +- `GameId[] _disputeGameList` - Slot 104 (array of all game IDs) + +**Key Function:** +```solidity +function pruneGames(uint256 _desiredLength) external +``` +Removes games from storage until `_disputeGameList.length == _desiredLength`. + +**Event:** +```solidity +event GamePruned( + uint256 indexed index, + GameId indexed gameId, + Hash indexed gameUUID, + GameType gameType, + address gameProxy +) +``` + +**Note:** This contract is only temporarily set as the factory implementation during the pruning operation and is immediately replaced with the original implementation afterwards. + +### `ResetAnchorGame.s.sol` + +Maintenance script that directly sets the anchor game in the AnchorStateRegistry by manipulating storage slot 3. This is a critical operation that temporarily upgrades the registry to StorageSetter, updates the anchorGame storage slot, then restores the original implementation. + +**Features:** +- Temporarily upgrades AnchorStateRegistry to StorageSetter +- Directly sets storage slot 3 (anchorGame) to a new game address +- Automatically restores original registry implementation +- Verifies anchor game after restoration +- Emits AnchorGameReset event for tracking + +**Required Environment Variables:** +- `REGISTRY` - Address of the AnchorStateRegistry proxy +- `PROXY_ADMIN` - Address of the ProxyAdmin contract +- `ANCHOR_GAME` - Address of the new anchor game to set + +**Example Execution:** +```bash +REGISTRY="0x..." PROXY_ADMIN="0x..." ANCHOR_GAME="0x..." forge script ResetAnchorGame.s.sol --root $PATH_TO_OP_REPO/packages/contracts-bedrock --broadcast --private-key $PK --rpc-url $RPC +``` + +**Process Flow:** +1. Store original registry implementation address +2. Read current anchor game from slot 3 +3. Deploy and upgrade to StorageSetter +4. Call setAddress() to set slot 3 to new anchor game +5. Restore original registry implementation +6. Verify anchor game via registry.anchorGame() +7. Emit AnchorGameReset event + +**Storage Slot Modified:** +- **Slot 3**: `anchorGame` (IFaultDisputeGame) - The game whose claim is currently being used as the anchor state + +**Event:** +```solidity +event AnchorGameReset( + address indexed previousGame, + address indexed newGame +) +``` + +**Safety Considerations:** +- This operation modifies critical registry storage - use with extreme caution +- Requires ProxyAdmin privileges to upgrade registry +- The anchor game address is used for proving withdrawals and disputes +- Ensure ANCHOR_GAME points to a valid, finalized fault dispute game +- Setting an invalid anchor game can break the withdrawal system +- Verify the game type and status before setting +- Consider impact on in-flight withdrawals + +**Use Cases:** +- Emergency recovery when anchor state needs manual correction +- Testing anchor state behavior in non-production environments +- Fixing anchor state after disputed game resolution issues + +## Notes + +- **Critical**: All Foundry scripts must be executed from the Optimism contracts-bedrock directory using `--root` flag +- The `CloseRecentGame.s.sol` script stops after finding and closing the first eligible game +- Games must pass all validation checks before being closed +- The `PruneGamesFromStorage.s.sol` script is a destructive operation - games are permanently removed from factory storage +- Pruning games does not affect the deployed game contracts themselves, only the factory's internal tracking +- Always verify the RETENTION_INDEX value before pruning to avoid removing games that should be retained +- Consider the gas cost implications of pruning large numbers of games in a single transaction +- The `ResetAnchorGame.s.sol` script is a **highly sensitive operation** that directly modifies anchor state +- Only use `ResetAnchorGame.s.sol` in emergency situations or testing environments +- Setting an incorrect anchor game can break withdrawals and dispute resolution +- Both `PruneGamesFromStorage.s.sol` and `ResetAnchorGame.s.sol` use the temporary upgrade pattern for safety diff --git a/packages/op-tooling/scripts/ResetAnchorGame.s.sol b/packages/op-tooling/scripts/ResetAnchorGame.s.sol new file mode 100644 index 00000000000..b1fe6852dfa --- /dev/null +++ b/packages/op-tooling/scripts/ResetAnchorGame.s.sol @@ -0,0 +1,68 @@ +// SPDX-License-Identifier: MIT +pragma solidity 0.8.15; + +import { Script } from "forge-std/Script.sol"; +import { console } from "forge-std/console.sol"; + +import { IAnchorStateRegistry } from "interfaces/dispute/IAnchorStateRegistry.sol"; +import { IProxyAdmin } from "interfaces/universal/IProxyAdmin.sol"; +import { StorageSetter } from "src/universal/StorageSetter.sol"; + +contract ResetAnchorGame is Script { + // This script requires running with --root and the following env vars: + // REGISTRY (required) - address of the anchor state registry + // PROXY_ADMIN (required) - address of the proxy admin + // ANCHOR_GAME (required) - address of the new anchor game + + event AnchorGameReset(address indexed previousGame, address indexed newGame); + + function run() external { + IAnchorStateRegistry registry_ = IAnchorStateRegistry(vm.envAddress("REGISTRY")); + console.log("Registry present at:", address(registry_)); + + IProxyAdmin proxyAdmin_ = IProxyAdmin(vm.envAddress("PROXY_ADMIN")); + console.log("ProxyAdmin present at:", address(proxyAdmin_)); + + address newAnchorGame_ = vm.envAddress("ANCHOR_GAME"); + console.log("New anchor game:", newAnchorGame_); + + // Store registry impl + address registryImpl_ = proxyAdmin_.getProxyImplementation(address(registry_)); + + // Get current anchor game before upgrade (from slot 3) + bytes32 slot3Value_ = vm.load(address(registry_), bytes32(uint256(3))); + address currentAnchorGame_ = address(uint160(uint256(slot3Value_))); + console.log("Current anchor game:", currentAnchorGame_); + + // Start broadcast + vm.startBroadcast(); + + // Upgrade registry to StorageSetter + console.log("Upgrading registry to StorageSetter..."); + StorageSetter setter_ = new StorageSetter(); + proxyAdmin_.upgrade(payable(address(registry_)), address(setter_)); + console.log("Registry upgraded to StorageSetter at:", address(setter_)); + + // Set anchor game in slot 3 + console.log("Setting anchor game in slot 3..."); + StorageSetter(payable(address(registry_))).setAddress(bytes32(uint256(3)), newAnchorGame_); + console.log("Anchor game set: ", newAnchorGame_); + + // Restore registry implementation + console.log("Restoring registry implementation..."); + proxyAdmin_.upgrade(payable(address(registry_)), registryImpl_); + console.log("Registry implementation restored to:", registryImpl_); + + // Verify anchor game was set + address anchorGame_ = address(registry_.anchorGame()); + console.log("Verified anchor game:", anchorGame_); + require(anchorGame_ == newAnchorGame_, "Anchor game verification failed"); + console.log("Anchor game successfully reset!"); + + // Emit event + emit AnchorGameReset(currentAnchorGame_, newAnchorGame_); + + // Stop broadcast + vm.stopBroadcast(); + } +} diff --git a/packages/op-tooling/scripts/close-recent-game.sh b/packages/op-tooling/scripts/close-recent-game.sh new file mode 100755 index 00000000000..a5ae7767ba5 --- /dev/null +++ b/packages/op-tooling/scripts/close-recent-game.sh @@ -0,0 +1,26 @@ +#!/bin/sh + +set -euo pipefail + +check_env_var() { + local var_name=$1 + local var_value=${!var_name:-} + + if [ -z "$var_value" ]; then + print_error "$var_name is not set" + echo "Please export $var_name while running this script" + exit 1 + fi +} + +echo "Checking environment variables..." +check_env_var "L1_RPC_URL" +check_env_var "OP_DIR" +check_env_var "PK" + +echo "Closing recent fault dispute game..." +forge script CloseRecentGame.s.sol \ + --rpc-url $L1_RPC_URL \ + --root $OP_DIR/packages/contracts-bedrock \ + --private-key $PK \ + --broadcast diff --git a/packages/op-tooling/verify/README.md b/packages/op-tooling/verify/README.md new file mode 100644 index 00000000000..b9209a0214a --- /dev/null +++ b/packages/op-tooling/verify/README.md @@ -0,0 +1,140 @@ +# Contract Verification Scripts + +This directory contains scripts for verifying smart contracts created during deployment of new OpStack & during upgrade of exisitng OpStack to newer version. + +## Scripts + +### `verify-new-op-chain.sh` + +Verifies contracts for new Optimism chain deployments on both L1 and L2 networks. + +**Features:** +- Supports L1 (defaults to Sepolia) and L2 (defaults to Celo-Sepolia) networks +- Automatically detects proxy implementations and verifies them +- Verifies complete contract suite including OPCM, bridges, and system contracts +- Supports interop contracts when enabled + +**Required Environment Variables:** +- `NETWORK` - Target network (`l1` or `l2`) +- `ALCHEMY_KEY` - Alchemy API key (required for L1 networks) + +**Optional Environment Variables:** +- `BLOCKSCOUT_API_KEY` - API key for Blockscout verification +- `ETHERSCAN_API_KEY` - API key for Etherscan verification +- `TENDERLY_URL` - Tenderly verification URL +- `TENDERLY_API_KEY` - Tenderly API key + +**Configuration:** +- **Release**: `celo-contracts/v3.0.0` +- **Deployer**: `0x95a40aA01d2d72b4122C19c86160710D01224ada` +- **Interop Support**: Configurable via `USE_INTEROP` flag + +**Network Configuration:** + +| Network | Chain ID | RPC Endpoint | Block Explorer | +|---------|----------|--------------|----------------| +| L1 (Sepolia) | 11155111 | Alchemy | eth-sepolia.blockscout.com | +| L2 (Celo-Sepolia) | 11142220 | Forno | celo-sepolia.blockscout.com | + +**Example Execution:** +```bash +NETWORK="l1" ALCHEMY_KEY="your-key" BLOCKSCOUT_API_KEY="your-key" ./verify-new-op-chain.sh +``` + +### `verify-upgrade-impls.sh` + +Verifies implementation contracts for OPCM upgrades across different versions and networks. + +**Features:** +- Supports v2 and v3 upgrade verifications +- Works with Mainnet and Holesky testnet +- Verifies OPCM container and all implementation contracts +- Includes constructor argument encoding for complex deployments + +**Required Environment Variables:** +- `NETWORK` - Target network (`mainnet` or `holesky`) +- `VERSION` - Upgrade version (`v2` or `v3`) + +**Optional Environment Variables:** +- `BLOCKSCOUT_API_KEY` - API key for Blockscout verification +- `ETHERSCAN_API_KEY` - API key for Etherscan verification +- `TENDERLY_URL` - Tenderly verification URL +- `TENDERLY_API_KEY` - Tenderly API key + +**Pre-configured Contract Addresses:** + +The script includes hardcoded addresses for various deployments: +- **Baklava**: V2 and V3 configurations +- **Alfajores**: V2 and V3 configurations +- **Mainnet**: V2 and V3 configurations + +**Version-Specific Contracts:** + +#### V2 Implementation Contracts +- DelayedWETH with 604800 second delay +- OptimismPortal2 with maturity and finality delays +- SystemConfig, L1CrossDomainMessenger, bridges, and factories +- DisputeGameFactory, AnchorStateRegistry +- SuperchainConfig, ProtocolVersions + +#### V3 Implementation Contracts +- OPContractsManagerContractsContainer with blueprints and implementations +- OPContractsManagerGameTypeAdder, Deployer, and Upgrader +- Enhanced versions of all V2 contracts +- Additional Celo-specific contracts + +**Example Execution:** +```bash +NETWORK="mainnet" VERSION="v3" ETHERSCAN_API_KEY="your-key" ./verify-upgrade-impls.sh +``` + +### `verify-upgrade-validator.sh` + +Verifies validator contracts used in upgrade processes. + +**Features:** +- Supports StandardValidator V2.0.0 and V3.0.0 +- Configurable chain ID with Holesky as default +- Automatic block explorer URL selection based on chain +- Complex constructor argument encoding + +**Required Environment Variables:** +- `VERSION` - Validator version (`v2` or `v3`) +- `VALIDATOR` - Validator contract address + +**Optional Environment Variables:** +- `CHAIN_ID` - Target chain ID (defaults to 17000 for Holesky) +- `BLOCKSCOUT_API_KEY` - API key for Blockscout verification +- `ETHERSCAN_API_KEY` - API key for Etherscan verification +- `TENDERLY_URL` - Tenderly verification URL +- `TENDERLY_API_KEY` - Tenderly API key + +**Pre-configured Validators:** +- **Baklava V3**: `0x9df52e41189e89485bb7aee1e5cc93874dd89712` +- **Alfajores V3**: `0xc6bacfa8421117677e03c3eb81d44b37a9ceef31` + +**Example Execution:** +```bash +VERSION="v3" VALIDATOR="0x..." CHAIN_ID="17000" ETHERSCAN_API_KEY="your-key" ./verify-upgrade-validator.sh +``` + +## Contract Categories + +### L1 Contracts +- **OPCM Suite**: OPContractsManager and related contracts +- **Portal Contracts**: OptimismPortal2, DelayedWETH +- **Messaging**: L1CrossDomainMessenger +- **Bridges**: L1StandardBridge, L1ERC721Bridge +- **Factories**: OptimismMintableERC20Factory, DisputeGameFactory +- **System**: SystemConfig, SuperchainConfig, ProtocolVersions +- **Celo-Specific**: CeloSuperchainConfig, CeloTokenL1 + +### L2 Contracts +- **Core System**: LegacyMessagePasser, DeployerWhitelist, WETH +- **Messaging**: L2CrossDomainMessenger +- **Bridges**: L2StandardBridge, L2ERC721Bridge +- **Fee Vaults**: SequencerFeeVault, BaseFeeVault, L1FeeVault, OperatorFeeVault +- **Factories**: OptimismMintableERC20Factory, OptimismMintableERC721Factory +- **Attestation**: SchemaRegistry, EAS +- **Governance**: GovernanceToken +- **Interop** (optional): CeloL2Interop, L2ToL2CrossDomainMessenger, SuperchainWETH, ETHLiquidity, SuperchainTokenBridge diff --git a/packages/op-tooling/verify/verify-new-op-chain.sh b/packages/op-tooling/verify/verify-new-op-chain.sh new file mode 100644 index 00000000000..2e62341697f --- /dev/null +++ b/packages/op-tooling/verify/verify-new-op-chain.sh @@ -0,0 +1,230 @@ +#!/usr/bin/env bash + +# Require env vars +[ -z "${NETWORK:-}" ] && echo "Need to set the NETWORK via env" && exit 1; +[ -z "${RELEASE:-}" ] && echo "Need to set the RELEASE via env (example value: celo-contracts/v3.0.0)" && exit 1; + +# Config +DEPLOYER="0x95a40aA01d2d72b4122C19c86160710D01224ada" +USE_INTEROP="false" # Set to true if you interop is enabled and contracts are deployed on L2 + +# Check network +case $NETWORK in + "l1") + echo "Detected network: $NETWORK (sepolia)" + [ -z "${ALCHEMY_KEY:-}" ] && echo "Need to set the ALCHEMY_KEY via env" && exit 1; + BLOCKSCOUT_URL=https://eth-sepolia.blockscout.com/api/ + CHAIN_ID=11155111 + RPC_URL=https://eth-sepolia.g.alchemy.com/v2/$ALCHEMY_KEY + ;; + "l2") + echo "Detected network: $NETWORK (celo-sepolia)" + BLOCKSCOUT_URL=https://celo-sepolia.blockscout.com/api/ + CHAIN_ID=11142220 + RPC_URL=https://forno.celo-sepolia.celo-testnet.org + ;; + *) + echo "Unsupported network: $NETWORK" && exit 1 + ;; +esac + +# L1 contracts +# OPCM="0xf25a271ba3c290c9fc094c600179c4709f23e0dd" +# OPCM_GTA="0x6d5f8c3a8a84cc32e70097aaf2f2aa5ba6955857" +# OPCM_DEPLOYER="0xc7d9db993b9b61410299073fe748f5217bb8904a" +# OPCM_UPGRADER="0x6c8fc16bf0b1665cdb9fed485250088b67f873c1" +# DW="0x082f5f58b664cd1d51f9845fee322aba2ced9cba" # proxy for DelayedWETH +# OP="0x44ae3d41a335a7d05eb533029917aad35662dcc2" # proxy for OptimismPortal +# SC="0x760a5f022c9940f4a074e0030be682f560d29818" # proxy for SystemConfig +# LCDM="0x70b0e58e6039831954ede2ea1e9ef8a51680e4fd" # proxy for L1CrossDomainMessenger +# LEB="0xb8c8dcbccd0f7c5e7a2184b13b85d461d8711e96" # proxy for L2ERC721Bridge +# LSB="0xec18a3c30131a0db4246e785355fbc16e2eaf408" # proxy for L1StandardBridge +# OMEF="0x261be2ed7241fed9c746e0b5dff3a4a335991377" # proxy for OptimismMintableERC20Factory +# DGF="0x57c45d82d1a995f1e135b8d7edc0a6bb5211cfaa" # proxy for DisputeGameFactory +# ASR="0xd73ba8168a61f3e917f0930d5c0401aa47e269d6" # proxy for AnchorStateRegistry +# SU="0x31bEef32135c90AE8E56Fb071B3587de289Aaf77" # proxy for SuperchainConfig +# PV="0x0e2d45F3393C3A02ebf285F998c5bF990A1541cd" # proxy for ProtocolVersions +# PA="0xf7d7a3d3bb8abb6829249b3d3ad3d525d052027e" # ProxyAdmin +# AM="0x8f0c6fc85a53551d87899ac2a5af2b48c793eb63" # AddressManager +# PDG="0x5a4feaeb665f8049ac1e0714f82a11532d47ae49" # PermissionedDisputeGame +# CSU="0x5c34140a1273372211bd75184ccc9e434b38d86b" # proxy for CeloSuperchainConfig +# CT="0x3c7011fd5e6aed460caa4985cf8d8caba435b092" # proxy for CeloToken + +# L2 contracts +# L2_LMP="0x4200000000000000000000000000000000000000" # proxy for LegacyMessagePasser +# L2_DWL="0x4200000000000000000000000000000000000002" # proxy for DeployerWhitelist +# L2_WE="0x4200000000000000000000000000000000000006" # WETH (no proxy) +# L2_LCDM="0x4200000000000000000000000000000000000007" # proxy for L2CrossDomainMessenger (L1CrossDomainMessengerProxy) +# L2_GPO="0x420000000000000000000000000000000000000f" # proxy for GasPriceOracle +# L2_LSB="0x4200000000000000000000000000000000000010" # proxy for L2StandardBridge (L1StandardBridgeProxy) +# L2_SFV="0x4200000000000000000000000000000000000011" # proxy for SequencerFeeVault +# L2_OMEF="0x4200000000000000000000000000000000000012" # proxy for OptimismMintableERC20Factory +# L2_LBN="0x4200000000000000000000000000000000000013" # proxy for L1BlockNumber +# L2_LEB="0x4200000000000000000000000000000000000014" # proxy for L2ERC721Bridge (L1ERC721Bridge) +# L2_LB="0x4200000000000000000000000000000000000015" # proxy for L1Block +# L2_LTLMP="0x4200000000000000000000000000000000000016" # proxy for L2ToL1MessagePasser +# L2_OMEF="0x4200000000000000000000000000000000000017" # proxy for OptimismMintableERC721Factory +# L2_PA="0x4200000000000000000000000000000000000018" # proxy for ProxyAdmin +# L2_BFV="0x4200000000000000000000000000000000000019" # proxy for BaseFeeVault +# L2_LFV="0x420000000000000000000000000000000000001A" # proxy for L1FeeVault +# L2_OFV="0x420000000000000000000000000000000000001B" # proxy for OperatorFeeVault +# L2_SR="0x4200000000000000000000000000000000000020" # proxy for SchemaRegistry +# L2_EAS="0x4200000000000000000000000000000000000021" # proxy for EAS +# L2_GT="0x4200000000000000000000000000000000000042" # GovernanceToken (no proxy) +# if [ $USE_INTEROP = "true" ]; then +# L2_CLI="0x4200000000000000000000000000000000000022" # proxy for CeloL2Interop +# L2_LLCDM="0x4200000000000000000000000000000000000023" # proxy for L2ToL2CrossDomainMessenger +# L2_SWE="0x4200000000000000000000000000000000000024" # proxy for SuperchainWETH +# L2_EL="0x4200000000000000000000000000000000000025" # proxy for ETHLiquidity +# L2_STB="0x4200000000000000000000000000000000000026" # proxy for SuperchainTokenBridge +# fi + +verify() { + CONSTRUCTOR_SIG=${3:-} + if [ "${BLOCKSCOUT_API_KEY:-}" ]; then + echo ">>> [Blockscout] $2 ($1)" + if [ -z ${CONSTRUCTOR_SIG:-} ]; then + forge verify-contract $1 $2 \ + --chain-id $CHAIN_ID \ + --etherscan-api-key=$BLOCKSCOUT_API_KEY \ + --verifier-url=$BLOCKSCOUT_URL \ + --verifier=blockscout \ + --watch + else + forge verify-contract $1 $2 \ + --chain-id $CHAIN_ID \ + --etherscan-api-key=$BLOCKSCOUT_API_KEY \ + --verifier-url=$BLOCKSCOUT_URL \ + --verifier=blockscout \ + --constructor-args $(cast abi-encode $CONSTRUCTOR_SIG ${@:4}) \ + --watch + fi + fi + if [ "${ETHERSCAN_API_KEY:-}" ]; then + echo ">>> [Etherscan] $2 ($1)" + if [ -z ${CONSTRUCTOR_SIG:-} ]; then + forge verify-contract $1 $2 \ + --chain-id $CHAIN_ID \ + --etherscan-api-key=$ETHERSCAN_API_KEY \ + --verifier=etherscan \ + --watch + else + forge verify-contract $1 $2 \ + --chain-id $CHAIN_ID \ + --etherscan-api-key=$ETHERSCAN_API_KEY \ + --constructor-args $(cast abi-encode $CONSTRUCTOR_SIG ${@:4}) \ + --verifier=etherscan \ + --watch + fi + fi + if [ "${TENDERLY_URL:-}" ] && [ "${TENDERLY_API_KEY:-}" ]; then + echo ">>> [Tenderly] $2 ($1)" + if [ -z ${CONSTRUCTOR_SIG:-} ]; then + forge verify-contract $1 $2 \ + --chain-id $CHAIN_ID \ + --verifier-url=$TENDERLY_URL \ + --etherscan-api-key=$TENDERLY_API_KEY \ + --watch + else + forge verify-contract $1 $2 \ + --chain-id $CHAIN_ID \ + --verifier-url=$TENDERLY_URL \ + --etherscan-api-key=$TENDERLY_API_KEY \ + --constructor-args $(cast abi-encode $CONSTRUCTOR_SIG ${@:4}) \ + --watch + fi + fi + echo "----------------------------------------" +} + +get_impl() { + IMPL_SLOT="0x360894a13ba1a3210667c828492db98dca3e2076cc3735a920a3ca505d382bbc" # keccak256("eip1967.proxy.implementation") + export IMPL_ADDRESS_B32=$(cast storage $1 $IMPL_SLOT -r $RPC_URL) + export IMPL_ADDRESS=$(cast parse-bytes32-address $IMPL_ADDRESS_B32) +} + +verify_proxy() { + get_impl $1 + echo "Proxy: $1 ImplB32: $IMPL_ADDRESS_B32 Impl: $IMPL_ADDRESS" + verify $IMPL_ADDRESS ${@:2} +} + +if [ "$NETWORK" = "l1" ]; then + echo ">>> [L1] Verifying contracts on $NETWORK" + # start verifying contracts + OPCM_CONTAINER=$(cast call $OPCM_GTA "contractsContainer()(address)" -r $RPC_URL) + echo "OPCM Container: $OPCM_CONTAINER" + BLUEPRINTS=$(cast call $OPCM_CONTAINER "blueprints()((address,address,address,address,address,address,address,address,address))" -r $RPC_URL) + BLUEPRINTS="${BLUEPRINTS// /}" # Remove spaces + echo "OPCM Blueprints: $BLUEPRINTS" + IMPLS=$(cast call $OPCM_CONTAINER "implementations()((address,address,address,address,address,address,address,address,address,address,address,address))" -r $RPC_URL) + IMPLS="${IMPLS// /}" # Remove spaces + echo "OPCM Implementations: $IMPLS" + get_impl $CSU + CSUI=$IMPL_ADDRESS + echo "CeloSuperchainConfig Impl: $CSUI" + get_impl $PV + PVI=$IMPL_ADDRESS + echo "ProtocolVersions Impl: $PVI" + verify $OPCM_CONTAINER OPContractsManagerContractsContainer "constructor((address,address,address,address,address,address,address,address,address),(address,address,address,address,address,address,address,address,address,address,address,address))" "$BLUEPRINTS" "$IMPLS" + verify $OPCM_GTA OPContractsManagerGameTypeAdder "constructor(address)" $OPCM_CONTAINER # TODO: unverified + verify $OPCM_DEPLOYER OPContractsManagerDeployer "constructor(address)" $OPCM_CONTAINER # TODO: unverified + verify $OPCM_UPGRADER OPContractsManagerUpgrader "constructor(address)" $OPCM_CONTAINER # TODO: unverified + verify $OPCM OPContractsManager "constructor(address,address,address,address,address,address,string,address)" $OPCM_GTA $OPCM_DEPLOYER $OPCM_UPGRADER $CSUI $PVI $PA $RELEASE $DEPLOYER # TODO: unverified + DELAY_WETH=$(cast call $DW "delay()(uint256)" -r $RPC_URL) + echo "Delayed WETH delay: $DELAY_WETH" + verify_proxy $DW DelayedWETH "constructor(uint256)" $DELAY_WETH + PROOF_MATURITY=$(cast call $OP "proofMaturityDelaySeconds()(uint256)" -r $RPC_URL) + echo "Optimism Portal proof maturity: $PROOF_MATURITY" + GAME_FINALITY=$(cast call $OP "disputeGameFinalityDelaySeconds()(uint256)" -r $RPC_URL) + echo "Optimism Portal game finality: $GAME_FINALITY" + verify_proxy $OP OptimismPortal2 "constructor(uint256,uint256)" $PROOF_MATURITY $GAME_FINALITY + verify_proxy $LCDM L1CrossDomainMessenger + verify_proxy $LEB L1ERC721Bridge + verify_proxy $LSB L1StandardBridge + verify_proxy $CSU CeloSuperchainConfig + verify_proxy $PV ProtocolVersions + verify_proxy $SU SuperchainConfig + verify_proxy $SC SystemConfig + verify_proxy $OMEF OptimismMintableERC20Factory + verify_proxy $ASR AnchorStateRegistry + verify_proxy $DGF DisputeGameFactory + verify_proxy $CT CeloTokenL1 + verify $PA ProxyAdmin + verify $AM AddressManager + verify $PDG PermissionedDisputeGame + # end verifying contracts + echo ">>> [L1] Finished verifying contracts on $NETWORK" +else + echo ">>> [L2] Verifying contracts on $NETWORK" + # start verifying contracts + verify_proxy $L2_LMP LegacyMessagePasser + verify_proxy $L2_DWL DeployerWhitelist + verify $L2_WE WETH + verify_proxy $L2_LCDM L2CrossDomainMessenger "constructor(address)" $LCDM + verify_proxy $L2_GPO GasPriceOracle + verify_proxy $L2_LSB L2StandardBridge "constructor(address)" $LSB + verify_proxy $L2_SFV SequencerFeeVault + verify_proxy $L2_OMEF OptimismMintableERC20Factory + verify_proxy $L2_LBN L1BlockNumber + verify_proxy $L2_LEB L2ERC721Bridge "constructor(address)" $LEB + verify_proxy $L2_LB L1Block + verify_proxy $L2_LTLMP L2ToL1MessagePasser + verify_proxy $L2_OMEF OptimismMintableERC721Factory + verify_proxy $L2_PA ProxyAdmin + verify_proxy $L2_BFV BaseFeeVault + verify_proxy $L2_LFV L1FeeVault + verify_proxy $L2_OFV OperatorFeeVault + verify_proxy $L2_SR SchemaRegistry + verify_proxy $L2_EAS EAS + verify $L2_GT GovernanceToken # not deployed? + if [ $USE_INTEROP = "true" ]; then + verify_proxy $L2_CLI CeloL2Interop + verify_proxy $L2_LLCDM L2ToL2CrossDomainMessenger + verify_proxy $L2_SWE SuperchainWETH + verify_proxy $L2_EL ETHLiquidity + verify_proxy $L2_STB SuperchainTokenBridge + fi + # end verifying contracts + echo ">>> [L2] Finished verifying contracts on $NETWORK" +fi diff --git a/packages/op-tooling/verify/verify-upgrade-impls.sh b/packages/op-tooling/verify/verify-upgrade-impls.sh new file mode 100644 index 00000000000..6dca832022e --- /dev/null +++ b/packages/op-tooling/verify/verify-upgrade-impls.sh @@ -0,0 +1,139 @@ +#!/usr/bin/env bash +set -euo pipefail + +# Mainnet: V2 +# OPCM=0x597f110a3bee7f260b1657ab63c36d86b3740f36 +# DWI=0x1e121e21e1a11ae47c0efe8a7e13ae3eb4923796 +# OPI=0xbed463769920dac19a7e2adf47b6c6bb6480bd97 +# SCI=0x911ea44d22eb903515378625da3a0e09d2e1b074 +# LCDMI=0x3d5a67747de7e09b0d71f5d782c8b45f6307b9fd +# LEBI=0x276d3730f219f7ec22274f7263180b8452b46d47 +# LSBI=0xaf38504abc62f28e419622506698c5fa3ca15eda +# OMEFI=0x5493f4677a186f64805fe7317d6993ba4863988f +# DGFI=0x4bba758f006ef09402ef31724203f316ab74e4a0 +# ASRI=0x7b465370bb7a333f99edd19599eb7fb1c2d3f8d2 +# SUI=0x4da82a327773965b8d4d85fa3db8249b387458e7 +# PVI=0x37e15e4d6dffa9e5e320ee1ec036922e563cb76c + +# Mainnet: V3 +# OPCM_CONTAINER=0x75a66e525fba131313ba986aa7c70f8e756d40a7 +# OPCM_GTA=0x20d62d912b6b05e350441a2e7364c9bbe35870b3 +# OPCM_DEPLOYER=0x8bf5c8c0d9b6a721fc70324a982df562bdd3ce70 +# OPCM_UPGRADER=0xe565acc3c822d5d8298d9c7213a88dddc0ee93e1 +# OPCM=0x2e8cd74af534f5eeb53f889d92fd4220546a15e7 +# DWI=0x1e121e21e1a11ae47c0efe8a7e13ae3eb4923796 +# OPI=0x215a5ff85308a72a772f09b520da71d3520e9ac7 +# SCI=0x9c61c5a8ff9408b83ac92571278550097a9d2bb5 +# LCDMI=0x807124f75ff2120b2f26d7e6f9e39c03ee9de212 +# LEBI=0x7ae1d3bd877a4c5ca257404ce26be93a02c98013 +# LSBI=0x28841965b26d41304905a836da5c0921da7dbb84 +# OMEFI=0x6a52641d87a600ba103ccdfbe3eb02ac7e73c04a +# DGFI=0x4bba758f006ef09402ef31724203f316ab74e4a0 +# ASRI=0x7b465370bb7a333f99edd19599eb7fb1c2d3f8d2 +# SUI=0x4da82a327773965b8d4d85fa3db8249b387458e7 +# PVI=0x37e15e4d6dffa9e5e320ee1ec036922e563cb76c + +# Require env vars +[ -z "${NETWORK:-}" ] && echo "Need to set the NETWORK via env" && exit 1; +[ -z "${VERSION:-}" ] && echo "Need to set the VERSION via env" && exit 1; + +# Check network +case $NETWORK in + "mainnet") + echo "Detected network: $NETWORK" + BLOCKSCOUT_URL=https://eth.blockscout.com/api/ + CHAIN_ID=1 + ;; + "holesky") + echo "Detected network: $NETWORK" + BLOCKSCOUT_URL=https://eth-holesky.blockscout.com/api/ + CHAIN_ID=17000 + ;; + *) + echo "Unsupported network: $NETWORK" && exit 1 + ;; +esac + +# Check version +case $VERSION in + "v2"|"v3") + echo "Detected supported version: $VERSION" + ;; + *) + echo "Invalid version: $VERSION" && exit 1 + ;; +esac + +verify() { + CONSTRUCTOR_SIG=${3:-} + if [ "${BLOCKSCOUT_API_KEY:-}" ]; then + echo ">>> [Blockscout] $2" + if [ -z ${CONSTRUCTOR_SIG:-} ]; then + forge verify-contract $1 $2 \ + --chain-id $CHAIN_ID \ + --etherscan-api-key=$BLOCKSCOUT_API_KEY \ + --verifier-url=$BLOCKSCOUT_URL \ + --watch + else + forge verify-contract $1 $2 \ + --chain-id $CHAIN_ID \ + --etherscan-api-key=$BLOCKSCOUT_API_KEY \ + --verifier-url=$BLOCKSCOUT_URL \ + --constructor-args $(cast abi-encode $CONSTRUCTOR_SIG ${@:4}) \ + --watch + fi + fi + if [ "${ETHERSCAN_API_KEY:-}" ]; then + echo ">>> [Etherscan] $2" + if [ -z ${CONSTRUCTOR_SIG:-} ]; then + forge verify-contract $1 $2 \ + --chain-id $CHAIN_ID \ + --etherscan-api-key=$ETHERSCAN_API_KEY \ + --watch + else + forge verify-contract $1 $2 \ + --chain-id $CHAIN_ID \ + --etherscan-api-key=$ETHERSCAN_API_KEY \ + --constructor-args $(cast abi-encode $CONSTRUCTOR_SIG ${@:4}) \ + --watch + fi + fi + if [ "${TENDERLY_URL:-}" ] && [ "${TENDERLY_API_KEY:-}" ]; then + echo ">>> [Tenderly] $2" + if [ -z ${CONSTRUCTOR_SIG:-} ]; then + forge verify-contract $1 $2 \ + --chain-id $CHAIN_ID \ + --verifier-url=$TENDERLY_URL \ + --etherscan-api-key=$TENDERLY_API_KEY \ + --watch + else + forge verify-contract $1 $2 \ + --chain-id $CHAIN_ID \ + --verifier-url=$TENDERLY_URL \ + --etherscan-api-key=$TENDERLY_API_KEY \ + --constructor-args $(cast abi-encode $CONSTRUCTOR_SIG ${@:4}) \ + --watch + fi + fi +} + +if [ $VERSION = "v3" ]; then + verify $OPCM_CONTAINER OPContractsManagerContractsContainer "constructor((address,address,address,address,address,address,address,address,address),(address,address,address,address,address,address,address,address,address,address,address,address))" "(0x765c6637a370595845F637739279C353484a26A6,0xA643EA8ee60D92f615eC70AF0248c449bBCEcF4d,0x2Fa0D0f6d92061344Db35132379dB419bD1c56f7,0xA5d36DEaf2267B267278a4a1458deFe0d65620eb,0x7096758bDD076a4cC42255c278F2Cb216D6D8ce3,0x2E5A428E3C65080D51e9c0d581DDa85cE8489189,0xc10A417e3A00B3e6cC70bbB998b6ad3689CeBBB9,0x011d2556c6b858f5f5Fa69f33f0Cd8D52dE0E222,0xbbcC9cdDA0B1ea8058B45FA4DC56E43BA69890e1)" "(0x4da82a327773965b8d4D85Fa3dB8249b387458E7,0x37E15e4d6DFFa9e5E320Ee1eC036922E563CB76C,0x7aE1d3BD877a4C5CA257404ce26BE93A02C98013,0x215A5fF85308A72A772F09B520dA71D3520e9aC7,0x9c61C5a8FF9408B83ac92571278550097A9d2BB5,0x6A52641d87a600bA103CcdfbE3EB02Ac7E73C04A,0x807124F75FF2120b2f26D7e6f9e39C03ee9DE212,0x28841965B26d41304905A836Da5C0921DA7dBB84,0x4bbA758F006Ef09402eF31724203F316ab74e4a0,0x7b465370BB7A333f99edd19599EB7Fb1c2D3F8D2,0x1e121E21E1A11Ae47C0EFE8A7E13ae3eb4923796,0xaA59A0777648BC75cd10364083e878c1cCd6112a)" + verify $OPCM_GTA OPContractsManagerGameTypeAdder "constructor(address)" $OPCM_CONTAINER + verify $OPCM_DEPLOYER OPContractsManagerDeployer "constructor(address)" $OPCM_CONTAINER + verify $OPCM_UPGRADER OPContractsManagerUpgrader "constructor(address)" $OPCM_CONTAINER + verify $OPCM OPContractsManager "constructor(address,address,address,address,address,address,string,address)" $OPCM_GTA $OPCM_DEPLOYER $OPCM_UPGRADER "0x95703e0982140D16f8ebA6d158FccEde42f04a4C" "0x1b6dEB2197418075AB314ac4D52Ca1D104a8F663" "0x783A434532Ee94667979213af1711505E8bFE374" "celo-contracts/v3.0.0" "0x4092A77bAF58fef0309452cEaCb09221e556E112" +else + verify $OPCM OPContractsManager "constructor(address,address,address,string,(address,address,address,address,address,address,address,address,address),(address,address,address,address,address,address,address,address,address,address,address,address),address)" "0x95703e0982140D16f8ebA6d158FccEde42f04a4C" "0x1b6dEB2197418075AB314ac4D52Ca1D104a8F663" "0x783A434532Ee94667979213af1711505E8bFE374" "celo-contracts/v2.0.0" "(0x765c6637a370595845F637739279C353484a26A6,0xA643EA8ee60D92f615eC70AF0248c449bBCEcF4d,0x2Fa0D0f6d92061344Db35132379dB419bD1c56f7,0xA5d36DEaf2267B267278a4a1458deFe0d65620eb,0x7096758bDD076a4cC42255c278F2Cb216D6D8ce3,0x2E5A428E3C65080D51e9c0d581DDa85cE8489189,0xc10A417e3A00B3e6cC70bbB998b6ad3689CeBBB9,0x011d2556c6b858f5f5Fa69f33f0Cd8D52dE0E222,0xbbcC9cdDA0B1ea8058B45FA4DC56E43BA69890e1)" "(0x4da82a327773965b8d4D85Fa3dB8249b387458E7,0x37E15e4d6DFFa9e5E320Ee1eC036922E563CB76C,0x276d3730f219f7ec22274f7263180b8452B46d47,0xBeD463769920dAc19a7E2aDf47B6C6Bb6480bD97,0x911EA44d22EB903515378625dA3a0E09D2E1B074,0x5493f4677A186f64805fe7317D6993ba4863988F,0x3d5a67747dE7E09b0d71F5d782c8b45f6307B9Fd,0xAF38504abC62F28e419622506698C5Fa3ca15EDA,0x4bbA758F006Ef09402eF31724203F316ab74e4a0,0x7b465370BB7A333f99edd19599EB7Fb1c2D3F8D2,0x1e121E21E1A11Ae47C0EFE8A7E13ae3eb4923796,0xaA59A0777648BC75cd10364083e878c1cCd6112a)" "0x4092A77bAF58fef0309452cEaCb09221e556E112" +fi +verify $DWI DelayedWETH "constructor(uint256)" 604800 +verify $OPI OptimismPortal2 "constructor(uint256,uint256)" 604800 302400 +verify $SCI SystemConfig +verify $LCDMI L1CrossDomainMessenger +verify $LEBI L1ERC721Bridge +verify $LSBI L1StandardBridge +verify $OMEFI OptimismMintableERC20Factory +verify $DGFI DisputeGameFactory +verify $ASRI AnchorStateRegistry +verify $SUI SuperchainConfig +verify $PVI ProtocolVersions diff --git a/packages/op-tooling/verify/verify-upgrade-validator.sh b/packages/op-tooling/verify/verify-upgrade-validator.sh new file mode 100644 index 00000000000..d9aaa867cd4 --- /dev/null +++ b/packages/op-tooling/verify/verify-upgrade-validator.sh @@ -0,0 +1,94 @@ +#!/usr/bin/env bash +set -euo pipefail + + +# Alfajores: V3 +# VALIDATOR=0xc6bacfa8421117677e03c3eb81d44b37a9ceef31 + +# Require env vars +[ -z "${VERSION:-}" ] && echo "Need to set the VERSION via env" && exit 1; +[ -z "${VALIDATOR:-}" ] && echo "Need to set the VALIDATOR via env" && exit 1; + +# Optional env vars +if [ -z "${CHAIN_ID:-}" ]; then + # Fallback to Holesky + CHAIN_ID=17000 +fi + +if [ "$CHAIN_ID:-" = "17000"]; then + BLOCKSCOUT_URL=https://eth-holesky.blockscout.com/api/ +else + BLOCKSCOUT_URL=https://eth.blockscout.com/api/ +fi + +# Check version +case $VERSION in + "v2") + echo "Detected supported version: $VERSION" + CONTRACT="StandardValidatorV200" + ;; + "v3") + echo "Detected supported version: $VERSION" + CONTRACT="StandardValidatorV300" + ;; + *) + echo "Invalid version: $VERSION" && exit 1 + ;; +esac + +verify() { + CONSTRUCTOR_SIG=${3:-} + if [ "${BLOCKSCOUT_API_KEY:-}" ]; then + echo ">>> [Blockscout] $2" + if [ -z ${CONSTRUCTOR_SIG:-} ]; then + forge verify-contract $1 $2 \ + --chain-id $CHAIN_ID \ + --etherscan-api-key=$BLOCKSCOUT_API_KEY \ + --verifier-url=$BLOCKSCOUT_URL \ + --watch + else + forge verify-contract $1 $2 \ + --chain-id $CHAIN_ID \ + --etherscan-api-key=$BLOCKSCOUT_API_KEY \ + --verifier-url=$BLOCKSCOUT_URL \ + --constructor-args $(cast abi-encode $CONSTRUCTOR_SIG ${@:4}) \ + --watch + fi + fi + if [ "${ETHERSCAN_API_KEY:-}" ]; then + echo ">>> [Etherscan] $2" + if [ -z ${CONSTRUCTOR_SIG:-} ]; then + forge verify-contract $1 $2 \ + --chain-id $CHAIN_ID \ + --etherscan-api-key=$ETHERSCAN_API_KEY \ + --watch + else + forge verify-contract $1 $2 \ + --chain-id $CHAIN_ID \ + --etherscan-api-key=$ETHERSCAN_API_KEY \ + --constructor-args $(cast abi-encode $CONSTRUCTOR_SIG ${@:4}) \ + --watch + fi + fi + if [ "${TENDERLY_URL:-}" ] && [ "${TENDERLY_API_KEY:-}" ]; then + echo ">>> [Tenderly] $2" + if [ -z ${CONSTRUCTOR_SIG:-} ]; then + forge verify-contract $1 $2 \ + --chain-id $CHAIN_ID \ + --verifier-url=$TENDERLY_URL \ + --etherscan-api-key=$TENDERLY_API_KEY \ + --watch + else + forge verify-contract $1 $2 \ + --chain-id $CHAIN_ID \ + --verifier-url=$TENDERLY_URL \ + --etherscan-api-key=$TENDERLY_API_KEY \ + --constructor-args $(cast abi-encode $CONSTRUCTOR_SIG ${@:4}) \ + --watch + fi + fi +} + +verify $VALIDATOR $CONTRACT \ + "constructor((address,address,address,address,address,address,address,address,address,address),address,address,address,address)" \ + "($LEBI,$OPI,$SCI,$OMEFI,$LCMDI,$LSBI,$DGFI,$ASRI,$DWI,$MIPSI)" $SUI $LPM $MIPS $CHALLENGER diff --git a/packages/op-tooling/withdrawal/README.md b/packages/op-tooling/withdrawal/README.md new file mode 100644 index 00000000000..91d52e4123e --- /dev/null +++ b/packages/op-tooling/withdrawal/README.md @@ -0,0 +1,182 @@ +# L2 to L1 Withdrawals + +This directory contains tooling for performing L2 to L1 withdrawals of CELO from Celo's L2 networks. The workflow follows the Optimism-style withdrawal process with a 7-day challenge period (exact challenge period dictated by `PROOF_MATURITY_DELAY_SECONDS` in Optimism Portal). + +## Supported Networks + +The tooling supports two network configurations: + +- **Sepolia**: L1 (Ethereum Sepolia) ↔ L2 (Celo Sepolia) - *Testnet* +- **Mainnet**: L1 (Ethereum Mainnet) ↔ L2 (Celo Mainnet) + +Set the `NETWORK` environment variable to specify which network to use (`sepolia` or `mainnet`). This variable is **required**. + +## Important Notes + +- **Private Keys**: Always provide private keys without `0x` prefix to all scripts +- **Values**: All VALUE parameters should be specified in wei +- **Timing**: You may need to wait up to 1 hour before it's possible to generate a proof after initiation + +## Workflow Overview + +1. **Initiate** withdrawal on L2 +2. **Build** withdrawal proof (wait up to 1 hour) +3. **Prove** withdrawal on L1 +4. **Wait** 7 days for challenge period +5. **Finalize** and claim on L1 + +## Step 1: Initiate Withdrawal on L2 + +Initiates the withdrawal process on L2 using the L2_L1_MESSAGE_PASSER contract. + +```sh +RECIPIENT=0x... VALUE=1000000000000000000 PK=123... ./initiate.sh +``` + +**Required Environment Variables:** +- `RECIPIENT`: L1 address that will receive the funds +- `VALUE`: Amount to withdraw in wei +- `PK`: Private key (without 0x prefix) of the sender + +**Optional Environment Variables:** +- `GAS_LIMIT`: Gas limit for the transaction (default: 0 - which means no gas limit) +- `DATA`: Additional data to include (default: "0x00") +- `L2_RPC_URL`: Custom L2 RPC URL + +**Output:** Transaction hash - **save this for the next step!** + +## Step 2: Build Withdrawal Proof + +Builds the withdrawal proof from L2 transaction data. May require waiting up to 1 hour for the message to become provable. + +```sh +cd build_proof && yarn install && PK=123... TX_HASH=0x... yarn build_proof +``` + +**Required Environment Variables:** +- `PK`: Private key (without 0x prefix) of the sender +- `TX_HASH`: Transaction hash from the initiation step +- `NETWORK`: Network to use (`sepolia` or `mainnet`) + +For detailed information about proof building, see [build_proof README](./build_proof/). + +## Step 3: Prove Withdrawal on L1 + +Submits the withdrawal proof to the Optimism Portal contract on L1. + +```sh +WITHDRAWAL_NONCE=123... SENDER=0x... RECIPIENT=0x... VALUE=1000000000000000000 \ + GAME_INDEX=123... OUTPUT_ROOT_PROOF__VERSION=0x... \ + OUTPUT_ROOT_PROOF__STATE_ROOT=0x... OUTPUT_ROOT_PROOF__MESSAGE_PASSER_STORAGE_ROOT=0x... \ + OUTPUT_ROOT_PROOF__LATEST_BLOCKHASH=0x... WITHDRAWAL_PROOF="[0x...,0x...,0x...]" \ + PK=123... L1_RPC_URL=https://... ./prove.sh +``` + +**Required Environment Variables:** +- `WITHDRAWAL_NONCE`: Nonce from the withdrawal initiation +- `SENDER`: Address that initiated the withdrawal +- `RECIPIENT`: Address that will receive the funds +- `VALUE`: Amount being withdrawn in wei +- `GAME_INDEX`: Game index (output from Step 2) +- `OUTPUT_ROOT_PROOF__VERSION`: Version from proof building +- `OUTPUT_ROOT_PROOF__STATE_ROOT`: State root from proof building +- `OUTPUT_ROOT_PROOF__MESSAGE_PASSER_STORAGE_ROOT`: Storage root from proof building +- `OUTPUT_ROOT_PROOF__LATEST_BLOCKHASH`: Block hash from proof building +- `WITHDRAWAL_PROOF`: Array of proof bytes from proof building +- `PK`: Private key (without 0x prefix) for submitting the proof +- `L1_RPC_URL`: L1 RPC URL to use for the proof submission +- `NETWORK`: Network to use (`sepolia` or `mainnet`) + +**Optional Environment Variables:** +- `GAS_LIMIT`: Gas limit for the transaction (default: 0 - which means no gas limit) +- `DATA`: Additional data to include (default: "0x00") + +## Step 4: Check Withdrawal Status + +Check the status of a proven withdrawal. + +```sh +WITHDRAWAL_HASH=0x... PROOF_SUBMITTER=0x... L1_RPC_URL=https://... ./get.sh +``` + +**Required Environment Variables:** +- `WITHDRAWAL_HASH`: Hash of the withdrawal transaction +- `PROOF_SUBMITTER`: Address that submitted the proof +- `L1_RPC_URL`: L1 RPC URL to use for checking status +- `NETWORK`: Network to use (`sepolia` or `mainnet`) + +## Step 5: Wait 7 Days + +The withdrawal must wait 7 days (challenge period) before it can be finalized. + +## Step 6: Check Readiness for Finalization + +Check if the withdrawal is ready to be finalized and claimed. + +```sh +WITHDRAWAL_HASH=0x... PROOF_SUBMITTER=0x... L1_RPC_URL=https://... ./check.sh +``` + +**Required Environment Variables:** +- `WITHDRAWAL_HASH`: Hash of the withdrawal transaction +- `PROOF_SUBMITTER`: Address that submitted the proof +- `L1_RPC_URL`: L1 RPC URL to use for checking readiness +- `NETWORK`: Network to use (`sepolia` or `mainnet`) + +**Output:** +- Reverts with error message if withdrawal has issues +- Returns `0x` (blank output) if withdrawal is ready to claim + +## Step 7: Finalize and Claim Withdrawal + +Completes the withdrawal process and claims the funds on L1. + +```sh +WITHDRAWAL_NONCE=123... SENDER=0x... RECIPIENT=0x... VALUE=1000000000000000000 \ + PK=123... L1_RPC_URL=https://... ./finalize.sh +``` + +**Required Environment Variables:** +- `WITHDRAWAL_NONCE`: Nonce from the withdrawal initiation +- `SENDER`: Address that initiated the withdrawal +- `RECIPIENT`: Address that will receive the funds +- `VALUE`: Amount being withdrawn in wei +- `PK`: Private key (without 0x prefix) for finalizing +- `L1_RPC_URL`: L1 RPC URL to use for finalizing the withdrawal +- `NETWORK`: Network to use (`sepolia` or `mainnet`) + +**Optional Environment Variables:** +- `GAS_LIMIT`: Gas limit for the transaction (default: 0 - which means no gas limit) +- `DATA`: Additional data to include (default: "0x00") + +## Contract Addresses + +### Network-Specific Contract Addresses + +**Sepolia (L1: Ethereum Sepolia, L2: Celo Sepolia):** +- **L2_L1_MESSAGE_PASSER**: `0x4200000000000000000000000000000000000016` (Celo Sepolia) +- **L1_OPTIMISM_PORTAL**: `0x44ae3d41a335a7d05eb533029917aad35662dcc2` (Ethereum Sepolia) + +**Mainnet (L1: Ethereum Mainnet, L2: Celo Mainnet):** +- **L2_L1_MESSAGE_PASSER**: `0x4200000000000000000000000000000000000016` (Celo Mainnet) +- **L1_OPTIMISM_PORTAL**: `0xc5c5D157928BDBD2ACf6d0777626b6C75a9EAEDC` (Ethereum Mainnet) + +## Troubleshooting + +- **Proof not available**: Wait up to 1 hour after initiation +- **RPC errors**: Ensure your L1_RPC_URL is valid and accessible +- **Gas issues**: Adjust GAS_LIMIT environment variable +- **Private key format**: Ensure PK is provided without 0x prefix +- **Value format**: Ensure VALUE is in wei (not ETH) +- **Network errors**: Ensure NETWORK is set to one of: `sepolia`, `mainnet` +- **Unsupported network**: Check that you're using a supported network configuration + +## Example Withdrawal Proof + +```sh +WITHDRAWAL_PROOF="[0xf8918080808080a0231eba9c2bc1784b944714d5260873e3f92b58434c1879123d58f995b342865180a0b3b0303113429f394c506a530c83a8fdbd3125d95b2310b05191cd2dbc978aa8808080a0236e8f61ecde6abfebc6c529441f782f62469d8a2cc47b7aace2c136bd3b1ff080a06babe3fe3879f4972e397c7e516ceb2699945beb318afa0ddee8e7381796f5ff808080,0xf8518080808080a0ea006b1384a4bf0219939e5483e6e82c22d13290d5055e2042541adfb1b47ec380808080a05aa8408d8bac30771c33c39b02167ad094fff70f16e4aa667623d999d04725c9808080808080,0xe2a02005084db35fe36c140bc6d2bc4d520dafa807b5e774c7276c91658a496f59cc01]" +``` + +## Related Documentation + +For the reverse operation (depositing from L1 to L2), see the [deposit README](../deposit/README.md). diff --git a/packages/op-tooling/withdrawal/build_proof/README.md b/packages/op-tooling/withdrawal/build_proof/README.md new file mode 100644 index 00000000000..634998d7724 --- /dev/null +++ b/packages/op-tooling/withdrawal/build_proof/README.md @@ -0,0 +1,144 @@ +# Building Withdrawal Proofs + +This package builds cryptographic proofs required for L2 to L1 withdrawals on Celo's L2 networks. It uses [Viem](https://viem.sh/) to interact with both L1 and L2 networks and generates the necessary proof data for withdrawal finalization. + +## Supported Networks + +The package supports two network configurations: + +- **Sepolia**: L1 (Ethereum Sepolia) ↔ L2 (Celo Sepolia) - *Testnet* +- **Mainnet**: L1 (Ethereum Mainnet) ↔ L2 (Celo Mainnet) + +Set the `NETWORK` environment variable to specify which network to use (`sepolia` or `mainnet`). This variable is **required**. + +## Overview + +When withdrawing assets from Celo L2 to L1, a cryptographic proof must be generated to verify the withdrawal transaction on L1. This package automates the proof building process by: + +1. Fetching withdrawal transaction data from L2 +2. Waiting for the withdrawal to become provable (up to 1 hour) +3. Building the cryptographic proof using Viem's Optimism stack utilities +4. Outputting proof parameters for submission to L1 contracts + +## Installation + +Install dependencies using Yarn: + +```sh +yarn install +``` + +## Environment Variables + +The script requires the following environment variables: + +**Required:** +- `PK`: Private key of the account that initiated the withdrawal (without 0x prefix) +- `TX_HASH`: Transaction hash of the withdrawal initiation on L2 (with 0x prefix) +- `NETWORK`: Network to use (`sepolia` or `mainnet`) + + +## Usage + +To build a proof for a withdrawal, run: + +```sh +# For Sepolia (testnet) +NETWORK=sepolia PK=1234567890abcdef... TX_HASH=0x1234567890abcdef... yarn build_proof + +# For Mainnet +NETWORK=mainnet PK=1234567890abcdef... TX_HASH=0x1234567890abcdef... yarn build_proof +``` + +## Waiting Period + +**Important**: After initiating a withdrawal on L2, you must wait up to **1 hour** before the proof can be built. The script will automatically wait for the withdrawal to become provable. + +**Status Messages:** +- `waiting-to-prove`: Withdrawal is not yet ready, wait and retry +- `ready-to-prove`: Withdrawal is ready, proof will be built + +## Output Format + +The script outputs several pieces of information. **Save all data from "Prove Args"** for the next step in the withdrawal process: + +```sh +Receipt: { + transactionHash: "0x...", + blockNumber: 123456, + ... +} +Status: ready-to-prove +Output: { + outputRoot: "0x...", + timestamp: 1234567890, + l2BlockNumber: 123456, + ... +} +Withdrawal: { + nonce: 123, + sender: "0x...", + target: "0x...", + value: 1000000000000000000n, + gasLimit: 0n, + data: "0x", + withdrawalHash: "0x..." +} +Prove Args: { + l2OutputIndex: 123, + outputRootProof: { + latestBlockhash: "0x...", + messagePasserStorageRoot: "0x...", + stateRoot: "0x...", + version: "0x..." + }, + withdrawalProof: [ + "0x...", + "0x...", + "0x..." + ], + withdrawal: { + nonce: 123, + sender: "0x...", + target: "0x...", + value: 1000000000000000000n, + gasLimit: 0n, + data: "0x", + withdrawalHash: "0x..." + } +} +``` + +## Proof Data for Next Step + +The "Prove Args" output contains all the data needed for the `prove.sh` script: + +- `l2OutputIndex` → `GAME_INDEX` +- `outputRootProof.version` → `OUTPUT_ROOT_PROOF__VERSION` +- `outputRootProof.stateRoot` → `OUTPUT_ROOT_PROOF__STATE_ROOT` +- `outputRootProof.messagePasserStorageRoot` → `OUTPUT_ROOT_PROOF__MESSAGE_PASSER_STORAGE_ROOT` +- `outputRootProof.latestBlockhash` → `OUTPUT_ROOT_PROOF__LATEST_BLOCKHASH` +- `withdrawalProof` → `WITHDRAWAL_PROOF` +- `withdrawal.nonce` → `WITHDRAWAL_NONCE` +- `withdrawal.sender` → `SENDER` +- `withdrawal.target` → `RECIPIENT` +- `withdrawal.value` → `VALUE` + +## Contract Addresses + +### Network-Specific Contract Addresses + +**Sepolia (L1: Ethereum Sepolia, L2: Celo Sepolia):** +- **L2L1MessagePasser**: `0x4200000000000000000000000000000000000016` (Celo Sepolia) +- **Portal Contract**: `0x44ae3d41a335a7d05eb533029917aad35662dcc2` (Ethereum Sepolia) + +**Mainnet (L1: Ethereum Mainnet, L2: Celo Mainnet):** +- **L2L1MessagePasser**: `0x4200000000000000000000000000000000000016` (Celo Mainnet) +- **Portal Contract**: `0xc5c5D157928BDBD2ACf6d0777626b6C75a9EAEDC` (Ethereum Mainnet) + +## Troubleshooting + +- **"waiting-to-prove" status**: Wait up to 1 hour after withdrawal initiation +- **Private key format**: Ensure PK is provided without 0x prefix +- **Network errors**: Ensure NETWORK is set to one of: `sepolia`, `mainnet` +- **Unsupported network**: Check that you're using a supported network configuration diff --git a/packages/op-tooling/withdrawal/build_proof/build_proof.ts b/packages/op-tooling/withdrawal/build_proof/build_proof.ts new file mode 100644 index 00000000000..ab17a92ccff --- /dev/null +++ b/packages/op-tooling/withdrawal/build_proof/build_proof.ts @@ -0,0 +1,34 @@ +import { publicClientL1, publicClientL2, walletClientL2 } from './config.ts' +import { WITHDRAWAL_TX_HASH } from './env.ts' + +const withdrawalTxHash = WITHDRAWAL_TX_HASH +if (typeof withdrawalTxHash !== 'string' || withdrawalTxHash.startsWith('0x') === false) { + throw new Error('WITHDRAWAL_TX_HASH must be a string starting with 0x') +} + +// Wait for the initiate withdrawal transaction receipt. +const receipt = await publicClientL2.waitForTransactionReceipt({ + hash: withdrawalTxHash as `0x${string}`, +}) +console.log('Receipt:', receipt) + +const status = await publicClientL1.getWithdrawalStatus({ + receipt, + targetChain: walletClientL2.chain, +}) +console.log('Status:', status) + +// Wait until the withdrawal is ready to prove. +const { output, withdrawal } = await publicClientL1.waitToProve({ + receipt, + targetChain: walletClientL2.chain, +}) +console.log('Output:', output) +console.log('Withdrawal:', withdrawal) + +// Build parameters to prove the withdrawal on the L2. +const proveArgs = await publicClientL2.buildProveWithdrawal({ + output, + withdrawal, +}) +console.log('Prove Args:', proveArgs) diff --git a/packages/op-tooling/withdrawal/build_proof/celo.ts b/packages/op-tooling/withdrawal/build_proof/celo.ts new file mode 100644 index 00000000000..bf30eefc4e0 --- /dev/null +++ b/packages/op-tooling/withdrawal/build_proof/celo.ts @@ -0,0 +1,49 @@ +import { chainConfig } from 'viem/op-stack' +import { defineChain } from 'viem/utils' + +const sourceId = 1 // mainnet + +export const celo = /*#__PURE__*/ defineChain({ + ...chainConfig, + id: 42_220, + name: 'Celo', + nativeCurrency: { + decimals: 18, + name: 'CELO', + symbol: 'CELO', + }, + rpcUrls: { + default: { http: ['https://forno.celo.org'] }, + }, + blockExplorers: { + default: { + name: 'Celo Explorer', + url: 'https://celoscan.io', + apiUrl: 'https://api.celoscan.io/api', + }, + }, + contracts: { + ...chainConfig.contracts, + multicall3: { + address: '0xcA11bde05977b3631167028862bE2a173976CA11', + blockCreated: 13112599, + }, + portal: { + [sourceId]: { + address: '0xc5c5D157928BDBD2ACf6d0777626b6C75a9EAEDC', + }, + }, + disputeGameFactory: { + [sourceId]: { + address: '0xFbAC162162f4009Bb007C6DeBC36B1dAC10aF683', + }, + }, + l1StandardBridge: { + [sourceId]: { + address: '0x9C4955b92F34148dbcfDCD82e9c9eCe5CF2badfe', + }, + }, + }, + sourceId, + testnet: false, +}) diff --git a/packages/op-tooling/withdrawal/build_proof/config.ts b/packages/op-tooling/withdrawal/build_proof/config.ts new file mode 100644 index 00000000000..f43742b9765 --- /dev/null +++ b/packages/op-tooling/withdrawal/build_proof/config.ts @@ -0,0 +1,72 @@ +import { privateKeyToAccount } from 'viem/accounts' +import { createPublicClient, createWalletClient, http } from 'viem' +import { sepolia, mainnet, celoSepolia } from 'viem/chains' +import { publicActionsL1, publicActionsL2, walletActionsL1, walletActionsL2 } from 'viem/op-stack' +import { PRIVATE_KEY, NETWORK } from './env.ts' +import { celo } from './celo.ts' + +export const account = privateKeyToAccount(`0x${PRIVATE_KEY}`) + +let l1: { public: any; wallet: any } = { + public: undefined, + wallet: undefined, +} +if (NETWORK === 'sepolia') { + l1.public = createPublicClient({ + chain: sepolia, + transport: http(), + }).extend(publicActionsL1()) + + l1.wallet = createWalletClient({ + account, + chain: sepolia, + transport: http(), + }).extend(walletActionsL1()) +} else if (NETWORK === 'mainnet') { + l1.public = createPublicClient({ + chain: mainnet, + transport: http(), + }).extend(publicActionsL1()) + + l1.wallet = createWalletClient({ + account, + chain: mainnet, + transport: http(), + }).extend(walletActionsL1()) +} else { + throw new Error('Unsupported network. Supported networks are: sepolia, mainnet') +} +export const publicClientL1 = l1.public +export const walletClientL1 = l1.wallet + +let l2: { public: any; wallet: any } = { + public: undefined, + wallet: undefined, +} +if (NETWORK === 'sepolia') { + l2.public = createPublicClient({ + chain: celoSepolia, + transport: http(), + }).extend(publicActionsL2()) + + l2.wallet = createWalletClient({ + account, + chain: celoSepolia, + transport: http(), + }).extend(walletActionsL2()) +} else if (NETWORK === 'mainnet') { + l2.public = createPublicClient({ + chain: celo, + transport: http(), + }).extend(publicActionsL2()) + + l2.wallet = createWalletClient({ + account, + chain: celo, + transport: http(), + }).extend(walletActionsL2()) +} else { + throw new Error('Unsupported network. Supported networks are: sepolia, mainnet') +} +export const publicClientL2 = l2.public +export const walletClientL2 = l2.wallet diff --git a/packages/op-tooling/withdrawal/build_proof/env.ts b/packages/op-tooling/withdrawal/build_proof/env.ts new file mode 100644 index 00000000000..97ee34bbb67 --- /dev/null +++ b/packages/op-tooling/withdrawal/build_proof/env.ts @@ -0,0 +1,6 @@ +import dotenv from 'dotenv' +dotenv.config() + +export const PRIVATE_KEY = process.env.PK +export const WITHDRAWAL_TX_HASH = process.env.TX_HASH +export const NETWORK = process.env.NETWORK diff --git a/packages/op-tooling/withdrawal/build_proof/package.json b/packages/op-tooling/withdrawal/build_proof/package.json new file mode 100644 index 00000000000..07d980319a4 --- /dev/null +++ b/packages/op-tooling/withdrawal/build_proof/package.json @@ -0,0 +1,18 @@ +{ + "name": "build-withdrawal-proof", + "type": "module", + "version": "1.0.0", + "main": "build_proof.ts", + "license": "MIT", + "dependencies": { + "@types/node": "^24.1.0", + "dotenv": "^17.2.1", + "ts-node": "^10.9.2", + "tsx": "^4.7.0", + "typescript": "^5.8.3", + "viem": "^2.33.1" + }, + "scripts": { + "build_proof": "tsx build_proof.ts" + } +} diff --git a/packages/op-tooling/withdrawal/build_proof/tsconfig.json b/packages/op-tooling/withdrawal/build_proof/tsconfig.json new file mode 100644 index 00000000000..860287f8206 --- /dev/null +++ b/packages/op-tooling/withdrawal/build_proof/tsconfig.json @@ -0,0 +1,17 @@ +{ + "compilerOptions": { + "allowImportingTsExtensions": true, + "noEmit": true, + "module": "es2022", + "target": "es2017", + "moduleResolution": "node", + "esModuleInterop": true, + "allowSyntheticDefaultImports": true, + "strict": true, + "skipLibCheck": true, + "forceConsistentCasingInFileNames": true + }, + "ts-node": { + "esm": true + } +} diff --git a/packages/op-tooling/withdrawal/build_proof/yarn.lock b/packages/op-tooling/withdrawal/build_proof/yarn.lock new file mode 100644 index 00000000000..7b00fcf4dd5 --- /dev/null +++ b/packages/op-tooling/withdrawal/build_proof/yarn.lock @@ -0,0 +1,419 @@ +# THIS IS AN AUTOGENERATED FILE. DO NOT EDIT THIS FILE DIRECTLY. +# yarn lockfile v1 + + +"@adraffy/ens-normalize@^1.11.0": + version "1.11.0" + resolved "https://registry.yarnpkg.com/@adraffy/ens-normalize/-/ens-normalize-1.11.0.tgz#42cc67c5baa407ac25059fcd7d405cc5ecdb0c33" + integrity sha512-/3DDPKHqqIqxUULp8yP4zODUY1i+2xvVWsv8A79xGWdCAG+8sb0hRh0Rk2QyOJUnnbyPUAZYcpBuRe3nS2OIUg== + +"@cspotcode/source-map-support@^0.8.0": + version "0.8.1" + resolved "https://registry.yarnpkg.com/@cspotcode/source-map-support/-/source-map-support-0.8.1.tgz#00629c35a688e05a88b1cda684fb9d5e73f000a1" + integrity sha512-IchNf6dN4tHoMFIn/7OE8LWZ19Y6q/67Bmf6vnGREv8RSbBVb9LPJxEcnwrcwX6ixSvaiGoomAUvu4YSxXrVgw== + dependencies: + "@jridgewell/trace-mapping" "0.3.9" + +"@esbuild/aix-ppc64@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/aix-ppc64/-/aix-ppc64-0.25.11.tgz#2ae33300598132cc4cf580dbbb28d30fed3c5c49" + integrity sha512-Xt1dOL13m8u0WE8iplx9Ibbm+hFAO0GsU2P34UNoDGvZYkY8ifSiy6Zuc1lYxfG7svWE2fzqCUmFp5HCn51gJg== + +"@esbuild/android-arm64@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/android-arm64/-/android-arm64-0.25.11.tgz#927708b3db5d739d6cb7709136924cc81bec9b03" + integrity sha512-9slpyFBc4FPPz48+f6jyiXOx/Y4v34TUeDDXJpZqAWQn/08lKGeD8aDp9TMn9jDz2CiEuHwfhRmGBvpnd/PWIQ== + +"@esbuild/android-arm@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/android-arm/-/android-arm-0.25.11.tgz#571f94e7f4068957ec4c2cfb907deae3d01b55ae" + integrity sha512-uoa7dU+Dt3HYsethkJ1k6Z9YdcHjTrSb5NUy66ZfZaSV8hEYGD5ZHbEMXnqLFlbBflLsl89Zke7CAdDJ4JI+Gg== + +"@esbuild/android-x64@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/android-x64/-/android-x64-0.25.11.tgz#8a3bf5cae6c560c7ececa3150b2bde76e0fb81e6" + integrity sha512-Sgiab4xBjPU1QoPEIqS3Xx+R2lezu0LKIEcYe6pftr56PqPygbB7+szVnzoShbx64MUupqoE0KyRlN7gezbl8g== + +"@esbuild/darwin-arm64@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/darwin-arm64/-/darwin-arm64-0.25.11.tgz#0a678c4ac4bf8717e67481e1a797e6c152f93c84" + integrity sha512-VekY0PBCukppoQrycFxUqkCojnTQhdec0vevUL/EDOCnXd9LKWqD/bHwMPzigIJXPhC59Vd1WFIL57SKs2mg4w== + +"@esbuild/darwin-x64@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/darwin-x64/-/darwin-x64-0.25.11.tgz#70f5e925a30c8309f1294d407a5e5e002e0315fe" + integrity sha512-+hfp3yfBalNEpTGp9loYgbknjR695HkqtY3d3/JjSRUyPg/xd6q+mQqIb5qdywnDxRZykIHs3axEqU6l1+oWEQ== + +"@esbuild/freebsd-arm64@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/freebsd-arm64/-/freebsd-arm64-0.25.11.tgz#4ec1db687c5b2b78b44148025da9632397553e8a" + integrity sha512-CmKjrnayyTJF2eVuO//uSjl/K3KsMIeYeyN7FyDBjsR3lnSJHaXlVoAK8DZa7lXWChbuOk7NjAc7ygAwrnPBhA== + +"@esbuild/freebsd-x64@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/freebsd-x64/-/freebsd-x64-0.25.11.tgz#4c81abd1b142f1e9acfef8c5153d438ca53f44bb" + integrity sha512-Dyq+5oscTJvMaYPvW3x3FLpi2+gSZTCE/1ffdwuM6G1ARang/mb3jvjxs0mw6n3Lsw84ocfo9CrNMqc5lTfGOw== + +"@esbuild/linux-arm64@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/linux-arm64/-/linux-arm64-0.25.11.tgz#69517a111acfc2b93aa0fb5eaeb834c0202ccda5" + integrity sha512-Qr8AzcplUhGvdyUF08A1kHU3Vr2O88xxP0Tm8GcdVOUm25XYcMPp2YqSVHbLuXzYQMf9Bh/iKx7YPqECs6ffLA== + +"@esbuild/linux-arm@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/linux-arm/-/linux-arm-0.25.11.tgz#58dac26eae2dba0fac5405052b9002dac088d38f" + integrity sha512-TBMv6B4kCfrGJ8cUPo7vd6NECZH/8hPpBHHlYI3qzoYFvWu2AdTvZNuU/7hsbKWqu/COU7NIK12dHAAqBLLXgw== + +"@esbuild/linux-ia32@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/linux-ia32/-/linux-ia32-0.25.11.tgz#b89d4efe9bdad46ba944f0f3b8ddd40834268c2b" + integrity sha512-TmnJg8BMGPehs5JKrCLqyWTVAvielc615jbkOirATQvWWB1NMXY77oLMzsUjRLa0+ngecEmDGqt5jiDC6bfvOw== + +"@esbuild/linux-loong64@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/linux-loong64/-/linux-loong64-0.25.11.tgz#11f603cb60ad14392c3f5c94d64b3cc8b630fbeb" + integrity sha512-DIGXL2+gvDaXlaq8xruNXUJdT5tF+SBbJQKbWy/0J7OhU8gOHOzKmGIlfTTl6nHaCOoipxQbuJi7O++ldrxgMw== + +"@esbuild/linux-mips64el@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/linux-mips64el/-/linux-mips64el-0.25.11.tgz#b7d447ff0676b8ab247d69dac40a5cf08e5eeaf5" + integrity sha512-Osx1nALUJu4pU43o9OyjSCXokFkFbyzjXb6VhGIJZQ5JZi8ylCQ9/LFagolPsHtgw6himDSyb5ETSfmp4rpiKQ== + +"@esbuild/linux-ppc64@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/linux-ppc64/-/linux-ppc64-0.25.11.tgz#b3a28ed7cc252a61b07ff7c8fd8a984ffd3a2f74" + integrity sha512-nbLFgsQQEsBa8XSgSTSlrnBSrpoWh7ioFDUmwo158gIm5NNP+17IYmNWzaIzWmgCxq56vfr34xGkOcZ7jX6CPw== + +"@esbuild/linux-riscv64@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/linux-riscv64/-/linux-riscv64-0.25.11.tgz#ce75b08f7d871a75edcf4d2125f50b21dc9dc273" + integrity sha512-HfyAmqZi9uBAbgKYP1yGuI7tSREXwIb438q0nqvlpxAOs3XnZ8RsisRfmVsgV486NdjD7Mw2UrFSw51lzUk1ww== + +"@esbuild/linux-s390x@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/linux-s390x/-/linux-s390x-0.25.11.tgz#cd08f6c73b6b6ff9ccdaabbd3ff6ad3dca99c263" + integrity sha512-HjLqVgSSYnVXRisyfmzsH6mXqyvj0SA7pG5g+9W7ESgwA70AXYNpfKBqh1KbTxmQVaYxpzA/SvlB9oclGPbApw== + +"@esbuild/linux-x64@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/linux-x64/-/linux-x64-0.25.11.tgz#3c3718af31a95d8946ebd3c32bb1e699bdf74910" + integrity sha512-HSFAT4+WYjIhrHxKBwGmOOSpphjYkcswF449j6EjsjbinTZbp8PJtjsVK1XFJStdzXdy/jaddAep2FGY+wyFAQ== + +"@esbuild/netbsd-arm64@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/netbsd-arm64/-/netbsd-arm64-0.25.11.tgz#b4c767082401e3a4e8595fe53c47cd7f097c8077" + integrity sha512-hr9Oxj1Fa4r04dNpWr3P8QKVVsjQhqrMSUzZzf+LZcYjZNqhA3IAfPQdEh1FLVUJSiu6sgAwp3OmwBfbFgG2Xg== + +"@esbuild/netbsd-x64@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/netbsd-x64/-/netbsd-x64-0.25.11.tgz#f2a930458ed2941d1f11ebc34b9c7d61f7a4d034" + integrity sha512-u7tKA+qbzBydyj0vgpu+5h5AeudxOAGncb8N6C9Kh1N4n7wU1Xw1JDApsRjpShRpXRQlJLb9wY28ELpwdPcZ7A== + +"@esbuild/openbsd-arm64@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/openbsd-arm64/-/openbsd-arm64-0.25.11.tgz#b4ae93c75aec48bc1e8a0154957a05f0641f2dad" + integrity sha512-Qq6YHhayieor3DxFOoYM1q0q1uMFYb7cSpLD2qzDSvK1NAvqFi8Xgivv0cFC6J+hWVw2teCYltyy9/m/14ryHg== + +"@esbuild/openbsd-x64@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/openbsd-x64/-/openbsd-x64-0.25.11.tgz#b42863959c8dcf9b01581522e40012d2c70045e2" + integrity sha512-CN+7c++kkbrckTOz5hrehxWN7uIhFFlmS/hqziSFVWpAzpWrQoAG4chH+nN3Be+Kzv/uuo7zhX716x3Sn2Jduw== + +"@esbuild/openharmony-arm64@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/openharmony-arm64/-/openharmony-arm64-0.25.11.tgz#b2e717141c8fdf6bddd4010f0912e6b39e1640f1" + integrity sha512-rOREuNIQgaiR+9QuNkbkxubbp8MSO9rONmwP5nKncnWJ9v5jQ4JxFnLu4zDSRPf3x4u+2VN4pM4RdyIzDty/wQ== + +"@esbuild/sunos-x64@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/sunos-x64/-/sunos-x64-0.25.11.tgz#9fbea1febe8778927804828883ec0f6dd80eb244" + integrity sha512-nq2xdYaWxyg9DcIyXkZhcYulC6pQ2FuCgem3LI92IwMgIZ69KHeY8T4Y88pcwoLIjbed8n36CyKoYRDygNSGhA== + +"@esbuild/win32-arm64@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/win32-arm64/-/win32-arm64-0.25.11.tgz#501539cedb24468336073383989a7323005a8935" + integrity sha512-3XxECOWJq1qMZ3MN8srCJ/QfoLpL+VaxD/WfNRm1O3B4+AZ/BnLVgFbUV3eiRYDMXetciH16dwPbbHqwe1uU0Q== + +"@esbuild/win32-ia32@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/win32-ia32/-/win32-ia32-0.25.11.tgz#8ac7229aa82cef8f16ffb58f1176a973a7a15343" + integrity sha512-3ukss6gb9XZ8TlRyJlgLn17ecsK4NSQTmdIXRASVsiS2sQ6zPPZklNJT5GR5tE/MUarymmy8kCEf5xPCNCqVOA== + +"@esbuild/win32-x64@0.25.11": + version "0.25.11" + resolved "https://registry.yarnpkg.com/@esbuild/win32-x64/-/win32-x64-0.25.11.tgz#5ecda6f3fe138b7e456f4e429edde33c823f392f" + integrity sha512-D7Hpz6A2L4hzsRpPaCYkQnGOotdUpDzSGRIv9I+1ITdHROSFUWW95ZPZWQmGka1Fg7W3zFJowyn9WGwMJ0+KPA== + +"@jridgewell/resolve-uri@^3.0.3": + version "3.1.2" + resolved "https://registry.yarnpkg.com/@jridgewell/resolve-uri/-/resolve-uri-3.1.2.tgz#7a0ee601f60f99a20c7c7c5ff0c80388c1189bd6" + integrity sha512-bRISgCIjP20/tbWSPWMEi54QVPRZExkuD9lJL+UIxUKtwVJA8wW1Trb1jMs1RFXo1CBTNZ/5hpC9QvmKWdopKw== + +"@jridgewell/sourcemap-codec@^1.4.10": + version "1.5.4" + resolved "https://registry.yarnpkg.com/@jridgewell/sourcemap-codec/-/sourcemap-codec-1.5.4.tgz#7358043433b2e5da569aa02cbc4c121da3af27d7" + integrity sha512-VT2+G1VQs/9oz078bLrYbecdZKs912zQlkelYpuf+SXF+QvZDYJlbx/LSx+meSAwdDFnF8FVXW92AVjjkVmgFw== + +"@jridgewell/trace-mapping@0.3.9": + version "0.3.9" + resolved "https://registry.yarnpkg.com/@jridgewell/trace-mapping/-/trace-mapping-0.3.9.tgz#6534fd5933a53ba7cbf3a17615e273a0d1273ff9" + integrity sha512-3Belt6tdc8bPgAtbcmdtNJlirVoTmEb5e2gC94PnkwEW9jI6CAHUeoG85tjWP5WquqfavoMtMwiG4P926ZKKuQ== + dependencies: + "@jridgewell/resolve-uri" "^3.0.3" + "@jridgewell/sourcemap-codec" "^1.4.10" + +"@noble/ciphers@^1.3.0": + version "1.3.0" + resolved "https://registry.yarnpkg.com/@noble/ciphers/-/ciphers-1.3.0.tgz#f64b8ff886c240e644e5573c097f86e5b43676dc" + integrity sha512-2I0gnIVPtfnMw9ee9h1dJG7tp81+8Ob3OJb3Mv37rx5L40/b0i7djjCVvGOVqc9AEIQyvyu1i6ypKdFw8R8gQw== + +"@noble/curves@1.9.2": + version "1.9.2" + resolved "https://registry.yarnpkg.com/@noble/curves/-/curves-1.9.2.tgz#73388356ce733922396214a933ff7c95afcef911" + integrity sha512-HxngEd2XUcg9xi20JkwlLCtYwfoFw4JGkuZpT+WlsPD4gB/cxkvTD8fSsoAnphGZhFdZYKeQIPCuFlWPm1uE0g== + dependencies: + "@noble/hashes" "1.8.0" + +"@noble/curves@^1.9.1", "@noble/curves@~1.9.0": + version "1.9.6" + resolved "https://registry.yarnpkg.com/@noble/curves/-/curves-1.9.6.tgz#b45ebedca85bb75782f6be7e7f120f0c423c99e0" + integrity sha512-GIKz/j99FRthB8icyJQA51E8Uk5hXmdyThjgQXRKiv9h0zeRlzSCLIzFw6K1LotZ3XuB7yzlf76qk7uBmTdFqA== + dependencies: + "@noble/hashes" "1.8.0" + +"@noble/hashes@1.8.0", "@noble/hashes@^1.8.0", "@noble/hashes@~1.8.0": + version "1.8.0" + resolved "https://registry.yarnpkg.com/@noble/hashes/-/hashes-1.8.0.tgz#cee43d801fcef9644b11b8194857695acd5f815a" + integrity sha512-jCs9ldd7NwzpgXDIf6P3+NrHh9/sD6CQdxHyjQI+h/6rDNo88ypBxxz45UDuZHz9r3tNz7N/VInSVoVdtXEI4A== + +"@scure/base@~1.2.5": + version "1.2.6" + resolved "https://registry.yarnpkg.com/@scure/base/-/base-1.2.6.tgz#ca917184b8231394dd8847509c67a0be522e59f6" + integrity sha512-g/nm5FgUa//MCj1gV09zTJTaM6KBAHqLN907YVQqf7zC49+DcO4B1so4ZX07Ef10Twr6nuqYEH9GEggFXA4Fmg== + +"@scure/bip32@1.7.0", "@scure/bip32@^1.7.0": + version "1.7.0" + resolved "https://registry.yarnpkg.com/@scure/bip32/-/bip32-1.7.0.tgz#b8683bab172369f988f1589640e53c4606984219" + integrity sha512-E4FFX/N3f4B80AKWp5dP6ow+flD1LQZo/w8UnLGYZO674jS6YnYeepycOOksv+vLPSpgN35wgKgy+ybfTb2SMw== + dependencies: + "@noble/curves" "~1.9.0" + "@noble/hashes" "~1.8.0" + "@scure/base" "~1.2.5" + +"@scure/bip39@1.6.0", "@scure/bip39@^1.6.0": + version "1.6.0" + resolved "https://registry.yarnpkg.com/@scure/bip39/-/bip39-1.6.0.tgz#475970ace440d7be87a6086cbee77cb8f1a684f9" + integrity sha512-+lF0BbLiJNwVlev4eKelw1WWLaiKXw7sSl8T6FvBlWkdX+94aGJ4o8XjUdlyhTCjd8c+B3KT3JfS8P0bLRNU6A== + dependencies: + "@noble/hashes" "~1.8.0" + "@scure/base" "~1.2.5" + +"@tsconfig/node10@^1.0.7": + version "1.0.11" + resolved "https://registry.yarnpkg.com/@tsconfig/node10/-/node10-1.0.11.tgz#6ee46400685f130e278128c7b38b7e031ff5b2f2" + integrity sha512-DcRjDCujK/kCk/cUe8Xz8ZSpm8mS3mNNpta+jGCA6USEDfktlNvm1+IuZ9eTcDbNk41BHwpHHeW+N1lKCz4zOw== + +"@tsconfig/node12@^1.0.7": + version "1.0.11" + resolved "https://registry.yarnpkg.com/@tsconfig/node12/-/node12-1.0.11.tgz#ee3def1f27d9ed66dac6e46a295cffb0152e058d" + integrity sha512-cqefuRsh12pWyGsIoBKJA9luFu3mRxCA+ORZvA4ktLSzIuCUtWVxGIuXigEwO5/ywWFMZ2QEGKWvkZG1zDMTag== + +"@tsconfig/node14@^1.0.0": + version "1.0.3" + resolved "https://registry.yarnpkg.com/@tsconfig/node14/-/node14-1.0.3.tgz#e4386316284f00b98435bf40f72f75a09dabf6c1" + integrity sha512-ysT8mhdixWK6Hw3i1V2AeRqZ5WfXg1G43mqoYlM2nc6388Fq5jcXyr5mRsqViLx/GJYdoL0bfXD8nmF+Zn/Iow== + +"@tsconfig/node16@^1.0.2": + version "1.0.4" + resolved "https://registry.yarnpkg.com/@tsconfig/node16/-/node16-1.0.4.tgz#0b92dcc0cc1c81f6f306a381f28e31b1a56536e9" + integrity sha512-vxhUy4J8lyeyinH7Azl1pdd43GJhZH/tP2weN8TntQblOY+A0XbT8DJk1/oCPuOOyg/Ja757rG0CgHcWC8OfMA== + +"@types/node@^24.1.0": + version "24.2.0" + resolved "https://registry.yarnpkg.com/@types/node/-/node-24.2.0.tgz#cde712f88c5190006d6b069232582ecd1f94a760" + integrity sha512-3xyG3pMCq3oYCNg7/ZP+E1ooTaGB4cG8JWRsqqOYQdbWNY4zbaV0Ennrd7stjiJEFZCaybcIgpTjJWHRfBSIDw== + dependencies: + undici-types "~7.10.0" + +abitype@1.0.8, abitype@^1.0.8: + version "1.0.8" + resolved "https://registry.yarnpkg.com/abitype/-/abitype-1.0.8.tgz#3554f28b2e9d6e9f35eb59878193eabd1b9f46ba" + integrity sha512-ZeiI6h3GnW06uYDLx0etQtX/p8E24UaHHBj57RSjK7YBFe7iuVn07EDpOeP451D06sF27VOz9JJPlIKJmXgkEg== + +acorn-walk@^8.1.1: + version "8.3.4" + resolved "https://registry.yarnpkg.com/acorn-walk/-/acorn-walk-8.3.4.tgz#794dd169c3977edf4ba4ea47583587c5866236b7" + integrity sha512-ueEepnujpqee2o5aIYnvHU6C0A42MNdsIDeqy5BydrkuC5R1ZuUFnm27EeFJGoEHJQgn3uleRvmTXaJgfXbt4g== + dependencies: + acorn "^8.11.0" + +acorn@^8.11.0, acorn@^8.4.1: + version "8.15.0" + resolved "https://registry.yarnpkg.com/acorn/-/acorn-8.15.0.tgz#a360898bc415edaac46c8241f6383975b930b816" + integrity sha512-NZyJarBfL7nWwIq+FDL6Zp/yHEhePMNnnJ0y3qfieCrmNvYct8uvtiV41UvlSe6apAfk0fY1FbWx+NwfmpvtTg== + +arg@^4.1.0: + version "4.1.3" + resolved "https://registry.yarnpkg.com/arg/-/arg-4.1.3.tgz#269fc7ad5b8e42cb63c896d5666017261c144089" + integrity sha512-58S9QDqG0Xx27YwPSt9fJxivjYl432YCwfDMfZ+71RAqUrZef7LrKQZ3LHLOwCS4FLNBplP533Zx895SeOCHvA== + +create-require@^1.1.0: + version "1.1.1" + resolved "https://registry.yarnpkg.com/create-require/-/create-require-1.1.1.tgz#c1d7e8f1e5f6cfc9ff65f9cd352d37348756c333" + integrity sha512-dcKFX3jn0MpIaXjisoRvexIJVEKzaq7z2rZKxf+MSr9TkdmHmsU4m2lcLojrj/FHl8mk5VxMmYA+ftRkP/3oKQ== + +diff@^4.0.1: + version "4.0.2" + resolved "https://registry.yarnpkg.com/diff/-/diff-4.0.2.tgz#60f3aecb89d5fae520c11aa19efc2bb982aade7d" + integrity sha512-58lmxKSA4BNyLz+HHMUzlOEpg09FV+ev6ZMe3vJihgdxzgcwZ8VoEEPmALCZG9LmqfVoNMMKpttIYTVG6uDY7A== + +dotenv@^17.2.1: + version "17.2.1" + resolved "https://registry.yarnpkg.com/dotenv/-/dotenv-17.2.1.tgz#6f32e10faf014883515538dc922a0fb8765d9b32" + integrity sha512-kQhDYKZecqnM0fCnzI5eIv5L4cAe/iRI+HqMbO/hbRdTAeXDG+M9FjipUxNfbARuEg4iHIbhnhs78BCHNbSxEQ== + +esbuild@~0.25.0: + version "0.25.11" + resolved "https://registry.yarnpkg.com/esbuild/-/esbuild-0.25.11.tgz#0f31b82f335652580f75ef6897bba81962d9ae3d" + integrity sha512-KohQwyzrKTQmhXDW1PjCv3Tyspn9n5GcY2RTDqeORIdIJY8yKIF7sTSopFmn/wpMPW4rdPXI0UE5LJLuq3bx0Q== + optionalDependencies: + "@esbuild/aix-ppc64" "0.25.11" + "@esbuild/android-arm" "0.25.11" + "@esbuild/android-arm64" "0.25.11" + "@esbuild/android-x64" "0.25.11" + "@esbuild/darwin-arm64" "0.25.11" + "@esbuild/darwin-x64" "0.25.11" + "@esbuild/freebsd-arm64" "0.25.11" + "@esbuild/freebsd-x64" "0.25.11" + "@esbuild/linux-arm" "0.25.11" + "@esbuild/linux-arm64" "0.25.11" + "@esbuild/linux-ia32" "0.25.11" + "@esbuild/linux-loong64" "0.25.11" + "@esbuild/linux-mips64el" "0.25.11" + "@esbuild/linux-ppc64" "0.25.11" + "@esbuild/linux-riscv64" "0.25.11" + "@esbuild/linux-s390x" "0.25.11" + "@esbuild/linux-x64" "0.25.11" + "@esbuild/netbsd-arm64" "0.25.11" + "@esbuild/netbsd-x64" "0.25.11" + "@esbuild/openbsd-arm64" "0.25.11" + "@esbuild/openbsd-x64" "0.25.11" + "@esbuild/openharmony-arm64" "0.25.11" + "@esbuild/sunos-x64" "0.25.11" + "@esbuild/win32-arm64" "0.25.11" + "@esbuild/win32-ia32" "0.25.11" + "@esbuild/win32-x64" "0.25.11" + +eventemitter3@5.0.1: + version "5.0.1" + resolved "https://registry.yarnpkg.com/eventemitter3/-/eventemitter3-5.0.1.tgz#53f5ffd0a492ac800721bb42c66b841de96423c4" + integrity sha512-GWkBvjiSZK87ELrYOSESUYeVIc9mvLLf/nXalMOS5dYrgZq9o5OVkbZAVM06CVxYsCwH9BDZFPlQTlPA1j4ahA== + +fsevents@~2.3.3: + version "2.3.3" + resolved "https://registry.yarnpkg.com/fsevents/-/fsevents-2.3.3.tgz#cac6407785d03675a2a5e1a5305c697b347d90d6" + integrity sha512-5xoDfX+fL7faATnagmWPpbFtwh/R77WmMMqqHGS65C3vvB0YHrgF+B1YmZ3441tMj5n63k0212XNoJwzlhffQw== + +get-tsconfig@^4.7.5: + version "4.13.0" + resolved "https://registry.yarnpkg.com/get-tsconfig/-/get-tsconfig-4.13.0.tgz#fcdd991e6d22ab9a600f00e91c318707a5d9a0d7" + integrity sha512-1VKTZJCwBrvbd+Wn3AOgQP/2Av+TfTCOlE4AcRJE72W1ksZXbAx8PPBR9RzgTeSPzlPMHrbANMH3LbltH73wxQ== + dependencies: + resolve-pkg-maps "^1.0.0" + +isows@1.0.7: + version "1.0.7" + resolved "https://registry.yarnpkg.com/isows/-/isows-1.0.7.tgz#1c06400b7eed216fbba3bcbd68f12490fc342915" + integrity sha512-I1fSfDCZL5P0v33sVqeTDSpcstAg/N+wF5HS033mogOVIp4B+oHC7oOCsA3axAbBSGTJ8QubbNmnIRN/h8U7hg== + +make-error@^1.1.1: + version "1.3.6" + resolved "https://registry.yarnpkg.com/make-error/-/make-error-1.3.6.tgz#2eb2e37ea9b67c4891f684a1394799af484cf7a2" + integrity sha512-s8UhlNe7vPKomQhC1qFelMokr/Sc3AgNbso3n74mVPA5LTZwkB9NlXf4XPamLxJE8h0gh73rM94xvwRT2CVInw== + +ox@0.8.6: + version "0.8.6" + resolved "https://registry.yarnpkg.com/ox/-/ox-0.8.6.tgz#7dd666216ee8cda2eb2e5fef3fe4cb20dec3dcad" + integrity sha512-eiKcgiVVEGDtEpEdFi1EGoVVI48j6icXHce9nFwCNM7CKG3uoCXKdr4TPhS00Iy1TR2aWSF1ltPD0x/YgqIL9w== + dependencies: + "@adraffy/ens-normalize" "^1.11.0" + "@noble/ciphers" "^1.3.0" + "@noble/curves" "^1.9.1" + "@noble/hashes" "^1.8.0" + "@scure/bip32" "^1.7.0" + "@scure/bip39" "^1.6.0" + abitype "^1.0.8" + eventemitter3 "5.0.1" + +resolve-pkg-maps@^1.0.0: + version "1.0.0" + resolved "https://registry.yarnpkg.com/resolve-pkg-maps/-/resolve-pkg-maps-1.0.0.tgz#616b3dc2c57056b5588c31cdf4b3d64db133720f" + integrity sha512-seS2Tj26TBVOC2NIc2rOe2y2ZO7efxITtLZcGSOnHHNOQ7CkiUBfw0Iw2ck6xkIhPwLhKNLS8BO+hEpngQlqzw== + +ts-node@^10.9.2: + version "10.9.2" + resolved "https://registry.yarnpkg.com/ts-node/-/ts-node-10.9.2.tgz#70f021c9e185bccdca820e26dc413805c101c71f" + integrity sha512-f0FFpIdcHgn8zcPSbf1dRevwt047YMnaiJM3u2w2RewrB+fob/zePZcrOyQoLMMO7aBIddLcQIEK5dYjkLnGrQ== + dependencies: + "@cspotcode/source-map-support" "^0.8.0" + "@tsconfig/node10" "^1.0.7" + "@tsconfig/node12" "^1.0.7" + "@tsconfig/node14" "^1.0.0" + "@tsconfig/node16" "^1.0.2" + acorn "^8.4.1" + acorn-walk "^8.1.1" + arg "^4.1.0" + create-require "^1.1.0" + diff "^4.0.1" + make-error "^1.1.1" + v8-compile-cache-lib "^3.0.1" + yn "3.1.1" + +tsx@^4.7.0: + version "4.20.6" + resolved "https://registry.yarnpkg.com/tsx/-/tsx-4.20.6.tgz#8fb803fd9c1f70e8ccc93b5d7c5e03c3979ccb2e" + integrity sha512-ytQKuwgmrrkDTFP4LjR0ToE2nqgy886GpvRSpU0JAnrdBYppuY5rLkRUYPU1yCryb24SsKBTL/hlDQAEFVwtZg== + dependencies: + esbuild "~0.25.0" + get-tsconfig "^4.7.5" + optionalDependencies: + fsevents "~2.3.3" + +typescript@^5.8.3: + version "5.9.2" + resolved "https://registry.yarnpkg.com/typescript/-/typescript-5.9.2.tgz#d93450cddec5154a2d5cabe3b8102b83316fb2a6" + integrity sha512-CWBzXQrc/qOkhidw1OzBTQuYRbfyxDXJMVJ1XNwUHGROVmuaeiEm3OslpZ1RV96d7SKKjZKrSJu3+t/xlw3R9A== + +undici-types@~7.10.0: + version "7.10.0" + resolved "https://registry.yarnpkg.com/undici-types/-/undici-types-7.10.0.tgz#4ac2e058ce56b462b056e629cc6a02393d3ff350" + integrity sha512-t5Fy/nfn+14LuOc2KNYg75vZqClpAiqscVvMygNnlsHBFpSXdJaYtXMcdNLpl/Qvc3P2cB3s6lOV51nqsFq4ag== + +v8-compile-cache-lib@^3.0.1: + version "3.0.1" + resolved "https://registry.yarnpkg.com/v8-compile-cache-lib/-/v8-compile-cache-lib-3.0.1.tgz#6336e8d71965cb3d35a1bbb7868445a7c05264bf" + integrity sha512-wa7YjyUGfNZngI/vtK0UHAN+lgDCxBPCylVXGp0zu59Fz5aiGtNXaq3DhIov063MorB+VfufLh3JlF2KdTK3xg== + +viem@^2.33.1: + version "2.33.2" + resolved "https://registry.yarnpkg.com/viem/-/viem-2.33.2.tgz#039732775976efb821a2f7c6c6d4cd6029f5af4f" + integrity sha512-/720OaM4dHWs8vXwNpyet+PRERhPaW+n/1UVSCzyb9jkmwwVfaiy/R6YfCFb4v+XXbo8s3Fapa3DM5yCRSkulA== + dependencies: + "@noble/curves" "1.9.2" + "@noble/hashes" "1.8.0" + "@scure/bip32" "1.7.0" + "@scure/bip39" "1.6.0" + abitype "1.0.8" + isows "1.0.7" + ox "0.8.6" + ws "8.18.2" + +ws@8.18.2: + version "8.18.2" + resolved "https://registry.yarnpkg.com/ws/-/ws-8.18.2.tgz#42738b2be57ced85f46154320aabb51ab003705a" + integrity sha512-DMricUmwGZUVr++AEAe2uiVM7UoO9MAVZMDu05UQOaUII0lp+zOzLLU4Xqh/JvTqklB1T4uELaaPBKyjE1r4fQ== + +yn@3.1.1: + version "3.1.1" + resolved "https://registry.yarnpkg.com/yn/-/yn-3.1.1.tgz#1e87401a09d767c1d5eab26a6e4c185182d2eb50" + integrity sha512-Ux4ygGWsu2c7isFWe8Yu1YluJmqVhxqK2cLXNQA5AcC3QfbGNpM7fu0Y8b/z16pXLnFxZYvWhd3fhBY9DLmC6Q== diff --git a/packages/op-tooling/withdrawal/check.sh b/packages/op-tooling/withdrawal/check.sh new file mode 100755 index 00000000000..60735f51630 --- /dev/null +++ b/packages/op-tooling/withdrawal/check.sh @@ -0,0 +1,29 @@ +#!/bin/bash +set -euo pipefail + +# Determine network +NETWORK=${NETWORK:-}; [ -z "${NETWORK:-}" ] && echo "Need to set the NETWORK via env" && exit 1; +case $NETWORK in + sepolia) + L1_OPTIMISM_PORTAL=0x44ae3d41a335a7d05eb533029917aad35662dcc2 + ;; + mainnet) + L1_OPTIMISM_PORTAL=0xc5c5D157928BDBD2ACf6d0777626b6C75a9EAEDC + ;; + *) + echo "Unsupported network: $NETWORK" + exit 1 + ;; +esac + +# Required environment variables +WITHDRAWAL_HASH=${WITHDRAWAL_HASH:-}; [ -z "${WITHDRAWAL_HASH:-}" ] && echo "Need to set the WITHDRAWAL_HASH via env" && exit 1; +PROOF_SUBMITTER=${PROOF_SUBMITTER:-}; [ -z "${PROOF_SUBMITTER:-}" ] && echo "Need to set the PROOF_SUBMITTER via env" && exit 1; +L1_RPC_URL=${L1_RPC_URL:-}; [ -z "${L1_RPC_URL:-}" ] && echo "Need to set the L1_RPC_URL via env" && exit 1; + +# Reverts with reason if something is wrong with withdrawal (not proven, not initiated, not ready for claim etc.) +cast call $L1_OPTIMISM_PORTAL \ + "checkWithdrawal(bytes32,address)" \ + $WITHDRAWAL_HASH \ + $PROOF_SUBMITTER \ + --rpc-url $L1_RPC_URL diff --git a/packages/op-tooling/withdrawal/finalize.sh b/packages/op-tooling/withdrawal/finalize.sh new file mode 100755 index 00000000000..15ca3ec5adb --- /dev/null +++ b/packages/op-tooling/withdrawal/finalize.sh @@ -0,0 +1,36 @@ +#!/bin/bash +set -euo pipefail + +# Determine network +NETWORK=${NETWORK:-}; [ -z "${NETWORK:-}" ] && echo "Need to set the NETWORK via env" && exit 1; +case $NETWORK in + sepolia) + L1_OPTIMISM_PORTAL=0x44ae3d41a335a7d05eb533029917aad35662dcc2 + ;; + mainnet) + L1_OPTIMISM_PORTAL=0xc5c5D157928BDBD2ACf6d0777626b6C75a9EAEDC + ;; + *) + echo "Unsupported network: $NETWORK" + exit 1 + ;; +esac + +# Required environment variables +WITHDRAWAL_NONCE=${WITHDRAWAL_NONCE:-}; [ -z "${WITHDRAWAL_NONCE:-}" ] && echo "Need to set the WITHDRAWAL_NONCE via env" && exit 1; +SENDER=${SENDER:-}; [ -z "${SENDER:-}" ] && echo "Need to set the SENDER via env" && exit 1; +RECIPIENT=${RECIPIENT:-}; [ -z "${RECIPIENT:-}" ] && echo "Need to set the RECIPIENT via env" && exit 1; +VALUE=${VALUE:-}; [ -z "${VALUE:-}" ] && echo "Need to set the VALUE via env" && exit 1; +PK=${PK:-}; [ -z "${PK:-}" ] && echo "Need to set the PK via env" && exit 1; +L1_RPC_URL=${L1_RPC_URL:-}; [ -z "${L1_RPC_URL:-}" ] && echo "Need to set the L1_RPC_URL via env" && exit 1; + +# Optional environment variables +GAS_LIMIT=${GAS_LIMIT:-0} +DATA=${DATA:-"0x00"} + +# Finalization & claim of withdrawal on L1 +cast send $L1_OPTIMISM_PORTAL \ + "finalizeWithdrawalTransaction((uint256, address, address, uint256, uint256, bytes))" \ + "($WITHDRAWAL_NONCE,$SENDER,$RECIPIENT,$VALUE,$GAS_LIMIT,$DATA)" \ + --private-key $PK \ + --rpc-url $L1_RPC_URL diff --git a/packages/op-tooling/withdrawal/get.sh b/packages/op-tooling/withdrawal/get.sh new file mode 100755 index 00000000000..c679df9ee22 --- /dev/null +++ b/packages/op-tooling/withdrawal/get.sh @@ -0,0 +1,29 @@ +#!/bin/bash +set -euo pipefail + +# Determine network +NETWORK=${NETWORK:-}; [ -z "${NETWORK:-}" ] && echo "Need to set the NETWORK via env" && exit 1; +case $NETWORK in + sepolia) + L1_OPTIMISM_PORTAL=0x44ae3d41a335a7d05eb533029917aad35662dcc2 + ;; + mainnet) + L1_OPTIMISM_PORTAL=0xc5c5D157928BDBD2ACf6d0777626b6C75a9EAEDC + ;; + *) + echo "Unsupported network: $NETWORK" + exit 1 + ;; +esac + +# Required environment variables +WITHDRAWAL_HASH=${WITHDRAWAL_HASH:-}; [ -z "${WITHDRAWAL_HASH:-}" ] && echo "Need to set the WITHDRAWAL_HASH via env" && exit 1; +PROOF_SUBMITTER=${PROOF_SUBMITTER:-}; [ -z "${PROOF_SUBMITTER:-}" ] && echo "Need to set the PROOF_SUBMITTER via env" && exit 1; +L1_RPC_URL=${L1_RPC_URL:-}; [ -z "${L1_RPC_URL:-}" ] && echo "Need to set the L1_RPC_URL via env" && exit 1; + +# Retrieves stored data about proven withdrawal from L1 +cast call $L1_OPTIMISM_PORTAL \ + "provenWithdrawals(bytes32,address)" \ + $WITHDRAWAL_HASH \ + $PROOF_SUBMITTER \ + --rpc-url $L1_RPC_URL diff --git a/packages/op-tooling/withdrawal/initiate.sh b/packages/op-tooling/withdrawal/initiate.sh new file mode 100755 index 00000000000..250f742b049 --- /dev/null +++ b/packages/op-tooling/withdrawal/initiate.sh @@ -0,0 +1,25 @@ +#!/bin/bash +set -euo pipefail + +# Constants +L2_L1_MESSAGE_PASSER=0x4200000000000000000000000000000000000016 + +# Required environment variables +RECIPIENT=${RECIPIENT:-}; [ -z "${RECIPIENT:-}" ] && echo "Need to set the RECIPIENT via env" && exit 1; +VALUE=${VALUE:-}; [ -z "${VALUE:-}" ] && echo "Need to set the VALUE via env" && exit 1; +PK=${PK:-}; [ -z "${PK:-}" ] && echo "Need to set the PK via env" && exit 1; +L2_RPC_URL=${L2_RPC_URL:-}; [ -z "${L2_RPC_URL:-}" ] && echo "Need to set the L2_RPC_URL via env" && exit 1; + +# Optional environment variables +GAS_LIMIT=${GAS_LIMIT:-0} +DATA=${DATA:-"0x00"} + +# Initiation of withdrawal on L2 +cast send $L2_L1_MESSAGE_PASSER \ + "initiateWithdrawal(address,uint256,bytes)" \ + $RECIPIENT \ + $GAS_LIMIT \ + $DATA \ + --value $VALUE \ + --private-key $PK \ + --rpc-url $L2_RPC_URL diff --git a/packages/op-tooling/withdrawal/prove.sh b/packages/op-tooling/withdrawal/prove.sh new file mode 100755 index 00000000000..c997799d898 --- /dev/null +++ b/packages/op-tooling/withdrawal/prove.sh @@ -0,0 +1,48 @@ +#!/bin/bash +set -euo pipefail + +# Determine network +NETWORK=${NETWORK:-}; [ -z "${NETWORK:-}" ] && echo "Need to set the NETWORK via env" && exit 1; +case $NETWORK in + sepolia) + L1_OPTIMISM_PORTAL=0x44ae3d41a335a7d05eb533029917aad35662dcc2 + ;; + mainnet) + L1_OPTIMISM_PORTAL=0xc5c5D157928BDBD2ACf6d0777626b6C75a9EAEDC + ;; + *) + echo "Unsupported network: $NETWORK" + exit 1 + ;; +esac + +# Required environment variables +WITHDRAWAL_NONCE=${WITHDRAWAL_NONCE:-}; [ -z "${WITHDRAWAL_NONCE:-}" ] && echo "Need to set the WITHDRAWAL_NONCE via env" && exit 1; +SENDER=${SENDER:-}; [ -z "${SENDER:-}" ] && echo "Need to set the SENDER via env" && exit 1; +RECIPIENT=${RECIPIENT:-}; [ -z "${RECIPIENT:-}" ] && echo "Need to set the RECIPIENT via env" && exit 1; +VALUE=${VALUE:-}; [ -z "${VALUE:-}" ] && echo "Need to set the VALUE via env" && exit 1; +GAME_INDEX=${GAME_INDEX:-}; [ -z "${GAME_INDEX:-}" ] && echo "Need to set the GAME_INDEX via env" && exit 1; +OUTPUT_ROOT_PROOF__VERSION=${OUTPUT_ROOT_PROOF__VERSION:-} && [ -z "${OUTPUT_ROOT_PROOF__VERSION:-}" ] && echo "Need to set the OUTPUT_ROOT_PROOF__VERSION via env" && exit 1; +OUTPUT_ROOT_PROOF__STATE_ROOT=${OUTPUT_ROOT_PROOF__STATE_ROOT:-} && [ -z "${OUTPUT_ROOT_PROOF__STATE_ROOT:-}" ] && echo "Need to set the OUTPUT_ROOT_PROOF__STATE_ROOT via env" && exit 1; +OUTPUT_ROOT_PROOF__MESSAGE_PASSER_STORAGE_ROOT=${OUTPUT_ROOT_PROOF__MESSAGE_PASSER_STORAGE_ROOT:-} && [ -z "${OUTPUT_ROOT_PROOF__MESSAGE_PASSER_STORAGE_ROOT:-}" ] && echo "Need to set the OUTPUT_ROOT_PROOF__MESSAGE_PASSER_STORAGE_ROOT via env" && exit 1; +OUTPUT_ROOT_PROOF__LATEST_BLOCKHASH=${OUTPUT_ROOT_PROOF__LATEST_BLOCKHASH:-} && [ -z "${OUTPUT_ROOT_PROOF__LATEST_BLOCKHASH:-}" ] && echo "Need to set the OUTPUT_ROOT_PROOF__LATEST_BLOCKHASH via env" && exit 1; +WITHDRAWAL_PROOF=${WITHDRAWAL_PROOF:-}; [ -z "${WITHDRAWAL_PROOF:-}" ] && echo "Need to set the WITHDRAWAL_PROOF via env" && exit 1; +PK=${PK:-}; [ -z "${PK:-}" ] && echo "Need to set the PK via env" && exit 1; +L1_RPC_URL=${L1_RPC_URL:-}; [ -z "${L1_RPC_URL:-}" ] && echo "Need to set the L1_RPC_URL via env" && exit 1; + +### Example value of WITHDRAWAL_PROOF: +# WITHDRAWAL_PROOF="[0xf8918080808080a0231eba9c2bc1784b944714d5260873e3f92b58434c1879123d58f995b342865180a0b3b0303113429f394c506a530c83a8fdbd3125d95b2310b05191cd2dbc978aa8808080a0236e8f61ecde6abfebc6c529441f782f62469d8a2cc47b7aace2c136bd3b1ff080a06babe3fe3879f4972e397c7e516ceb2699945beb318afa0ddee8e7381796f5ff808080,0xf8518080808080a0ea006b1384a4bf0219939e5483e6e82c22d13290d5055e2042541adfb1b47ec380808080a05aa8408d8bac30771c33c39b02167ad094fff70f16e4aa667623d999d04725c9808080808080,0xe2a02005084db35fe36c140bc6d2bc4d520dafa807b5e774c7276c91658a496f59cc01]" + +# Optional environment variables +GAS_LIMIT=${GAS_LIMIT:-0} +DATA=${DATA:-"0x00"} + +# Proves withdrawal transaction on L1 +cast send $L1_OPTIMISM_PORTAL \ + "proveWithdrawalTransaction((uint256, address, address, uint256, uint256, bytes), uint256, (bytes32, bytes32, bytes32, bytes32), bytes[])" \ + "($WITHDRAWAL_NONCE,$SENDER,$RECIPIENT,$VALUE,$GAS_LIMIT,$DATA)" \ + $GAME_INDEX \ + "($OUTPUT_ROOT_PROOF__VERSION,$OUTPUT_ROOT_PROOF__STATE_ROOT,$OUTPUT_ROOT_PROOF__MESSAGE_PASSER_STORAGE_ROOT,$OUTPUT_ROOT_PROOF__LATEST_BLOCKHASH)" \ + "$WITHDRAWAL_PROOF" \ + --private-key $PK \ + --rpc-url $L1_RPC_URL diff --git a/packages/protocol/.env.json b/packages/protocol/.env.json index 3efb49ac7d6..0967ef424bc 100644 --- a/packages/protocol/.env.json +++ b/packages/protocol/.env.json @@ -1,3 +1 @@ -{ - "celoScanApiKey": "" -} +{} diff --git a/packages/protocol/.eslintrc.js b/packages/protocol/.eslintrc.js index f10f0cbe039..3b9b132d5c9 100644 --- a/packages/protocol/.eslintrc.js +++ b/packages/protocol/.eslintrc.js @@ -3,5 +3,7 @@ module.exports = { 'import/no-extraneous-dependencies': 'off', 'no-underscore-dangle': 'off', indent: ['off', 2], + 'dot-notation': 'off', + '@typescript-eslint/dot-notation': ['error', { allowIndexSignaturePropertyAccess: true }], }, } diff --git a/packages/protocol/.gitignore b/packages/protocol/.gitignore index 4a69dc113de..8688b4422d7 100644 --- a/packages/protocol/.gitignore +++ b/packages/protocol/.gitignore @@ -7,10 +7,6 @@ contractPackages.js.map lib/**/*.js lib/**/*.js.map -migrations/**/*.js -migrations/**/*.js.map - -migrations_ts/**/*.js scripts/**/*.js scripts/**/*.js.map @@ -18,6 +14,8 @@ scripts/**/*.js.map test/**/*.js test/**/*.js.map +test-ts/**/*.js + types/**/*.js types/**/*.js.map @@ -26,6 +24,7 @@ scTopics build .tmp .*.tar.gz +out* types/*.js types/contracts/* @@ -38,6 +37,8 @@ deployedGrants.json # Foundry cache/ out/ +out-truffle-compat/ +out-truffle-compat-0.8/ wagmi.config.js wagmi.config.js.map @@ -47,4 +48,12 @@ abis/lib/ # these are copied from contracts-08 folder for purposes of publishing to npm contracts/0.8 -broadcast/ \ No newline at end of file +broadcast/ + +# Libraries files generated by verify-deployed +*-libraries.json + +out-truffle-compat-0.8 +out-truffle-compat + +opencode.json \ No newline at end of file diff --git a/packages/protocol/CHEATSHEET.md b/packages/protocol/CHEATSHEET.md new file mode 100644 index 00000000000..4c421d5a958 --- /dev/null +++ b/packages/protocol/CHEATSHEET.md @@ -0,0 +1,46 @@ +# Cheatsheet + +Covers changes in `package.json` scripts introduced starting with PR [#11369](https://github.com/celo-org/celo-monorepo/pull/11369). This is a temporary document, slated to be removed after the migration away from Truffle is completed, when `package.json` is expected to be further simplified. + +> ... - means does not exist in this version + +| Before | After | +| --------------------------------------- | --------------------------------------------- | +| `...` | `build:foundry` | +| `build` | `build` | +| `build:sol` | `build:truffle-sol` | +| `build:ts` | `build:truffle-ts` | +| `devchain` | `devchain` | +| `init-network` | `devchain:init-network` | +| `test:generate-old-devchain-and-build` | `devchain:generate-old-devchain-and-build` | +| `migrate` | `devchain:migrate` | +| `devchain:reset` | `devchain:reset` | +| `check-opcodes` | `release:check-opcodes` | +| `check-versions` | `release:check-versions` | +| `determine-release-version` | `release:determine-release-version` | +| `make-release` | `release:make` | +| `verify-deployed` | `release:verify-deployed` | +| `verify-release` | `release:verify-release` | +| `pull-submodules` | `submodules:pull` | +| `delete-submodules` | `submodules:delete` | +| `compare-git-tags` | `tags:compare` | +| `view-tags` | `tags:view` | +| `...` | `test` | +| `test:coverage` | `test:coverage` | +| `gas` | `test:gas` | +| `test:release-snapshots` | `test:release-snapshots` | +| `test:scripts` | `test:scripts` | +| `test` | `test:truffle` | +| `console` | `truffle:console` | +| `govern` | `truffle:govern` | +| `migrate` | `truffle:migrate` | +| `set_block_gas_limit` | `truffle:set-block-gas-limit` | +| `truffle-verify` | `truffle:verify` | +| `prepare_contracts_and_abis_publishing` | `utils:prepare-contracts-and-abis-publishing` | +| `prepare_devchain_anvil_publishing` | `utils:prepare-devchain-anvil-publishing` | +| `sourcify-publish` | `utils:sourcify-publish` | +| `validate_abis_exports` | `utils:validate-abis-exports` | +| `download-artifacts` | `...` | +| `generate-stabletoken-files` | `...` | +| `revoke` | `...` | +| `upload-artifacts` | `...` | diff --git a/packages/protocol/README.md b/packages/protocol/README.md index 8c96371c8a8..c01091884f2 100644 --- a/packages/protocol/README.md +++ b/packages/protocol/README.md @@ -8,12 +8,12 @@ The contents of this package are licensed under the terms of the GNU Lesser Publ ### Initial deployment -See the [testnet helm chart README](../helm-charts/testnet/README.md) for how to expose the RPC endpoint. +See the [testnet helm chart README](https://github.com/celo-org/charts/blob/main/charts/testnet/README.md) for how to expose the RPC endpoint. Then, to deploy contracts to a network run: ```bash -yarn run init-network -n NETWORK +yarn run devchain:init-network -n NETWORK ``` This will deploy the contracts to the network specified in `truffle-config.js` and save the artifacts to `build/NETWORK`. @@ -26,53 +26,15 @@ If a new contract needs to be deployed, create a migration file in the `migratio To apply any new migrations to a network, run: ```bash -yarn run migrate -n NETWORK +yarn run devchain:migrate -n NETWORK ``` -### Accounts - -To give an account some gold, wrapped gold, and stable token, run: - -```bash -yarn run faucet -n NETWORK -a ACCOUNT_ADDRESS -``` - -You can check balances by running: - -```bash -yarn run get-balances -n NETWORK -a ACCOUNT_ADDRESS -``` - -You can run 'onlyOwner' methods via the [MultiSig](contracts/common/MultiSig.sol) by running: - -```bash -yarn run govern -n NETWORK -c "stableToken.setMinter(0x1234)" -``` - -### Build artifacts - -When interacting with one of our Kubernetes-deployed networks, you can download the build artifacts to a local directory using: - -```bash -yarn run download-artifacts -n NAME -``` - -You must run this before interacting with one of these networks to have the build artifacts available locally. - -If you changed the build artifacts (e.g. by running the `init-network`, `migrate`, or `upgrade` script), upload the new build artifacts with: - -```bash -yarn run upload-artifacts -n NAME -``` - -By default, `NAME` will be set as `RELEASE_NAME`, `NAMESPACE_NAME`, `TESTNET_NAME` which you should have used with the same name in prior instructions. If you used separate names for the above, you can customize the run with the `-r -n -t` flags respectively. - ### Console To start a truffle console run: ``` -yarn console -f -n rc1 +yarn truffle:console -f -n rc1 ``` Options: @@ -107,64 +69,128 @@ truffle(rc1)> let exchange = await kit.contracts.getExchange() Warning / TODO: We are migrating our tests to Foundry, so this section may be out of date. For instruction on how to run tests with Foundry see [here](./test-sol/README.md). -To test the smart contracts, run: +Truffle tests have been deprecated. + +## Verify released smart contracts + +> Etherscan API V1→V2 migration affects Celoscan. Use Foundry ≥1.3.5 for V2 support. + +### Quick Start + +**Parameters:** +- `[ADDRESS]` - Contract address to verify +- `[CONTRACT]` - Contract name (e.g., `Proposals`, `Validators`) +- `[PATH]` - Full path to contract (e.g., `contracts/.../Proposals.sol`) +- `[NETWORK]` - Network slug (celo, celo-sepolia, celo-alfajores, celo-baklava) +- `[CHAIN_ID]` - Network ID (42220=mainnet, 11142220=celo-sepolia, ...) +- `[API_KEY]` - Your Celoscan/Blockscout API key +- `[RPC_URL]` - Url to RPC (e.g., `https://forno.celo.org`) +- `[CONSTRUCTOR_HEX]` - Result of encoding `$(cast abi-encode "constructor +([SIGNATURE])" [ARGS])` +### 1. Compile with Foundry + +Use the appropriate profile to match Truffle compilation settings: ```bash -yarn run test -``` +# Solidity 0.5 contracts +FOUNDRY_PROFILE=truffle-compat forge build contracts/.../[CONTRACT].sol -Adding the optional `--gas` flag will print out a report of contract gas usage. +# Solidity 0.8 contracts +FOUNDRY_PROFILE=truffle-compat8 forge build contracts-0.8/.../[CONTRACT].sol +``` -To test a single smart contract, run: +### 2. Verify with Foundry +**Base command:** ```bash -yarn run test ${contract name} +FOUNDRY_PROFILE=[truffle-compat|truffle-compat8] forge verify-contract [ADDRESS] [CONTRACT] \ + --chain-id [CHAIN_ID] \ + --watch ``` -Adding the optional `--gas` flag will print out a report of contract gas usage. +**Platform options:** +- **Celoscan**: `--etherscan-api-key=[API_KEY]` +- **Blockscout**: `--verifier=blockscout --verifier-url=https://[NETWORK].blockscout.com/api/` +- **Sourcify**: `--verifier=sourcify` -For quick test iterations run: +**Examples:** +**Template format:** ```bash -yarn run quicktest +# Celoscan verification in Solidity 0.5 +FOUNDRY_PROFILE=truffle-compat forge verify-contract [ADDRESS] [CONTRACT] \ + --chain-id [CHAIN_ID] --etherscan-api-key=[API_KEY] --watch + +# Blockscout verification in Solidity 0.8 +FOUNDRY_PROFILE=truffle-compat8 forge verify-contract [ADDRESS] [CONTRACT] \ + --chain-id [CHAIN_ID] --verifier=blockscout --verifier-url=https://[NETWORK].blockscout.com/api/ --watch ``` -or for a single contract: +**Real examples:** ```bash -yarn run quicktest ${contract name} -``` +# Celoscan verification - Solidity 0.5 (Celo Mainnet) +FOUNDRY_PROFILE=truffle-compat forge verify-contract 0x8d6677192144292870907e3fa8a5527fe55a7ff6 Governance \ + --chain-id 42220 --etherscan-api-key=YourCeloscanAPIKey --watch -For `quicktest` to work correctly a contract's migration dependencies have to be uncommented in `scripts/bash/backupmigrations.sh`. +# Celoscan verification - Solidity 0.5 (Celo Sepolia) +FOUNDRY_PROFILE=truffle-compat forge verify-contract 0x1234567890123456789012345678901234567890 Validators \ + --chain-id 11142220 --etherscan-api-key=YourCeloscanAPIKey --watch -Compared to the normal test command, quicktest will: +# Blockscout verification - Solidity 0.5 (Celo Mainnet) +FOUNDRY_PROFILE=truffle-compat forge verify-contract 0xabcdefabcdefabcdefabcdefabcdefabcdefabcd StableToken \ + --chain-id 42220 --verifier=blockscout --verifier-url=https://celo.blockscout.com/api/ --watch -1. Not run the pretest script of building solidity (will still be run as part of truffle test) and compiling typescript. This works because truffle can run typescript "natively". -2. Only migrate selected migrations as set in `backupmigrations.sh` (you'll likely need at least one compilation step since truffle seems to only run compiled migrations) +# Sourcify verification - Solidity 0.5 (Celo Mainnet) +FOUNDRY_PROFILE=truffle-compat forge verify-contract 0x471ece3750da237f93b8e339c536989b8978a438 LockedGold \ + --chain-id 42220 --verifier=sourcify --watch +``` -## Verify released smart contracts +```bash +# Celoscan verification - Solidity 0.8 (Celo Mainnet) +FOUNDRY_PROFILE=truffle-compat8 forge verify-contract 0x9876543210987654321098765432109876543210 CeloToken \ + --chain-id 42220 --etherscan-api-key=YourCeloscanAPIKey --watch -1. Update CeloScanApi in env.json file -2. Run verification command +# Celoscan verification - Solidity 0.8 (Celo Sepolia) +FOUNDRY_PROFILE=truffle-compat8 forge verify-contract 0xfedcba0987654321fedcba0987654321fedcba09 LockedCelo \ + --chain-id 11142220 --etherscan-api-key=YourCeloscanAPIKey --watch +``` +**Options for exact match verification** (upgrade from partial verified): ```bash -yarn truffle-verify [ContractName]@[Contract address] --network [network] --forno [network rpc url] +--skip-is-verified-check --constructor-args [CONSTRUCTOR_HEX] ``` -example: +### 3. Fallback: Truffle + +If Foundry fails: ```bash -yarn truffle-verify MentoFeeHandlerSeller@0x4efa274b7e33476c961065000d58ee09f7921a74 --network mainnet --forno https://forno.celo.org +# Compile all +yarn build +# or for specific contract in Solidity 0.5: +yarn run truffle compile [PATH] --contracts_build_directory=build/contracts +# or for specific contract in Solidity 0.8: +yarn run truffle compile [PATH] --contracts_build_directory=build/contracts-0.8 --config truffle-config0.8.js + +# Verify +yarn truffle:verify [CONTRACT]@[ADDRESS] --network [NETWORK] --forno [RPC_URL] ``` +### 4. Manual Verification + +If Sourcify verification succeeds but Celoscan/Blockscout fails: +1. Download the standard input JSON from Sourcify +2. Use it to manually verify on Celoscan or Blockscout + ### Possible problems -1. Some of old smart contracts have slightly different bytecode when verified (it is usually just few bytes difference). Some of the smart contracts were originally deployed with version 0.5.8 instead of 0.5.13 even though there is no history trace about this in our monorepo. +1. Some of the old smart contracts have slightly different bytecode when verified (it is usually just few bytes difference). Some of the smart contracts were originally deployed with version 0.5.8 instead of 0.5.13 even though there is no history trace about this in our monorepo. -2. Bytecode differs because of missing library addresses on CeloScan. Json file that will be manually uploaded to CeloScan needs to have libraries root element updated. Library addresses is possible to get either manually or with command which will generate libraries.json. +2. Bytecode differs because of missing library addresses on CeloScan. Json file that will be manually uploaded to CeloScan needs to have libraries root element updated. Library addresses are possible to get either manually or with command which will generate libraries.json. ```bash - yarn verify-deployed -n $NETWORK -b $PREVIOUS_RELEASE -f + yarn release:verify-deployed -n $NETWORK -b $PREVIOUS_RELEASE -f ``` ```javascript @@ -293,18 +319,18 @@ Output: The output is a CSV file named `onchain_bytecode_sizes_.csv` To get the list of PRs that changed smart contracts between two releases, run: ```sh -yarn compare-git-tags [git_tag/branch] [git_tag/branch] +yarn tags:compare [git_tag/branch] [git_tag/branch] ``` Example: ```sh -yarn compare-git-tags release/core-contracts/11 release/core-contracts/12 +yarn tags:compare release/core-contracts/11 release/core-contracts/12 ``` Example output: PRs that made these changes: -16442165a Deactivate BlochainParameters Contract on L2 (#11008) +16442165a Deactivate BlockchainParameters Contract on L2 (#11008) 198f6215a SortedLinkedList Foundry Migration (#10846) diff --git a/packages/protocol/RELEASE_PROCESS_FOUNDRY.md b/packages/protocol/RELEASE_PROCESS_FOUNDRY.md new file mode 100644 index 00000000000..275e48cfa0b --- /dev/null +++ b/packages/protocol/RELEASE_PROCESS_FOUNDRY.md @@ -0,0 +1,1094 @@ +# Celo Core Contracts Release Process (Foundry) + +This document describes the release process for Celo Core Contracts using Foundry-based tooling. + +## Table of Contents + +- [Overview](#overview) +- [Versioning](#versioning) +- [GitHub Branching & Tagging](#github-branching--tagging) +- [Release Artifacts](#release-artifacts) +- [Release Commands](#release-commands) +- [Step-by-Step Release Process](#step-by-step-release-process) +- [Release Timeline](#release-timeline) +- [Testing & Verification](#testing--verification) +- [Governance Proposal](#governance-proposal) +- [Communication & Community](#communication--community) +- [Emergency Patches](#emergency-patches) +- [Troubleshooting](#troubleshooting) +- [Additional Resources](#additional-resources) +- [Networks](#networks) +- [Deployer Keys](#deployer-keys) +- [Starting a Local Fork](#starting-a-local-fork) +- [Contract Verification (Block Explorers)](#contract-verification-block-explorers) +- [Environment Variables](#environment-variables) + +## Overview + +The Celo Core Contracts release process ensures that smart contract upgrades are deployed safely through proper versioning, testing, and governance procedures. This guide covers the Foundry-based tooling that replaces the legacy Truffle-based release scripts. + +Core contract releases are rolled out via Celo's on-chain governance system, with approximately four major releases per year. + +### Script Mapping (Truffle → Foundry) + +| Truffle Command | Foundry Equivalent | +|-----------------|-------------------| +| `release:check-versions` | `release:check-versions:foundry` | +| `release:verify-deployed` | `release:verify-deployed:foundry` | +| `release:make` | `release:make:foundry` | +| `release:verify-release` | *(use verify-deployed with --proposal)* | + +## Versioning + +Each Celo core smart contract is versioned independently using **semantic versioning** with a custom extension: + +| Version Type | Description | +|-------------|-------------| +| **STORAGE** | Incompatible storage layout changes | +| **MAJOR** | Incompatible ABI changes | +| **MINOR** | Added functionality (backwards compatible) | +| **PATCH** | Backwards-compatible bug fixes | + +All deployed core contracts implement `getVersionNumber()` which returns `(storage, major, minor, patch)` encoded in Solidity source. Contracts deployed before versioning was added default to version `1.1.0.0`. + +**Important**: If mixins or libraries change, all contracts using them are considered changed and must be redeployed in the next release. + +## GitHub Branching & Tagging + +### Branch Naming Convention + +Release development happens on branches named: +``` +release/core-contracts/${N} +``` +Where `N` is the release version number (e.g., `release/core-contracts/14`). + +## Release Artifacts + +The release process generates and uses several important artifacts. Understanding these is crucial for a successful release. + +### 1. `libraries.json` - Library Address Mapping + +**What it is:** A JSON file containing the deployed addresses of all linked libraries on a network. + +**How it's generated:** Automatically created by `release:verify-deployed:foundry` after successfully verifying on-chain bytecode. + +**When to generate:** +```bash +# Generate libraries.json for Celo Sepolia +yarn release:verify-deployed:foundry -b core-contracts.v${PREVIOUS} -n celo-sepolia -f + +# Generate libraries.json for Mainnet +yarn release:verify-deployed:foundry -b core-contracts.v${PREVIOUS} -n celo -f +``` + +**Output location:** `./libraries.json` (in the `packages/protocol` directory) + +**Example content:** +```json +{ + "Proposals": "0x38afc0dc55415ae27b81c24b5a5fbfe433ebfba8", + "IntegerSortedLinkedList": "0x411b40a81a07fcd3542ce5b3d7e215178c4ca2ef", + "AddressLinkedList": "0xd26d896d258e258eba71ff0873a878ec36538f8d", + "Signatures": "0x69baecd458e7c08b13a18e11887dbb078fb3cbb4", + "AddressSortedLinkedList": "0x4819ad0a0eb1304b1d7bc3afd7818017b52a87ab" +} +``` + +**Why it's needed:** The `release:make:foundry` script uses these addresses to link libraries when deploying new contract implementations. + +### 2. Compatibility Report (`releaseData/versionReports/`) + +**What it is:** A comprehensive JSON report comparing two contract versions, detailing all changes. + +**How it's generated:** Created by `release:check-versions:foundry`. + +**Report structure:** +```json +{ + "oldArtifactsFolder": ["out-core-contracts.v13"], + "newArtifactsFolder": ["out-release_core-contracts_14"], + "report": { + "contracts": { + "ContractName": { + "changes": { + "storage": [], // Storage layout changes + "major": [ // Breaking ABI changes + {"type": "MethodRemoved", "signature": "..."} + ], + "minor": [], // New functionality (backwards compatible) + "patch": [ // Bytecode-only changes + {"type": "DeployedBytecode"} + ] + }, + "versionDelta": { + "storage": "=", // No storage change + "major": "+1", // Major version should increment + "minor": "0", // Minor reset to 0 + "patch": "0" // Patch reset to 0 + } + } + }, + "libraries": { ... } + } +} +``` + +**Storage location:** `./releaseData/versionReports/release${N}-report.json` + +### 3. Initialization Data (`releaseData/initializationData/`) + +**What it is:** JSON file containing constructor/initialization arguments for newly deployed contracts. + +**Location:** `./releaseData/initializationData/release${N}.json` + +**Format:** +```json +{ + "ContractName": [arg1, arg2, arg3], + "AnotherContract": ["0xAddress", 12345, true] +} +``` + +**When needed:** Only required when deploying contracts with storage-incompatible changes (new proxy) or entirely new contracts. + +**Example with real data:** +```json +{ + "FeeHandler": [ + "0x000000000000000000000000000000000000ce10", + "0xGoldTokenAddress", + "0xExchangeAddress" + ] +} +``` + +**Note:** If no new contracts need initialization, this file can be empty `{}`. + +### 4. Governance Proposal JSON + +**What it is:** The output from `release:make:foundry` containing all transactions needed for the governance proposal. + +**Generated by:** `release:make:foundry -p ./proposal.json ...` + +**Structure:** +```json +[ + { + "contract": "ContractProxy", + "function": "_setImplementation", + "args": ["0xNewImplementationAddress"], + "value": "0" + }, + { + "contract": "Registry", + "function": "setAddressFor", + "args": ["ContractName", "0xNewProxyAddress"], + "value": "0", + "description": "Registry: ContractName -> 0x..." + } +] +``` + +### Tagging Strategy + +| Stage | Tag Format | Description | +|-------|-----------|-------------| +| Pre-audit | `core-contracts.v${N}.pre-audit` | First commit on release branch before audit | +| Final | `core-contracts.v${N}` | After successful deployment and governance | + +### View Release Tags + +```bash +yarn tags:view +``` + +### Compare Releases + +To see PRs that changed smart contracts between releases: + +```bash +yarn tags:compare release/core-contracts/13 release/core-contracts/14 +``` + +## Release Commands + +The following npm scripts are available in `packages/protocol` for the release process: + +### 1. Check Versions (`release:check-versions:foundry`) + +Compares contract versions between two branches/tags, checking storage layout, ABI compatibility, and bytecode changes. **This must be run before `make-release`.** + +```bash +yarn release:check-versions:foundry \ + -a \ + -b \ + -r \ + -l +``` + +**Parameters:** +- `-a`: Old branch/tag containing the currently deployed contracts +- `-b`: New branch/tag containing the release candidate +- `-r`: (Optional) Path to write the compatibility report +- `-l`: (Optional) Path to append logs (default: `/tmp/celo-check-versions.log`) + +**What it checks:** +- Storage layout compatibility (detects slot collisions) +- ABI changes (method additions, removals, signature changes) +- Bytecode differences +- Version number correctness (ensures bumps match change types) + +**Example:** +```bash +yarn release:check-versions:foundry \ + -a core-contracts.v13 \ + -b release/core-contracts/14 \ + -r ./releaseData/versionReports/release14-report.json +``` + +**Expected output on success:** +``` +Success! Actual version numbers match expected +Writing compatibility report to ./releaseData/versionReports/release14-report.json ...Done +``` + +**If version mismatch is detected:** +``` +Version mismatch detected: +{ + "ContractName": { + "actual": { "storage": 1, "major": 2, "minor": 0, "patch": 0 }, + "expected": { "storage": 1, "major": 3, "minor": 0, "patch": 0 } + } +} +``` + +### 2. Verify Deployed Contracts (`release:verify-deployed:foundry`) + +Verifies that on-chain bytecode matches the source code for a given branch/tag. **Also generates `libraries.json`.** + +```bash +yarn release:verify-deployed:foundry \ + -b \ + -n \ + [-f] \ + [-l ] +``` + +**Parameters:** +- `-b`: Branch/tag containing the smart contracts to verify +- `-n`: Network to verify against (`celo`, `celo-sepolia`) +- `-f`: (Optional) Use Forno service to connect to the network +- `-l`: (Optional) Path to append logs + +**What it does:** +1. Builds contracts from the specified branch using Foundry +2. Fetches all contract addresses from the on-chain Registry +3. Compares on-chain bytecode against locally compiled bytecode +4. **Writes linked library addresses to `libraries.json`** + +**Example:** +```bash +# Verify on Celo Sepolia testnet (generates libraries.json for Celo Sepolia) +yarn release:verify-deployed:foundry -b core-contracts.v13 -n celo-sepolia -f + +# Verify on Mainnet (generates libraries.json for Mainnet) +yarn release:verify-deployed:foundry -b core-contracts.v13 -n celo -f +``` + +**Expected output:** +``` +Writing logs to /tmp/celo-verify-deployed.log + - Checkout contracts source code at core-contracts.v13 + - Build contract artifacts at out-core-contracts.v13-truffle-compat +... +Success, no bytecode mismatches found! +Writing linked library addresses to libraries.json +``` + +### 3. Make Release (`release:make:foundry`) + +Builds, deploys new contract implementations, and generates a governance proposal JSON. + +```bash +yarn release:make:foundry \ + -b \ + -k \ + -i \ + -l \ + -n \ + -p \ + -r +``` + +**Parameters:** +- `-b`: Branch/tag to build and deploy from +- `-k`: Private key for signing deployment transactions +- `-i`: Path to initialization data JSON (e.g., `./releaseData/initializationData/release14.json`) +- `-l`: Path to library address mappings (`libraries.json`) - **must be generated first!** +- `-n`: Network to deploy to (`celo`, `celo-sepolia`) +- `-p`: Path to write the governance proposal JSON output +- `-r`: Path to the compatibility report from `check-versions` - **must be generated first!** + +**Prerequisites:** +1. Run `release:check-versions:foundry` to generate the compatibility report +2. Run `release:verify-deployed:foundry` to generate `libraries.json` +3. Create/update initialization data in `releaseData/initializationData/` + +**Example:** +```bash +# Deploy to Celo Sepolia +yarn release:make:foundry \ + -b release/core-contracts/14 \ + -k $DEPLOYER_PRIVATE_KEY \ + -i ./releaseData/initializationData/release14.json \ + -l ./libraries.json \ + -n celo-sepolia \ + -p ./proposal-celo-sepolia.json \ + -r ./releaseData/versionReports/release14-report.json +``` + +**What it does:** +1. Reads the compatibility report to determine which contracts need deployment +2. Deploys new library implementations (if libraries changed) +3. Deploys new contract implementations +4. Deploys new proxies (only for storage-incompatible changes) +5. Generates governance proposal transactions +6. Writes proposal JSON to the specified output path + +### 4. Check Opcodes (`release:check-opcodes`) + +Scans core contracts for unsafe opcodes (`selfdestruct`, `delegatecall`). + +```bash +yarn release:check-opcodes +``` + +**Run this before any release to ensure no unsafe patterns were introduced.** + +**Expected output:** +``` +Core contracts are safe against selfdestruct+delegatecall vulnerabilities +``` + +### 5. Determine Release Version (`release:determine-release-version`) + +Outputs the release version number from a branch name. + +```bash +yarn release:determine-release-version +``` + +### 6. Build & Test Commands + +```bash +# Build contracts with Foundry +yarn build:foundry + +# Run all Foundry tests +yarn test + +# Run specific test file +forge test --match-path test/SomeContract.t.sol -vvv + +# Clean build artifacts +yarn clean:foundry +``` + +### 7. Tag Management + +```bash +# View all release tags +yarn tags:view + +# Compare changes between releases +yarn tags:compare release/core-contracts/13 release/core-contracts/14 +``` + +## Step-by-Step Release Process + +### Phase 1: Preparation + +1. **Create Release Branch** + ```bash + git checkout master + git pull origin master + git checkout -b release/core-contracts/${N} + ``` + +2. **Check for Unsafe Opcodes** + ```bash + yarn release:check-opcodes + ``` + Ensure no `selfdestruct` or `delegatecall` in core contracts. + +3. **Tag Pre-Audit Commit** + ```bash + git tag core-contracts.v${N}.pre-audit + git push origin core-contracts.v${N}.pre-audit + ``` + +4. **Create GitHub Pre-Release** + - Navigate to GitHub Releases + - Create a new pre-release pointing to `core-contracts.v${N}.pre-audit` + - Include release notes and audit submission details + +5. **Submit for Audit** + - Create GitHub issue tracking audit progress + - Submit code to auditors + - Draft initial release notes + +### Phase 2: Version Check & Report Generation + +1. **Generate Compatibility Report** + ```bash + yarn release:check-versions:foundry \ + -a core-contracts.v$((N-1)) \ + -b release/core-contracts/${N} \ + -r ./releaseData/versionReports/release${N}-report.json + ``` + + **Output:** `./releaseData/versionReports/release${N}-report.json` + +2. **Review the Report** + + Check the generated report for: + - **Storage changes**: Any `storage: []` that's not empty requires STORAGE version bump + - **Major changes**: Method removals/signature changes require MAJOR bump + - **Minor changes**: New methods require MINOR bump + - **Patch changes**: Bytecode-only changes require PATCH bump + + The script will fail if version numbers don't match expected changes. + +3. **Prepare Initialization Data** + + Create/update `./releaseData/initializationData/release${N}.json`: + + ```bash + # Check previous release for format reference + cat ./releaseData/initializationData/release$((N-1)).json + ``` + + Add initialization arguments for any contracts with storage-breaking changes: + ```json + { + "NewContract": ["0xRegistryAddress", 1000, true] + } + ``` + + If no new contracts need initialization, create an empty file: + ```json + {} + ``` + +### Phase 3: Testnet Deployment (Celo Sepolia) + +1. **Generate Library Addresses for Celo Sepolia** + ```bash + yarn release:verify-deployed:foundry \ + -b core-contracts.v$((N-1)) \ + -n celo-sepolia \ + -f + ``` + + **Output:** `./libraries.json` (contains Celo Sepolia library addresses) + + > **Important:** This step verifies the previous release matches on-chain bytecode AND generates the `libraries.json` needed for deployment. + +2. **Deploy to Celo Sepolia** + ```bash + yarn release:make:foundry \ + -b release/core-contracts/${N} \ + -k $CELO_SEPOLIA_DEPLOYER_KEY \ + -i ./releaseData/initializationData/release${N}.json \ + -l ./libraries.json \ + -n celo-sepolia \ + -p ./proposal-celo-sepolia.json \ + -r ./releaseData/versionReports/release${N}-report.json + ``` + + **Outputs:** + - New contract implementations deployed on Celo Sepolia + - `./proposal-celo-sepolia.json` containing governance transactions + +3. **Submit Governance Proposal on Celo Sepolia** + + Use `celocli` to submit the proposal: + ```bash + celocli governance:propose \ + --jsonTransactions ./proposal-celo-sepolia.json \ + --deposit 100e18 \ + --from $PROPOSER_ADDRESS \ + --node https://forno.celo-sepolia.celo-testnet.org + ``` + + Note the proposal ID from the output. + +4. **Announce on Community Channels** + - Post on Celo Forum (Governance category) + - Announce on Discord `#governance` channel + - Include: proposal ID, GitHub release link, audit report link + +5. **Manual Testing on Celo Sepolia** + + After governance executes, manually verify: + - [ ] CELO transfers work correctly + - [ ] Account registration succeeds + - [ ] Oracle price reporting functions + - [ ] Escrow operations complete + - [ ] Validator registration/deregistration works + - [ ] Election voting functions + - [ ] Governance proposal creation works + - [ ] Locked gold operations work + +### Phase 4: Mainnet Deployment + +1. **Generate Library Addresses for Mainnet** + ```bash + yarn release:verify-deployed:foundry \ + -b core-contracts.v$((N-1)) \ + -n celo \ + -f + ``` + + **Output:** `./libraries.json` (now contains Mainnet library addresses) + + > **Warning:** This overwrites the previous `libraries.json`. The Mainnet libraries will have different addresses than Celo Sepolia. + +2. **Deploy to Mainnet** + ```bash + yarn release:make:foundry \ + -b release/core-contracts/${N} \ + -k $MAINNET_DEPLOYER_KEY \ + -i ./releaseData/initializationData/release${N}.json \ + -l ./libraries.json \ + -n celo \ + -p ./proposal-mainnet.json \ + -r ./releaseData/versionReports/release${N}-report.json + ``` + + **Outputs:** + - New contract implementations deployed on Mainnet + - `./proposal-mainnet.json` containing governance transactions + +3. **Submit Mainnet Governance Proposal** + + ```bash + celocli governance:propose \ + --jsonTransactions ./proposal-mainnet.json \ + --deposit 10000e18 \ + --from $PROPOSER_ADDRESS \ + --node https://forno.celo.org + ``` + + > **Best Practice:** Submit on Tuesday to allow full governance cycle before weekend. + +4. **Update Community** + - Update Forum thread with Mainnet proposal ID + - Announce on Discord + +### Phase 5: Finalization + +After governance proposal executes successfully: + +1. **Verify Mainnet Deployment** + ```bash + yarn release:verify-deployed:foundry \ + -b release/core-contracts/${N} \ + -n celo \ + -f + ``` + +2. **Merge Release Branch** (use merge commit, not squash) + ```bash + git checkout master + git merge --no-ff release/core-contracts/${N} + git push origin master + ``` + +3. **Create Final Tag** + ```bash + git tag core-contracts.v${N} + git push origin core-contracts.v${N} + ``` + +4. **Update CircleCI Config** + - Update `RELEASE_TAG` in `.circleci/config.yml` + +5. **Create GitHub Release** + - Point to `core-contracts.v${N}` tag + - Include: + - Final release notes + - Link to audit report + - Links to governance proposals (Celo Sepolia and Mainnet) + - Summary of changes + +6. **Archive Release Artifacts** + - Commit the version report to `releaseData/versionReports/` + - Commit initialization data to `releaseData/initializationData/` + +## Release Timeline + +| Day | Action | +|-----|--------| +| **T (Tuesday)** | Create GitHub issue, cut release branch, submit to auditor, draft release notes | +| **T + 1 week** | Audit report arrives, finalize release notes, commit fixes, tag first release candidate, announce on Forum and Discord | +| **T + 2 weeks** | Deploy to Celo Sepolia testnet, submit governance proposal, update Forum/Discord | +| **T + 3 weeks** | Monitor testnet governance, manual testing, address any issues | +| **T + 4 weeks** | Deploy to Mainnet, submit governance proposal, notify community | + +### Governance Timeline + +Once a proposal is submitted, the typical timeline is: +- **24 hours**: Dequeue period +- **24 hours**: Approval period +- **5 days**: Referendum voting period +- **Up to 3 days**: Execution window + +Total: ~8-10 days from proposal submission to execution + +## Testing & Verification + +### Unit Tests + +```bash +# Run all Foundry tests +yarn test + +# Run specific test file +forge test --match-path test/SomeContract.t.sol + +# Run with verbosity for debugging +forge test -vvv +``` + +### Integration Tests + +```bash +# Start local anvil devchain (L2) +yarn anvil-devchain:start-L2 + +# Run integration tests +yarn anvil-devchain:integration-tests + +# Run E2E tests +yarn anvil-devchain:e2e-tests +``` + +### Verification Checks + +The `release:check-versions:foundry` script performs: +- Storage layout compatibility checks +- ABI compatibility verification +- Bytecode change detection +- Version number validation + +### Manual Testing Checklist + +After deploying to Celo Sepolia: +- [ ] CELO transfers work correctly +- [ ] Account registration succeeds +- [ ] Oracle price reporting functions +- [ ] Escrow operations complete +- [ ] Validator registration/deregistration works +- [ ] Election voting functions +- [ ] Governance proposal creation works + +## Governance Proposal + +### Proposal JSON Structure + +The `release:make:foundry` script generates a JSON file containing governance proposal transactions: + +```json +[ + { + "contract": "ContractProxy", + "function": "_setImplementation", + "args": ["0x...newImplementationAddress"], + "value": "0" + }, + { + "contract": "Registry", + "function": "setAddressFor", + "args": ["ContractName", "0xNewProxyAddress"], + "value": "0", + "description": "Registry: ContractName -> 0x..." + } +] +``` + +### Transaction Types + +| Type | When Used | Function | +|------|-----------|----------| +| `_setImplementation` | Implementation-only changes | Points proxy to new implementation | +| `_setAndInitializeImplementation` | New proxy with initialization | Sets implementation and calls initialize | +| `setAddressFor` | New proxy deployment | Updates Registry to point to new proxy | + +### Submitting with celocli + +**Celo Sepolia:** +```bash +celocli governance:propose \ + --jsonTransactions ./proposal-celo-sepolia.json \ + --deposit 100e18 \ + --descriptionURL "https://github.com/celo-org/celo-monorepo/releases/tag/core-contracts.v${N}" \ + --from $PROPOSER_ADDRESS \ + --useLedger \ + --node https://forno.celo-sepolia.celo-testnet.org +``` + +**Mainnet:** +```bash +celocli governance:propose \ + --jsonTransactions ./proposal-mainnet.json \ + --deposit 10000e18 \ + --descriptionURL "https://github.com/celo-org/celo-monorepo/releases/tag/core-contracts.v${N}" \ + --from $PROPOSER_ADDRESS \ + --useLedger \ + --node https://forno.celo.org +``` + +### Monitoring Proposal Status + +```bash +# View proposal details +celocli governance:show --proposalID --node + +# View all proposals +celocli governance:list --node +``` + +### Governance Timeline + +| Phase | Duration | Description | +|-------|----------|-------------| +| Dequeue | 24 hours | Proposal enters queue | +| Approval | 24 hours | Approvers vote to advance | +| Referendum | 5 days | Token holders vote | +| Execution | Up to 3 days | Proposal can be executed | + +**Total time from submission to execution: ~8-10 days** + +## Communication & Community + +### Required Communications + +#### 1. Forum Post (at least 1 week before Celo Sepolia proposal) + +Create a post in the [Celo Forum Governance category](https://forum.celo.org/c/governance/) with: + +**Required Information:** +- Proposer name and background +- Summary of changes (what contracts are affected and why) +- Link to GitHub release branch/tag +- Link to audit report +- Link to compatibility report +- Expected timeline for Celo Sepolia and Mainnet + +**Template:** +```markdown +# Core Contracts Release ${N} + +## Summary +Brief description of what this release includes. + +## Changes +- Contract A: Description of changes (MAJOR version bump) +- Contract B: Description of changes (PATCH version bump) +- ... + +## Links +- GitHub Release: [link] +- Audit Report: [link] +- Compatibility Report: [link] + +## Timeline +- Celo Sepolia Proposal: [date] +- Mainnet Proposal: [date] (pending successful testnet deployment) + +## Testing +Description of testing performed. +``` + +#### 2. Discord Announcements (`#governance` channel) + +**When to post:** +- Release candidate announcement (when branch is cut) +- Celo Sepolia proposal submitted (include proposal ID) +- Mainnet proposal submitted (include proposal ID) +- Governance execution complete + +#### 3. Continuous Updates + +- Respond to community questions on Forum thread +- Post updates if timeline changes +- Announce any issues discovered during testing + +## Emergency Patches + +For urgent fixes between regular releases: + +1. **Cherry-pick from Last Release** + ```bash + git checkout -b hotfix/core-contracts/${N}.1 core-contracts.v${N} + git cherry-pick + ``` + +2. **Expedited Process** + - Versioning still required (at minimum PATCH bump) + - Expedited audit for critical fixes + - Faster governance timeline may be requested + +3. **Documentation** + - Document urgency and rationale + - Link to fix commit and any related issues + +## Troubleshooting + +### Build Issues + +```bash +# Clean all Foundry artifacts and rebuild +yarn clean:foundry +yarn build:foundry + +# If submodules are missing +yarn submodules:pull +``` + +### Version Check Failures + +**"Version mismatch detected"** + +The `release:check-versions:foundry` script compares actual version numbers in contracts against expected versions based on detected changes. + +1. Open the generated report and check the `versionDelta` for each contract +2. In the contract's Solidity file, update `getVersionNumber()` to match expected values +3. Re-run the check + +**Example fix:** +```solidity +// Before (incorrect) +function getVersionNumber() external pure returns (uint256, uint256, uint256, uint256) { + return (1, 1, 0, 0); +} + +// After (correct - major change detected) +function getVersionNumber() external pure returns (uint256, uint256, uint256, uint256) { + return (1, 2, 0, 0); // Incremented major version +} +``` + +### Missing libraries.json + +**Error:** `libraries.json not found` or `Library address not found` + +**Solution:** Run verify-deployed first to generate libraries.json: +```bash +yarn release:verify-deployed:foundry -b core-contracts.v${PREVIOUS} -n -f +``` + +### Bytecode Verification Failures + +**"Bytecode mismatch for ContractName"** + +Possible causes: +1. **Wrong branch**: Ensure `-b` flag points to the correct release tag +2. **Compiler version mismatch**: Check `foundry.toml` settings match deployment +3. **Library address mismatch**: Regenerate `libraries.json` +4. **Optimizer settings**: Ensure Foundry profile matches original compilation + +**Debug steps:** +```bash +# Check which profile was used +cat foundry.toml | grep -A 10 "\[profile" + +# Rebuild with specific profile +FOUNDRY_PROFILE=truffle-compat forge build +``` + +### Deployment Failures + +**"Insufficient funds"** +- Ensure deployer account has enough CELO for gas +- Mainnet deployments can cost significant gas + +**"Nonce too low"** +- Wait for pending transactions to confirm +- Or use a fresh deployer account + +**"Contract deployment failed"** +- Check gas limit (default is 20M) +- Verify constructor arguments in initialization data +- Check network connectivity + +### Governance Proposal Issues + +**"Deposit too low"** +- Celo Sepolia: minimum 100 CELO +- Mainnet: minimum 10,000 CELO + +**"Invalid transaction format"** +- Verify proposal JSON structure matches expected format +- Check contract names match Registry entries + +### Network Connectivity + +**"Could not connect to network"** + +Use Forno endpoints with the `-f` flag: +```bash +yarn release:verify-deployed:foundry -b -n celo-sepolia -f +``` + +Or set custom RPC: +```bash +export RPC_URL=https://your-custom-rpc.com +``` + +### Common Gotchas + +1. **libraries.json is network-specific**: Always regenerate when switching between Celo Sepolia and Mainnet + +2. **Report must be generated before make-release**: The compatibility report tells make-release which contracts need deployment + +3. **Empty initialization data is valid**: If no new contracts need initialization, use `{}` + +4. **Merge with --no-ff**: Always use merge commits (not squash) to preserve release history + +5. **Order matters**: + ``` + check-versions → verify-deployed → make-release → governance + ``` + +## Additional Resources + +- [Foundry Book](https://book.getfoundry.sh/) +- [Celo Governance Documentation](https://docs.celo.org/protocol/governance) +- [Release Data Directory](./releaseData/README.md) +- [Test Documentation](./test-sol/README.md) + +## Networks + +| Network | Chain ID | RPC URL | Use Case | +|---------|----------|---------|----------| +| Celo Mainnet | 42220 | https://forno.celo.org | Production releases | +| Celo Sepolia | 11142220 | https://forno.celo-sepolia.celo-testnet.org | Testnet releases | +| Local Fork | varies | http://127.0.0.1:8545 | Testing releases | + +## Deployer Keys + +Deployer keys are stored in encrypted mnemonic files in the repo root: + +| Network | Mnemonic File | Encrypted File | +|---------|--------------|----------------| +| Celo Sepolia | `.env.mnemonic.celosepolia` | N/A (manual) | +| Mainnet | `.env.mnemonic.mainnet` | `.env.mnemonic.mainnet.enc` | + +### Decrypting Keys (cLabs employees) + +```bash +# Decrypt all mnemonic files using GCP KMS +yarn keys:decrypt +``` + +### Using Keys + +Each mnemonic file exports `DEPLOYER_PRIVATE_KEY`. Source it before running release commands: + +```bash +# For Celo Sepolia +source .env.mnemonic.celosepolia + +# For Mainnet +source .env.mnemonic.mainnet +``` + +Then use `$DEPLOYER_PRIVATE_KEY` in release commands. + +## Starting a Local Fork + +Before testing on a local fork, start Anvil with the required parameters: + +```bash +# Fork Celo Sepolia +anvil --fork-url https://forno.celo-sepolia.celo-testnet.org \ + --code-size-limit 500000 \ + --gas-limit 100000000 + +# Fork Mainnet +anvil --fork-url https://forno.celo.org \ + --code-size-limit 500000 \ + --gas-limit 100000000 +``` + +**Important**: The `--code-size-limit` and `--gas-limit` flags are required for Celo contract deployments due to large contract sizes. + +## Contract Verification (Block Explorers) + +The release script automatically verifies deployed contracts on: +- **Blockscout** (https://celo-sepolia.blockscout.com or https://celo.blockscout.com) - No API key required +- **Celoscan** via Etherscan V2 API (https://celoscan.io) - **API key required** for production networks + +### Verification Features + +The script handles verification automatically with: +- **Linked libraries**: Contracts using libraries (e.g., Governance with Proposals library) are verified with the `--libraries` flag +- **Foundry profiles**: Sets `FOUNDRY_PROFILE` environment variable (`truffle-compat` for 0.5.x, `truffle-compat8` for 0.8.x) to ensure bytecode matches +- **Full compiler version**: Uses full version with commit hash (e.g., `0.5.14+commit.01f1aaa4`) +- **Automatic retries**: Up to 6 retries with logarithmic delays (5s, 10s, 20s, 40s, 60s, 60s) for block explorer indexing + +### Celoscan API Key (Required for celo-sepolia and mainnet) + +The API key is **required by default** for production networks. Get your key from https://etherscan.io/myapikey + +**Setup options (in order of precedence):** + +1. **CLI flag**: `-a YOUR_API_KEY` +2. **Environment variable**: `export CELOSCAN_API_KEY=YOUR_API_KEY` +3. **Config file**: `packages/protocol/.env.json` + ```json + { + "celoScanApiKey": "YOUR_API_KEY" + } + ``` + +**Note**: The Etherscan V2 API uses a unified endpoint (`api.etherscan.io`) that works with a single API key for all supported chains including Celo. + +### Skip Verification + +To skip verification (e.g., for testing or if you don't have an API key): +```bash +yarn release:make:foundry ... -s +``` + +Verification is automatically skipped when using a custom RPC URL (local forks). + +### Verification Troubleshooting + +- **"Address is not a smart-contract"**: Block explorer hasn't indexed the contract yet. The script waits 30s initially, then automatically retries up to 6 times with logarithmic delays (5s, 10s, 20s, 40s, 60s max). +- **"Bytecode mismatch"**: Usually caused by wrong foundry profile. The script now automatically sets `FOUNDRY_PROFILE` based on contract source path. +- **Linked library errors**: The script automatically detects and passes library addresses via `--libraries` flag for contracts that use linked libraries. + +## Environment Variables + +For convenience, set these environment variables: + +```bash +# Deployer keys (keep secure!) +export CELO_SEPOLIA_DEPLOYER_KEY="0x..." +export MAINNET_DEPLOYER_KEY="0x..." + +# Celoscan API key (for contract verification) +export CELOSCAN_API_KEY="YOUR_API_KEY" + +# Optional: Custom RPC URLs +export RPC_URL="https://forno.celo.org" +``` + +--- + +*This document covers the Foundry-based release process. For legacy Truffle-based tooling, see the original scripts in `scripts/bash/`.* diff --git a/packages/protocol/artifacts/Proxy/Readme.md b/packages/protocol/artifacts/Proxy/Readme.md index 0a5f3d8c50f..14f93e72eb7 100644 --- a/packages/protocol/artifacts/Proxy/Readme.md +++ b/packages/protocol/artifacts/Proxy/Readme.md @@ -7,6 +7,6 @@ Leaving a note for future reference: 1. The `proxyInitCode...` file seems to be require the bytecode for `Proxy.sol`. I'm not sure if this is the correct way to do it, but I simply copy/pasted the JSON value at `packages/protocol/out/Proxy.sol/Proxy.json` > `bytecode.object.` which is from the Foundry build artifacts. 1. The `proxyBytecode...` file seems to be require the deployed bytecode for `Proxy.sol`. I'm not sure if this is the correct way to do it, but I simply copy/pasted the JSON value at `packages/protocol/out/Proxy.sol/Proxy.json` > `deployedBytecode.object.` which is from the Foundry build artifacts. -Unless the bytecodes in these manual artifacts matches the actual Foundry artifacts, the `test_verifyArtifacts()` test in [`ProxyFactory08.t.sol`](packages/protocol/test-sol/unit/common/ProxyFactory08.t.sol) will fail. +Unless the bytecodes in these manual artifacts matches the actual Foundry artifacts, the `test_verifyArtifacts()` test in [`ProxyFactory08.t.sol`](../../test-sol/unit/common/ProxyFactory08.t.sol) will fail. I didn't have time to investigate this further or question why manual artifacts are needed in the first place. But, I'm leaving a note here for future reference. diff --git a/packages/protocol/contractPackages.ts b/packages/protocol/contractPackages.ts index b75d67fa521..3d4a335759f 100644 --- a/packages/protocol/contractPackages.ts +++ b/packages/protocol/contractPackages.ts @@ -10,6 +10,15 @@ export interface ContractPackage { proxiesPath?: string } +export const SOLIDITY_05_PACKAGE = { + path: 'contracts', + contractsFolder: '', + folderPath: '', + name: '0.5', + contracts: [] as string[], // catch-all + truffleConfig: 'truffle-config.js', +} satisfies ContractPackage + export const MENTO_PACKAGE = { path: 'mento-core', contractsFolder: 'contracts', @@ -25,6 +34,7 @@ export const MENTO_PACKAGE = { 'StableToken', 'StableTokenEUR', 'StableTokenBRL', + 'AddressLinkedList', // FIXME: https://github.com/celo-org/celo-monorepo/issues/11684 ], proxyContracts: [ 'ExchangeBRLProxy', @@ -56,6 +66,7 @@ export const SOLIDITY_08_PACKAGE = { 'EpochManager', 'EpochManagerEnabler', 'ScoreManager', + 'AddressLinkedList', // FIXME: https://github.com/celo-org/celo-monorepo/issues/11684 ], proxyContracts: [ 'GasPriceMinimumProxy', diff --git a/packages/protocol/contracts-0.8/common/EpochManager.sol b/packages/protocol/contracts-0.8/common/EpochManager.sol index 11802e3823c..453eb27f3fe 100644 --- a/packages/protocol/contracts-0.8/common/EpochManager.sol +++ b/packages/protocol/contracts-0.8/common/EpochManager.sol @@ -252,7 +252,7 @@ contract EpochManager is "Elected accounts and signers of different lengths." ); for (uint i = 0; i < electedAccounts.length; i++) { - address group = validators.getValidatorsGroup(electedAccounts[i]); + address group = validators.getMembershipInLastEpoch(electedAccounts[i]); if (processedGroups[group] == 0) { toProcessGroups++; uint256 groupScore = scoreReader.getGroupScore(group); @@ -339,7 +339,7 @@ contract EpochManager is "Elected accounts and signers of different lengths." ); for (uint i = 0; i < electedAccounts.length; i++) { - address group = validators.getValidatorsGroup(electedAccounts[i]); + address group = validators.getMembershipInLastEpoch(electedAccounts[i]); if (processedGroups[group] == 0) { _toProcessGroups++; uint256 groupScore = scoreReader.getGroupScore(group); @@ -590,7 +590,7 @@ contract EpochManager is * @return Patch version of the contract. */ function getVersionNumber() external pure returns (uint256, uint256, uint256, uint256) { - return (1, 1, 0, 0); + return (1, 1, 0, 2); } /** @@ -699,7 +699,7 @@ contract EpochManager is address(getStableToken()) ); - uint256 CELOequivalent = (numerator * totalRewards) / denominator; + uint256 CELOequivalent = (denominator * totalRewards) / numerator; getCeloUnreleasedTreasury().release( registry.getAddressForOrDie(RESERVE_REGISTRY_ID), CELOequivalent diff --git a/packages/protocol/contracts-0.8/common/SuperBridgeETHWrapper.sol b/packages/protocol/contracts-0.8/common/SuperBridgeETHWrapper.sol new file mode 100644 index 00000000000..f66655785da --- /dev/null +++ b/packages/protocol/contracts-0.8/common/SuperBridgeETHWrapper.sol @@ -0,0 +1,70 @@ +// SPDX-License-Identifier: LGPL-3.0-only +pragma solidity >=0.8.7 <0.8.20; + +import "../../contracts/common/Initializable.sol"; +import "./interfaces/IStandardBridge.sol"; +import "./interfaces/IWETH.sol"; + +contract SuperBridgeETHWrapper { + IWETH public wethLocal; + address public wethAddressRemote; + IStandardBridge public standardBridge; + + event WrappedAndBridged(address indexed sender, uint256 amount); + + /** + * @notice Creates the contract with the WETH and Standard Bridge addresses. + * @param _wethAddressLocal The address of the local WETH contract. + * @param _wethAddressRemote The address of the remote WETH contract. + * @param _standardBridgeAddress The address of the Standard Bridge contract. + */ + constructor( + address _wethAddressLocal, + address _wethAddressRemote, + address _standardBridgeAddress + ) { + _setAddresses(_wethAddressLocal, _wethAddressRemote, _standardBridgeAddress); + } + + /** + * @notice Wraps the ETH and bridges it to the recipient. + * @param to The address of the recipient on the other chain. + * @param minGasLimit The minimum gas limit for the bridge transaction. + */ + function wrapAndBridge(address to, uint32 minGasLimit) public payable { + require(msg.value > 0, "No ETH sent"); + + // Wrap the ETH + wethLocal.deposit{ value: msg.value }(); + + // Approve the Standard Bridge to spend the WETH + wethLocal.approve(address(standardBridge), msg.value); + + // Bridge the WETH to the recipient + standardBridge.bridgeERC20To( + address(wethLocal), + address(wethAddressRemote), + to, + msg.value, + minGasLimit, + "" + ); + emit WrappedAndBridged(msg.sender, msg.value); + } + + function _setAddresses( + address _wethAddressLocal, + address _wethAddressRemote, + address _standardBridgeAddress + ) internal { + require( + _wethAddressLocal != address(0) && + _wethAddressRemote != address(0) && + _standardBridgeAddress != address(0), + "Invalid address" + ); + wethLocal = IWETH(_wethAddressLocal); + wethAddressRemote = _wethAddressRemote; + standardBridge = IStandardBridge(_standardBridgeAddress); + } +} diff --git a/packages/protocol/contracts-0.8/common/UsingRegistryV2NoMento.sol b/packages/protocol/contracts-0.8/common/UsingRegistryV2NoMento.sol index 64144c917b3..6d86987f183 100644 --- a/packages/protocol/contracts-0.8/common/UsingRegistryV2NoMento.sol +++ b/packages/protocol/contracts-0.8/common/UsingRegistryV2NoMento.sol @@ -4,33 +4,36 @@ pragma solidity >=0.8.0 <0.8.20; // Note: This is not an exact copy of UsingRegistry or UsingRegistryV2 in the contract's folder // because Mento's interfaces still don't support Solidity 0.8 -import "@openzeppelin/contracts8/access/Ownable.sol"; -import "@openzeppelin/contracts8/token/ERC20/IERC20.sol"; - -import "./interfaces/IScoreReader.sol"; - -import "../../contracts/common/interfaces/IAccounts.sol"; -import "../../contracts/common/interfaces/IEpochManager.sol"; -import "../../contracts/common/interfaces/IFeeCurrencyWhitelist.sol"; -import "../../contracts/common/interfaces/IFreezer.sol"; -import "../../contracts/common/interfaces/IRegistry.sol"; -import "../../contracts/common/interfaces/ICeloUnreleasedTreasury.sol"; - -import "../../contracts/governance/interfaces/IElection.sol"; -import "../../contracts/governance/interfaces/IEpochRewards.sol"; -import "../../contracts/governance/interfaces/IGovernance.sol"; -import "../../contracts/governance/interfaces/ILockedGold.sol"; -import "../../contracts/governance/interfaces/ILockedCelo.sol"; -import "../../contracts/governance/interfaces/IValidators.sol"; - -import "../../contracts/identity/interfaces/IRandom.sol"; -import "../../contracts/identity/interfaces/IAttestations.sol"; -import "../../contracts/identity/interfaces/IFederatedAttestations.sol"; - -// import "../../lib/mento-core/contracts/interfaces/IExchange.sol"; -// import "../../lib/mento-core/contracts/interfaces/IReserve.sol"; -// import "../../lib/mento-core/contracts/interfaces/IStableToken.sol"; -import "../../contracts/stability/interfaces/ISortedOracles.sol"; +// OpenZeppelin imports +import { IERC20 } from "@openzeppelin/contracts8/token/ERC20/IERC20.sol"; + +// Common imports +import { IAccounts } from "../../contracts/common/interfaces/IAccounts.sol"; +import { ICeloUnreleasedTreasury } from "../../contracts/common/interfaces/ICeloUnreleasedTreasury.sol"; +import { IEpochManagerEnabler } from "../../contracts/common/interfaces/IEpochManagerEnabler.sol"; +import { IEpochManager } from "../../contracts/common/interfaces/IEpochManager.sol"; +import { IFeeHandler } from "../../contracts/common/interfaces/IFeeHandler.sol"; +import { IFreezer } from "../../contracts/common/interfaces/IFreezer.sol"; +import { IRegistry } from "../../contracts/common/interfaces/IRegistry.sol"; +import { IScoreReader } from "./interfaces/IScoreReader.sol"; + +// Governance imports +import { IElection } from "../../contracts/governance/interfaces/IElection.sol"; +import { IEpochRewards } from "../../contracts/governance/interfaces/IEpochRewards.sol"; +import { IGovernance } from "../../contracts/governance/interfaces/IGovernance.sol"; +import { ILockedGold } from "../../contracts/governance/interfaces/ILockedGold.sol"; +import { ILockedCelo } from "../../contracts/governance/interfaces/ILockedCelo.sol"; +import { IValidators } from "../../contracts/governance/interfaces/IValidators.sol"; + +// Identity imports +import { IAttestations } from "../../contracts/identity/interfaces/IAttestations.sol"; +import { IEscrow } from "../../contracts/identity/interfaces/IEscrow.sol"; +import { IFederatedAttestations } from "../../contracts/identity/interfaces/IFederatedAttestations.sol"; +import { IOdisPayments } from "../../contracts/identity/interfaces/IOdisPayments.sol"; +import { IRandom } from "../../contracts/identity/interfaces/IRandom.sol"; + +// Stability imports +import { ISortedOracles } from "../../contracts/stability/interfaces/ISortedOracles.sol"; contract UsingRegistryV2NoMento { address internal constant registryAddress = 0x000000000000000000000000000000000000ce10; @@ -38,45 +41,41 @@ contract UsingRegistryV2NoMento { bytes32 internal constant ACCOUNTS_REGISTRY_ID = keccak256(abi.encodePacked("Accounts")); bytes32 internal constant ATTESTATIONS_REGISTRY_ID = keccak256(abi.encodePacked("Attestations")); - bytes32 internal constant DOWNTIME_SLASHER_REGISTRY_ID = - keccak256(abi.encodePacked("DowntimeSlasher")); + bytes32 internal constant BLOCKCHAIN_PARAMETERS_REGISTRY_ID = + keccak256(abi.encodePacked("BlockchainParameters")); + bytes32 internal constant CELO_UNRELEASED_TREASURY_REGISTRY_ID = + keccak256(abi.encodePacked("CeloUnreleasedTreasury")); + bytes32 internal constant GOLD_TOKEN_REGISTRY_ID = keccak256(abi.encodePacked("GoldToken")); + bytes32 internal constant CELO_TOKEN_REGISTRY_ID = keccak256(abi.encodePacked("CeloToken")); bytes32 internal constant DOUBLE_SIGNING_SLASHER_REGISTRY_ID = keccak256(abi.encodePacked("DoubleSigningSlasher")); + bytes32 internal constant DOWNTIME_SLASHER_REGISTRY_ID = + keccak256(abi.encodePacked("DowntimeSlasher")); bytes32 internal constant ELECTION_REGISTRY_ID = keccak256(abi.encodePacked("Election")); - bytes32 internal constant EXCHANGE_REGISTRY_ID = keccak256(abi.encodePacked("Exchange")); - bytes32 internal constant EXCHANGE_EURO_REGISTRY_ID = keccak256(abi.encodePacked("ExchangeEUR")); - bytes32 internal constant EXCHANGE_REAL_REGISTRY_ID = keccak256(abi.encodePacked("ExchangeBRL")); - - bytes32 internal constant FEE_CURRENCY_WHITELIST_REGISTRY_ID = - keccak256(abi.encodePacked("FeeCurrencyWhitelist")); + bytes32 internal constant EPOCH_REWARDS_REGISTRY_ID = keccak256(abi.encodePacked("EpochRewards")); + bytes32 internal constant EPOCH_MANAGER_ENABLER_REGISTRY_ID = + keccak256(abi.encodePacked("EpochManagerEnabler")); + bytes32 internal constant EPOCH_MANAGER_REGISTRY_ID = keccak256(abi.encodePacked("EpochManager")); + bytes32 internal constant ESCROW_REGISTRY_ID = keccak256(abi.encodePacked("Escrow")); bytes32 internal constant FEDERATED_ATTESTATIONS_REGISTRY_ID = keccak256(abi.encodePacked("FederatedAttestations")); + bytes32 internal constant FEE_CURRENCY_DIRECTORY_REGISTRY_ID = + keccak256(abi.encodePacked("FeeCurrencyDirectory")); + bytes32 internal constant FEE_HANDLER_REGISTRY_ID = keccak256(abi.encodePacked("FeeHandler")); bytes32 internal constant FREEZER_REGISTRY_ID = keccak256(abi.encodePacked("Freezer")); - bytes32 internal constant GOLD_TOKEN_REGISTRY_ID = keccak256(abi.encodePacked("GoldToken")); + bytes32 internal constant GAS_PRICE_MINIMUM_REGISTRY_ID = + keccak256(abi.encodePacked("GasPriceMinimum")); bytes32 internal constant GOVERNANCE_REGISTRY_ID = keccak256(abi.encodePacked("Governance")); bytes32 internal constant GOVERNANCE_SLASHER_REGISTRY_ID = keccak256(abi.encodePacked("GovernanceSlasher")); bytes32 internal constant LOCKED_GOLD_REGISTRY_ID = keccak256(abi.encodePacked("LockedGold")); - bytes32 internal constant RESERVE_REGISTRY_ID = keccak256(abi.encodePacked("Reserve")); + bytes32 internal constant LOCKED_CELO_REGISTRY_ID = keccak256(abi.encodePacked("LockedCelo")); + bytes32 internal constant ODIS_PAYMENT_REGISTRY_ID = keccak256(abi.encodePacked("OdisPayments")); bytes32 internal constant RANDOM_REGISTRY_ID = keccak256(abi.encodePacked("Random")); + bytes32 internal constant SCORE_MANAGER_REGISTRY_ID = keccak256(abi.encodePacked("ScoreManager")); bytes32 internal constant SORTED_ORACLES_REGISTRY_ID = keccak256(abi.encodePacked("SortedOracles")); - bytes32 internal constant STABLE_TOKEN_REGISTRY_ID = keccak256(abi.encodePacked("StableToken")); - bytes32 internal constant STABLE_EURO_TOKEN_REGISTRY_ID = - keccak256(abi.encodePacked("StableTokenEUR")); - bytes32 internal constant STABLE_REAL_TOKEN_REGISTRY_ID = - keccak256(abi.encodePacked("StableTokenBRL")); bytes32 internal constant VALIDATORS_REGISTRY_ID = keccak256(abi.encodePacked("Validators")); - bytes32 internal constant CELO_UNRELEASED_TREASURY_REGISTRY_ID = - keccak256(abi.encodePacked("CeloUnreleasedTreasury")); - - bytes32 internal constant CELO_TOKEN_REGISTRY_ID = keccak256(abi.encodePacked("CeloToken")); - bytes32 internal constant LOCKED_CELO_REGISTRY_ID = keccak256(abi.encodePacked("LockedCelo")); - bytes32 internal constant EPOCH_REWARDS_REGISTRY_ID = keccak256(abi.encodePacked("EpochRewards")); - bytes32 internal constant EPOCH_MANAGER_ENABLER_REGISTRY_ID = - keccak256(abi.encodePacked("EpochManagerEnabler")); - bytes32 internal constant EPOCH_MANAGER_REGISTRY_ID = keccak256(abi.encodePacked("EpochManager")); - bytes32 internal constant SCORE_MANAGER_REGISTRY_ID = keccak256(abi.encodePacked("ScoreManager")); modifier onlyRegisteredContract(bytes32 identifierHash) { require( @@ -95,35 +94,40 @@ contract UsingRegistryV2NoMento { return IAccounts(registryContract.getAddressForOrDie(ACCOUNTS_REGISTRY_ID)); } - function getAttestations() internal view returns (IAttestations) { - return IAttestations(registryContract.getAddressForOrDie(ATTESTATIONS_REGISTRY_ID)); + function getCeloUnreleasedTreasury() internal view returns (ICeloUnreleasedTreasury) { + return + ICeloUnreleasedTreasury( + registryContract.getAddressForOrDie(CELO_UNRELEASED_TREASURY_REGISTRY_ID) + ); + } + + function getGoldToken() internal view returns (IERC20) { + return IERC20(registryContract.getAddressForOrDie(GOLD_TOKEN_REGISTRY_ID)); + } + + function getCeloToken() internal view returns (IERC20) { + return IERC20(registryContract.getAddressForOrDie(CELO_TOKEN_REGISTRY_ID)); } function getElection() internal view returns (IElection) { return IElection(registryContract.getAddressForOrDie(ELECTION_REGISTRY_ID)); } - // function getExchange() internal view returns (IExchange) { - // return IExchange(registryContract.getAddressForOrDie(EXCHANGE_REGISTRY_ID)); - // } - - // function getExchangeDollar() internal view returns (IExchange) { - // return getExchange(); - // } + function getEpochRewards() internal view returns (IEpochRewards) { + return IEpochRewards(registryContract.getAddressForOrDie(EPOCH_REWARDS_REGISTRY_ID)); + } - // function getExchangeEuro() internal view returns (IExchange) { - // return IExchange(registryContract.getAddressForOrDie(EXCHANGE_EURO_REGISTRY_ID)); - // } + function getEpochManagerEnabler() internal view returns (IEpochManagerEnabler) { + return + IEpochManagerEnabler(registryContract.getAddressForOrDie(EPOCH_MANAGER_ENABLER_REGISTRY_ID)); + } - // function getExchangeREAL() internal view returns (IExchange) { - // return IExchange(registryContract.getAddressForOrDie(EXCHANGE_REAL_REGISTRY_ID)); - // } + function getEpochManager() internal view returns (IEpochManager) { + return IEpochManager(registryContract.getAddressForOrDie(EPOCH_MANAGER_REGISTRY_ID)); + } - function getFeeCurrencyWhitelistRegistry() internal view returns (IFeeCurrencyWhitelist) { - return - IFeeCurrencyWhitelist( - registryContract.getAddressForOrDie(FEE_CURRENCY_WHITELIST_REGISTRY_ID) - ); + function getEscrow() internal view returns (IEscrow) { + return IEscrow(registryContract.getAddressForOrDie(ESCROW_REGISTRY_ID)); } function getFederatedAttestations() internal view returns (IFederatedAttestations) { @@ -133,16 +137,12 @@ contract UsingRegistryV2NoMento { ); } - function getFreezer() internal view returns (IFreezer) { - return IFreezer(registryContract.getAddressForOrDie(FREEZER_REGISTRY_ID)); - } - - function getGoldToken() internal view returns (IERC20) { - return IERC20(registryContract.getAddressForOrDie(GOLD_TOKEN_REGISTRY_ID)); + function getFeeHandler() internal view returns (IFeeHandler) { + return IFeeHandler(registryContract.getAddressForOrDie(FEE_HANDLER_REGISTRY_ID)); } - function getCeloToken() internal view returns (IERC20) { - return IERC20(registryContract.getAddressForOrDie(CELO_TOKEN_REGISTRY_ID)); + function getFreezer() internal view returns (IFreezer) { + return IFreezer(registryContract.getAddressForOrDie(FREEZER_REGISTRY_ID)); } function getGovernance() internal view returns (IGovernance) { @@ -157,54 +157,23 @@ contract UsingRegistryV2NoMento { return ILockedCelo(registryContract.getAddressForOrDie(LOCKED_CELO_REGISTRY_ID)); } + function getOdisPayments() internal view returns (IOdisPayments) { + return IOdisPayments(registryContract.getAddressForOrDie(ODIS_PAYMENT_REGISTRY_ID)); + } + function getRandom() internal view returns (IRandom) { return IRandom(registryContract.getAddressForOrDie(RANDOM_REGISTRY_ID)); } - // function getReserve() internal view returns (IReserve) { - // return IReserve(registryContract.getAddressForOrDie(RESERVE_REGISTRY_ID)); - // } + function getScoreReader() internal view returns (IScoreReader) { + return IScoreReader(registryContract.getAddressForOrDie(SCORE_MANAGER_REGISTRY_ID)); + } function getSortedOracles() internal view returns (ISortedOracles) { return ISortedOracles(registryContract.getAddressForOrDie(SORTED_ORACLES_REGISTRY_ID)); } - // function getStableToken() internal view returns (IStableToken) { - // return IStableToken(registryContract.getAddressForOrDie(STABLE_TOKEN_REGISTRY_ID)); - // } - - // function getStableDollarToken() internal view returns (IStableToken) { - // return getStableToken(); - // } - - // function getStableEuroToken() internal view returns (IStableToken) { - // return IStableToken(registryContract.getAddressForOrDie(STABLE_EURO_TOKEN_REGISTRY_ID)); - // } - - // function getStableRealToken() internal view returns (IStableToken) { - // return IStableToken(registryContract.getAddressForOrDie(STABLE_REAL_TOKEN_REGISTRY_ID)); - // } - function getValidators() internal view returns (IValidators) { return IValidators(registryContract.getAddressForOrDie(VALIDATORS_REGISTRY_ID)); } - - function getCeloUnreleasedTreasury() internal view returns (ICeloUnreleasedTreasury) { - return - ICeloUnreleasedTreasury( - registryContract.getAddressForOrDie(CELO_UNRELEASED_TREASURY_REGISTRY_ID) - ); - } - - function getEpochRewards() internal view returns (IEpochRewards) { - return IEpochRewards(registryContract.getAddressForOrDie(EPOCH_REWARDS_REGISTRY_ID)); - } - - function getEpochManager() internal view returns (IEpochManager) { - return IEpochManager(registryContract.getAddressForOrDie(EPOCH_MANAGER_REGISTRY_ID)); - } - - function getScoreReader() internal view returns (IScoreReader) { - return IScoreReader(registryContract.getAddressForOrDie(SCORE_MANAGER_REGISTRY_ID)); - } } diff --git a/packages/protocol/contracts-0.8/common/interfaces/IStandardBridge.sol b/packages/protocol/contracts-0.8/common/interfaces/IStandardBridge.sol new file mode 100644 index 00000000000..aa2b168e4d3 --- /dev/null +++ b/packages/protocol/contracts-0.8/common/interfaces/IStandardBridge.sol @@ -0,0 +1,28 @@ +// SPDX-License-Identifier: LGPL-3.0-only +pragma solidity >=0.8.7 <0.8.20; + +/** + * @title This interface describes the functions specific to the Superbridge's Standard Bridge contract. + */ +interface IStandardBridge { + /// @notice Sends ERC20 tokens to a receiver's address on the other chain. Note that if the + /// ERC20 token on the other chain does not recognize the local token as the correct + /// pair token, the ERC20 bridge will fail and the tokens will be returned to sender on + /// this chain. + /// @param _localToken Address of the ERC20 on this chain. + /// @param _remoteToken Address of the corresponding token on the remote chain. + /// @param _to Address of the receiver. + /// @param _amount Amount of local tokens to deposit. + /// @param _minGasLimit Minimum amount of gas that the bridge can be relayed with. + /// @param _extraData Extra data to be sent with the transaction. Note that the recipient will + /// not be triggered with this data, but it will be emitted and can be used + /// to identify the transaction. + function bridgeERC20To( + address _localToken, + address _remoteToken, + address _to, + uint256 _amount, + uint32 _minGasLimit, + bytes calldata _extraData + ) external; +} diff --git a/packages/protocol/contracts-0.8/common/interfaces/IWETH.sol b/packages/protocol/contracts-0.8/common/interfaces/IWETH.sol new file mode 100644 index 00000000000..ca03da934eb --- /dev/null +++ b/packages/protocol/contracts-0.8/common/interfaces/IWETH.sol @@ -0,0 +1,17 @@ +// SPDX-License-Identifier: LGPL-3.0-only +pragma solidity >=0.8.7 <0.8.20; + +interface IWETH { + function deposit() external payable; + function withdraw(uint wad) external; + + function totalSupply() external view returns (uint); + + function approve(address guy, uint wad) external returns (bool); + + function transfer(address dst, uint wad) external returns (bool); + + function transferFrom(address src, address dst, uint wad) external returns (bool); + + function allowance(address owner, address spender) external returns (uint256); +} diff --git a/packages/protocol/contracts-0.8/common/linkedlists/SortedLinkedList.sol b/packages/protocol/contracts-0.8/common/linkedlists/SortedLinkedList.sol index 4ce7b23b846..04564aa4b86 100644 --- a/packages/protocol/contracts-0.8/common/linkedlists/SortedLinkedList.sol +++ b/packages/protocol/contracts-0.8/common/linkedlists/SortedLinkedList.sol @@ -160,7 +160,7 @@ library SortedLinkedList { } /** - * @notice Returns the keys of the elements greaterKey than and less than the provided value. + * @notice Returns the keys of the elements greater than and less than the provided value. * @param list A storage pointer to the underlying list. * @param value The element value. * @param lesserKey The key of the element which could be just left of the new value. diff --git a/packages/protocol/contracts-0.8/common/test/MockCeloToken.sol b/packages/protocol/contracts-0.8/common/test/MockCeloToken.sol index 0c85c62cf80..e05b7d36c06 100644 --- a/packages/protocol/contracts-0.8/common/test/MockCeloToken.sol +++ b/packages/protocol/contracts-0.8/common/test/MockCeloToken.sol @@ -9,6 +9,7 @@ contract MockCeloToken08 { uint256 public totalSupply_; uint8 public constant decimals = 18; mapping(address => uint256) balances; + uint256 public _totalSupply; function setTotalSupply(uint256 value) external { totalSupply_ = value; diff --git a/packages/protocol/contracts-0.8/governance/Validators.sol b/packages/protocol/contracts-0.8/governance/Validators.sol index c9934e64d5f..17e3e551329 100644 --- a/packages/protocol/contracts-0.8/governance/Validators.sol +++ b/packages/protocol/contracts-0.8/governance/Validators.sol @@ -8,7 +8,6 @@ import "solidity-bytes-utils-8/contracts/BytesLib.sol"; import "../../contracts/governance/interfaces/IValidators.sol"; -import "../../contracts/common/CalledByVm.sol"; import "../../contracts/common/Initializable.sol"; import "../../contracts/common/FixidityLib.sol"; import "../common/linkedlists/AddressLinkedList.sol"; @@ -30,8 +29,7 @@ contract Validators is ReentrancyGuard, Initializable, UsingRegistry, - PrecompilesOverride, - CalledByVm + PrecompilesOverride { using FixidityLib for FixidityLib.Fraction; using AddressLinkedList for LinkedList.List; @@ -115,7 +113,6 @@ contract Validators is struct InitParams { // The number of blocks to delay a ValidatorGroup's commission uint256 commissionUpdateDelay; - uint256 downtimeGracePeriod; } mapping(address => ValidatorGroup) private groups; @@ -124,17 +121,16 @@ contract Validators is address[] private registeredValidators; LockedGoldRequirements public validatorLockedGoldRequirements; LockedGoldRequirements public groupLockedGoldRequirements; - ValidatorScoreParameters private validatorScoreParameters; + ValidatorScoreParameters private deprecated_validatorScoreParameters; uint256 public membershipHistoryLength; uint256 public maxGroupSize; // The number of blocks to delay a ValidatorGroup's commission update uint256 public commissionUpdateDelay; uint256 public slashingMultiplierResetPeriod; - uint256 public downtimeGracePeriod; + uint256 public deprecated_downtimeGracePeriod; event MaxGroupSizeSet(uint256 size); event CommissionUpdateDelaySet(uint256 delay); - event ValidatorScoreParametersSet(uint256 exponent, uint256 adjustmentSpeed); event GroupLockedGoldRequirementsSet(uint256 value, uint256 duration); event ValidatorLockedGoldRequirementsSet(uint256 value, uint256 duration); event MembershipHistoryLengthSet(uint256 length); @@ -143,8 +139,6 @@ contract Validators is event ValidatorAffiliated(address indexed validator, address indexed group); event ValidatorDeaffiliated(address indexed validator, address indexed group); event ValidatorEcdsaPublicKeyUpdated(address indexed validator, bytes ecdsaPublicKey); - event ValidatorBlsPublicKeyUpdated(address indexed validator, bytes blsPublicKey); - event ValidatorScoreUpdated(address indexed validator, uint256 score, uint256 epochScore); event ValidatorGroupRegistered(address indexed group, uint256 commission); event ValidatorGroupDeregistered(address indexed group); event ValidatorGroupMemberAdded(address indexed group, address indexed validator); @@ -156,12 +150,6 @@ contract Validators is uint256 activationBlock ); event ValidatorGroupCommissionUpdated(address indexed group, uint256 commission); - event ValidatorEpochPaymentDistributed( - address indexed validator, - uint256 validatorPayment, - address indexed group, - uint256 groupPayment - ); modifier onlySlasher() { require(getLockedGold().isSlasher(msg.sender), "Only registered slasher can call"); @@ -181,8 +169,6 @@ contract Validators is * @param groupRequirementDuration The Locked Gold requirement duration for groups. * @param validatorRequirementValue The Locked Gold requirement amount for validators. * @param validatorRequirementDuration The Locked Gold requirement duration for validators. - * @param validatorScoreExponent The exponent used in calculating validator scores. - * @param validatorScoreAdjustmentSpeed The speed at which validator scores are adjusted. * @param _membershipHistoryLength The max number of entries for validator membership history. * @param _maxGroupSize The maximum group size. * update. @@ -194,8 +180,6 @@ contract Validators is uint256 groupRequirementDuration, uint256 validatorRequirementValue, uint256 validatorRequirementDuration, - uint256 validatorScoreExponent, - uint256 validatorScoreAdjustmentSpeed, uint256 _membershipHistoryLength, uint256 _slashingMultiplierResetPeriod, uint256 _maxGroupSize, @@ -205,106 +189,22 @@ contract Validators is setRegistry(registryAddress); setGroupLockedGoldRequirements(groupRequirementValue, groupRequirementDuration); setValidatorLockedGoldRequirements(validatorRequirementValue, validatorRequirementDuration); - setValidatorScoreParameters(validatorScoreExponent, validatorScoreAdjustmentSpeed); setMaxGroupSize(_maxGroupSize); setCommissionUpdateDelay(initParams.commissionUpdateDelay); setMembershipHistoryLength(_membershipHistoryLength); setSlashingMultiplierResetPeriod(_slashingMultiplierResetPeriod); - setDowntimeGracePeriod(initParams.downtimeGracePeriod); - } - - /** - * @notice Updates a validator's score based on its uptime for the epoch. - * @param signer The validator signer of the validator account whose score needs updating. - * @param uptime The Fixidity representation of the validator's uptime, between 0 and 1. - */ - function updateValidatorScoreFromSigner( - address signer, - uint256 uptime - ) external virtual onlyVm onlyL1 { - _updateValidatorScoreFromSigner(signer, uptime); - } - - /** - * @notice Distributes epoch payments to the account associated with `signer` and its group. - * @param signer The validator signer of the account to distribute the epoch payment to. - * @param maxPayment The maximum payment to the validator. Actual payment is based on score and - * group commission. - * @return distributeEpochPaymentsFromSigner The total payment paid to the validator and their group. - */ - function distributeEpochPaymentsFromSigner( - address signer, - uint256 maxPayment - ) external virtual onlyVm onlyL1 returns (uint256) { - return _distributeEpochPaymentsFromSigner(signer, maxPayment); } /** * @notice Registers a validator unaffiliated with any validator group. * @param ecdsaPublicKey The ECDSA public key that the validator is using for consensus, should * match the validator signer. 64 bytes. - * @param blsPublicKey The BLS public key that the validator is using for consensus, should pass - * proof of possession. 96 bytes. - * @param blsPop The BLS public key proof-of-possession, which consists of a signature on the - * account address. 48 bytes. * @return True upon success. * @dev Fails if the account is already a validator or validator group. * @dev Fails if the account does not have sufficient Locked Gold. - * @dev Fails on L2. Use registerValidatorNoBls instead. */ - function registerValidator( - bytes calldata ecdsaPublicKey, - bytes calldata blsPublicKey, - bytes calldata blsPop - ) external nonReentrant onlyL1 returns (bool) { - address account = getAccounts().validatorSignerToAccount(msg.sender); - _isRegistrationAllowed(account); - require(!isValidator(account) && !isValidatorGroup(account), "Already registered"); - uint256 lockedGoldBalance = getLockedGold().getAccountTotalLockedGold(account); - require(lockedGoldBalance >= validatorLockedGoldRequirements.value, "Deposit too small"); - Validator storage validator = validators[account]; - address signer = getAccounts().getValidatorSigner(account); - require( - _updateEcdsaPublicKey(validator, account, signer, ecdsaPublicKey), - "Error updating ECDSA public key" - ); - // TODO double check blob - require( - _updateBlsPublicKey(validator, account, blsPublicKey, blsPop), - "Error updating BLS public key" - ); - registeredValidators.push(account); - updateMembershipHistory(account, address(0)); - emit ValidatorRegistered(account); - return true; - } - - /** - * @notice Registers a validator unaffiliated with any validator group. - * @param ecdsaPublicKey The ECDSA public key that the validator is using for consensus, should - * match the validator signer. 64 bytes. - * @return True upon success. - * @dev Fails if the account is already a validator or validator group. - * @dev Fails if the account does not have sufficient Locked Gold. - */ - function registerValidatorNoBls( - bytes calldata ecdsaPublicKey - ) external nonReentrant onlyL2 returns (bool) { - address account = getAccounts().validatorSignerToAccount(msg.sender); - _isRegistrationAllowed(account); - require(!isValidator(account) && !isValidatorGroup(account), "Already registered"); - uint256 lockedGoldBalance = getLockedGold().getAccountTotalLockedGold(account); - require(lockedGoldBalance >= validatorLockedGoldRequirements.value, "Deposit too small"); - Validator storage validator = validators[account]; - address signer = getAccounts().getValidatorSigner(account); - require( - _updateEcdsaPublicKey(validator, account, signer, ecdsaPublicKey), - "Error updating ECDSA public key" - ); - registeredValidators.push(account); - updateMembershipHistory(account, address(0)); - emit ValidatorRegistered(account); - return true; + function registerValidator(bytes calldata ecdsaPublicKey) external nonReentrant returns (bool) { + return registerValidatorNoBls(ecdsaPublicKey); } /** @@ -374,28 +274,6 @@ contract Validators is return true; } - /** - * @notice Updates a validator's BLS key. - * @param blsPublicKey The BLS public key that the validator is using for consensus, should pass - * proof of possession. 48 bytes. - * @param blsPop The BLS public key proof-of-possession, which consists of a signature on the - * account address. 48 bytes. - * @return True upon success. - */ - function updateBlsPublicKey( - bytes calldata blsPublicKey, - bytes calldata blsPop - ) external onlyL1 returns (bool) { - address account = getAccounts().validatorSignerToAccount(msg.sender); - require(isValidator(account), "Not a validator"); - Validator storage validator = validators[account]; - require( - _updateBlsPublicKey(validator, account, blsPublicKey, blsPop), - "Error updating BLS public key" - ); - return true; - } - /** * @notice Updates a validator's ECDSA key. * @param account The address under which the validator is registered. @@ -443,37 +321,6 @@ contract Validators is return true; } - /** - * @notice Updates a validator's ECDSA and BLS keys. - * @param account The address under which the validator is registered. - * @param signer The address which the validator is using to sign consensus messages. - * @param ecdsaPublicKey The ECDSA public key corresponding to `signer`. - * @param blsPublicKey The BLS public key that the validator is using for consensus, should pass - * proof of possession. 96 bytes. - * @param blsPop The BLS public key proof-of-possession, which consists of a signature on the - * account address. 48 bytes. - * @return True upon success. - */ - function updatePublicKeys( - address account, - address signer, - bytes calldata ecdsaPublicKey, - bytes calldata blsPublicKey, - bytes calldata blsPop - ) external onlyL1 onlyRegisteredContract(ACCOUNTS_REGISTRY_ID) returns (bool) { - require(isValidator(account), "Not a validator"); - Validator storage validator = validators[account]; - require( - _updateEcdsaPublicKey(validator, account, signer, ecdsaPublicKey), - "Error updating ECDSA public key" - ); - require( - _updateBlsPublicKey(validator, account, blsPublicKey, blsPop), - "Error updating BLS public key" - ); - return true; - } - /** * @notice Registers a validator group with no member validators. * @param commission Fixidity representation of the commission this group receives on epoch @@ -652,26 +499,13 @@ contract Validators is */ function mintStableToEpochManager( uint256 amount - ) external onlyL2 nonReentrant onlyRegisteredContract(EPOCH_MANAGER_REGISTRY_ID) { + ) external nonReentrant onlyRegisteredContract(EPOCH_MANAGER_REGISTRY_ID) { require( IStableToken(getStableToken()).mint(msg.sender, amount), "mint failed to epoch manager" ); } - /** - * @notice Returns the validator BLS key. - * @param signer The account that registered the validator or its authorized signing address. - * @return blsPublicKey The validator BLS key. - */ - function getValidatorBlsPublicKeyFromSigner( - address signer - ) external view returns (bytes memory blsPublicKey) { - address account = getAccounts().signerToAccount(signer); - require(isValidator(account), "Not a validator"); - return validators[account].publicKeys.bls; - } - function getMembershipHistoryLength() external view returns (uint256) { return membershipHistoryLength; } @@ -712,7 +546,6 @@ contract Validators is * @param account The address of the validator group. * @param n The number of members to return. * @return The signers of the top n group members for a particular group. - * @dev Returns the account instead of signer on L2. */ function getTopGroupValidators( address account, @@ -862,15 +695,6 @@ contract Validators is return history.entries[index].group; } - /** - * @notice Returns the parameters that govern how a validator's score is calculated. - * @return The exponent that governs how a validator's score is calculated. - * @return The adjustment speed that governs how a validator's score is calculated. - */ - function getValidatorScoreParameters() external view returns (uint256, uint256) { - return (validatorScoreParameters.exponent, validatorScoreParameters.adjustmentSpeed.unwrap()); - } - /** * @notice Returns the group membership history of a validator. * @param account The validator whose membership history to return. @@ -893,22 +717,6 @@ contract Validators is return (epochs, membershipGroups, history.lastRemovedFromGroupTimestamp, history.tail); } - /** - * @notice Calculates the aggregate score of a group for an epoch from individual uptimes. - * @param uptimes Array of Fixidity representations of the validators' uptimes, between 0 and 1. - * @dev group_score = average(uptimes ** exponent) - * @return Fixidity representation of the group epoch score between 0 and 1. - */ - function calculateGroupEpochScore(uint256[] calldata uptimes) external view returns (uint256) { - require(uptimes.length > 0, "Uptime array empty"); - require(uptimes.length <= maxGroupSize, "Uptime array larger than maximum group size"); - FixidityLib.Fraction memory sum; - for (uint256 i = 0; i < uptimes.length; i = i.add(1)) { - sum = sum.add(FixidityLib.wrap(calculateEpochScore(uptimes[i]))); - } - return sum.divide(FixidityLib.newFixed(uptimes.length)).unwrap(); - } - /** * @notice Returns the maximum number of members a group can add. * @return The maximum number of members a group can add. @@ -937,7 +745,12 @@ contract Validators is uint256 score, uint256 maxPayment ) external view virtual returns (uint256) { - require(isValidator(account), "Not a validator"); + if (!isValidator(account)) { + // In the unlikely scenario that the validator is still in the set after it has deaffiliated + // skip the payment. This is only possible if the epochs have not been processed for more than the time + // defined in validatorLockedGoldRequirements.duration. (currently 60 days) + return 0; + } FixidityLib.Fraction memory scoreFraction = FixidityLib.wrap(score); require(scoreFraction.lte(FixidityLib.fixed1()), "Score must be <= 1"); @@ -966,7 +779,36 @@ contract Validators is * @return Patch version of the contract. */ function getVersionNumber() external pure returns (uint256, uint256, uint256, uint256) { - return (1, 3, 0, 1); + return (1, 4, 0, 0); + } + + /** + * @notice Registers a validator unaffiliated with any validator group. + * @param ecdsaPublicKey The ECDSA public key that the validator is using for consensus, should + * match the validator signer. 64 bytes. + * @return True upon success. + * @dev Fails if the account is already a validator or validator group. + * @dev Fails if the account does not have sufficient Locked Gold. + * @dev Will be deprecated in favor of `registerValidator(bytes calldata ecdsaPublicKey)`. + */ + function registerValidatorNoBls( + bytes calldata ecdsaPublicKey + ) public nonReentrant returns (bool) { + address account = getAccounts().validatorSignerToAccount(msg.sender); + _isRegistrationAllowed(account); + require(!isValidator(account) && !isValidatorGroup(account), "Already registered"); + uint256 lockedGoldBalance = getLockedGold().getAccountTotalLockedGold(account); + require(lockedGoldBalance >= validatorLockedGoldRequirements.value, "Deposit too small"); + Validator storage validator = validators[account]; + address signer = getAccounts().getValidatorSigner(account); + require( + _updateEcdsaPublicKey(validator, account, signer, ecdsaPublicKey), + "Error updating ECDSA public key" + ); + registeredValidators.push(account); + updateMembershipHistory(account, address(0)); + emit ValidatorRegistered(account); + return true; } /** @@ -1005,33 +847,6 @@ contract Validators is return true; } - /** - * @notice Updates the validator score parameters. - * @param exponent The exponent used in calculating the score. - * @param adjustmentSpeed The speed at which the score is adjusted. - * @return True upon success. - */ - function setValidatorScoreParameters( - uint256 exponent, - uint256 adjustmentSpeed - ) public onlyOwner onlyL1 returns (bool) { - require( - adjustmentSpeed <= FixidityLib.fixed1().unwrap(), - "Adjustment speed cannot be larger than 1" - ); - require( - exponent != validatorScoreParameters.exponent || - !FixidityLib.wrap(adjustmentSpeed).equals(validatorScoreParameters.adjustmentSpeed), - "Adjustment speed and exponent not changed" - ); - validatorScoreParameters = ValidatorScoreParameters( - exponent, - FixidityLib.wrap(adjustmentSpeed) - ); - emit ValidatorScoreParametersSet(exponent, adjustmentSpeed); - return true; - } - /** * @notice Updates the Locked Gold requirements for Validator Groups. * @param value The per-member amount of Locked Gold required. @@ -1080,14 +895,6 @@ contract Validators is slashingMultiplierResetPeriod = value; } - /** - * @notice Sets the downtimeGracePeriod property if called by owner. - * @param value New downtime grace period for calculating epoch scores. - */ - function setDowntimeGracePeriod(uint256 value) public nonReentrant onlyOwner onlyL1 { - downtimeGracePeriod = value; - } - /** * @notice Returns the current locked gold balance requirement for the supplied account. * @param account The account that may have to meet locked gold balance requirements. @@ -1132,28 +939,6 @@ contract Validators is return history.entries[head].group; } - /** - * @notice Calculates the validator score for an epoch from the uptime value for the epoch. - * @param uptime The Fixidity representation of the validator's uptime, between 0 and 1. - * @dev epoch_score = uptime ** exponent - * @return Fixidity representation of the epoch score between 0 and 1. - */ - function calculateEpochScore(uint256 uptime) public view onlyL1 returns (uint256) { - require(uptime <= FixidityLib.fixed1().unwrap(), "Uptime cannot be larger than one"); - uint256 numerator; - uint256 denominator; - uptime = Math.min(uptime.add(downtimeGracePeriod), FixidityLib.fixed1().unwrap()); - (numerator, denominator) = fractionMulExp( - FixidityLib.fixed1().unwrap(), - FixidityLib.fixed1().unwrap(), - uptime, - FixidityLib.fixed1().unwrap(), - validatorScoreParameters.exponent, - 18 - ); - return FixidityLib.newFixedFraction(numerator, denominator).unwrap(); - } - /** * @notice Returns whether or not an account meets its Locked Gold requirements. * @param account The address of the account. @@ -1238,75 +1023,6 @@ contract Validators is return validators[account].publicKeys.ecdsa.length > 0; } - /** - * @notice Distributes epoch payments to the account associated with `signer` and its group. - * @param signer The validator signer of the validator to distribute the epoch payment to. - * @param maxPayment The maximum payment to the validator. Actual payment is based on score and - * group commission. - * @return The total payment paid to the validator and their group. - */ - function _distributeEpochPaymentsFromSigner( - address signer, - uint256 maxPayment - ) internal returns (uint256) { - address account = getAccounts().signerToAccount(signer); - require(isValidator(account), "Not a validator"); - // The group that should be paid is the group that the validator was a member of at the - // time it was elected. - address group = getMembershipInLastEpoch(account); - require(group != address(0), "Validator not registered with a group"); - // Both the validator and the group must maintain the minimum locked gold balance in order to - // receive epoch payments. - if (meetsAccountLockedGoldRequirements(account) && meetsAccountLockedGoldRequirements(group)) { - FixidityLib.Fraction memory totalPayment = FixidityLib - .newFixed(maxPayment) - .multiply(validators[account].score) - .multiply(groups[group].slashInfo.multiplier); - uint256 groupPayment = totalPayment.multiply(groups[group].commission).fromFixed(); - FixidityLib.Fraction memory remainingPayment = FixidityLib.newFixed( - totalPayment.fromFixed().sub(groupPayment) - ); - (address beneficiary, uint256 fraction) = getAccounts().getPaymentDelegation(account); - uint256 delegatedPayment = remainingPayment.multiply(FixidityLib.wrap(fraction)).fromFixed(); - uint256 validatorPayment = remainingPayment.fromFixed().sub(delegatedPayment); - IStableToken stableToken = IStableToken(getStableToken()); - require(stableToken.mint(group, groupPayment), "mint failed to validator group"); - require(stableToken.mint(account, validatorPayment), "mint failed to validator account"); - if (fraction != 0) { - require(stableToken.mint(beneficiary, delegatedPayment), "mint failed to delegatee"); - } - emit ValidatorEpochPaymentDistributed(account, validatorPayment, group, groupPayment); - return totalPayment.fromFixed(); - } else { - return 0; - } - } - - /** - * @notice Updates a validator's score based on its uptime for the epoch. - * @param signer The validator signer of the validator whose score needs updating. - * @param uptime The Fixidity representation of the validator's uptime, between 0 and 1. - * @dev new_score = uptime ** exponent * adjustmentSpeed + old_score * (1 - adjustmentSpeed) - */ - function _updateValidatorScoreFromSigner(address signer, uint256 uptime) internal { - address account = getAccounts().signerToAccount(signer); - require(isValidator(account), "Not a validator"); - - FixidityLib.Fraction memory epochScore = FixidityLib.wrap(calculateEpochScore(uptime)); - FixidityLib.Fraction memory newComponent = validatorScoreParameters.adjustmentSpeed.multiply( - epochScore - ); - - FixidityLib.Fraction memory currentComponent = FixidityLib.fixed1().subtract( - validatorScoreParameters.adjustmentSpeed - ); - currentComponent = currentComponent.multiply(validators[account].score); - validators[account].score = FixidityLib.wrap( - Math.min(epochScore.unwrap(), newComponent.add(currentComponent).unwrap()) - ); - emit ValidatorScoreUpdated(account, validators[account].score.unwrap(), epochScore.unwrap()); - } - function _isRegistrationAllowed(address account) private { require( !getElection().allowedToVoteOverMaxNumberOfGroups(account), @@ -1354,30 +1070,6 @@ contract Validators is return true; } - /** - * @notice Updates a validator's BLS key. - * @param validator The validator whose BLS public key should be updated. - * @param account The address under which the validator is registered. - * @param blsPublicKey The BLS public key that the validator is using for consensus, should pass - * proof of possession. 96 bytes. - * @param blsPop The BLS public key proof-of-possession, which consists of a signature on the - * account address. 48 bytes. - * @return True upon success. - */ - function _updateBlsPublicKey( - Validator storage validator, - address account, - bytes memory blsPublicKey, - bytes memory blsPop - ) private returns (bool) { - require(blsPublicKey.length == 96, "Wrong BLS public key length"); - require(blsPop.length == 48, "Wrong BLS PoP length"); - require(checkProofOfPossession(account, blsPublicKey, blsPop), "Invalid BLS PoP"); - validator.publicKeys.bls = blsPublicKey; - emit ValidatorBlsPublicKeyUpdated(account, blsPublicKey); - return true; - } - /** * @notice Updates a validator's ECDSA key. * @param validator The validator whose ECDSA public key should be updated. @@ -1524,9 +1216,7 @@ contract Validators is } function _sendValidatorPaymentIfNecessary(address validator) private { - if (isL2()) { - getEpochManager().sendValidatorPayment(validator); - } + getEpochManager().sendValidatorPayment(validator); } function _sendValidatorGroupPaymentsIfNecessary(ValidatorGroup storage group) private { diff --git a/packages/protocol/contracts-0.8/governance/interfaces/IValidatorsInitializer.sol b/packages/protocol/contracts-0.8/governance/interfaces/IValidatorsInitializer.sol index 21c68bb975b..0d623d033f4 100644 --- a/packages/protocol/contracts-0.8/governance/interfaces/IValidatorsInitializer.sol +++ b/packages/protocol/contracts-0.8/governance/interfaces/IValidatorsInitializer.sol @@ -8,8 +8,6 @@ interface IValidatorsInitializer { uint256 groupRequirementDuration, uint256 validatorRequirementValue, uint256 validatorRequirementDuration, - uint256 validatorScoreExponent, - uint256 validatorScoreAdjustmentSpeed, uint256 _membershipHistoryLength, uint256 _slashingMultiplierResetPeriod, uint256 _maxGroupSize, @@ -21,6 +19,5 @@ library InitParamsLib { struct InitParams { // The number of blocks to delay a ValidatorGroup's commission uint256 commissionUpdateDelay; - uint256 downtimeGracePeriod; } } diff --git a/packages/protocol/contracts-0.8/governance/test/EpochRewardsMock.sol b/packages/protocol/contracts-0.8/governance/test/EpochRewardsMock.sol index df4e8e592a8..3a3ee22277d 100644 --- a/packages/protocol/contracts-0.8/governance/test/EpochRewardsMock.sol +++ b/packages/protocol/contracts-0.8/governance/test/EpochRewardsMock.sol @@ -25,9 +25,6 @@ contract EpochRewardsMock08 is IEpochRewards { return 0; } - function isReserveLow() external pure returns (bool) { - return false; - } function calculateTargetEpochRewards() external view diff --git a/packages/protocol/contracts-0.8/governance/test/IMockValidators.sol b/packages/protocol/contracts-0.8/governance/test/IMockValidators.sol index 9dd67b7857f..087150a1ada 100644 --- a/packages/protocol/contracts-0.8/governance/test/IMockValidators.sol +++ b/packages/protocol/contracts-0.8/governance/test/IMockValidators.sol @@ -52,8 +52,6 @@ interface IMockValidators { function getAccountLockedGoldRequirement(address account) external view returns (uint256); - function calculateGroupEpochScore(uint256[] calldata uptimes) external view returns (uint256); - function getGroupsNumMembers(address[] calldata groups) external view returns (uint256[] memory); function groupMembershipInEpoch(address addr, uint256, uint256) external view returns (address); diff --git a/packages/protocol/contracts/CompileExchange.sol b/packages/protocol/contracts/CompileExchange.sol index fbcb78aa10d..4c04b5be804 100644 --- a/packages/protocol/contracts/CompileExchange.sol +++ b/packages/protocol/contracts/CompileExchange.sol @@ -220,11 +220,11 @@ contract CompileExchange is /** * @notice Allows owner to set the minimum number of reports required - * @param newMininumReports The new update minimum number of reports required + * @param newMinimumReports The new update minimum number of reports required */ - function setMinimumReports(uint256 newMininumReports) public onlyOwner { - minimumReports = newMininumReports; - emit MinimumReportsSet(newMininumReports); + function setMinimumReports(uint256 newMinimumReports) public onlyOwner { + minimumReports = newMinimumReports; + emit MinimumReportsSet(newMinimumReports); } /** diff --git a/packages/protocol/contracts/common/Accounts.sol b/packages/protocol/contracts/common/Accounts.sol index edd7a47624d..86d5a875ff1 100644 --- a/packages/protocol/contracts/common/Accounts.sol +++ b/packages/protocol/contracts/common/Accounts.sol @@ -11,7 +11,6 @@ import "../common/interfaces/ICeloVersionedContract.sol"; import "../common/Signatures.sol"; import "../common/UsingRegistry.sol"; import "../common/libraries/ReentrancyGuard.sol"; -import "../../contracts-0.8/common/IsL2Check.sol"; contract Accounts is IAccounts, @@ -19,8 +18,7 @@ contract Accounts is Ownable, ReentrancyGuard, Initializable, - UsingRegistry, - IsL2Check + UsingRegistry { using FixidityLib for FixidityLib.Fraction; using SafeMath for uint256; @@ -272,38 +270,6 @@ contract Accounts is emit ValidatorSignerAuthorized(msg.sender, signer); } - /** - * @notice Authorizes an address to sign consensus messages on behalf of the account. - * @param signer The address of the signing key to authorize. - * @param ecdsaPublicKey The ECDSA public key corresponding to `signer`. - * @param blsPublicKey The BLS public key that the validator is using for consensus, should pass - * proof of possession. 96 bytes. - * @param blsPop The BLS public key proof-of-possession, which consists of a signature on the - * account address. 48 bytes. - * @param v The recovery id of the incoming ECDSA signature. - * @param r Output value r of the ECDSA signature. - * @param s Output value s of the ECDSA signature. - * @dev v, r, s constitute `signer`'s signature on `msg.sender`. - */ - function authorizeValidatorSignerWithKeys( - address signer, - uint8 v, - bytes32 r, - bytes32 s, - bytes calldata ecdsaPublicKey, - bytes calldata blsPublicKey, - bytes calldata blsPop - ) external nonReentrant { - legacyAuthorizeSignerWithSignature(signer, ValidatorSigner, v, r, s); - setIndexedSigner(signer, ValidatorSigner); - - require( - getValidators().updatePublicKeys(msg.sender, signer, ecdsaPublicKey, blsPublicKey, blsPop), - "Failed to update validator keys" - ); - emit ValidatorSignerAuthorized(msg.sender, signer); - } - /** * @notice Getter for the metadata of multiple accounts. * @param accountsToQuery The addresses of the accounts to get the metadata for. @@ -498,7 +464,7 @@ contract Accounts is * @return Patch version of the contract. */ function getVersionNumber() external pure returns (uint256, uint256, uint256, uint256) { - return (1, 1, 4, 2); + return (1, 2, 0, 0); } /** @@ -580,7 +546,7 @@ contract Accounts is } /** - * @notice Removes a validator's payment delegation by setting benficiary and + * @notice Removes a validator's payment delegation by setting beneficiary and * fraction to 0. */ function deletePaymentDelegation() public { diff --git a/packages/protocol/contracts/common/Blockable.sol b/packages/protocol/contracts/common/Blockable.sol index 90b0617ea1a..fc5f7a83a79 100644 --- a/packages/protocol/contracts/common/Blockable.sol +++ b/packages/protocol/contracts/common/Blockable.sol @@ -9,7 +9,7 @@ import "./interfaces/IBlocker.sol"; * @dev This contract uses an external IBlocker contract to determine if it is blocked. The owner can set the blocking contract. **/ contract Blockable is IBlockable { - // using directly memory slot so contracts can inherit from this contract withtout breaking storage layout + // using directly memory slot so contracts can inherit from this contract without breaking storage layout bytes32 private constant BLOCKEDBY_POSITION = bytes32(uint256(keccak256("blocked_by_position")) - 1); diff --git a/packages/protocol/contracts/common/GoldToken.sol b/packages/protocol/contracts/common/GoldToken.sol index f3a9d307958..8445e3c8cb6 100644 --- a/packages/protocol/contracts/common/GoldToken.sol +++ b/packages/protocol/contracts/common/GoldToken.sol @@ -6,12 +6,10 @@ import "openzeppelin-solidity/contracts/math/SafeMath.sol"; import "openzeppelin-solidity/contracts/token/ERC20/IERC20.sol"; import "./UsingRegistry.sol"; -import "./CalledByVm.sol"; import "./Initializable.sol"; import "./interfaces/ICeloToken.sol"; import "./interfaces/ICeloTokenInitializer.sol"; import "./interfaces/ICeloVersionedContract.sol"; -import "../../contracts-0.8/common/IsL2Check.sol"; /** * @title ERC20 interface for the CELO token. @@ -24,13 +22,11 @@ import "../../contracts-0.8/common/IsL2Check.sol"; */ contract GoldToken is Initializable, - CalledByVm, UsingRegistry, IERC20, ICeloToken, ICeloTokenInitializer, - ICeloVersionedContract, - IsL2Check + ICeloVersionedContract { using SafeMath for uint256; @@ -41,7 +37,7 @@ contract GoldToken is string constant SYMBOL = "CELO"; uint8 constant DECIMALS = 18; uint256 constant CELO_SUPPLY_CAP = 1000000000 ether; // 1 billion CELO - uint256 internal totalSupply_; + uint256 internal deprecated_totalSupply_; // this variable is deprecated // solhint-enable state-visibility mapping(address => mapping(address => uint256)) internal allowed; @@ -66,7 +62,6 @@ contract GoldToken is * @param registryAddress Address of the Registry contract. */ function initialize(address registryAddress) external initializer { - totalSupply_ = 0; _transferOwnership(msg.sender); setRegistry(registryAddress); } @@ -176,38 +171,6 @@ contract GoldToken is return true; } - /** - * @notice Mints new CELO and gives it to 'to'. - * @param to The account for which to mint tokens. - * @param value The amount of CELO to mint. - * @dev This function will be deprecated in L2. - */ - function mint(address to, uint256 value) external onlyL1 onlyVm returns (bool) { - if (value == 0) { - return true; - } - - require(to != address(0), "mint attempted to reserved address 0x0"); - totalSupply_ = totalSupply_.add(value); - - bool success; - (success, ) = TRANSFER.call.value(0).gas(gasleft())(abi.encode(address(0), to, value)); - require(success, "CELO transfer failed"); - - emit Transfer(address(0), to, value); - return true; - } - - /** - * @notice Increases the variable for total amount of CELO in existence. - * @param amount The amount to increase counter by - * @dev This function will be deprecated in L2. The onlyway to increase - * the supply is with the mint function. - */ - function increaseSupply(uint256 amount) external onlyL1 onlyVm { - totalSupply_ = totalSupply_.add(amount); - } - /** * @return The name of the CELO token. */ @@ -229,13 +192,6 @@ contract GoldToken is return DECIMALS; } - /** - * @return The total amount of CELO in existence, not including what the burn address holds. - */ - function circulatingSupply() external view returns (uint256) { - return allocatedSupply().sub(getBurnedAmount()).sub(balanceOf(address(0))); - } - /** * @notice Gets the amount of owner's CELO allowed to be spent by spender. * @param _owner The owner of the CELO. @@ -254,7 +210,7 @@ contract GoldToken is * @return Patch version of the contract. */ function getVersionNumber() external pure returns (uint256, uint256, uint256, uint256) { - return (1, 1, 3, 0); + return (1, 2, 0, 0); } /** @@ -278,22 +234,14 @@ contract GoldToken is * @return The total amount of allocated CELO. */ function allocatedSupply() public view returns (uint256) { - if (isL2()) { - return CELO_SUPPLY_CAP - getCeloUnreleasedTreasury().getRemainingBalanceToRelease(); - } else { - return totalSupply(); - } + return CELO_SUPPLY_CAP - getCeloUnreleasedTreasury().getRemainingBalanceToRelease(); } /** * @return The total amount of CELO in existence, including what the burn address holds. */ - function totalSupply() public view returns (uint256) { - if (isL2()) { - return CELO_SUPPLY_CAP; - } else { - return totalSupply_; - } + function totalSupply() external view returns (uint256) { + return CELO_SUPPLY_CAP; } /** diff --git a/packages/protocol/contracts/common/Permissioned.sol b/packages/protocol/contracts/common/Permissioned.sol new file mode 100644 index 00000000000..584e3daa369 --- /dev/null +++ b/packages/protocol/contracts/common/Permissioned.sol @@ -0,0 +1,12 @@ +pragma solidity ^0.5.13; + +contract Permissioned { + /** + * @notice Modifier that restricts function calls to a specific permitted address. + * @param permittedAddress The address that is allowed to call the function. + */ + modifier onlyPermitted(address permittedAddress) { + require(msg.sender == permittedAddress, "Only permitted address can call"); + _; + } +} diff --git a/packages/protocol/contracts/common/interfaces/IAccounts.sol b/packages/protocol/contracts/common/interfaces/IAccounts.sol index 286353d6978..b3d1cf4db53 100644 --- a/packages/protocol/contracts/common/interfaces/IAccounts.sol +++ b/packages/protocol/contracts/common/interfaces/IAccounts.sol @@ -17,15 +17,6 @@ interface IAccounts { bytes32, bytes calldata ) external; - function authorizeValidatorSignerWithKeys( - address, - uint8, - bytes32, - bytes32, - bytes calldata, - bytes calldata, - bytes calldata - ) external; function authorizeAttestationSigner(address, uint8, bytes32, bytes32) external; function setEip712DomainSeparator() external; function createAccount() external returns (bool); diff --git a/packages/protocol/contracts/common/interfaces/ICeloToken.sol b/packages/protocol/contracts/common/interfaces/ICeloToken.sol index 5f7107395db..a18dd5b0470 100644 --- a/packages/protocol/contracts/common/interfaces/ICeloToken.sol +++ b/packages/protocol/contracts/common/interfaces/ICeloToken.sol @@ -9,9 +9,9 @@ interface ICeloToken { function initialize(address) external; function transferWithComment(address, uint256, string calldata) external returns (bool); function burn(uint256 value) external returns (bool); - function mint(address to, uint256 value) external returns (bool); function name() external view returns (string memory); function symbol() external view returns (string memory); function decimals() external view returns (uint8); function allocatedSupply() external view returns (uint256); + function totalSupply() external view returns (uint256); } diff --git a/packages/protocol/contracts/common/interfaces/IFeeHandler.sol b/packages/protocol/contracts/common/interfaces/IFeeHandler.sol index 385998240fe..c13077ed307 100644 --- a/packages/protocol/contracts/common/interfaces/IFeeHandler.sol +++ b/packages/protocol/contracts/common/interfaces/IFeeHandler.sol @@ -20,7 +20,7 @@ interface IFeeHandler { // calls exchange(tokenAddress), and distribute(tokenAddress) function handle(address tokenAddress) external; - // main entrypoint for a burn, iterates over token and calles handle + // main entrypoint for a burn, iterates over token and calls handle function handleAll() external; // Sends the balance of token at tokenAddress to feesBeneficiary, diff --git a/packages/protocol/contracts/common/interfaces/IMultiSig.sol b/packages/protocol/contracts/common/interfaces/IMultiSig.sol new file mode 100644 index 00000000000..f7e37271c4d --- /dev/null +++ b/packages/protocol/contracts/common/interfaces/IMultiSig.sol @@ -0,0 +1,10 @@ +// SPDX-License-Identifier: LGPL-3.0-only +pragma solidity >=0.5.13 <0.9.0; + +interface IMultiSig { + function submitTransaction( + address destination, + uint256 value, + bytes calldata data + ) external returns (uint256); +} diff --git a/packages/protocol/contracts/common/libraries/B12.sol b/packages/protocol/contracts/common/libraries/B12.sol deleted file mode 100644 index 4d39b1f08dc..00000000000 --- a/packages/protocol/contracts/common/libraries/B12.sol +++ /dev/null @@ -1,880 +0,0 @@ -pragma solidity ^0.5.13; - -// Included into celo-monorepo from -// https://github.com/prestwich/b12-sol/blob/main/contracts/B12.sol -// which is largely based on -// https://github.com/ralexstokes/deposit-verifier/blob/master/deposit_verifier.sol - -import { TypedMemView } from "./TypedMemView.sol"; - -library B12 { - using TypedMemView for bytes; - using TypedMemView for bytes29; - - // Fp is a field element with the high-order part stored in `a`. - struct Fp { - uint256 a; - uint256 b; - } - - // Fp2 is an extension field element with the coefficient of the - // quadratic non-residue stored in `b`, i.e. p = a + i * b - struct Fp2 { - Fp a; - Fp b; - } - - // G1Point represents a point on BLS12-377 over Fp with coordinates (X,Y); - struct G1Point { - Fp X; - Fp Y; - } - - // G2Point represents a point on BLS12-377 over Fp2 with coordinates (X,Y); - struct G2Point { - Fp2 X; - Fp2 Y; - } - - struct G1MultiExpArg { - G1Point point; - uint256 scalar; - } - - struct G2MultiExpArg { - G2Point point; - uint256 scalar; - } - - struct PairingArg { - G1Point g1; - G2Point g2; - } - - // Base field modulus from https://eips.ethereum.org/EIPS/eip-2539#specification - uint256 constant BLS12_377_BASE_A = 0x1ae3a4617c510eac63b05c06ca1493b; - uint256 constant BLS12_377_BASE_B = - 0x1a22d9f300f5138f1ef3622fba094800170b5d44300000008508c00000000001; - - function fpModExp( - Fp memory base, - uint256 exponent, - Fp memory modulus - ) internal view returns (Fp memory) { - uint256 base1 = base.a; - uint256 base2 = base.b; - uint256 modulus1 = modulus.a; - uint256 modulus2 = modulus.b; - bytes memory arg = new bytes(3 * 32 + 32 + 64 + 64); - bytes memory ret = new bytes(64); - uint256 result1; - uint256 result2; - assembly { - // length of base, exponent, modulus - mstore(add(arg, 0x20), 0x40) - mstore(add(arg, 0x40), 0x20) - mstore(add(arg, 0x60), 0x40) - - // assign base, exponent, modulus - mstore(add(arg, 0x80), base1) - mstore(add(arg, 0xa0), base2) - mstore(add(arg, 0xc0), exponent) - mstore(add(arg, 0xe0), modulus1) - mstore(add(arg, 0x100), modulus2) - - // call the precompiled contract BigModExp (0x05) - let success := staticcall(gas, 0x05, add(arg, 0x20), 0x100, add(ret, 0x20), 0x40) - switch success - case 0 { - revert(0x0, 0x0) - } - default { - result1 := mload(add(0x20, ret)) - result2 := mload(add(0x40, ret)) - } - } - return Fp(result1, result2); - } - - function fpModExp2( - Fp memory base, - uint256 idx, - uint256 exponent, - Fp memory modulus - ) internal view returns (Fp memory) { - uint256 base1 = base.a; - uint256 base2 = base.b; - uint256 modulus1 = modulus.a; - uint256 modulus2 = modulus.b; - bytes memory arg = new bytes(3 * 32 + 62 + 64 + 32 + idx); - bytes memory ret = new bytes(64); - uint256 result1; - uint256 result2; - assembly { - // length of base, exponent, modulus - mstore(add(arg, 0x20), add(0x40, idx)) - mstore(add(arg, 0x40), 0x20) - mstore(add(arg, 0x60), 0x40) - - // assign base, exponent, modulus - mstore(add(arg, 0x80), base1) - mstore(add(arg, 0xa0), base2) - mstore(add(arg, add(idx, 0xc0)), exponent) - mstore(add(arg, add(idx, 0xe0)), modulus1) - mstore(add(arg, add(idx, 0x100)), modulus2) - - // call the precompiled contract BigModExp (0x05) - let success := staticcall(gas, 0x05, add(arg, 0x20), add(idx, 0x100), add(ret, 0x20), 0x40) - switch success - case 0 { - revert(0x0, 0x0) - } - default { - result1 := mload(add(0x20, ret)) - result2 := mload(add(0x40, ret)) - } - } - return Fp(result1, result2); - } - - function fpMul(Fp memory a, Fp memory b) internal view returns (Fp memory) { - uint256 a1 = uint128(a.b); - uint256 a2 = uint128(a.b >> 128); - uint256 a3 = uint128(a.a); - uint256 a4 = uint128(a.a >> 128); - uint256 b1 = uint128(b.b); - uint256 b2 = uint128(b.b >> 128); - uint256 b3 = uint128(b.a); - uint256 b4 = uint128(b.a >> 128); - Fp memory res = fpNormal2(Fp(0, a1 * b1), 0); - res = fpAdd(res, fpNormal2(fpAdd2(a1 * b2, a2 * b1), 16)); - res = fpAdd(res, fpNormal2(fpAdd3(a1 * b3, a2 * b2, a3 * b1), 32)); - res = fpAdd(res, fpNormal2(fpAdd4(a1 * b4, a2 * b3, a3 * b2, a4 * b1), 48)); - res = fpAdd(res, fpNormal2(fpAdd3(a2 * b4, a3 * b3, a4 * b2), 64)); - res = fpAdd(res, fpNormal2(fpAdd2(a3 * b4, a4 * b3), 96)); - res = fpAdd(res, fpNormal2(Fp(0, a4 * b4), 128)); - return fpNormal(res); - } - - function fp2Normal(Fp2 memory a) internal view returns (Fp2 memory) { - return Fp2(fpNormal(a.a), fpNormal(a.b)); - } - - function fp2Mul(Fp2 memory a, Fp2 memory b) internal view returns (Fp2 memory) { - Fp memory non_residue = B12.Fp( - 0x01ae3a4617c510eac63b05c06ca1493b, - 0x1a22d9f300f5138f1ef3622fba094800170b5d44300000008508bffffffffffc - ); - - Fp memory v0 = fpMul(a.a, b.a); - Fp memory v1 = fpMul(a.b, b.b); - - Fp memory res1 = fpAdd(a.b, a.a); - res1 = fpMul(res1, fpAdd(b.a, b.b)); - res1 = fpSub(res1, v0); - res1 = fpSub(res1, v1); - Fp memory res0 = fpAdd(v0, fpMul(v1, non_residue)); - return Fp2(res0, res1); - } - - function fpNormal2(Fp memory a, uint256 idx) internal view returns (Fp memory) { - return fpModExp2(a, idx, 1, Fp(BLS12_377_BASE_A, BLS12_377_BASE_B)); - } - - function fpNormal(Fp memory a) internal view returns (Fp memory) { - return fpModExp(a, 1, Fp(BLS12_377_BASE_A, BLS12_377_BASE_B)); - } - - function mapToG2( - Fp2 memory x, - Fp2 memory hint1, - Fp2 memory hint2, - bool greatest - ) internal view returns (G2Point memory) { - Fp2 memory one = Fp2(Fp(0, 1), Fp(0, 0)); - Fp2 memory res = fp2Add(fp2Mul(x, fp2Mul(x, x)), one); - Fp2 memory sqhint1 = fp2Mul(hint1, hint1); - Fp2 memory sqhint2 = fp2Mul(hint2, hint2); - require(Fp2Eq(sqhint1, res), "y1 not sqrt"); - require(Fp2Eq(sqhint2, res), "y2 not sqrt"); - require(fp2Gt(hint1, hint2), "y1 not greatest"); - G2Point memory p = G2Point(x, greatest ? hint1 : hint2); - return p; - } - - function mapToG1( - Fp memory x, - Fp memory hint1, - Fp memory hint2, - bool greatest - ) internal view returns (G1Point memory) { - Fp memory one = Fp(0, 1); - Fp memory res = fpAdd(fpModExp(x, 3, Fp(BLS12_377_BASE_A, BLS12_377_BASE_B)), one); - Fp memory sqhint1 = fpModExp(hint1, 2, Fp(BLS12_377_BASE_A, BLS12_377_BASE_B)); - Fp memory sqhint2 = fpModExp(hint2, 2, Fp(BLS12_377_BASE_A, BLS12_377_BASE_B)); - require(FpEq(sqhint1, res), "y1 not sqrt"); - require(FpEq(sqhint2, res), "y2 not sqrt"); - require(fpGt(hint1, hint2), "y1 not greatest"); - return G1Point(x, greatest ? hint1 : hint2); - } - - function g1Add( - G1Point memory a, - G1Point memory b, - uint8 precompile, - uint256 gasEstimate - ) internal view returns (G1Point memory c) { - uint256[8] memory input; - input[0] = a.X.a; - input[1] = a.X.b; - input[2] = a.Y.a; - input[3] = a.Y.b; - - input[4] = b.X.a; - input[5] = b.X.b; - input[6] = b.Y.a; - input[7] = b.Y.b; - - bool success; - assembly { - success := staticcall(gasEstimate, precompile, input, 256, input, 128) - // deallocate the input, leaving dirty memory - mstore(0x40, input) - } - - require(success, "g1 add precompile failed"); - c.X.a = input[0]; - c.X.b = input[1]; - c.Y.a = input[2]; - c.Y.b = input[3]; - } - - // Overwrites A - function g1Mul( - G1Point memory a, - uint256 scalar, - uint8 precompile, - uint256 gasEstimate - ) internal view returns (G1Point memory c) { - uint256[5] memory input; - input[0] = a.X.a; - input[1] = a.X.b; - input[2] = a.Y.a; - input[3] = a.Y.b; - - input[4] = scalar; - - bool success; - assembly { - success := staticcall( - gasEstimate, - precompile, - input, - 160, - input, // reuse the memory to avoid growing - 128 - ) - // deallocate the input, leaving dirty memory - mstore(0x40, input) - } - require(success, "g1 mul precompile failed"); - c.X.a = input[0]; - c.X.b = input[1]; - c.Y.a = input[2]; - c.Y.b = input[3]; - } - - function g1MultiExp( - G1MultiExpArg[] memory argVec, - uint8 precompile, - uint256 gasEstimate - ) internal view returns (G1Point memory c) { - uint256[] memory input = new uint256[](argVec.length * 5); - // hate this - for (uint256 i = 0; i < argVec.length; i++) { - input[i * 5 + 0] = argVec[i].point.X.a; - input[i * 5 + 1] = argVec[i].point.X.b; - input[i * 5 + 2] = argVec[i].point.Y.a; - input[i * 5 + 3] = argVec[i].point.Y.b; - input[i * 5 + 4] = argVec[i].scalar; - } - - bool success; - assembly { - success := staticcall( - gasEstimate, - precompile, - add(input, 0x20), - mul(mload(input), 0x20), - add(input, 0x20), - 128 - ) - // deallocate the input, leaving dirty memory - mstore(0x40, input) - } - require(success, "g1 multiExp precompile failed"); - c.X.a = input[0]; - c.X.b = input[1]; - c.Y.a = input[2]; - c.Y.b = input[3]; - } - - function g2Add( - G2Point memory a, - G2Point memory b, - uint8 precompile, - uint256 gasEstimate - ) internal view returns (G2Point memory c) { - uint256[16] memory input; - input[0] = a.X.a.a; - input[1] = a.X.a.b; - input[2] = a.X.b.a; - input[3] = a.X.b.b; - - input[4] = a.Y.a.a; - input[5] = a.Y.a.b; - input[6] = a.Y.b.a; - input[7] = a.Y.b.b; - - input[8] = b.X.a.a; - input[9] = b.X.a.b; - input[10] = b.X.b.a; - input[11] = b.X.b.b; - - input[12] = b.Y.a.a; - input[13] = b.Y.a.b; - input[14] = b.Y.b.a; - input[15] = b.Y.b.b; - - bool success; - assembly { - success := staticcall( - gasEstimate, - precompile, - input, - 512, - input, // reuse the memory to avoid growing - 256 - ) - // deallocate the input, leaving dirty memory - mstore(0x40, input) - } - require(success, "g2 add precompile failed"); - c.X.a.a = input[0]; - c.X.a.b = input[1]; - c.X.b.a = input[2]; - c.X.b.b = input[3]; - - c.Y.a.a = input[4]; - c.Y.a.b = input[5]; - c.Y.b.a = input[6]; - c.Y.b.b = input[7]; - } - - // Overwrites A - function g2Mul( - G2Point memory a, - uint256 scalar, - uint8 precompile, - uint256 gasEstimate - ) internal view { - uint256[9] memory input; - - input[0] = a.X.a.a; - input[1] = a.X.a.b; - input[2] = a.X.b.a; - input[3] = a.X.b.b; - - input[4] = a.Y.a.a; - input[5] = a.Y.a.b; - input[6] = a.Y.b.a; - input[7] = a.Y.b.b; - - input[8] = scalar; - - bool success; - assembly { - success := staticcall( - gasEstimate, - precompile, - input, - 288, - input, // reuse the memory to avoid growing - 256 - ) - // deallocate the input, leaving dirty memory - mstore(0x40, input) - } - require(success, "g2 mul precompile failed"); - a.X.a.a = input[0]; - a.X.a.b = input[1]; - a.X.b.a = input[2]; - a.X.b.b = input[3]; - a.Y.a.a = input[4]; - a.Y.a.b = input[5]; - a.Y.b.a = input[6]; - a.Y.b.b = input[7]; - } - - function g2MultiExp( - G2MultiExpArg[] memory argVec, - uint8 precompile, - uint256 gasEstimate - ) internal view returns (G2Point memory c) { - uint256[] memory input = new uint256[](argVec.length * 9); - // hate this - for (uint256 i = 0; i < input.length / 9; i += 1) { - uint256 idx = i * 9; - input[idx + 0] = argVec[i].point.X.a.a; - input[idx + 1] = argVec[i].point.X.a.b; - input[idx + 2] = argVec[i].point.X.b.a; - input[idx + 3] = argVec[i].point.X.b.b; - input[idx + 4] = argVec[i].point.Y.a.a; - input[idx + 5] = argVec[i].point.Y.a.b; - input[idx + 6] = argVec[i].point.Y.b.a; - input[idx + 7] = argVec[i].point.Y.b.b; - input[idx + 8] = argVec[i].scalar; - } - - bool success; - assembly { - success := staticcall( - gasEstimate, - precompile, - add(input, 0x20), - mul(mload(input), 0x20), // 288 bytes per arg - add(input, 0x20), // write directly to the already allocated result - 256 - ) - // deallocate the input, leaving dirty memory - mstore(0x40, input) - } - require(success, "g2 multiExp precompile failed"); - c.X.a.a = input[0]; - c.X.a.b = input[1]; - c.X.b.a = input[2]; - c.X.b.b = input[3]; - c.Y.a.a = input[4]; - c.Y.a.b = input[5]; - c.Y.b.a = input[6]; - c.Y.b.b = input[7]; - } - - function pairing( - PairingArg[] memory argVec, - uint8 precompile, - uint256 gasEstimate - ) internal view returns (bool result) { - uint256 len = argVec.length; - uint256[] memory input = new uint256[](len * 12); - - for (uint256 i = 0; i < len; i++) { - uint256 idx = i * 12; - input[idx + 0] = argVec[i].g1.X.a; - input[idx + 1] = argVec[i].g1.X.b; - input[idx + 2] = argVec[i].g1.Y.a; - input[idx + 3] = argVec[i].g1.Y.b; - input[idx + 4] = argVec[i].g2.X.a.a; - input[idx + 5] = argVec[i].g2.X.a.b; - input[idx + 6] = argVec[i].g2.X.b.a; - input[idx + 7] = argVec[i].g2.X.b.b; - input[idx + 8] = argVec[i].g2.Y.a.a; - input[idx + 9] = argVec[i].g2.Y.a.b; - input[idx + 10] = argVec[i].g2.Y.b.a; - input[idx + 11] = argVec[i].g2.Y.b.b; - } - - bool success; - assembly { - success := staticcall( - gasEstimate, - precompile, - add(input, 0x20), // the body of the array - mul(384, len), // 384 bytes per arg - mload(0x40), // write to earliest freemem - 32 - ) - result := mload(mload(0x40)) // load what we just wrote - // deallocate the input, leaving dirty memory - mstore(0x40, input) - } - require(success, "pairing precompile failed"); - } - - function fp2Add(Fp2 memory a, Fp2 memory b) internal pure returns (Fp2 memory) { - return Fp2(fpAdd(a.a, b.a), fpAdd(a.b, b.b)); - } - - function FpEq(Fp memory a, Fp memory b) internal pure returns (bool) { - return (a.a == b.a && a.b == b.b); - } - - function fpGt(Fp memory a, Fp memory b) internal pure returns (bool) { - return (a.a > b.a || (a.a == b.a && a.b > b.b)); - } - - function Fp2Eq(Fp2 memory a, Fp2 memory b) internal pure returns (bool) { - return FpEq(a.a, b.a) && FpEq(a.b, b.b); - } - - function fp2Gt(Fp2 memory a, Fp2 memory b) internal pure returns (bool) { - if (FpEq(a.b, b.b)) return fpGt(a.a, b.a); - else return fpGt(a.b, b.b); - } - - function fpAdd2(uint256 a, uint256 b) internal pure returns (Fp memory) { - return fpAdd(Fp(0, a), Fp(0, b)); - } - - function fpAdd3(uint256 a, uint256 b, uint256 c) internal pure returns (Fp memory) { - return fpAdd(Fp(0, a), fpAdd(Fp(0, b), Fp(0, c))); - } - - function fpAdd4(uint256 a, uint256 b, uint256 c, uint256 d) internal pure returns (Fp memory) { - return fpAdd(Fp(0, a), fpAdd(Fp(0, b), fpAdd(Fp(0, c), Fp(0, d)))); - } - - function fpAdd(Fp memory a, Fp memory b) internal pure returns (Fp memory) { - uint256 bb = a.b + b.b; - uint256 aa = a.a + b.a + (bb >= a.b && bb >= b.b ? 0 : 1); - return Fp(aa, bb); - } - - function fpSub(Fp memory a, Fp memory b) internal pure returns (Fp memory) { - Fp memory x = fpAdd(a, Fp(BLS12_377_BASE_A, BLS12_377_BASE_B)); - uint256 bb = x.b - b.b; - uint256 aa = x.a - b.a - (bb <= x.b ? 0 : 1); - return Fp(aa, bb); - } - - function parsePointGen( - bytes memory h, - uint256 offset - ) internal pure returns (uint256, uint256, uint256) { - uint256 a = 0; - uint256 b = 0; - for (uint256 i = 0; i < 32; i++) { - uint256 byt = uint256(uint8(h[offset + i])); - b = b + (byt << (i * 8)); - } - for (uint256 i = 0; i < 15; i++) { - uint256 byt = uint256(uint8(h[offset + i + 32])); - a = a + (byt << (i * 8)); - } - return (a, b, uint256(uint8(h[offset + 47]))); - } - - function parsePoint(bytes memory h, uint256 offset) internal pure returns (Fp memory, bool) { - (uint256 a, uint256 b, uint256 byt) = parsePointGen(h, offset); - a = a + ((byt & 0x7f) << (15 * 8)); - return (Fp(a, b), byt & 0xa0 != 0); - } - - function parseSimplePoint(bytes memory h, uint256 offset) internal pure returns (Fp memory) { - Fp memory res = Fp(0, 0); - parseSimplePoint(h, offset, res); - return res; - } - - function parseSimplePoint(bytes memory h, uint256 offset, Fp memory p) internal pure { - uint256 a; - uint256 b; - assembly { - a := mload(add(0x20, add(h, offset))) - b := mload(add(0x40, add(h, offset))) - } - p.a = a; - p.b = b; - } - - function parsePoint(bytes memory h) internal pure returns (Fp memory, bool) { - return parsePoint(h, 0); - } - - function parseRandomPoint(bytes memory h) internal pure returns (Fp memory, bool) { - (uint256 a, uint256 b, uint256 byt) = parsePointGen(h, 0); - a = a + ((byt & 0x01) << (15 * 8)); - return (Fp(a, b), byt & 0x02 != 0); - } - - function readFp2(bytes memory h, uint256 offset) internal pure returns (Fp2 memory) { - Fp memory a = parseSimplePoint(h, offset); - Fp memory b = parseSimplePoint(h, 64 + offset); - return Fp2(a, b); - } - - function readFp2(bytes memory h, uint256 offset, Fp2 memory p) internal pure { - parseSimplePoint(h, offset, p.a); - parseSimplePoint(h, 64 + offset, p.b); - } - - function readG2(bytes memory h, uint256 offset) internal pure returns (G2Point memory) { - Fp2 memory a = readFp2(h, offset); - Fp2 memory b = readFp2(h, 128 + offset); - return G2Point(a, b); - } - - function readG2(bytes memory h, uint256 offset, G2Point memory p) internal pure { - readFp2(h, offset, p.X); - readFp2(h, 128 + offset, p.Y); - } - - function g1Eq(G1Point memory a, G1Point memory b) internal pure returns (bool) { - return FpEq(a.X, b.X) && FpEq(a.Y, b.Y); - } - - function g1Eq(G2Point memory a, G2Point memory b) internal pure returns (bool) { - return (Fp2Eq(a.X, b.X) && Fp2Eq(a.Y, b.Y)); - } - - function parseFp(bytes memory input, uint256 offset) internal pure returns (Fp memory ret) { - bytes29 ref = input.ref(0).postfix(input.length - offset, 0); - - ret.a = ref.indexUint(0, 32); - ret.b = ref.indexUint(32, 32); - } - - function parseFp2(bytes memory input, uint256 offset) internal pure returns (Fp2 memory ret) { - bytes29 ref = input.ref(0).postfix(input.length - offset, 0); - - ret.a.a = ref.indexUint(0, 32); - ret.a.b = ref.indexUint(32, 32); - ret.b.a = ref.indexUint(64, 32); - ret.b.b = ref.indexUint(96, 32); - } - - function parseCompactFp( - bytes memory input, - uint256 offset - ) internal pure returns (Fp memory ret) { - bytes29 ref = input.ref(0).postfix(input.length - offset, 0); - - ret.a = ref.indexUint(0, 16); - ret.b = ref.indexUint(16, 32); - } - - function parseCompactFp2( - bytes memory input, - uint256 offset - ) internal pure returns (Fp2 memory ret) { - bytes29 ref = input.ref(0).postfix(input.length - offset, 0); - - ret.a.a = ref.indexUint(48, 16); - ret.a.b = ref.indexUint(64, 32); - ret.b.a = ref.indexUint(0, 16); - ret.b.b = ref.indexUint(16, 32); - } - - function parseG1(bytes memory input, uint256 offset) internal pure returns (G1Point memory ret) { - // unchecked sub is safe due to view validity checks - bytes29 ref = input.ref(0).postfix(input.length - offset, 0); - - ret.X.a = ref.indexUint(0, 32); - ret.X.b = ref.indexUint(32, 32); - ret.Y.a = ref.indexUint(64, 32); - ret.Y.b = ref.indexUint(96, 32); - } - - function parseG2(bytes memory input, uint256 offset) internal pure returns (G2Point memory ret) { - // unchecked sub is safe due to view validity checks - bytes29 ref = input.ref(0).postfix(input.length - offset, 0); - - ret.X.a.a = ref.indexUint(0, 32); - ret.X.a.b = ref.indexUint(32, 32); - ret.X.b.a = ref.indexUint(64, 32); - ret.X.b.b = ref.indexUint(96, 32); - ret.Y.a.a = ref.indexUint(128, 32); - ret.Y.a.b = ref.indexUint(160, 32); - ret.Y.b.a = ref.indexUint(192, 32); - ret.Y.b.b = ref.indexUint(224, 32); - } - - function serializeFp(Fp memory p) internal pure returns (bytes memory) { - return abi.encodePacked(p.a, p.b); - } - - function serializeFp2(Fp2 memory p) internal pure returns (bytes memory) { - return abi.encodePacked(p.a.a, p.a.b, p.b.a, p.b.b); - } - - function serializeG1(G1Point memory p) internal pure returns (bytes memory) { - return abi.encodePacked(p.X.a, p.X.b, p.Y.a, p.Y.b); - } - - function serializeG2(G2Point memory p) internal pure returns (bytes memory) { - return abi.encodePacked(p.X.a.a, p.X.a.b, p.X.b.a, p.X.b.b, p.Y.a.a, p.Y.a.b, p.Y.b.a, p.Y.b.b); - } -} - -library B12_381Lib { - using B12 for B12.G1Point; - using B12 for B12.G2Point; - - uint8 constant G1_ADD = 0xF2; - uint8 constant G1_MUL = 0xF1; - uint8 constant G1_MULTI_EXP = 0xF0; - uint8 constant G2_ADD = 0xEF; - uint8 constant G2_MUL = 0xEE; - uint8 constant G2_MULTI_EXP = 0xED; - uint8 constant PAIRING = 0xEC; - uint8 constant MAP_TO_G1 = 0xEB; - uint8 constant MAP_TO_G2 = 0xEA; - - function mapToG1(B12.Fp memory a) internal view returns (B12.G1Point memory b) { - uint256[2] memory input; - input[0] = a.a; - input[1] = a.b; - - bool success; - uint8 ADDR = MAP_TO_G1; - assembly { - success := staticcall( - 20000, - ADDR, - input, // the body of the array - 64, - b, // write directly to pre-allocated result - 128 - ) - // deallocate the input - mstore(add(input, 0), 0) - mstore(add(input, 0x20), 0) - mstore(0x40, input) - } - } - - function mapToG2(B12.Fp2 memory a) internal view returns (B12.G2Point memory b) { - uint256[4] memory input; - input[0] = a.a.a; - input[1] = a.a.b; - input[2] = a.b.a; - input[3] = a.b.b; - - bool success; - uint8 ADDR = MAP_TO_G2; - assembly { - success := staticcall( - 120000, - ADDR, - input, // the body of the array - 128, - b, // write directly to pre-allocated result - 256 - ) - // deallocate the input - mstore(add(input, 0), 0) - mstore(add(input, 0x20), 0) - mstore(add(input, 0x40), 0) - mstore(add(input, 0x60), 0) - mstore(0x40, input) - } - } - - function g1Add( - B12.G1Point memory a, - B12.G1Point memory b - ) internal view returns (B12.G1Point memory c) { - return a.g1Add(b, G1_ADD, 15000); - } - - function g1Mul( - B12.G1Point memory a, - uint256 scalar - ) internal view returns (B12.G1Point memory c) { - return a.g1Mul(scalar, G1_MUL, 50000); - } - - function g1MultiExp( - B12.G1MultiExpArg[] memory argVec - ) internal view returns (B12.G1Point memory c) { - uint256 roughCost = (argVec.length * 12000 * 1200) / 1000; - return B12.g1MultiExp(argVec, G1_MULTI_EXP, roughCost); - } - - function g2Add( - B12.G2Point memory a, - B12.G2Point memory b - ) internal view returns (B12.G2Point memory c) { - return a.g2Add(b, G2_ADD, 20000); - } - - function g2Mul(B12.G2Point memory a, uint256 scalar) internal view { - return a.g2Mul(scalar, G2_MUL, 60000); - } - - function g2MultiExp( - B12.G2MultiExpArg[] memory argVec - ) internal view returns (B12.G2Point memory c) { - uint256 roughCost = (argVec.length * 55000 * 1200) / 1000; - return B12.g2MultiExp(argVec, G2_MULTI_EXP, roughCost); - } - - function pairing(B12.PairingArg[] memory argVec) internal view returns (bool result) { - uint256 roughCost = (23000 * argVec.length) + 115000; - return B12.pairing(argVec, PAIRING, roughCost); - } - - function negativeP1() internal pure returns (B12.G1Point memory p) { - p.X.a = 31827880280837800241567138048534752271; - p.X.b = 88385725958748408079899006800036250932223001591707578097800747617502997169851; - p.Y.a = 22997279242622214937712647648895181298; - p.Y.b = 46816884707101390882112958134453447585552332943769894357249934112654335001290; - } -} - -library B12_377Lib { - using B12 for B12.G1Point; - using B12 for B12.G2Point; - - uint8 constant G1_ADD = 0xE9; - uint8 constant G1_MUL = 0xE8; - uint8 constant G1_MULTI_EXP = 0xE7; - uint8 constant G2_ADD = 0xE6; - uint8 constant G2_MUL = 0xE5; - uint8 constant G2_MULTI_EXP = 0xE4; - uint8 constant PAIRING = 0xE3; - - function g1Add( - B12.G1Point memory a, - B12.G1Point memory b - ) internal view returns (B12.G1Point memory c) { - return a.g1Add(b, G1_ADD, 15000); - } - - function g1Mul( - B12.G1Point memory a, - uint256 scalar - ) internal view returns (B12.G1Point memory c) { - return a.g1Mul(scalar, G1_MUL, 50000); - } - - function g1MultiExp( - B12.G1MultiExpArg[] memory argVec - ) internal view returns (B12.G1Point memory c) { - uint256 roughCost = (argVec.length * 12000 * 1200) / 1000; - return B12.g1MultiExp(argVec, G1_MULTI_EXP, roughCost); - } - - function g2Add( - B12.G2Point memory a, - B12.G2Point memory b - ) internal view returns (B12.G2Point memory c) { - return a.g2Add(b, G2_ADD, 20000); - } - - function g2Mul(B12.G2Point memory a, uint256 scalar) internal view { - return a.g2Mul(scalar, G2_MUL, 60000); - } - - function g2MultiExp( - B12.G2MultiExpArg[] memory argVec - ) internal view returns (B12.G2Point memory c) { - uint256 roughCost = (argVec.length * 55000 * 1200) / 1000; - return B12.g2MultiExp(argVec, G2_MULTI_EXP, roughCost); - } - - function pairing(B12.PairingArg[] memory argVec) internal view returns (bool result) { - uint256 roughCost = (55000 * argVec.length) + 65000; - return B12.pairing(argVec, PAIRING, roughCost); - } -} diff --git a/packages/protocol/contracts/common/libraries/CIP20Lib.sol b/packages/protocol/contracts/common/libraries/CIP20Lib.sol deleted file mode 100644 index c8f1af27a04..00000000000 --- a/packages/protocol/contracts/common/libraries/CIP20Lib.sol +++ /dev/null @@ -1,141 +0,0 @@ -// From https://github.com/prestwich/cip20-sol -pragma solidity ^0.5.13; - -library CIP20Lib { - uint8 private constant CIP20_ADDRESS = 0xE2; - - uint8 private constant SHA3_256_SELECTOR = 0x00; - uint8 private constant SHA3_512_SELECTOR = 0x01; - uint8 private constant KECCAK_512_SELECTOR = 0x02; - uint8 private constant SHA2_512_SELECTOR = 0x03; - uint8 private constant BLAKE2S_SELECTOR = 0x10; - - // BLAKE2S_DEFAULT_CONFIG = createBlake2sConfig(32, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0) - bytes32 private constant BLAKE2S_DEFAULT_CONFIG = - 0x2000010100000000000000000000000000000000000000000000000000000000; - - // Accepts a fully formed input blob. This should include any config - // options and the preimage, but not the selector. - function executeCIP20( - bytes memory input, - uint8 selector, - uint256 output_len - ) internal view returns (bytes memory) { - uint8 addr = CIP20_ADDRESS; - bytes memory output = new bytes(output_len); - - // To avoid copying the input array (an unbounded cost) we store its - // length on the stack and then replace the length prefix for its - // in-memory representation with the selector. We then replace the - // length in memory after the precompile executes with it. - uint256 len = input.length; - - bool success; - assembly { - mstore(input, selector) // selector - - success := staticcall( - sub(gas, 2000), - addr, - add(input, 0x1F), // location is shifted 1 byte for selector - add(len, 0x01), // length w/ selector - add(output, 0x20), // location - mload(output) // length - ) - - // Restore the input array length prefix - mstore(input, len) - } - - require(success, "CIP-20 hashing failed"); - return output; - } - - function sha3_256(bytes memory input) internal view returns (bytes memory) { - return executeCIP20(input, SHA3_256_SELECTOR, 32); - } - - function sha3_512(bytes memory input) internal view returns (bytes memory) { - return executeCIP20(input, SHA3_512_SELECTOR, 64); - } - - function keccak512(bytes memory input) internal view returns (bytes memory) { - return executeCIP20(input, KECCAK_512_SELECTOR, 64); - } - - function sha2_512(bytes memory input) internal view returns (bytes memory) { - return executeCIP20(input, SHA2_512_SELECTOR, 64); - } - - function blake2sWithConfig( - bytes32 config, - bytes memory key, - bytes memory preimage - ) internal view returns (bytes memory) { - require( - key.length == uint256(config >> (8 * 30)) & 0xff, - "CIP20Lib/blake2sWithConfig - Provided key length does not match key length in config" - ); - bytes memory configuredInput = abi.encodePacked(config, preimage); - return executeCIP20(configuredInput, BLAKE2S_SELECTOR, uint256(uint8(config[0]))); - } - - function blake2s(bytes memory preimage) internal view returns (bytes memory) { - return blake2sWithConfig(BLAKE2S_DEFAULT_CONFIG, hex"", preimage); - } - - function createBlake2sConfig( - uint8 digestSize, - uint8 keyLength, - uint8 fanout, - uint8 depth, - uint32 leafLength, - uint32 nodeOffset, - uint16 xofDigestLength, - uint8 nodeDepth, - uint8 innerLength, - bytes8 salt, - bytes8 personalize - ) internal pure returns (bytes32 config) { - require(keyLength <= 32, "CIP20Lib/createBlake2sConfig -- keyLength must be 32 or less"); - config = writeU8(config, 0, digestSize); - config = writeU8(config, 1, keyLength); - - config = writeU8(config, 2, fanout); - config = writeU8(config, 3, depth); - config = writeLEU32(config, 4, leafLength); - config = writeLEU32(config, 8, nodeOffset); - config = writeLEU16(config, 12, xofDigestLength); - config = writeU8(config, 14, nodeDepth); - config = writeU8(config, 15, innerLength); - - config |= bytes32(uint256(uint64(salt))) << (8 * 8); - config |= bytes32(uint256(uint64(personalize))) << (8 * 0); - return config; - } - - // This function relies on alignment mechanics. Explicit conversion to - // `bytes` types shorter than 32 results in left re-alignment. To avoid - // that, we convert the bytes32 to uint256 instead of converting the uint8 - // to a bytes1. - function writeU8(bytes32 b, uint8 offset, uint8 toWrite) private pure returns (bytes32) { - require(offset <= 31, "CIP20Lib/writeU8 -- out of bounds write"); - uint8 shift = 8 * (32 - 1 - offset); - bytes32 res = bytes32(uint256(b) | (uint256(toWrite) << shift)); - return res; - } - - function writeLEU32(bytes32 b, uint8 offset, uint32 toWrite) private pure returns (bytes32) { - b = writeU8(b, offset + 0, uint8(toWrite >> 0)); - b = writeU8(b, offset + 1, uint8(toWrite >> 8)); - b = writeU8(b, offset + 2, uint8(toWrite >> 16)); - b = writeU8(b, offset + 3, uint8(toWrite >> 24)); - return b; - } - - function writeLEU16(bytes32 b, uint8 offset, uint16 toWrite) private pure returns (bytes32) { - b = writeU8(b, offset + 0, uint8(toWrite >> 0)); - b = writeU8(b, offset + 1, uint8(toWrite >> 8)); - return b; - } -} diff --git a/packages/protocol/contracts/common/libraries/TypedMemView.sol b/packages/protocol/contracts/common/libraries/TypedMemView.sol deleted file mode 100644 index d33b001f88c..00000000000 --- a/packages/protocol/contracts/common/libraries/TypedMemView.sol +++ /dev/null @@ -1,830 +0,0 @@ -// SPDX-License-Identifier: MIT OR Apache-2.0 -// Original authors: https://github.com/summa-tx/memview-sol -pragma solidity >=0.5.10 <0.8.0; - -import { SafeMathMem } from "./SafeMathMem.sol"; - -library TypedMemView { - using SafeMathMem for uint256; - - // Why does this exist? - // the solidity `bytes memory` type has a few weaknesses. - // 1. You can't index ranges effectively - // 2. You can't slice without copying - // 3. The underlying data may represent any type - // 4. Solidity never deallocates memory, and memory costs grow - // superlinearly - - // By using a memory view instead of a `bytes memory` we get the following - // advantages: - // 1. Slices are done on the stack, by manipulating the pointer - // 2. We can index arbitrary ranges and quickly convert them to stack types - // 3. We can insert type info into the pointer, and typecheck at runtime - - // This makes `TypedMemView` a useful tool for efficient zero-copy - // algorithms. - - // Why bytes29? - // We want to avoid confusion between views, digests, and other common - // types so we chose a large and uncommonly used odd number of bytes - // - // Note that while bytes are left-aligned in a word, integers and addresses - // are right-aligned. This means when working in assembly we have to - // account for the 3 unused bytes on the righthand side - // - // First 5 bytes are a type flag. - // - ff_ffff_fffe is reserved for unknown type. - // - ff_ffff_ffff is reserved for invalid types/errors. - // next 12 are memory address - // next 12 are len - // bottom 3 bytes are empty - - // Assumptions: - // - non-modification of memory. - // - No Solidity updates - // - - wrt free mem point - // - - wrt bytes representation in memory - // - - wrt memory addressing in general - - // Usage: - // - create type constants - // - use `assertType` for runtime type assertions - // - - unfortunately we can't do this at compile time yet :( - // - recommended: implement modifiers that perform type checking - // - - e.g. - // - - `uint40 constant MY_TYPE = 3;` - // - - ` modifer onlyMyType(bytes29 myView) { myView.assertType(MY_TYPE); }` - // - instantiate a typed view from a bytearray using `ref` - // - use `index` to inspect the contents of the view - // - use `slice` to create smaller views into the same memory - // - - `slice` can increase the offset - // - - `slice can decrease the length` - // - - must specify the output type of `slice` - // - - `slice` will return a null view if you try to overrun - // - - make sure to explicitly check for this with `notNull` or `assertType` - // - use `equal` for typed comparisons. - - // The null view - bytes29 public constant NULL = hex"ffffffffffffffffffffffffffffffffffffffffffffffffffffffffff"; - // Mask a low uint96 - uint256 constant LOW_12_MASK = 0xffffffffffffffffffffffff; - // Shift constants - uint8 constant SHIFT_TO_LEN = 24; - uint8 constant SHIFT_TO_LOC = 96 + 24; - uint8 constant SHIFT_TO_TYPE = 96 + 96 + 24; - // For nibble encoding - bytes private constant NIBBLE_LOOKUP = "0123456789abcdef"; - - /** - * @notice Copies the referenced memory to a new loc in memory, returning a `bytes` pointing to - * the new memory - * @dev Shortcuts if the pointers are identical, otherwise compares type and digest. - * @param memView The view - * @return ret - The view pointing to the new memory - */ - function clone(bytes29 memView) internal view returns (bytes memory ret) { - uint256 ptr; - uint256 _len = len(memView); - assembly { - // solium-disable-previous-line security/no-inline-assembly - ptr := mload(0x40) // load unused memory pointer - ret := ptr - } - unsafeCopyTo(memView, ptr + 0x20); - assembly { - // solium-disable-previous-line security/no-inline-assembly - mstore(0x40, add(add(ptr, _len), 0x20)) // write new unused pointer - mstore(ptr, _len) // write len of new array (in bytes) - } - } - - /** - * @notice Produce the keccak256 digest of the concatenated contents of multiple views. - * @param memViews The views - * @return bytes32 - The keccak256 digest - */ - function joinKeccak(bytes29[] memory memViews) internal view returns (bytes32) { - uint256 ptr; - assembly { - // solium-disable-previous-line security/no-inline-assembly - ptr := mload(0x40) // load unused memory pointer - } - return keccak(unsafeJoin(memViews, ptr)); - } - - /** - * @notice Produce the sha256 digest of the concatenated contents of multiple views. - * @param memViews The views - * @return bytes32 - The sha256 digest - */ - function joinSha2(bytes29[] memory memViews) internal view returns (bytes32) { - uint256 ptr; - assembly { - // solium-disable-previous-line security/no-inline-assembly - ptr := mload(0x40) // load unused memory pointer - } - return sha2(unsafeJoin(memViews, ptr)); - } - - /** - * @notice copies all views, joins them into a new bytearray. - * @param memViews The views - * @return ret - The new byte array - */ - function join(bytes29[] memory memViews) internal view returns (bytes memory ret) { - uint256 ptr; - assembly { - // solium-disable-previous-line security/no-inline-assembly - ptr := mload(0x40) // load unused memory pointer - } - - bytes29 _newView = unsafeJoin(memViews, ptr + 0x20); - uint256 _written = len(_newView); - uint256 _footprint = footprint(_newView); - - assembly { - // solium-disable-previous-line security/no-inline-assembly - // store the legnth - mstore(ptr, _written) - // new pointer is old + 0x20 + the footprint of the body - mstore(0x40, add(add(ptr, _footprint), 0x20)) - ret := ptr - } - } - - /** - * @notice Return the sha2 digest of the underlying memory. - * @dev We explicitly deallocate memory afterwards. - * @param memView The view - * @return digest - The sha2 hash of the underlying memory - */ - function sha2(bytes29 memView) internal view returns (bytes32 digest) { - uint256 _loc = loc(memView); - uint256 _len = len(memView); - - bool res; - assembly { - // solium-disable-previous-line security/no-inline-assembly - let ptr := mload(0x40) - res := staticcall(gas, 2, _loc, _len, ptr, 0x20) // sha2 #1 - digest := mload(ptr) - } - require(res, "sha2 OOG"); - } - - /** - * @notice Implements bitcoin's hash160 (rmd160(sha2())) - * @param memView The pre-image - * @return digest - the Digest - */ - function hash160(bytes29 memView) internal view returns (bytes20 digest) { - uint256 _loc = loc(memView); - uint256 _len = len(memView); - bool res; - assembly { - // solium-disable-previous-line security/no-inline-assembly - let ptr := mload(0x40) - res := staticcall(gas, 2, _loc, _len, ptr, 0x20) // sha2 - res := and(res, staticcall(gas, 3, ptr, 0x20, ptr, 0x20)) // rmd160 - digest := mload(add(ptr, 0xc)) // return value is 0-prefixed. - } - require(res, "hash160 OOG"); - } - - /** - * @notice Implements bitcoin's hash256 (double sha2) - * @param memView A view of the preimage - * @return digest - the Digest - */ - function hash256(bytes29 memView) internal view returns (bytes32 digest) { - uint256 _loc = loc(memView); - uint256 _len = len(memView); - bool res; - assembly { - // solium-disable-previous-line security/no-inline-assembly - let ptr := mload(0x40) - res := staticcall(gas, 2, _loc, _len, ptr, 0x20) // sha2 #1 - res := and(res, staticcall(gas, 2, ptr, 0x20, ptr, 0x20)) // sha2 #2 - digest := mload(ptr) - } - require(res, "hash256 OOG"); - } - - /** - * @notice Returns the encoded hex character that represents the lower 4 bits of the argument. - * @param _byte The byte - * @return _char The encoded hex character - */ - function nibbleHex(uint8 _byte) internal pure returns (uint8 _char) { - uint8 _nibble = _byte & 0x0f; // keep bottom 4, 0 top 4 - _char = uint8(NIBBLE_LOOKUP[_nibble]); - } - /** - * @notice Returns a uint16 containing the hex-encoded byte. - * @param _b The byte - * @return encoded - The hex-encoded byte - */ - function byteHex(uint8 _b) internal pure returns (uint16 encoded) { - encoded |= nibbleHex(_b >> 4); // top 4 bits - encoded <<= 8; - encoded |= nibbleHex(_b); // lower 4 bits - } - - /** - * @notice Encodes the uint256 to hex. `first` contains the encoded top 16 bytes. - * `second` contains the encoded lower 16 bytes. - * - * @param _b The 32 bytes as uint256 - * @return first - The top 16 bytes - * @return second - The bottom 16 bytes - */ - function encodeHex(uint256 _b) internal pure returns (uint256 first, uint256 second) { - for (uint8 i = 31; i > 15; i -= 1) { - uint8 _byte = uint8(_b >> (i * 8)); - first |= byteHex(_byte); - if (i != 16) { - first <<= 16; - } - } - - // abusing underflow here =_= - for (uint8 i = 15; i < 255; i -= 1) { - uint8 _byte = uint8(_b >> (i * 8)); - second |= byteHex(_byte); - if (i != 0) { - second <<= 16; - } - } - } - - /** - * @notice Changes the endianness of a uint256. - * @dev https://graphics.stanford.edu/~seander/bithacks.html#ReverseParallel - * @param _b The unsigned integer to reverse - * @return v - The reversed value - */ - function reverseUint256(uint256 _b) internal pure returns (uint256 v) { - v = _b; - - // swap bytes - v = - ((v >> 8) & 0x00FF00FF00FF00FF00FF00FF00FF00FF00FF00FF00FF00FF00FF00FF00FF00FF) | - ((v & 0x00FF00FF00FF00FF00FF00FF00FF00FF00FF00FF00FF00FF00FF00FF00FF00FF) << 8); - // swap 2-byte long pairs - v = - ((v >> 16) & 0x0000FFFF0000FFFF0000FFFF0000FFFF0000FFFF0000FFFF0000FFFF0000FFFF) | - ((v & 0x0000FFFF0000FFFF0000FFFF0000FFFF0000FFFF0000FFFF0000FFFF0000FFFF) << 16); - // swap 4-byte long pairs - v = - ((v >> 32) & 0x00000000FFFFFFFF00000000FFFFFFFF00000000FFFFFFFF00000000FFFFFFFF) | - ((v & 0x00000000FFFFFFFF00000000FFFFFFFF00000000FFFFFFFF00000000FFFFFFFF) << 32); - // swap 8-byte long pairs - v = - ((v >> 64) & 0x0000000000000000FFFFFFFFFFFFFFFF0000000000000000FFFFFFFFFFFFFFFF) | - ((v & 0x0000000000000000FFFFFFFFFFFFFFFF0000000000000000FFFFFFFFFFFFFFFF) << 64); - // swap 16-byte long pairs - v = (v >> 128) | (v << 128); - } - - /** - * @notice Return the null view. - * @return bytes29 - The null view - */ - function nullView() internal pure returns (bytes29) { - return NULL; - } - - /** - * @notice Check if the view is null. - * @return bool - True if the view is null - */ - function isNull(bytes29 memView) internal pure returns (bool) { - return memView == NULL; - } - - /** - * @notice Check if the view is not null. - * @return bool - True if the view is not null - */ - function notNull(bytes29 memView) internal pure returns (bool) { - return !isNull(memView); - } - - /** - * @notice Check if the view is of a valid type and points to a valid location - * in memory. - * @dev We perform this check by examining solidity's unallocated memory - * pointer and ensuring that the view's upper bound is less than that. - * @param memView The view - * @return ret - True if the view is valid - */ - function isValid(bytes29 memView) internal pure returns (bool ret) { - if (typeOf(memView) == 0xffffffffff) { - return false; - } - uint256 _end = end(memView); - assembly { - // solhint-disable-previous-line no-inline-assembly - ret := iszero(gt(_end, mload(0x40))) - } - } - - /** - * @notice Require that a typed memory view be valid. - * @dev Returns the view for easy chaining. - * @param memView The view - * @return bytes29 - The validated view - */ - function assertValid(bytes29 memView) internal pure returns (bytes29) { - require(isValid(memView), "Validity assertion failed"); - return memView; - } - - /** - * @notice Return true if the memview is of the expected type. Otherwise false. - * @param memView The view - * @param _expected The expected type - * @return bool - True if the memview is of the expected type - */ - function isType(bytes29 memView, uint40 _expected) internal pure returns (bool) { - return typeOf(memView) == _expected; - } - - /** - * @notice Require that a typed memory view has a specific type. - * @dev Returns the view for easy chaining. - * @param memView The view - * @param _expected The expected type - * @return bytes29 - The view with validated type - */ - function assertType(bytes29 memView, uint40 _expected) internal pure returns (bytes29) { - if (!isType(memView, _expected)) { - (, uint256 g) = encodeHex(uint256(typeOf(memView))); - (, uint256 e) = encodeHex(uint256(_expected)); - string memory err = string( - abi.encodePacked("Type assertion failed. Got 0x", uint80(g), ". Expected 0x", uint80(e)) - ); - revert(err); - } - return memView; - } - - /** - * @notice Return an identical view with a different type. - * @param memView The view - * @param _newType The new type - * @return newView - The new view with the specified type - */ - function castTo(bytes29 memView, uint40 _newType) internal pure returns (bytes29 newView) { - // then | in the new type - uint256 _typeShift = SHIFT_TO_TYPE; - uint256 _typeBits = 40; - assembly { - // solium-disable-previous-line security/no-inline-assembly - // shift off the top 5 bytes - newView := or(newView, shr(_typeBits, shl(_typeBits, memView))) - newView := or(newView, shl(_typeShift, _newType)) - } - } - - /** - * @notice Instantiate a new memory view. This should generally not be called - * directly. Prefer `ref` wherever possible. - * @dev Instantiate a new memory view. This should generally not be called - * directly. Prefer `ref` wherever possible. - * @param _type The type - * @param _loc The memory address - * @param _len The length - * @return newView - The new view with the specified type, location and length - */ - function build( - uint256 _type, - uint256 _loc, - uint256 _len - ) internal pure returns (bytes29 newView) { - uint256 _end = _loc.add(_len); - assembly { - // solium-disable-previous-line security/no-inline-assembly - if gt(_end, mload(0x40)) { - _end := 0 - } - } - if (_end == 0) { - return NULL; - } - newView = unsafeBuildUnchecked(_type, _loc, _len); - } - - /** - * @notice Instantiate a memory view from a byte array. - * @dev Note that due to Solidity memory representation, it is not possible to - * implement a deref, as the `bytes` type stores its len in memory. - * @param arr The byte array - * @param newType The type - * @return bytes29 - The memory view - */ - function ref(bytes memory arr, uint40 newType) internal pure returns (bytes29) { - uint256 _len = arr.length; - - uint256 _loc; - assembly { - // solium-disable-previous-line security/no-inline-assembly - _loc := add(arr, 0x20) // our view is of the data, not the struct - } - - return build(newType, _loc, _len); - } - - /** - * @notice Return the associated type information. - * @param memView The memory view - * @return _type - The type associated with the view - */ - function typeOf(bytes29 memView) internal pure returns (uint40 _type) { - uint256 _shift = SHIFT_TO_TYPE; - assembly { - // solium-disable-previous-line security/no-inline-assembly - _type := shr(_shift, memView) // shift out lower 27 bytes - } - } - - /** - * @notice Optimized type comparison. Checks that the 5-byte type flag is equal. - * @param left The first view - * @param right The second view - * @return bool - True if the 5-byte type flag is equal - */ - function sameType(bytes29 left, bytes29 right) internal pure returns (bool) { - return (left ^ right) >> SHIFT_TO_TYPE == 0; - } - - /** - * @notice Return the memory address of the underlying bytes. - * @param memView The view - * @return _loc - The memory address - */ - function loc(bytes29 memView) internal pure returns (uint96 _loc) { - uint256 _mask = LOW_12_MASK; // assembly can't use globals - uint256 _shift = SHIFT_TO_LOC; - assembly { - // solium-disable-previous-line security/no-inline-assembly - _loc := and(shr(_shift, memView), _mask) - } - } - - /** - * @notice The number of memory words this memory view occupies, rounded up. - * @param memView The view - * @return uint256 - The number of memory words - */ - function words(bytes29 memView) internal pure returns (uint256) { - return uint256(len(memView)).add(31) / 32; - } - - /** - * @notice The in-memory footprint of a fresh copy of the view. - * @param memView The view - * @return uint256 - The in-memory footprint of a fresh copy of the view. - */ - function footprint(bytes29 memView) internal pure returns (uint256) { - return words(memView) * 32; - } - - /** - * @notice The number of bytes of the view. - * @param memView The view - * @return _len - The length of the view - */ - function len(bytes29 memView) internal pure returns (uint96 _len) { - uint256 _mask = LOW_12_MASK; // assembly can't use globals - uint256 _emptyBits = 24; - assembly { - // solium-disable-previous-line security/no-inline-assembly - _len := and(shr(_emptyBits, memView), _mask) - } - } - - /** - * @notice Returns the endpoint of `memView`. - * @param memView The view - * @return uint256 - The endpoint of `memView` - */ - function end(bytes29 memView) internal pure returns (uint256) { - return loc(memView) + len(memView); - } - - /** - * @notice Safe slicing without memory modification. - * @param memView The view - * @param _index The start index - * @param _len The length - * @param newType The new type - * @return bytes29 - The new view - */ - function slice( - bytes29 memView, - uint256 _index, - uint256 _len, - uint40 newType - ) internal pure returns (bytes29) { - uint256 _loc = loc(memView); - - // Ensure it doesn't overrun the view - if (_loc.add(_index).add(_len) > end(memView)) { - return NULL; - } - - _loc = _loc.add(_index); - return build(newType, _loc, _len); - } - - /** - * @notice Shortcut to `slice`. Gets a view representing the first `_len` bytes. - * @param memView The view - * @param _len The length - * @param newType The new type - * @return bytes29 - The new view - */ - function prefix(bytes29 memView, uint256 _len, uint40 newType) internal pure returns (bytes29) { - return slice(memView, 0, _len, newType); - } - - /** - * @notice Shortcut to `slice`. Gets a view representing the last `_len` byte. - * @param memView The view - * @param _len The length - * @param newType The new type - * @return bytes29 - The new view - */ - function postfix(bytes29 memView, uint256 _len, uint40 newType) internal pure returns (bytes29) { - return slice(memView, uint256(len(memView)).sub(_len), _len, newType); - } - - /** - * @notice Construct an error message for an indexing overrun. - * @param _loc The memory address - * @param _len The length - * @param _index The index - * @param _slice The slice where the overrun occurred - * @return err - The err - */ - function indexErrOverrun( - uint256 _loc, - uint256 _len, - uint256 _index, - uint256 _slice - ) internal pure returns (string memory err) { - (, uint256 a) = encodeHex(_loc); - (, uint256 b) = encodeHex(_len); - (, uint256 c) = encodeHex(_index); - (, uint256 d) = encodeHex(_slice); - err = string( - abi.encodePacked( - "TypedMemView/index - Overran the view. Slice is at 0x", - uint48(a), - " with length 0x", - uint48(b), - ". Attempted to index at offset 0x", - uint48(c), - " with length 0x", - uint48(d), - "." - ) - ); - } - - /** - * @notice Load up to 32 bytes from the view onto the stack. - * @dev Returns a bytes32 with only the `_bytes` highest bytes set. - * This can be immediately cast to a smaller fixed-length byte array. - * To automatically cast to an integer, use `indexUint`. - * @param memView The view - * @param _index The index - * @param _bytes The bytes - * @return result - The 32 byte result - */ - function index( - bytes29 memView, - uint256 _index, - uint8 _bytes - ) internal pure returns (bytes32 result) { - if (_bytes == 0) { - return bytes32(0); - } - if (_index.add(_bytes) > len(memView)) { - revert(indexErrOverrun(loc(memView), len(memView), _index, uint256(_bytes))); - } - require(_bytes <= 32, "TypedMemView/index - Attempted to index more than 32 bytes"); - - uint8 bitLength = _bytes * 8; - uint256 _loc = loc(memView); - uint256 _mask = leftMask(bitLength); - assembly { - // solium-disable-previous-line security/no-inline-assembly - result := and(mload(add(_loc, _index)), _mask) - } - } - - /** - * @notice Parse an unsigned integer from the view at `_index`. - * @dev Requires that the view have >= `_bytes` bytes following that index. - * @param memView The view - * @param _index The index - * @param _bytes The bytes - * @return result - The unsigned integer - */ - function indexUint( - bytes29 memView, - uint256 _index, - uint8 _bytes - ) internal pure returns (uint256 result) { - return uint256(index(memView, _index, _bytes)) >> ((32 - _bytes) * 8); - } - - /** - * @notice Parse an unsigned integer from LE bytes. - * @param memView The view - * @param _index The index - * @param _bytes The bytes - * @return result - The unsigned integer - */ - function indexLEUint( - bytes29 memView, - uint256 _index, - uint8 _bytes - ) internal pure returns (uint256 result) { - return reverseUint256(uint256(index(memView, _index, _bytes))); - } - - /** - * @notice Parse an address from the view at `_index`. Requires that the view have >= 20 bytes - * following that index. - * @param memView The view - * @param _index The index - * @return address - The address - */ - function indexAddress(bytes29 memView, uint256 _index) internal pure returns (address) { - return address(uint160(indexUint(memView, _index, 20))); - } - - /** - * @notice Return the keccak256 hash of the underlying memory - * @param memView The view - * @return digest - The keccak256 hash of the underlying memory - */ - function keccak(bytes29 memView) internal pure returns (bytes32 digest) { - uint256 _loc = loc(memView); - uint256 _len = len(memView); - assembly { - // solium-disable-previous-line security/no-inline-assembly - digest := keccak256(_loc, _len) - } - } - - /** - * @notice Return true if the underlying memory is equal. Else false. - * @param left The first view - * @param right The second view - * @return bool - True if the underlying memory is equal - */ - function untypedEqual(bytes29 left, bytes29 right) internal pure returns (bool) { - return (loc(left) == loc(right) && len(left) == len(right)) || keccak(left) == keccak(right); - } - - /** - * @notice Return false if the underlying memory is equal. Else true. - * @param left The first view - * @param right The second view - * @return bool - False if the underlying memory is equal - */ - function untypedNotEqual(bytes29 left, bytes29 right) internal pure returns (bool) { - return !untypedEqual(left, right); - } - - /** - * @notice Compares type equality. - * @dev Shortcuts if the pointers are identical, otherwise compares type and digest. - * @param left The first view - * @param right The second view - * @return bool - True if the types are the same - */ - function equal(bytes29 left, bytes29 right) internal pure returns (bool) { - return left == right || (typeOf(left) == typeOf(right) && keccak(left) == keccak(right)); - } - - /** - * @notice Compares type inequality. - * @dev Shortcuts if the pointers are identical, otherwise compares type and digest. - * @param left The first view - * @param right The second view - * @return bool - True if the types are not the same - */ - function notEqual(bytes29 left, bytes29 right) internal pure returns (bool) { - return !equal(left, right); - } - - /** - * @notice Copy the view to a location, return an unsafe memory reference - * @dev Super Dangerous direct memory access. - * - * This reference can be overwritten if anything else modifies memory (!!!). - * As such it MUST be consumed IMMEDIATELY. - * This function is private to prevent unsafe usage by callers. - * @param memView The view - * @param _newLoc The new location - * @return written - the unsafe memory reference - */ - function unsafeCopyTo(bytes29 memView, uint256 _newLoc) private view returns (bytes29 written) { - require(notNull(memView), "TypedMemView/copyTo - Null pointer deref"); - require(isValid(memView), "TypedMemView/copyTo - Invalid pointer deref"); - uint256 _len = len(memView); - uint256 _oldLoc = loc(memView); - - uint256 ptr; - bool res; - assembly { - // solium-disable-previous-line security/no-inline-assembly - ptr := mload(0x40) - // revert if we're writing in occupied memory - if gt(ptr, _newLoc) { - revert(0x60, 0x20) // empty revert message - } - - // use the identity precompile to copy - res := staticcall(gas, 4, _oldLoc, _len, _newLoc, _len) - } - require(res, "identity OOG"); - written = unsafeBuildUnchecked(typeOf(memView), _newLoc, _len); - } - - /** - * @notice Join the views in memory, return an unsafe reference to the memory. - * @dev Super Dangerous direct memory access. - * - * This reference can be overwritten if anything else modifies memory (!!!). - * As such it MUST be consumed IMMEDIATELY. - * This function is private to prevent unsafe usage by callers. - * @param memViews The views - * @param _location The location in memory to which to copy & concatenate - * @return unsafeView - The conjoined view pointing to the new memory - */ - function unsafeJoin( - bytes29[] memory memViews, - uint256 _location - ) private view returns (bytes29 unsafeView) { - assembly { - // solium-disable-previous-line security/no-inline-assembly - let ptr := mload(0x40) - // revert if we're writing in occupied memory - if gt(ptr, _location) { - revert(0x60, 0x20) // empty revert message - } - } - - uint256 _offset = 0; - for (uint256 i = 0; i < memViews.length; i++) { - bytes29 memView = memViews[i]; - unsafeCopyTo(memView, _location + _offset); - _offset += len(memView); - } - unsafeView = unsafeBuildUnchecked(0, _location, _offset); - } - - /** - * @notice Create a mask with the highest `_len` bits set. - * @param _len The length - * @return mask - The mask - */ - function leftMask(uint8 _len) private pure returns (uint256 mask) { - // ugly. redo without assembly? - assembly { - // solium-disable-previous-line security/no-inline-assembly - mask := sar(sub(_len, 1), 0x8000000000000000000000000000000000000000000000000000000000000000) - } - } - - /** - * @notice Unsafe raw pointer construction. This should generally not be called - * directly. Prefer `ref` wherever possible. - * @dev Unsafe raw pointer construction. This should generally not be called - * directly. Prefer `ref` wherever possible. - * @param _type The type - * @param _loc The memory address - * @param _len The length - * @return newView - The new view with the specified type, location and length - */ - function unsafeBuildUnchecked( - uint256 _type, - uint256 _loc, - uint256 _len - ) private pure returns (bytes29 newView) { - uint256 _uint96Bits = 96; - uint256 _emptyBits = 24; - assembly { - // solium-disable-previous-line security/no-inline-assembly - newView := shl(_uint96Bits, or(newView, _type)) // insert type - newView := shl(_uint96Bits, or(newView, _loc)) // insert loc - newView := shl(_emptyBits, or(newView, _len)) // empty bottom 3 bytes - } - } -} diff --git a/packages/protocol/contracts/common/libraries/test/BLS12Passthrough.sol b/packages/protocol/contracts/common/libraries/test/BLS12Passthrough.sol deleted file mode 100644 index d18e108e9fb..00000000000 --- a/packages/protocol/contracts/common/libraries/test/BLS12Passthrough.sol +++ /dev/null @@ -1,333 +0,0 @@ -pragma solidity ^0.5.13; - -import { B12_377Lib, B12_381Lib, B12 } from "../B12.sol"; -import { TypedMemView } from "../TypedMemView.sol"; - -contract Common { - using B12 for B12.G1Point; - using B12 for B12.G2Point; - using B12 for B12.Fp; - using B12 for B12.Fp2; - using B12 for bytes; - - using TypedMemView for bytes; - using TypedMemView for bytes29; - - event MEMDUMP(uint256 a, uint256 b, uint256 c, uint256 d); - - constructor() public {} - - function fpMulTest( - uint256 a1, - uint256 a2, - uint256 b1, - uint256 b2 - ) external view returns (uint256, uint256) { - B12.Fp memory a = B12.Fp(a1, a2); - B12.Fp memory b = B12.Fp(b1, b2); - B12.Fp memory res = B12.fpMul(a, b); - return (res.a, res.b); - } - - function fpNormalTest(uint256 a1, uint256 a2) external view returns (uint256, uint256) { - B12.Fp memory a = B12.Fp(a1, a2); - B12.Fp memory res = B12.fpNormal(a); - return (res.a, res.b); - } - - function fpNormal2Test(uint256 a, uint256 idx) external view returns (uint256, uint256) { - B12.Fp memory res = B12.fpNormal2(B12.Fp(0, a), idx); - return (res.a, res.b); - } - - function fp2MulTest( - uint256[] calldata arr - ) external view returns (uint256, uint256, uint256, uint256) { - B12.Fp2 memory x = B12.Fp2(B12.Fp(arr[0], arr[1]), B12.Fp(arr[2], arr[3])); - B12.Fp2 memory y = B12.Fp2(B12.Fp(arr[4], arr[5]), B12.Fp(arr[6], arr[7])); - B12.Fp2 memory res = B12.fp2Mul(x, y); - return (res.a.a, res.a.b, res.b.a, res.b.b); - } - - function testUncompress() external view returns (uint256, uint256) { - B12.Fp memory x = B12.Fp( - 0x008848defe740a67c8fc6225bf87ff54, - 0x85951e2caa9d41bb188282c8bd37cb5cd5481512ffcd394eeab9b16eb21be9ef - ); - B12.Fp memory y1 = B12.Fp( - 0x001cefdc52b4e1eba6d3b6633bf15a76, - 0x5ca326aa36b6c0b5b1db375b6a5124fa540d200dfb56a6e58785e1aaaa63715b - ); - B12.Fp memory y2 = B12.Fp( - 0x01914a69c5102eff1f674f5d30afeec4, - 0xbd7fb348ca3e52d96d182ad44fb82305c2fe3d3634a9591afd82de55559c8ea6 - ); - B12.G1Point memory res = B12.mapToG1(x, y2, y1, true); - return (res.Y.a, res.Y.b); - } - - function testParseG1(bytes calldata arg) external pure returns (uint256[4] memory ret) { - B12.G1Point memory a = arg.parseG1(0); - ret[0] = a.X.a; - ret[1] = a.X.b; - ret[2] = a.Y.a; - ret[3] = a.Y.b; - } - - function testSerializeG1( - uint256 w, - uint256 x, - uint256 y, - uint256 z - ) external pure returns (bytes memory) { - B12.G1Point memory a; - a.X.a = w; - a.X.b = x; - a.Y.a = y; - a.Y.b = z; - - return a.serializeG1(); - } - - function testParseG2(bytes calldata arg) external pure returns (uint256[8] memory ret) { - B12.G2Point memory a = arg.parseG2(0); - ret[0] = a.X.a.a; - ret[1] = a.X.a.b; - ret[2] = a.X.b.a; - ret[3] = a.X.b.b; - ret[4] = a.Y.a.a; - ret[5] = a.Y.a.b; - ret[6] = a.Y.b.a; - ret[7] = a.Y.b.b; - } - - function testSerializeG2( - uint256 xaa, - uint256 xab, - uint256 xba, - uint256 xbb, - uint256 yaa, - uint256 yab, - uint256 yba, - uint256 ybb - ) external pure returns (bytes memory) { - B12.G2Point memory a; - a.X.a.a = xaa; - a.X.a.b = xab; - a.X.b.a = xba; - a.X.b.b = xbb; - a.Y.a.a = yaa; - a.Y.a.b = yab; - a.Y.b.a = yba; - a.Y.b.b = ybb; - - return a.serializeG2(); - } - - function testDeserialize(bytes memory h) public pure returns (uint256, uint256, bool) { - (B12.Fp memory p, bool b) = B12.parsePoint(h); - return (p.a, p.b, b); - } - - function dumpMem(uint256 idx) internal { - uint256 a; - uint256 b; - uint256 c; - uint256 d; - - assembly { - a := mload(add(idx, 0x00)) - b := mload(add(idx, 0x20)) - c := mload(add(idx, 0x40)) - d := mload(add(idx, 0x60)) - } - emit MEMDUMP(a, b, c, d); - } - - function executePrecompile( - bytes memory input, - uint8 addr, - uint256 output_len - ) internal view returns (bytes memory output) { - bool success; - assembly { - success := staticcall( - sub(gas, 2000), - addr, - add(input, 0x20), // location - mload(input), // length - add(output, 0x20), // location - output_len // length - ) - mstore(output, output_len) - } - - require(success, "failed"); - } -} - -contract BLS12_381Passthrough is Common { - using B12_381Lib for B12.G1Point; - using B12_381Lib for B12.G2Point; - using B12_381Lib for B12.Fp; - using B12_381Lib for B12.Fp2; - using B12 for B12.G1Point; - using B12 for B12.G2Point; - using B12 for B12.Fp; - using B12 for B12.Fp2; - using B12_381Lib for bytes; - using B12 for bytes; - - using TypedMemView for bytes; - using TypedMemView for bytes29; - - constructor() public {} - - function g1Add(bytes calldata args) external view returns (bytes memory) { - B12.G1Point memory a = args.parseG1(0); - B12.G1Point memory b = args.parseG1(4 * 32); - return a.g1Add(b).serializeG1(); - } - - function g1Mul(bytes calldata args) external view returns (bytes memory) { - B12.G1Point memory a = args.parseG1(0); - uint256 scalar = args.ref(0).indexUint(4 * 32, 32); - return a.g1Mul(scalar).serializeG1(); - } - - function g1MultiExp(bytes calldata args) external view returns (bytes memory) { - bytes29 ref = args.ref(0); - - B12.G1MultiExpArg[] memory input = new B12.G1MultiExpArg[](args.length / 160); - - for (uint256 i = 0; i < args.length / 160; i += 1) { - uint256 idx = i * 160; - - input[i].point.X.a = ref.indexUint(idx + 0x00, 32); - input[i].point.X.b = ref.indexUint(idx + 0x20, 32); - input[i].point.Y.a = ref.indexUint(idx + 0x40, 32); - input[i].point.Y.b = ref.indexUint(idx + 0x60, 32); - input[i].scalar = ref.indexUint(idx + 0x80, 32); - } - - return B12_381Lib.g1MultiExp(input).serializeG1(); - } - - function g2Add(bytes calldata args) external view returns (bytes memory) { - B12.G2Point memory a = args.parseG2(0); - B12.G2Point memory b = args.parseG2(8 * 32); - return a.g2Add(b).serializeG2(); - } - - function g2Mul(bytes calldata args) external view returns (bytes memory) { - B12.G2Point memory a = args.parseG2(0); - uint256 scalar = args.ref(0).indexUint(8 * 32, 32); - a.g2Mul(scalar); - return a.serializeG2(); - } - - function g2MultiExp(bytes calldata args) external view returns (bytes memory) { - bytes29 ref = args.ref(0); - - B12.G2MultiExpArg[] memory input = new B12.G2MultiExpArg[](args.length / 288); - - for (uint256 i = 0; i < args.length / 288; i += 1) { - uint256 idx = i * 288; - - input[i].point.X.a.a = ref.indexUint(idx + 0x00, 32); - input[i].point.X.a.b = ref.indexUint(idx + 0x20, 32); - input[i].point.X.b.a = ref.indexUint(idx + 0x40, 32); - input[i].point.X.b.b = ref.indexUint(idx + 0x60, 32); - input[i].point.Y.a.a = ref.indexUint(idx + 0x80, 32); - input[i].point.Y.a.b = ref.indexUint(idx + 0xa0, 32); - input[i].point.Y.b.a = ref.indexUint(idx + 0xc0, 32); - input[i].point.Y.b.b = ref.indexUint(idx + 0xe0, 32); - input[i].scalar = ref.indexUint(idx + 0x100, 32); - } - - return B12_381Lib.g2MultiExp(input).serializeG2(); - } -} - -contract BLS12_377Passthrough is Common { - using B12_377Lib for B12.G1Point; - using B12_377Lib for B12.G2Point; - using B12_377Lib for B12.Fp; - using B12_377Lib for B12.Fp2; - using B12 for B12.G1Point; - using B12 for B12.G2Point; - using B12 for B12.Fp; - using B12 for B12.Fp2; - using B12_377Lib for bytes; - using B12 for bytes; - - using TypedMemView for bytes; - using TypedMemView for bytes29; - - constructor() public {} - - function g1Add(bytes calldata args) external view returns (bytes memory) { - B12.G1Point memory a = args.parseG1(0); - B12.G1Point memory b = args.parseG1(4 * 32); - return a.g1Add(b).serializeG1(); - } - - function g1Mul(bytes calldata args) external view returns (bytes memory) { - B12.G1Point memory a = args.parseG1(0); - uint256 scalar = args.ref(0).indexUint(4 * 32, 32); - return a.g1Mul(scalar).serializeG1(); - } - - function g1MultiExp(bytes calldata args) external view returns (bytes memory) { - bytes29 ref = args.ref(0); - - B12.G1MultiExpArg[] memory input = new B12.G1MultiExpArg[](args.length / 160); - - for (uint256 i = 0; i < args.length / 160; i += 1) { - uint256 idx = i * 160; - - input[i].point.X.a = ref.indexUint(idx + 0x00, 32); - input[i].point.X.b = ref.indexUint(idx + 0x20, 32); - input[i].point.Y.a = ref.indexUint(idx + 0x40, 32); - input[i].point.Y.b = ref.indexUint(idx + 0x60, 32); - input[i].scalar = ref.indexUint(idx + 0x80, 32); - } - - return B12_377Lib.g1MultiExp(input).serializeG1(); - } - - function g2Add(bytes calldata args) external view returns (bytes memory) { - B12.G2Point memory a = args.parseG2(0); - B12.G2Point memory b = args.parseG2(8 * 32); - return a.g2Add(b).serializeG2(); - } - - function g2Mul(bytes calldata args) external view returns (bytes memory) { - B12.G2Point memory a = args.parseG2(0); - uint256 scalar = args.ref(0).indexUint(8 * 32, 32); - a.g2Mul(scalar); - return a.serializeG2(); - } - - function g2MultiExp(bytes calldata args) external view returns (bytes memory) { - bytes29 ref = args.ref(0); - - B12.G2MultiExpArg[] memory input = new B12.G2MultiExpArg[](args.length / 288); - - for (uint256 i = 0; i < args.length / 288; i += 1) { - uint256 idx = i * 288; - - input[i].point.X.a.a = ref.indexUint(idx + 0x00, 32); - input[i].point.X.a.b = ref.indexUint(idx + 0x20, 32); - input[i].point.X.b.a = ref.indexUint(idx + 0x40, 32); - input[i].point.X.b.b = ref.indexUint(idx + 0x60, 32); - input[i].point.Y.a.a = ref.indexUint(idx + 0x80, 32); - input[i].point.Y.a.b = ref.indexUint(idx + 0xa0, 32); - input[i].point.Y.b.a = ref.indexUint(idx + 0xc0, 32); - input[i].point.Y.b.b = ref.indexUint(idx + 0xe0, 32); - input[i].scalar = ref.indexUint(idx + 0x100, 32); - } - - return B12_377Lib.g2MultiExp(input).serializeG2(); - } -} diff --git a/packages/protocol/contracts/common/libraries/test/CIP20Test.sol b/packages/protocol/contracts/common/libraries/test/CIP20Test.sol deleted file mode 100644 index 1449ac931e5..00000000000 --- a/packages/protocol/contracts/common/libraries/test/CIP20Test.sol +++ /dev/null @@ -1,35 +0,0 @@ -pragma solidity ^0.5.13; - -import "../CIP20Lib.sol"; - -contract CIP20Test { - using CIP20Lib for bytes; - - function sha3_256(bytes calldata input) external view returns (bytes memory) { - return input.sha3_256(); - } - - function sha3_512(bytes calldata input) external view returns (bytes memory) { - return input.sha3_512(); - } - - function keccak512(bytes calldata input) external view returns (bytes memory) { - return input.keccak512(); - } - - function sha2_512(bytes calldata input) external view returns (bytes memory) { - return input.sha2_512(); - } - - function blake2sWithConfig( - bytes32 config, - bytes calldata key, - bytes calldata preimage - ) external view returns (bytes memory) { - return CIP20Lib.blake2sWithConfig(config, key, preimage); - } - - function blake2s(bytes calldata input) external view returns (bytes memory) { - return input.blake2s(); - } -} diff --git a/packages/protocol/contracts/common/linkedlists/AddressLinkedList.sol b/packages/protocol/contracts/common/linkedlists/AddressLinkedList.sol deleted file mode 100644 index b24aaabc0fd..00000000000 --- a/packages/protocol/contracts/common/linkedlists/AddressLinkedList.sol +++ /dev/null @@ -1,106 +0,0 @@ -pragma solidity ^0.5.13; - -import "openzeppelin-solidity/contracts/math/SafeMath.sol"; - -import "./LinkedList.sol"; - -/** - * @title Maintains a doubly linked list keyed by address. - * @dev Following the `next` pointers will lead you to the head, rather than the tail. - */ -library AddressLinkedList { - using LinkedList for LinkedList.List; - using SafeMath for uint256; - /** - * @notice Inserts an element into a doubly linked list. - * @param list A storage pointer to the underlying list. - * @param key The key of the element to insert. - * @param previousKey The key of the element that comes before the element to insert. - * @param nextKey The key of the element that comes after the element to insert. - */ - function insert( - LinkedList.List storage list, - address key, - address previousKey, - address nextKey - ) public { - list.insert(toBytes(key), toBytes(previousKey), toBytes(nextKey)); - } - - /** - * @notice Inserts an element at the end of the doubly linked list. - * @param list A storage pointer to the underlying list. - * @param key The key of the element to insert. - */ - function push(LinkedList.List storage list, address key) public { - list.insert(toBytes(key), bytes32(0), list.tail); - } - - /** - * @notice Removes an element from the doubly linked list. - * @param list A storage pointer to the underlying list. - * @param key The key of the element to remove. - */ - function remove(LinkedList.List storage list, address key) public { - list.remove(toBytes(key)); - } - - /** - * @notice Updates an element in the list. - * @param list A storage pointer to the underlying list. - * @param key The element key. - * @param previousKey The key of the element that comes before the updated element. - * @param nextKey The key of the element that comes after the updated element. - */ - function update( - LinkedList.List storage list, - address key, - address previousKey, - address nextKey - ) public { - list.update(toBytes(key), toBytes(previousKey), toBytes(nextKey)); - } - - /** - * @notice Returns whether or not a particular key is present in the sorted list. - * @param list A storage pointer to the underlying list. - * @param key The element key. - * @return Whether or not the key is in the sorted list. - */ - function contains(LinkedList.List storage list, address key) public view returns (bool) { - return list.elements[toBytes(key)].exists; - } - - /** - * @notice Returns the N greatest elements of the list. - * @param list A storage pointer to the underlying list. - * @param n The number of elements to return. - * @return The keys of the greatest elements. - * @dev Reverts if n is greater than the number of elements in the list. - */ - function headN(LinkedList.List storage list, uint256 n) public view returns (address[] memory) { - bytes32[] memory byteKeys = list.headN(n); - address[] memory keys = new address[](n); - for (uint256 i = 0; i < n; i = i.add(1)) { - keys[i] = toAddress(byteKeys[i]); - } - return keys; - } - - /** - * @notice Gets all element keys from the doubly linked list. - * @param list A storage pointer to the underlying list. - * @return All element keys from head to tail. - */ - function getKeys(LinkedList.List storage list) public view returns (address[] memory) { - return headN(list, list.numElements); - } - - function toBytes(address a) public pure returns (bytes32) { - return bytes32(uint256(a) << 96); - } - - function toAddress(bytes32 b) public pure returns (address) { - return address(uint256(b) >> 96); - } -} diff --git a/packages/protocol/contracts/governance/Election.sol b/packages/protocol/contracts/governance/Election.sol index 3db4ec170ae..e9c0552bb12 100644 --- a/packages/protocol/contracts/governance/Election.sol +++ b/packages/protocol/contracts/governance/Election.sol @@ -6,7 +6,6 @@ import "openzeppelin-solidity/contracts/ownership/Ownable.sol"; import "./interfaces/IElection.sol"; import "./interfaces/IValidators.sol"; -import "../common/CalledByVm.sol"; import "../common/Initializable.sol"; import "../common/FixidityLib.sol"; import "../common/linkedlists/AddressSortedLinkedList.sol"; @@ -16,6 +15,7 @@ import "../common/libraries/Heap.sol"; import "../common/libraries/ReentrancyGuard.sol"; import "../common/Blockable.sol"; import "../common/PrecompilesOverride.sol"; +import "../common/Permissioned.sol"; /** * @title Manages the validator election process. @@ -28,8 +28,8 @@ contract Election is Initializable, UsingRegistry, PrecompilesOverride, - CalledByVm, - Blockable + Blockable, + Permissioned { using AddressSortedLinkedList for SortedLinkedList.List; using FixidityLib for FixidityLib.Fraction; @@ -156,20 +156,6 @@ contract Election is ); event EpochRewardsDistributedToVoters(address indexed group, uint256 value); - /** - * @notice - On L1, ensures the function is called via the consensus client. - * - On L2, ensures the function is called by the permitted address. - * @param permittedAddress The address permitted to call permissioned - * functions on L2. - */ - modifier onlyVmOrPermitted(address permittedAddress) { - if (isL2()) require(msg.sender == permittedAddress, "Only permitted address can call"); - else { - require(msg.sender == address(0), "Only VM can call"); - } - _; - } - /** * @notice Used in place of the constructor to allow the contract to be upgradable via proxy. * @param registryAddress The address of the registry core smart contract. @@ -357,7 +343,7 @@ contract Election is uint256 value, address lesser, address greater - ) external onlyVmOrPermitted(registry.getAddressFor(EPOCH_MANAGER_REGISTRY_ID)) { + ) external onlyPermitted(registry.getAddressFor(EPOCH_MANAGER_REGISTRY_ID)) { _distributeEpochRewards(group, value, lesser, greater); } @@ -564,44 +550,6 @@ contract Election is return votes.total.eligible.contains(group); } - /** - * @notice Returns the amount of rewards that voters for `group` are due at the end of an epoch. - * @param group The group to calculate epoch rewards for. - * @param totalEpochRewards The total amount of rewards going to all voters. - * @param uptimes Array of Fixidity representations of the validators' uptimes, between 0 and 1. - * @return The amount of rewards that voters for `group` are due at the end of an epoch. - * @dev Eligible groups that have received their maximum number of votes cannot receive more. - */ - function getGroupEpochRewards( - address group, - uint256 totalEpochRewards, - uint256[] calldata uptimes - ) external view onlyL1 returns (uint256) { - IValidators validators = getValidators(); - // The group must meet the balance requirements for their voters to receive epoch rewards. - if (!validators.meetsAccountLockedGoldRequirements(group) || votes.active.total <= 0) { - return 0; - } - - FixidityLib.Fraction memory votePortion = FixidityLib.newFixedFraction( - votes.active.forGroup[group].total, - votes.active.total - ); - FixidityLib.Fraction memory score = FixidityLib.wrap( - validators.calculateGroupEpochScore(uptimes) - ); - FixidityLib.Fraction memory slashingMultiplier = FixidityLib.wrap( - validators.getValidatorGroupSlashingMultiplier(group) - ); - return - FixidityLib - .newFixed(totalEpochRewards) - .multiply(votePortion) - .multiply(score) - .multiply(slashingMultiplier) - .fromFixed(); - } - /** * @notice Returns the amount of rewards that voters for `group` are due at the end of an epoch. * @param group The group to calculate epoch rewards for. @@ -614,7 +562,7 @@ contract Election is address group, uint256 totalEpochRewards, uint256 groupScore - ) external view onlyL2 returns (uint256) { + ) external view returns (uint256) { IValidators validators = getValidators(); // The group must meet the balance requirements for their voters to receive epoch rewards. if (!validators.meetsAccountLockedGoldRequirements(group) || votes.active.total <= 0) { @@ -676,7 +624,7 @@ contract Election is * @return Patch version of the contract. */ function getVersionNumber() external pure returns (uint256, uint256, uint256, uint256) { - return (1, 1, 4, 0); + return (1, 2, 0, 0); } /** @@ -824,100 +772,11 @@ contract Election is _electNValidatorSignerOrAccount(minElectableValidators, maxElectableValidators, accounts); } - /** - * @notice Returns a list of elected validator with seats allocated to groups via the D'Hondt - * method. - * @return The list of elected validator signers or accounts depending on input. - * @dev See https://en.wikipedia.org/wiki/D%27Hondt_method#Allocation for more information. - */ - function _electNValidatorSignerOrAccount( - uint256 minElectableValidators, - uint256 maxElectableValidators, - bool accounts // accounts or signers - ) internal view returns (address[] memory) { - // Groups must have at least `electabilityThreshold` proportion of the total votes to be - // considered for the election. - uint256 requiredVotes = electabilityThreshold - .multiply(FixidityLib.newFixed(getTotalVotes())) - .fromFixed(); - // Only consider groups with at least `requiredVotes` but do not consider more groups than the - // max number of electable validators. - uint256 numElectionGroups = votes.total.eligible.numElementsGreaterThan( - requiredVotes, - maxElectableValidators - ); - - address[] memory electionGroups = votes.total.eligible.headN(numElectionGroups); - uint256[] memory numMembers = getValidators().getGroupsNumMembers(electionGroups); - // Holds the number of members elected for each of the eligible validator groups. - uint256[] memory numMembersElected = new uint256[](electionGroups.length); - uint256 totalNumMembersElected = 0; - - uint256[] memory keys = new uint256[](electionGroups.length); - FixidityLib.Fraction[] memory votesForNextMember = new FixidityLib.Fraction[]( - electionGroups.length - ); - for (uint256 i = 0; i < electionGroups.length; i = i.add(1)) { - keys[i] = i; - votesForNextMember[i] = FixidityLib.newFixed( - votes.total.eligible.getValue(electionGroups[i]) - ); - } - - // Assign a number of seats to each validator group. - while (totalNumMembersElected < maxElectableValidators && electionGroups.length > 0) { - uint256 groupIndex = keys[0]; - // All electable validators have been elected. - if (votesForNextMember[groupIndex].unwrap() == 0) break; - // All members of the group have been elected - if (numMembers[groupIndex] <= numMembersElected[groupIndex]) { - votesForNextMember[groupIndex] = FixidityLib.wrap(0); - } else { - // Elect the next member from the validator group - numMembersElected[groupIndex] = numMembersElected[groupIndex].add(1); - totalNumMembersElected = totalNumMembersElected.add(1); - // If there are already n elected members in a group, the votes for the next member - // are total votes of group divided by n+1 - votesForNextMember[groupIndex] = FixidityLib - .newFixed(votes.total.eligible.getValue(electionGroups[groupIndex])) - .divide(FixidityLib.newFixed(numMembersElected[groupIndex].add(1))); - } - Heap.heapifyDown(keys, votesForNextMember); - } - require(totalNumMembersElected >= minElectableValidators, "Not enough elected validators"); - // Grab the top validators from each group that won seats. - address[] memory electedValidators = new address[](totalNumMembersElected); - totalNumMembersElected = 0; - - IValidators validators = getValidators(); - - for (uint256 i = 0; i < electionGroups.length; i = i.add(1)) { - // We use the validating delegate if one is set. - address[] memory electedGroupValidators; - if (accounts) { - electedGroupValidators = validators.getTopGroupValidatorsAccounts( - electionGroups[i], - numMembersElected[i] - ); - } else { - electedGroupValidators = validators.getTopGroupValidators( - electionGroups[i], - numMembersElected[i] - ); - } - for (uint256 j = 0; j < electedGroupValidators.length; j = j.add(1)) { - electedValidators[totalNumMembersElected] = electedGroupValidators[j]; - totalNumMembersElected = totalNumMembersElected.add(1); - } - } - return electedValidators; - } - /** * @notice Returns get current validator signers using the precompiles. * @return List of current validator signers. */ - function getCurrentValidatorSigners() public view onlyL1 returns (address[] memory) { + function getCurrentValidatorSigners() public view returns (address[] memory) { uint256 n = numberValidatorsInCurrentSet(); address[] memory res = new address[](n); for (uint256 i = 0; i < n; i = i.add(1)) { @@ -1061,7 +920,7 @@ contract Election is * Fundamentally calls `revokePending` and `revokeActive` but only resorts groups once. * @param account The account whose votes to `group` should be decremented. * @param group The validator group to decrement votes from. - * @param maxValue The maxinum number of votes to decrement and revoke. + * @param maxValue The maximum number of votes to decrement and revoke. * @param lesser The group receiving fewer votes than the group for which the vote was revoked, * or 0 if that group has the fewest votes of any validator group. * @param greater The group receiving more votes than the group for which the vote was revoked, @@ -1102,6 +961,95 @@ contract Election is return decrementedValue; } + /** + * @notice Returns a list of elected validator with seats allocated to groups via the D'Hondt + * method. + * @return The list of elected validator signers or accounts depending on input. + * @dev See https://en.wikipedia.org/wiki/D%27Hondt_method#Allocation for more information. + */ + function _electNValidatorSignerOrAccount( + uint256 minElectableValidators, + uint256 maxElectableValidators, + bool accounts // accounts or signers + ) internal view returns (address[] memory) { + // Groups must have at least `electabilityThreshold` proportion of the total votes to be + // considered for the election. + uint256 requiredVotes = electabilityThreshold + .multiply(FixidityLib.newFixed(getTotalVotes())) + .fromFixed(); + // Only consider groups with at least `requiredVotes` but do not consider more groups than the + // max number of electable validators. + uint256 numElectionGroups = votes.total.eligible.numElementsGreaterThan( + requiredVotes, + maxElectableValidators + ); + + address[] memory electionGroups = votes.total.eligible.headN(numElectionGroups); + uint256[] memory numMembers = getValidators().getGroupsNumMembers(electionGroups); + // Holds the number of members elected for each of the eligible validator groups. + uint256[] memory numMembersElected = new uint256[](electionGroups.length); + uint256 totalNumMembersElected = 0; + + uint256[] memory keys = new uint256[](electionGroups.length); + FixidityLib.Fraction[] memory votesForNextMember = new FixidityLib.Fraction[]( + electionGroups.length + ); + for (uint256 i = 0; i < electionGroups.length; i = i.add(1)) { + keys[i] = i; + votesForNextMember[i] = FixidityLib.newFixed( + votes.total.eligible.getValue(electionGroups[i]) + ); + } + + // Assign a number of seats to each validator group. + while (totalNumMembersElected < maxElectableValidators && electionGroups.length > 0) { + uint256 groupIndex = keys[0]; + // All electable validators have been elected. + if (votesForNextMember[groupIndex].unwrap() == 0) break; + // All members of the group have been elected + if (numMembers[groupIndex] <= numMembersElected[groupIndex]) { + votesForNextMember[groupIndex] = FixidityLib.wrap(0); + } else { + // Elect the next member from the validator group + numMembersElected[groupIndex] = numMembersElected[groupIndex].add(1); + totalNumMembersElected = totalNumMembersElected.add(1); + // If there are already n elected members in a group, the votes for the next member + // are total votes of group divided by n+1 + votesForNextMember[groupIndex] = FixidityLib + .newFixed(votes.total.eligible.getValue(electionGroups[groupIndex])) + .divide(FixidityLib.newFixed(numMembersElected[groupIndex].add(1))); + } + Heap.heapifyDown(keys, votesForNextMember); + } + require(totalNumMembersElected >= minElectableValidators, "Not enough elected validators"); + // Grab the top validators from each group that won seats. + address[] memory electedValidators = new address[](totalNumMembersElected); + totalNumMembersElected = 0; + + IValidators validators = getValidators(); + + for (uint256 i = 0; i < electionGroups.length; i = i.add(1)) { + // We use the validating delegate if one is set. + address[] memory electedGroupValidators; + if (accounts) { + electedGroupValidators = validators.getTopGroupValidatorsAccounts( + electionGroups[i], + numMembersElected[i] + ); + } else { + electedGroupValidators = validators.getTopGroupValidators( + electionGroups[i], + numMembersElected[i] + ); + } + for (uint256 j = 0; j < electedGroupValidators.length; j = j.add(1)) { + electedValidators[totalNumMembersElected] = electedGroupValidators[j]; + totalNumMembersElected = totalNumMembersElected.add(1); + } + } + return electedValidators; + } + /** * @notice Increments the number of total votes for `group` by `value`. * @param group The validator group whose vote total should be incremented. diff --git a/packages/protocol/contracts/governance/EpochRewards.sol b/packages/protocol/contracts/governance/EpochRewards.sol index 63d5495beba..3b63a3cacee 100644 --- a/packages/protocol/contracts/governance/EpochRewards.sol +++ b/packages/protocol/contracts/governance/EpochRewards.sol @@ -9,6 +9,7 @@ import "../common/Freezable.sol"; import "../common/Initializable.sol"; import "../common/UsingRegistry.sol"; import "../common/PrecompilesOverride.sol"; +import "../common/Permissioned.sol"; import "../common/interfaces/ICeloToken.sol"; import "../common/interfaces/ICeloVersionedContract.sol"; @@ -22,7 +23,8 @@ contract EpochRewards is Initializable, UsingRegistry, PrecompilesOverride, - Freezable + Freezable, + Permissioned { using FixidityLib for FixidityLib.Fraction; using SafeMath for uint256; @@ -85,14 +87,6 @@ contract EpochRewards is event TargetVotingYieldUpdated(uint256 fraction); - modifier onlyVmOrPermitted(address permittedAddress) { - if (isL2()) require(msg.sender == permittedAddress, "Only permitted address can call"); - else { - require(msg.sender == address(0), "Only VM can call"); - } - _; - } - /** * @notice Sets initialized == true on implementation contracts * @param test Set to true to skip implementation initialization @@ -154,33 +148,12 @@ contract EpochRewards is */ function updateTargetVotingYield() external - onlyVmOrPermitted(registry.getAddressFor(EPOCH_MANAGER_REGISTRY_ID)) + onlyPermitted(registry.getAddressFor(EPOCH_MANAGER_REGISTRY_ID)) onlyWhenNotFrozen { _updateTargetVotingYield(); } - /** - * @notice Determines if the reserve is low enough to demand a diversion from - * the community reward. Targets initial critical ratio of 2 with a linear - * decline until 25 years have passed where the critical ratio will be 1. - */ - function isReserveLow() external view returns (bool) { - // critical reserve ratio = 2 - time in second / 25 years - FixidityLib.Fraction memory timeSinceInitialization = FixidityLib.newFixed(now.sub(startTime)); - FixidityLib.Fraction memory m = FixidityLib.newFixed(25 * 365 * 1 days); - FixidityLib.Fraction memory b = FixidityLib.newFixed(2); - FixidityLib.Fraction memory criticalRatio; - // Don't let the critical reserve ratio go under 1 after 25 years. - if (timeSinceInitialization.gte(m)) { - criticalRatio = FixidityLib.fixed1(); - } else { - criticalRatio = b.subtract(timeSinceInitialization.divide(m)); - } - FixidityLib.Fraction memory ratio = FixidityLib.wrap(getReserve().getReserveRatio()); - return ratio.lte(criticalRatio); - } - /** * @notice Calculates the per validator epoch payment and the total rewards to voters. * @return The per validator epoch reward. @@ -278,7 +251,7 @@ contract EpochRewards is * @return Patch version of the contract. */ function getVersionNumber() external pure returns (uint256, uint256, uint256, uint256) { - return (1, 1, 2, 0); + return (1, 2, 0, 0); } /** @@ -451,18 +424,12 @@ contract EpochRewards is function getTargetTotalEpochPaymentsInGold() public view returns (uint256) { address stableTokenAddress = registry.getAddressForOrDie(STABLE_TOKEN_REGISTRY_ID); (uint256 numerator, uint256 denominator) = getSortedOracles().medianRate(stableTokenAddress); - if (isL2()) { - return - getEpochManager() - .numberOfElectedInCurrentSet() - .mul(targetValidatorEpochPayment) - .mul(denominator) - .div(numerator); - } return - numberValidatorsInCurrentSet().mul(targetValidatorEpochPayment).mul(denominator).div( - numerator - ); + getEpochManager() + .numberOfElectedInCurrentSet() + .mul(targetValidatorEpochPayment) + .mul(denominator) + .div(numerator); } /** diff --git a/packages/protocol/contracts/governance/Governance.sol b/packages/protocol/contracts/governance/Governance.sol index b4cee913061..2c5fb20a5e5 100644 --- a/packages/protocol/contracts/governance/Governance.sol +++ b/packages/protocol/contracts/governance/Governance.sol @@ -193,8 +193,6 @@ contract Governance is event ParticipationBaselineQuorumFactorSet(uint256 baselineQuorumFactor); - event HotfixWhitelisted(bytes32 indexed hash, address whitelister); - event HotfixApproved(bytes32 indexed hash, address approver); event HotfixPrepared(bytes32 indexed hash, uint256 indexed executionLimit); @@ -682,53 +680,26 @@ contract Governance is if (msg.sender == approver) { hotfixes[hash].approved = true; } else { - if (isL2()) { - hotfixes[hash].councilApproved = true; - } else { - revert("Hotfix approval by security council is not available on L1."); - } + hotfixes[hash].councilApproved = true; } emit HotfixApproved(hash, msg.sender); } /** - * @notice Whitelists the hash of a hotfix transaction(s). - * @param hash The abi encoded keccak256 hash of the hotfix transaction(s) to be whitelisted. - */ - function whitelistHotfix(bytes32 hash) external hotfixNotExecuted(hash) onlyL1 { - hotfixes[hash].deprecated_whitelisted[msg.sender] = true; - emit HotfixWhitelisted(hash, msg.sender); - } - - /** - * @notice Gives hotfix a prepared epoch for execution on L1. - * @notice Gives hotfix a time limit for execution on L2. + * @notice Gives hotfix a time limit for execution. * @param hash The hash of the hotfix to be prepared. */ function prepareHotfix(bytes32 hash) external hotfixNotExecuted(hash) { HotfixRecord storage _currentHotfix = hotfixes[hash]; - if (isL2()) { - uint256 _currentTime = now; - require(hotfixExecutionTimeWindow > 0, "Hotfix execution time window not set"); - require( - _currentHotfix.executionTimeLimit == 0, - "Hotfix already prepared for this timeframe." - ); - require(_currentHotfix.approved, "Hotfix not approved by approvers."); - require(_currentHotfix.councilApproved, "Hotfix not approved by security council."); - _currentHotfix.executionTimeLimit = _currentTime.add(hotfixExecutionTimeWindow); - emit HotfixPrepared(hash, _currentTime.add(hotfixExecutionTimeWindow)); - } else { - require(isHotfixPassing(hash), "hotfix not whitelisted by 2f+1 validators"); - uint256 epoch = getEpochNumber(); - require( - _currentHotfix.deprecated_preparedEpoch < epoch, - "hotfix already prepared for this epoch" - ); - _currentHotfix.deprecated_preparedEpoch = epoch; - emit HotfixPrepared(hash, epoch); - } + uint256 _currentTime = now; + require(hotfixExecutionTimeWindow > 0, "Hotfix execution time window not set"); + require(_currentHotfix.executionTimeLimit == 0, "Hotfix already prepared for this timeframe."); + require(_currentHotfix.approved, "Hotfix not approved by approvers."); + require(_currentHotfix.councilApproved, "Hotfix not approved by security council."); + + _currentHotfix.executionTimeLimit = _currentTime.add(hotfixExecutionTimeWindow); + emit HotfixPrepared(hash, _currentTime.add(hotfixExecutionTimeWindow)); } /** @@ -747,36 +718,22 @@ contract Governance is uint256[] calldata dataLengths, bytes32 salt ) external { - if (isL2()) { - bytes32 hash = keccak256(abi.encode(values, destinations, data, dataLengths, salt)); - - ( - bool approved, - bool councilApproved, - bool executed, - uint256 executionTimeLimit - ) = getL2HotfixRecord(hash); - require(!executed, "hotfix already executed"); - require(approved, "hotfix not approved"); - require(councilApproved, "hotfix not approved by security council"); - require(executionTimeLimit >= now, "Execution time limit has already been reached."); - Proposals.makeMem(values, destinations, data, dataLengths, msg.sender, 0).executeMem(); - - hotfixes[hash].executed = true; - emit HotfixExecuted(hash); - } else { - bytes32 hash = keccak256(abi.encode(values, destinations, data, dataLengths, salt)); - - (bool approved, bool executed, uint256 preparedEpoch) = getL1HotfixRecord(hash); - require(!executed, "hotfix already executed"); - require(approved, "hotfix not approved"); - require(preparedEpoch == getEpochNumber(), "hotfix must be prepared for this epoch"); + bytes32 hash = keccak256(abi.encode(values, destinations, data, dataLengths, salt)); - Proposals.makeMem(values, destinations, data, dataLengths, msg.sender, 0).executeMem(); + ( + bool approved, + bool councilApproved, + bool executed, + uint256 executionTimeLimit + ) = getHotfixRecord(hash); + require(!executed, "hotfix already executed"); + require(approved, "hotfix not approved"); + require(councilApproved, "hotfix not approved by security council"); + require(executionTimeLimit >= now, "Execution time limit has already been reached."); + Proposals.makeMem(values, destinations, data, dataLengths, msg.sender, 0).executeMem(); - hotfixes[hash].executed = true; - emit HotfixExecuted(hash); - } + hotfixes[hash].executed = true; + emit HotfixExecuted(hash); } /** @@ -1050,7 +1007,7 @@ contract Governance is * @return Patch version of the contract. */ function getVersionNumber() external pure returns (uint256, uint256, uint256, uint256) { - return (1, 4, 2, 1); + return (1, 5, 1, 0); } /** @@ -1291,73 +1248,14 @@ contract Governance is } /** - * @notice Returns number of validators from current set which have whitelisted the given hotfix. - * @param hash The abi encoded keccak256 hash of the hotfix transaction. - * @return Whitelist tally - */ - function hotfixWhitelistValidatorTally(bytes32 hash) public view onlyL1 returns (uint256) { - uint256 tally = 0; - uint256 n = numberValidatorsInCurrentSet(); - IAccounts accounts = getAccounts(); - for (uint256 i = 0; i < n; i = i.add(1)) { - address validatorSigner = validatorSignerAddressFromCurrentSet(i); - address validatorAccount = accounts.signerToAccount(validatorSigner); - if ( - isHotfixWhitelistedBy(hash, validatorSigner) || - isHotfixWhitelistedBy(hash, validatorAccount) - ) { - tally = tally.add(1); - } - } - return tally; - } - - /** - * @notice Checks if a byzantine quorum of validators has whitelisted the given hotfix. - * @param hash The abi encoded keccak256 hash of the hotfix transaction. - * @return Whether validator whitelist tally >= validator byzantine quorum - */ - function isHotfixPassing(bytes32 hash) public view onlyL1 returns (bool) { - return hotfixWhitelistValidatorTally(hash) >= minQuorumSizeInCurrentSet(); - } - - /** - * @notice Gets information about a L1 hotfix. - * @param hash The abi encoded keccak256 hash of the hotfix transaction. - * @return Hotfix approved. - * @return Hotfix executed. - * @return Hotfix preparedEpoch. - */ - function getL1HotfixRecord(bytes32 hash) public view onlyL1 returns (bool, bool, uint256) { - return ( - hotfixes[hash].approved, - hotfixes[hash].executed, - hotfixes[hash].deprecated_preparedEpoch - ); - } - - /** - * @notice Gets information about a L1 hotfix. - * @param hash The abi encoded keccak256 hash of the hotfix transaction. - * @return Hotfix approved. - * @return Hotfix executed. - * @return Hotfix preparedEpoch. - * @dev Provided for API backwards compatibility. Prefer the explicitly named - * `getL1HotfixRecord`/`getL2HotfixRecord` functions. - */ - function getHotfixRecord(bytes32 hash) public view returns (bool, bool, uint256) { - return getL1HotfixRecord(hash); - } - - /** - * @notice Gets information about a L2 hotfix. + * @notice Gets information about a hotfix. * @param hash The abi encoded keccak256 hash of the hotfix transaction. * @return Hotfix approved by approver. * @return Hotfix approved by SecurityCouncil. * @return Hotfix executed. - * @return Hotfix exection time limit. + * @return Hotfix execution time limit. */ - function getL2HotfixRecord(bytes32 hash) public view onlyL2 returns (bool, bool, bool, uint256) { + function getHotfixRecord(bytes32 hash) public view returns (bool, bool, bool, uint256) { return ( hotfixes[hash].approved, hotfixes[hash].councilApproved, @@ -1376,18 +1274,6 @@ contract Governance is return queue.contains(proposalId); } - /** - * @notice Returns whether given hotfix hash has been whitelisted by given address. - * @param hash The abi encoded keccak256 hash of the hotfix transaction(s) to be whitelisted. - * @param whitelister Address to check whitelist status of. - */ - function isHotfixWhitelistedBy( - bytes32 hash, - address whitelister - ) public view onlyL1 returns (bool) { - return hotfixes[hash].deprecated_whitelisted[whitelister]; - } - /** * @notice Returns whether or not a queued proposal has expired. * @param proposalId The ID of the proposal. @@ -1410,12 +1296,13 @@ contract Governance is isQueued(upvotedProposalId) && !isQueuedProposalExpired(upvotedProposalId); + // Calculate queue upvote weight using locked gold (upvotes use locked gold weight) + uint256 queueWeight = 0; if (isVotingQueue) { - uint256 weight = getLockedGold().getAccountTotalLockedGold(account); - return weight; + queueWeight = getLockedGold().getAccountTotalLockedGold(account); } - uint256 maxUsed = 0; + uint256 maxReferendumUsed = 0; for (uint256 index = 0; index < dequeued.length; index = index.add(1)) { uint256 proposalId = dequeued[index]; Proposals.Proposal storage proposal = proposals[proposalId]; @@ -1432,13 +1319,15 @@ contract Governance is } uint256 votesCast = voteRecord.yesVotes.add(voteRecord.noVotes).add(voteRecord.abstainVotes); - maxUsed = Math.max( - maxUsed, + maxReferendumUsed = Math.max( + maxReferendumUsed, // backward compatibility for transition period - this should be updated later on votesCast == 0 ? voteRecord.deprecated_weight : votesCast ); } - return maxUsed; + + // Return the maximum of queue upvote weight and referendum votes + return Math.max(queueWeight, maxReferendumUsed); } /** diff --git a/packages/protocol/contracts/governance/GovernanceSlasher.sol b/packages/protocol/contracts/governance/GovernanceSlasher.sol index 916a7d2499a..dc59521f805 100644 --- a/packages/protocol/contracts/governance/GovernanceSlasher.sol +++ b/packages/protocol/contracts/governance/GovernanceSlasher.sol @@ -1,20 +1,21 @@ pragma solidity ^0.5.13; -import "openzeppelin-solidity/contracts/ownership/Ownable.sol"; -import "openzeppelin-solidity/contracts/math/SafeMath.sol"; +import { Ownable } from "openzeppelin-solidity/contracts/ownership/Ownable.sol"; +import { SafeMath } from "openzeppelin-solidity/contracts/math/SafeMath.sol"; -import "../common/Initializable.sol"; -import "../common/UsingRegistry.sol"; -import "./interfaces/IValidators.sol"; -import "../../contracts-0.8/common/IsL2Check.sol"; -import "../common/interfaces/ICeloVersionedContract.sol"; +import { Initializable } from "../common/Initializable.sol"; +import { UsingRegistry } from "../common/UsingRegistry.sol"; +import { ICeloVersionedContract } from "../common/interfaces/ICeloVersionedContract.sol"; +import { IValidators } from "./interfaces/IValidators.sol"; +import { ILockedGold } from "./interfaces/ILockedGold.sol"; +import { IGovernanceSlasher } from "./interfaces/IGovernanceSlasher.sol"; contract GovernanceSlasher is Ownable, Initializable, UsingRegistry, ICeloVersionedContract, - IsL2Check + IGovernanceSlasher { using SafeMath for uint256; // Maps a slashed address to the amount to be slashed. @@ -23,8 +24,7 @@ contract GovernanceSlasher is address internal slasherExecuter; event SlashingApproved(address indexed account, uint256 amount); - event GovernanceSlashPerformed(address indexed account, uint256 amount); - event GovernanceSlashL2Performed(address indexed account, address indexed group, uint256 amount); + event GovernanceSlashPerformed(address indexed account, address indexed group, uint256 amount); event SlasherExecuterSet(address slasherExecuter); modifier onlyAuthorizedToSlash() { @@ -50,17 +50,6 @@ contract GovernanceSlasher is setRegistry(registryAddress); } - /** - * @notice Returns the storage, major, minor, and patch version of the contract. - * @return Storage version of the contract. - * @return Major version of the contract. - * @return Minor version of the contract. - * @return Patch version of the contract. - */ - function getVersionNumber() external pure returns (uint256, uint256, uint256, uint256) { - return (1, 1, 1, 0); - } - function setSlasherExecuter(address _slasherExecuter) external onlyOwner { slasherExecuter = _slasherExecuter; emit SlasherExecuterSet(_slasherExecuter); @@ -79,36 +68,50 @@ contract GovernanceSlasher is /** * @notice Calls `LockedGold.slash` on `account` if `account` has an entry in `slashed`. - * @param account Account to slash + * @param account Account to slash. + * @param group Validators group of the account to slash. * @param electionLessers Lesser pointers for slashing locked election gold. * @param electionGreaters Greater pointers for slashing locked election gold. * @param electionIndices Indices of groups voted by slashed account. */ function slash( address account, + address group, address[] calldata electionLessers, address[] calldata electionGreaters, uint256[] calldata electionIndices - ) external onlyL1 returns (bool) { - uint256 penalty = slashed[account]; - require(penalty > 0, "No penalty given by governance"); - slashed[account] = 0; - getLockedGold().slash( - account, - penalty, - address(0), - 0, - electionLessers, - electionGreaters, - electionIndices - ); - emit GovernanceSlashPerformed(account, penalty); - return true; + ) external returns (bool) { + return slashL2(account, group, electionLessers, electionGreaters, electionIndices); + } + + /** + * @notice Gets account penalty. + * @param account Address that is punished. + * @return Amount slashed. + */ + function getApprovedSlashing(address account) external view returns (uint256) { + return slashed[account]; + } + + function getSlasherExecuter() external view returns (address) { + return slasherExecuter; + } + + /** + * @notice Returns the storage, major, minor, and patch version of the contract. + * @return Storage version of the contract. + * @return Major version of the contract. + * @return Minor version of the contract. + * @return Patch version of the contract. + */ + function getVersionNumber() external pure returns (uint256, uint256, uint256, uint256) { + return (1, 2, 0, 0); } /** * @notice Calls `LockedGold.slash` on `account` if `account` has an entry in `slashed`. - * @param account Account to slash + * @param account Account to slash. + * @param group Validators group of the account to slash. * @param electionLessers Lesser pointers for slashing locked election gold. * @param electionGreaters Greater pointers for slashing locked election gold. * @param electionIndices Indices of groups voted by slashed account. @@ -116,10 +119,10 @@ contract GovernanceSlasher is function slashL2( address account, address group, - address[] calldata electionLessers, - address[] calldata electionGreaters, - uint256[] calldata electionIndices - ) external onlyL2 onlyAuthorizedToSlash returns (bool) { + address[] memory electionLessers, + address[] memory electionGreaters, + uint256[] memory electionIndices + ) public onlyAuthorizedToSlash returns (bool) { uint256 penalty = slashed[account]; require(penalty > 0, "No penalty given by governance"); slashed[account] = 0; @@ -151,20 +154,7 @@ contract GovernanceSlasher is validators.halveSlashingMultiplier(group); } - emit GovernanceSlashL2Performed(account, group, penalty); + emit GovernanceSlashPerformed(account, group, penalty); return true; } - - /** - * @notice Gets account penalty. - * @param account Address that is punished. - * @return Amount slashed. - */ - function getApprovedSlashing(address account) external view returns (uint256) { - return slashed[account]; - } - - function getSlasherExecuter() external view returns (address) { - return slasherExecuter; - } } diff --git a/packages/protocol/contracts/governance/ReleaseGold.sol b/packages/protocol/contracts/governance/ReleaseGold.sol index fbf9b13d30c..70323f51442 100644 --- a/packages/protocol/contracts/governance/ReleaseGold.sol +++ b/packages/protocol/contracts/governance/ReleaseGold.sol @@ -451,7 +451,6 @@ contract ReleaseGold is UsingRegistry, ReentrancyGuard, IReleaseGold, Initializa * @param ecdsaPublicKey The ECDSA public key corresponding to `signer`. * @dev The v,r and s signature should be signed by the authorized signer * key, with the ReleaseGold contract address as the message. - * @dev Function is deprecated on L2. */ function authorizeValidatorSignerWithPublicKey( address payable signer, @@ -467,45 +466,6 @@ contract ReleaseGold is UsingRegistry, ReentrancyGuard, IReleaseGold, Initializa getAccounts().authorizeValidatorSignerWithPublicKey(signer, v, r, s, ecdsaPublicKey); } - /** - * @notice A wrapper function for the authorize validator signer with keys account method. - * @param signer The address of the signing key to authorize. - * @param v The recovery id of the incoming ECDSA signature. - * @param r Output value r of the ECDSA signature. - * @param s Output value s of the ECDSA signature. - * @param ecdsaPublicKey The ECDSA public key corresponding to `signer`. - * @param blsPublicKey The BLS public key that the validator is using for consensus, should pass - * proof of possession. 96 bytes. - * @param blsPop The BLS public key proof-of-possession, which consists of a signature on the - * account address. 48 bytes. - * @dev The v,r and s signature should be signed by the authorized signer - * key, with the ReleaseGold contract address as the message. - * @dev Function is deprecated on L2. - */ - function authorizeValidatorSignerWithKeys( - address payable signer, - uint8 v, - bytes32 r, - bytes32 s, - bytes calldata ecdsaPublicKey, - bytes calldata blsPublicKey, - bytes calldata blsPop - ) external nonReentrant onlyCanValidate onlyWhenInProperState { - // If no previous signer has been authorized, fund the new signer so that tx fees can be paid. - if (getAccounts().getValidatorSigner(address(this)) == address(this)) { - fundSigner(signer); - } - getAccounts().authorizeValidatorSignerWithKeys( - signer, - v, - r, - s, - ecdsaPublicKey, - blsPublicKey, - blsPop - ); - } - /** * @notice A wrapper function for the authorize attestation signer account method. * @param signer The address of the signing key to authorize. diff --git a/packages/protocol/contracts/governance/interfaces/IElection.sol b/packages/protocol/contracts/governance/interfaces/IElection.sol index 42fd7a7225f..f5f42a37db3 100644 --- a/packages/protocol/contracts/governance/interfaces/IElection.sol +++ b/packages/protocol/contracts/governance/interfaces/IElection.sol @@ -2,69 +2,109 @@ pragma solidity >=0.5.13 <0.9.0; interface IElection { - function vote(address, uint256, address, address) external returns (bool); - function activate(address) external returns (bool); - function revokeActive(address, uint256, address, address, uint256) external returns (bool); - function revokeAllActive(address, address, address, uint256) external returns (bool); - function revokePending(address, uint256, address, address, uint256) external returns (bool); - function markGroupIneligible(address) external; - function markGroupEligible(address, address, address) external; - function allowedToVoteOverMaxNumberOfGroups(address) external returns (bool); + function vote( + address group, + uint256 value, + address lesser, + address greater + ) external returns (bool); + function activate(address group) external returns (bool); + function revokeActive( + address group, + uint256 value, + address lesser, + address greater, + uint256 index + ) external returns (bool); + function revokeAllActive( + address group, + address lesser, + address greater, + uint256 index + ) external returns (bool); + function revokePending( + address group, + uint256 value, + address lesser, + address greater, + uint256 index + ) external returns (bool); + function markGroupIneligible(address group) external; + function markGroupEligible(address group, address lesser, address greater) external; + function allowedToVoteOverMaxNumberOfGroups(address account) external returns (bool); function forceDecrementVotes( - address, - uint256, - address[] calldata, - address[] calldata, - uint256[] calldata + address account, + uint256 value, + address[] calldata lessers, + address[] calldata greaters, + uint256[] calldata indices ) external returns (uint256); function setAllowedToVoteOverMaxNumberOfGroups(bool flag) external; // only owner - function setElectableValidators(uint256, uint256) external returns (bool); - function setMaxNumGroupsVotedFor(uint256) external returns (bool); - function setElectabilityThreshold(uint256) external returns (bool); + function setElectableValidators(uint256 min, uint256 max) external returns (bool); + function setMaxNumGroupsVotedFor(uint256 _maxNumGroupsVotedFor) external returns (bool); + function setElectabilityThreshold(uint256 threshold) external returns (bool); // only VM - function distributeEpochRewards(address, uint256, address, address) external; + function distributeEpochRewards( + address group, + uint256 value, + address lesser, + address greater + ) external; // view functions function electValidatorSigners() external view returns (address[] memory); function electValidatorAccounts() external view returns (address[] memory); - function electNValidatorSigners(uint256, uint256) external view returns (address[] memory); - function electNValidatorAccounts(uint256, uint256) external view returns (address[] memory); + function electNValidatorSigners( + uint256 minElectableValidators, + uint256 maxElectableValidators + ) external view returns (address[] memory); + function electNValidatorAccounts( + uint256 minElectableValidators, + uint256 maxElectableValidators + ) external view returns (address[] memory); function getElectableValidators() external view returns (uint256, uint256); function getElectabilityThreshold() external view returns (uint256); - function getNumVotesReceivable(address) external view returns (uint256); + function getNumVotesReceivable(address group) external view returns (uint256); function getTotalVotes() external view returns (uint256); function getActiveVotes() external view returns (uint256); - function getTotalVotesByAccount(address) external view returns (uint256); - function getPendingVotesForGroupByAccount(address, address) external view returns (uint256); - function getActiveVotesForGroupByAccount(address, address) external view returns (uint256); - function getTotalVotesForGroupByAccount(address, address) external view returns (uint256); - function getActiveVoteUnitsForGroupByAccount(address, address) external view returns (uint256); - function getTotalVotesForGroup(address) external view returns (uint256); - function getActiveVotesForGroup(address) external view returns (uint256); - function getPendingVotesForGroup(address) external view returns (uint256); - function getGroupEligibility(address) external view returns (bool); - function getGroupEpochRewards( - address, - uint256, - uint256[] calldata + function getTotalVotesByAccount(address account) external view returns (uint256); + function getPendingVotesForGroupByAccount( + address group, + address account + ) external view returns (uint256); + function getActiveVotesForGroupByAccount( + address group, + address account + ) external view returns (uint256); + function getTotalVotesForGroupByAccount( + address group, + address account + ) external view returns (uint256); + function getActiveVoteUnitsForGroupByAccount( + address group, + address account ) external view returns (uint256); + function getTotalVotesForGroup(address group) external view returns (uint256); + function getActiveVotesForGroup(address group) external view returns (uint256); + function getPendingVotesForGroup(address group) external view returns (uint256); + function getGroupEligibility(address group) external view returns (bool); function getGroupEpochRewardsBasedOnScore( address group, uint256 totalEpochRewards, uint256 groupScore ) external view returns (uint256); - function getGroupsVotedForByAccount(address) external view returns (address[] memory); + function getGroupsVotedForByAccount(address account) external view returns (address[] memory); function getEligibleValidatorGroups() external view returns (address[] memory); function getTotalVotesForEligibleValidatorGroups() external view returns (address[] memory, uint256[] memory); function getCurrentValidatorSigners() external view returns (address[] memory); - function canReceiveVotes(address, uint256) external view returns (bool); - function hasActivatablePendingVotes(address, address) external view returns (bool); + function canReceiveVotes(address group, uint256 value) external view returns (bool); + function hasActivatablePendingVotes(address account, address group) external view returns (bool); function validatorSignerAddressFromCurrentSet(uint256 index) external view returns (address); function numberValidatorsInCurrentSet() external view returns (uint256); function owner() external view returns (address); diff --git a/packages/protocol/contracts/governance/interfaces/IEpochRewards.sol b/packages/protocol/contracts/governance/interfaces/IEpochRewards.sol index 1c32d669933..331348dc18f 100644 --- a/packages/protocol/contracts/governance/interfaces/IEpochRewards.sol +++ b/packages/protocol/contracts/governance/interfaces/IEpochRewards.sol @@ -3,7 +3,6 @@ pragma solidity >=0.5.13 <0.9.0; interface IEpochRewards { function updateTargetVotingYield() external; - function isReserveLow() external view returns (bool); function calculateTargetEpochRewards() external view returns (uint256, uint256, uint256, uint256); function getTargetVotingYieldParameters() external view returns (uint256, uint256, uint256); function getRewardsMultiplierParameters() external view returns (uint256, uint256, uint256); diff --git a/packages/protocol/contracts/governance/interfaces/IGovernance.sol b/packages/protocol/contracts/governance/interfaces/IGovernance.sol index 0d3ebe8d137..e88c0da7527 100644 --- a/packages/protocol/contracts/governance/interfaces/IGovernance.sol +++ b/packages/protocol/contracts/governance/interfaces/IGovernance.sol @@ -2,10 +2,38 @@ pragma solidity >=0.5.13 <0.9.0; interface IGovernance { - function removeVotesWhenRevokingDelegatedVotes( - address account, - uint256 maxAmountAllowed - ) external; + // constitution + function setConstitution(address destination, bytes4 functionId, uint256 threshold) external; + function getConstitution(address destination, bytes4 functionId) external view returns (uint256); + + // proposal + function propose( + uint256[] calldata values, + address[] calldata destinations, + bytes calldata data, + uint256[] calldata dataLengths, + string calldata descriptionUrl + ) external payable returns (uint256); + function getProposal( + uint256 proposalId + ) external view returns (address, uint256, uint256, uint256, string memory, uint256, bool); + function proposalCount() external view returns (uint256); + + // upvote + function upvote(uint256 proposalId, uint256 lesser, uint256 greater) external returns (bool); + function getUpvotes(uint256 proposalId) external view returns (uint256); + + // approve + function approve(uint256 proposalId, uint256 index) external returns (bool); + function isApproved(uint256 proposalId) external view returns (bool); + + // voting + // TODO: Enable once we migrate out of 0.5 + // function vote( + // uint256 proposalId, + // uint256 index, + // Proposals.VoteValue value + // ) external returns (bool); function votePartially( uint256 proposalId, uint256 index, @@ -13,15 +41,17 @@ interface IGovernance { uint256 noVotes, uint256 abstainVotes ) external returns (bool); - - function setConstitution(address destination, bytes4 functionId, uint256 threshold) external; - - function isVoting(address) external view returns (bool); + function removeVotesWhenRevokingDelegatedVotes( + address account, + uint256 maxAmountAllowed + ) external; + function isVoting(address account) external view returns (bool); + function getVoteTotals(uint256 proposalId) external view returns (uint256, uint256, uint256); function getAmountOfGoldUsedForVoting(address account) external view returns (uint256); - function getProposal( - uint256 proposalId - ) external view returns (address, uint256, uint256, uint256, string memory, uint256, bool); - + // referendum function getReferendumStageDuration() external view returns (uint256); + + // execution + function execute(uint256 proposalId, uint256 index) external returns (bool); } diff --git a/packages/protocol/contracts/governance/interfaces/IGovernanceSlasher.sol b/packages/protocol/contracts/governance/interfaces/IGovernanceSlasher.sol new file mode 100644 index 00000000000..662d7d1eb0d --- /dev/null +++ b/packages/protocol/contracts/governance/interfaces/IGovernanceSlasher.sol @@ -0,0 +1,15 @@ +// SPDX-License-Identifier: LGPL-3.0-only +pragma solidity >=0.5.13 <0.9.0; + +interface IGovernanceSlasher { + function setSlasherExecuter(address _slasherExecuter) external; + function approveSlashing(address account, uint256 penalty) external; + function getApprovedSlashing(address account) external view returns (uint256); + function slash( + address account, + address group, + address[] calldata electionLessers, + address[] calldata electionGreaters, + uint256[] calldata electionIndices + ) external returns (bool); +} diff --git a/packages/protocol/contracts/governance/interfaces/IGovernanceVote.sol b/packages/protocol/contracts/governance/interfaces/IGovernanceVote.sol new file mode 100644 index 00000000000..f2a18f8dac7 --- /dev/null +++ b/packages/protocol/contracts/governance/interfaces/IGovernanceVote.sol @@ -0,0 +1,16 @@ +// SPDX-License-Identifier: LGPL-3.0-only +pragma solidity >=0.5.13 <0.9.0; + +// TODO: Remove and replace with IGovernance.vote after we migrate out of 0.5 +interface IGovernanceVote { + // enum vote + enum VoteValue { + None, + Abstain, + No, + Yes + } + + // voting + function vote(uint256 proposalId, uint256 index, VoteValue value) external returns (bool); +} diff --git a/packages/protocol/contracts/governance/interfaces/ILockedCelo.sol b/packages/protocol/contracts/governance/interfaces/ILockedCelo.sol index cfa19772a8d..d2e65a77ee7 100644 --- a/packages/protocol/contracts/governance/interfaces/ILockedCelo.sol +++ b/packages/protocol/contracts/governance/interfaces/ILockedCelo.sol @@ -38,4 +38,6 @@ interface ILockedCelo { function getAccountTotalGovernanceVotingPower(address account) external view returns (uint256); function unlockingPeriod() external view returns (uint256); function getAccountNonvotingLockedCelo(address account) external view returns (uint256); + + function getAccountTotalLockedGold(address account) external view returns (uint256); } diff --git a/packages/protocol/contracts/governance/interfaces/IReleaseGold.sol b/packages/protocol/contracts/governance/interfaces/IReleaseGold.sol index 920f944ea11..15b86fc268c 100644 --- a/packages/protocol/contracts/governance/interfaces/IReleaseGold.sol +++ b/packages/protocol/contracts/governance/interfaces/IReleaseGold.sol @@ -14,15 +14,6 @@ interface IReleaseGold { bytes32, bytes calldata ) external; - function authorizeValidatorSignerWithKeys( - address payable, - uint8, - bytes32, - bytes32, - bytes calldata, - bytes calldata, - bytes calldata - ) external; function authorizeAttestationSigner(address payable, uint8, bytes32, bytes32) external; function revokeActive(address, uint256, address, address, uint256) external; function revokePending(address, uint256, address, address, uint256) external; diff --git a/packages/protocol/contracts/governance/interfaces/IValidators.sol b/packages/protocol/contracts/governance/interfaces/IValidators.sol index 8d771efec6c..822f92baa11 100644 --- a/packages/protocol/contracts/governance/interfaces/IValidators.sol +++ b/packages/protocol/contracts/governance/interfaces/IValidators.sol @@ -2,16 +2,11 @@ pragma solidity >=0.5.13 <0.9.0; interface IValidators { - function registerValidator( - bytes calldata, - bytes calldata, - bytes calldata - ) external returns (bool); + function registerValidator(bytes calldata) external returns (bool); function registerValidatorNoBls(bytes calldata ecdsaPublicKey) external returns (bool); function deregisterValidator(uint256) external returns (bool); function affiliate(address) external returns (bool); function deaffiliate() external returns (bool); - function updateBlsPublicKey(bytes calldata, bytes calldata) external returns (bool); function registerValidatorGroup(uint256) external returns (bool); function deregisterValidatorGroup(uint256) external returns (bool); function addMember(address) external returns (bool); @@ -26,44 +21,26 @@ interface IValidators { function setCommissionUpdateDelay(uint256) external; function setMaxGroupSize(uint256) external returns (bool); function setMembershipHistoryLength(uint256) external returns (bool); - function setValidatorScoreParameters(uint256, uint256) external returns (bool); function setGroupLockedGoldRequirements(uint256, uint256) external returns (bool); function setValidatorLockedGoldRequirements(uint256, uint256) external returns (bool); function setSlashingMultiplierResetPeriod(uint256) external; - function setDowntimeGracePeriod(uint256 value) external; // only registered contract function updateEcdsaPublicKey(address, address, bytes calldata) external returns (bool); - function updatePublicKeys( - address, - address, - bytes calldata, - bytes calldata, - bytes calldata - ) external returns (bool); function mintStableToEpochManager(uint256 amount) external; - // only VM - function updateValidatorScoreFromSigner(address, uint256) external; - function distributeEpochPaymentsFromSigner(address, uint256) external returns (uint256); - // only slasher function forceDeaffiliateIfValidator(address) external; function halveSlashingMultiplier(address) external; // view functions function maxGroupSize() external view returns (uint256); - function downtimeGracePeriod() external view returns (uint256); function getCommissionUpdateDelay() external view returns (uint256); - function getValidatorScoreParameters() external view returns (uint256, uint256); function getMembershipHistory( address ) external view returns (uint256[] memory, address[] memory, uint256, uint256); - function calculateEpochScore(uint256) external view returns (uint256); - function calculateGroupEpochScore(uint256[] calldata) external view returns (uint256); function getAccountLockedGoldRequirement(address) external view returns (uint256); function meetsAccountLockedGoldRequirements(address) external view returns (bool); - function getValidatorBlsPublicKeyFromSigner(address) external view returns (bytes memory); function getValidator( address account ) external view returns (bytes memory, bytes memory, address, uint256, address); diff --git a/packages/protocol/contracts/governance/test/MockElection.sol b/packages/protocol/contracts/governance/test/MockElection.sol index b3ea2717fe2..690f318a523 100644 --- a/packages/protocol/contracts/governance/test/MockElection.sol +++ b/packages/protocol/contracts/governance/test/MockElection.sol @@ -1,11 +1,9 @@ pragma solidity >=0.5.13 <0.8.20; -import "../../../contracts-0.8/common/IsL2Check.sol"; - /** * @title Holds a list of addresses of validators */ -contract MockElection is IsL2Check { +contract MockElection { mapping(address => bool) public isIneligible; mapping(address => bool) public isEligible; mapping(address => bool) public allowedToVoteOverMaxNumberOfGroups; @@ -78,10 +76,10 @@ contract MockElection is IsL2Check { return 0; } - function electValidatorSigners() external view onlyL1 returns (address[] memory) { + function electValidatorSigners() external view returns (address[] memory) { return electedValidators; } - function electValidators() external view onlyL2 returns (address[] memory) { + function electValidatorAccounts() external view returns (address[] memory) { return electedValidators; } @@ -104,8 +102,4 @@ contract MockElection is IsL2Check { function distributeEpochRewards(address group, uint256 value, address, address) external { distributedEpochRewards[group] = value; } - - function electValidatorAccounts() external view returns (address[] memory) { - return electedValidators; - } } diff --git a/packages/protocol/contracts/governance/test/MockGovernance.sol b/packages/protocol/contracts/governance/test/MockGovernance.sol index 936f1a42175..8c9bfe1c496 100644 --- a/packages/protocol/contracts/governance/test/MockGovernance.sol +++ b/packages/protocol/contracts/governance/test/MockGovernance.sol @@ -20,19 +20,22 @@ contract MockGovernance is IGovernance { totalVotes[voter] = votes; } - function removeVotesWhenRevokingDelegatedVotes( - address account, - uint256 maxAmountAllowed - ) external { - removeVotesCalledFor[account] = maxAmountAllowed; + function setConstitution(address, bytes4, uint256) external { + revert("not implemented"); } - function setConstitution(address, bytes4, uint256) external { + function getConstitution(address, bytes4) external view returns (uint256) { revert("not implemented"); } - function votePartially(uint256, uint256, uint256, uint256, uint256) external returns (bool) { - return true; + function propose( + uint256[] calldata values, + address[] calldata destinations, + bytes calldata data, + uint256[] calldata dataLengths, + string calldata descriptionUrl + ) external payable returns (uint256) { + return 0; } function getProposal( @@ -41,6 +44,41 @@ contract MockGovernance is IGovernance { return (address(0), 0, 0, 0, "", 0, false); } + function proposalCount() external view returns (uint256) { + return 0; + } + + function upvote(uint256 proposalId, uint256 lesser, uint256 greater) external returns (bool) { + return true; + } + + function getUpvotes(uint256 proposalId) external view returns (uint256) { + return 0; + } + + function approve(uint256 proposalId, uint256 index) external returns (bool) { + return true; + } + + function isApproved(uint256 proposalId) external view returns (bool) { + return true; + } + + function votePartially(uint256, uint256, uint256, uint256, uint256) external returns (bool) { + return true; + } + + function removeVotesWhenRevokingDelegatedVotes( + address account, + uint256 maxAmountAllowed + ) external { + removeVotesCalledFor[account] = maxAmountAllowed; + } + + function getVoteTotals(uint256 proposalId) external view returns (uint256, uint256, uint256) { + return (0, 0, 0); + } + function getAmountOfGoldUsedForVoting(address account) external view returns (uint256) { return totalVotes[account]; } @@ -48,4 +86,8 @@ contract MockGovernance is IGovernance { function getReferendumStageDuration() external view returns (uint256) { return 0; } + + function execute(uint256 proposalId, uint256 index) external returns (bool) { + return true; + } } diff --git a/packages/protocol/contracts/governance/test/MockValidators.sol b/packages/protocol/contracts/governance/test/MockValidators.sol index 9b7f27d4802..5523a8ed00c 100644 --- a/packages/protocol/contracts/governance/test/MockValidators.sol +++ b/packages/protocol/contracts/governance/test/MockValidators.sol @@ -3,7 +3,6 @@ pragma solidity ^0.5.13; import "openzeppelin-solidity/contracts/math/SafeMath.sol"; import "../interfaces/IValidators.sol"; -import "../../../contracts-0.8/common/IsL2Check.sol"; // Mocks Validators, compatible with 0.5 // For forge tests, can be avoided with calls to deployCodeTo @@ -11,7 +10,7 @@ import "../../../contracts-0.8/common/IsL2Check.sol"; /** * @title Holds a list of addresses of validators */ -contract MockValidators is IValidators, IsL2Check { +contract MockValidators is IValidators { using SafeMath for uint256; event HavelSlashingMultiplierHalved(address validator); @@ -35,17 +34,6 @@ contract MockValidators is IValidators, IsL2Check { return true; } - function updatePublicKeys( - address, - address, - bytes calldata, - bytes calldata, - bytes calldata - ) external returns (bool) { - allowOnlyL1(); - return true; - } - function setValidator(address account) external { isValidator[account] = true; } @@ -113,7 +101,6 @@ contract MockValidators is IValidators, IsL2Check { } function getValidatorGroupSlashingMultiplier(address) external view returns (uint256) { - allowOnlyL1(); return FIXED1_UINT; } @@ -129,10 +116,6 @@ contract MockValidators is IValidators, IsL2Check { return lockedGoldRequirements[account]; } - function calculateGroupEpochScore(uint256[] calldata uptimes) external view returns (uint256) { - return uptimes[0]; - } - function getGroupsNumMembers(address[] calldata groups) external view returns (uint256[] memory) { uint256[] memory numMembers = new uint256[](groups.length); for (uint256 i = 0; i < groups.length; i = i.add(1)) { @@ -169,9 +152,14 @@ contract MockValidators is IValidators, IsL2Check { revert("Method not implemented in mock"); } + function registerValidator(bytes calldata) external returns (bool) { + revert("Method not implemented in mock"); + } + function registerValidatorNoBls(bytes calldata) external returns (bool) { revert("Method not implemented in mock"); } + function removeMember(address) external returns (bool) { revert("Method not implemented in mock"); } @@ -192,14 +180,6 @@ contract MockValidators is IValidators, IsL2Check { revert("Method not implemented in mock"); } - function updateBlsPublicKey(bytes calldata, bytes calldata) external returns (bool) { - revert("Method not implemented in mock"); - } - - function setValidatorScoreParameters(uint256, uint256) external returns (bool) { - revert("Method not implemented in mock"); - } - function setValidatorLockedGoldRequirements(uint256, uint256) external returns (bool) { revert("Method not implemented in mock"); } @@ -208,10 +188,6 @@ contract MockValidators is IValidators, IsL2Check { revert("Method not implemented in mock"); } - function setDowntimeGracePeriod(uint256) external { - revert("Method not implemented in mock"); - } - function setCommissionUpdateDelay(uint256) external { revert("Method not implemented in mock"); } @@ -224,10 +200,6 @@ contract MockValidators is IValidators, IsL2Check { revert("Method not implemented in mock"); } - function updateValidatorScoreFromSigner(address, uint256) external { - revert("Method not implemented in mock"); - } - function mintStableToEpochManager(uint256 amount) external { mintedStable = mintedStable.add(amount); } @@ -236,18 +208,10 @@ contract MockValidators is IValidators, IsL2Check { revert("Method not implemented in mock"); } - function getValidatorScoreParameters() external view returns (uint256, uint256) { - revert("Method not implemented in mock"); - } - function getValidatorLockedGoldRequirements() external view returns (uint256, uint256) { revert("Method not implemented in mock"); } - function getValidatorBlsPublicKeyFromSigner(address) external view returns (bytes memory) { - revert("Method not implemented in mock"); - } - function getRegisteredValidators() external view returns (address[] memory) { revert("Method not implemented in mock"); } @@ -260,8 +224,8 @@ contract MockValidators is IValidators, IsL2Check { revert("Method not implemented in mock"); } - function getMembershipInLastEpoch(address) external view returns (address) { - revert("Method not implemented in mock"); + function getMembershipInLastEpoch(address validator) external view returns (address) { + return affiliations[validator]; } function getMembershipHistoryLength() external view returns (uint256) { @@ -272,10 +236,6 @@ contract MockValidators is IValidators, IsL2Check { revert("Method not implemented in mock"); } - function calculateEpochScore(uint256) external view returns (uint256) { - revert("Method not implemented in mock"); - } - function deaffiliate() external returns (bool) { revert("Method not implemented in mock"); } @@ -288,14 +248,6 @@ contract MockValidators is IValidators, IsL2Check { revert("Method not implemented in mock"); } - function distributeEpochPaymentsFromSigner(address, uint256) external onlyL1 returns (uint256) { - revert("Method not implemented in mock"); - } - - function downtimeGracePeriod() external view returns (uint256) { - revert("Method not implemented in mock"); - } - function getCommissionUpdateDelay() external view returns (uint256) { revert("Method not implemented in mock"); } @@ -312,14 +264,6 @@ contract MockValidators is IValidators, IsL2Check { epochRewards[account] = reward; } - function registerValidator( - bytes calldata, - bytes calldata, - bytes calldata - ) external returns (bool) { - revert("Method not implemented in mock"); - } - function getMembershipHistory( address ) external view returns (uint256[] memory, address[] memory, uint256, uint256) { diff --git a/packages/protocol/contracts/identity/test/AttestationsTest.sol b/packages/protocol/contracts/identity/test/AttestationsTest.sol index a125ca55628..447741a8f9e 100644 --- a/packages/protocol/contracts/identity/test/AttestationsTest.sol +++ b/packages/protocol/contracts/identity/test/AttestationsTest.sol @@ -5,7 +5,7 @@ import "../Attestations.sol"; /* * We need a test contract that behaves like the actual Attestations contract, * but mocks the implementations of the validator set getters. Otherwise we - * couldn't test `request` with the current ganache local testnet. + * couldn't test `request` with the testnet */ contract AttestationsTest is Attestations(true) { address[] private __testValidators; diff --git a/packages/protocol/contracts/stability/SortedOracles.sol b/packages/protocol/contracts/stability/SortedOracles.sol index 37bfd51f964..db1aefb1136 100644 --- a/packages/protocol/contracts/stability/SortedOracles.sol +++ b/packages/protocol/contracts/stability/SortedOracles.sol @@ -188,8 +188,8 @@ contract SortedOracles is /** * @notice Sets the equivalent token for a token. - * @param token The address of the token. - * @param equivalentToken The address of the equivalent token. + * @param token The address of the token. (eg USDC) + * @param equivalentToken The address of the equivalent token. (eg cUSD) */ function setEquivalentToken(address token, address equivalentToken) external onlyOwner { require(token != address(0), "token address cannot be 0"); diff --git a/packages/protocol/foundry.toml b/packages/protocol/foundry.toml index c3d8918642d..e85ee87377d 100644 --- a/packages/protocol/foundry.toml +++ b/packages/protocol/foundry.toml @@ -1,8 +1,10 @@ +# docs: https://getfoundry.sh/config/reference/default-config [profile.default] src = 'contracts-0.8' out = 'out' test = 'test-sol' libs = ['lib', 'node_modules'] +optimizer = true no_match_test = "skip" @@ -13,11 +15,16 @@ no_match_test = "skip" no_match_path = "{**/test/BLS12Passthrough.sol,**/test/RandomTest.sol,**/test-sol/devchain/**}" fs_permissions = [ - { access = "read", path = "./out"}, - { access = "read", path = "./migrations_sol/migrationsConfig.json"}, - { access = "read", path = "./governanceConstitution.json"}, - { access = "read", path = "./artifacts/"} - ] + { access = "read", path = "./out"}, + { access = "read", path = "./out-truffle-compat"}, + { access = "read", path = "./out-truffle-compat-0.8"}, + { access = "read", path = "./migrations_sol/migrationsConfig.json"}, + { access = "read", path = "./migrations_sol/"}, + { access = "read", path = "./governanceConstitution.json"}, + { access = "read", path = "./artifacts/"}, + { access = "read", path = "./.tmp/selectors" } +] + [profile.devchain] # Special profile for the tests that require an anvil devchain test = 'test-sol/devchain' @@ -25,3 +32,42 @@ match_path = "**/test-sol/devchain/**" no_match_path = "{**/test/BLS12Passthrough.sol,**/test/RandomTest.sol}" # See more config options https://github.com/foundry-rs/foundry/tree/master/config + +# from https://github.com/celo-org/celo-monorepo/pull/11488/ + +# Profile to match Truffle configuration for 0.5.13 contracts +[profile.truffle-compat] +skip=[ + "test-sol/**", # skip everything under test-sol + ] +src = 'contracts' +out = 'out-truffle-compat' +libs = ['lib', 'node_modules'] +solc_version = '0.5.14' +ast = true +optimizer = false +evm_version = 'istanbul' +metadata_literal = true +via_ir = false + +[profile.truffle-compat.lint] +lint_on_build = false + +# Profile to match Truffle configuration for 0.8.19 contracts +[profile.truffle-compat8] +skip=[ + "test-sol/**", # skip everything under test-sol +] +src = 'contracts-0.8' +out = 'out-truffle-compat-0.8' +libs = ['lib', 'node_modules'] +solc_version = '0.8.19' +ast = true +optimizer = true +optimizer_runs = 200 +use_literal_content = true +via_ir = false +evm_version = 'paris' +bytecode_hash = 'ipfs' +cbor_metadata = true +# See more config options https://github.com/foundry-rs/foundry/tree/master/crates/config diff --git a/packages/protocol/governanceConstitution.js b/packages/protocol/governanceConstitution.js index fa0e0ea0be7..2549eb18110 100644 --- a/packages/protocol/governanceConstitution.js +++ b/packages/protocol/governanceConstitution.js @@ -134,7 +134,6 @@ const DefaultConstitution = { setGroupLockedGoldRequirements: 0.8, setValidatorLockedGoldRequirements: 0.8, setSlashingMultiplierResetPeriod: 0.7, - setValidatorScoreParameters: 0.7, __contractPackage: contractPackages.SOLIDITY_08_PACKAGE, }, } diff --git a/packages/protocol/governanceConstitution.json b/packages/protocol/governanceConstitution.json index 2b7e06872b1..669bd85835e 100644 --- a/packages/protocol/governanceConstitution.json +++ b/packages/protocol/governanceConstitution.json @@ -40,10 +40,6 @@ "FederatedAttestations": { "default": 600000000000000000000000 }, - "FeeCurrencyWhitelist": { - "default": 800000000000000000000000, - "addToken": 800000000000000000000000 - }, "Freezer": { "default": 600000000000000000000000, "freeze": 600000000000000000000000, @@ -92,7 +88,7 @@ "OdisPayments": { "default": 600000000000000000000000 }, - "proxy": { + "Proxy": { "_transferOwnership": 900000000000000000000000, "_setAndInitializeImplementation": 900000000000000000000000, "_setImplementation": 900000000000000000000000 @@ -118,7 +114,6 @@ "setMembershipHistoryLength": 700000000000000000000000, "setGroupLockedGoldRequirements": 800000000000000000000000, "setValidatorLockedGoldRequirements": 800000000000000000000000, - "setSlashingMultiplierResetPeriod": 700000000000000000000000, - "setValidatorScoreParameters": 700000000000000000000000 + "setSlashingMultiplierResetPeriod": 700000000000000000000000 } } \ No newline at end of file diff --git a/packages/protocol/lib/bytecode-foundry.ts b/packages/protocol/lib/bytecode-foundry.ts new file mode 100644 index 00000000000..e4629bb913c --- /dev/null +++ b/packages/protocol/lib/bytecode-foundry.ts @@ -0,0 +1,207 @@ +/* eslint-disable max-classes-per-file: 0 */ + +import { NULL_ADDRESS, trimLeading0x } from '@celo/base/lib/address' +import { Artifact, LinkReferences } from '@celo/protocol/lib/compatibility/internal' +import { keccak256, toHex } from 'viem' + +/* + * The Solidity compiler appends a Swarm Hash of compilation metadata to the end + * of bytecode. We find this hash based on the specification here: + * https://solidity.readthedocs.io/en/develop/metadata.html#encoding-of-the-metadata-hash-in-the-bytecode + */ +const CONTRACT_METADATA_REGEXPS = [ + // 0.5.8 + 'a165627a7a72305820.{64}0029', + // 0.5.13 + 'a265627a7a72315820.{64}64736f6c6343.{6}0032', + // 0.8.19 + 'a264697066735822.{68}64736f6c6343.{6}0033' +] + +const GENERAL_METADATA_REGEXP = new RegExp( + `^(.*)(${CONTRACT_METADATA_REGEXPS.map((r) => '(' + r + ')').join('|')})$`, + 'i' // Use i flag to make search case insensitive. +) + +export const stripMetadata = (bytecode: string): string => { + if (bytecode === '0x') { + return '0x' + } + + const match = bytecode.match(GENERAL_METADATA_REGEXP) + if (match === null) { + throw new Error( + 'Only support stripping metadata from bytecodes generated by solc in the versions listed in CONTRACT_METADATA_REGEXPS' + ) + } + return match[1] +} +/* + * Maps library names to their onchain addresses (formatted without "0x" prefix) and 34-character + * linking placeholder hash. + */ +export interface LibraryLinks { + [name: string]: { + address: string, + placeholderHash: string + } +} + +/* + * Unresolved libraries appear as "__$$__" in bytecode output by + * solc. The length of the entire string is 40 characters (accounting for the 20 + * bytes of the address that should be substituted in). + * The hash is the first 34 characters of the keccak256 hash of the fully qualified library name, + * i.e. `sourcePath:libraryName`. + * See https://docs.soliditylang.org/en/v0.8.13/using-the-compiler.html#library-linking + */ +export const getPlaceholderHash = (name: string): string => { + const hash = keccak256(toHex(name)) + return hash.slice(2, 2 + 34) +} + +export const linkLibraries = (bytecode: string, libraryLinks: LibraryLinks): string => { + Object.keys(libraryLinks).forEach((libraryName) => { + const linkString = `__\\$${libraryLinks[libraryName].placeholderHash}\\$__` + // Use g flag to iterate through for all occurences. + bytecode = bytecode.replace(RegExp(linkString, 'g'), libraryLinks[libraryName].address) + }) + + return bytecode +} + +const ADDRESS_LENGTH = 40 +const PUSH20_OPCODE = '73' +/* + * To check that a library isn't being called directly, the Solidity + * compiler starts a library's bytecode with a comparison of the current + * address with the address the library was deployed to (it has to differ + * to ensure the library is being called with CALLCODE or DELEGATECALL + * instead of a regular CALL). + * The address is only known at contract construction time, so + * the compiler's output contains a placeholder 0-address, while the onchain + * bytecode has the correct address inserted. + * Reference: https://solidity.readthedocs.io/en/v0.5.12/contracts.html#call-protection-for-libraries + */ +export const verifyAndStripLibraryPrefix = (bytecode: string, address = NULL_ADDRESS) => { + if (bytecode.slice(2, 4) !== PUSH20_OPCODE) { + throw new Error(`Library bytecode doesn't start with address load`) + } else if (bytecode.slice(4, 4 + ADDRESS_LENGTH) !== trimLeading0x(address).toLowerCase()) { + throw new Error(`Library bytecode loads unexpected address at start`) + } + + return bytecode.slice(4 + ADDRESS_LENGTH, bytecode.length) +} + +/* + * Stores info about libraries linked in an artifact. + * Specifically, for each library, it stores: + * - `positions`: an array of indices into the `deployedBytecode` string indicating the locations + * where the library should be linked. + * - `placeholderHash`: the hexadecimal 34-character hash used as a placeholder before linking. + */ +export class ArtifactLibraryLinking { + links: { + [library: string]: { + positions: number[] + placeholderHash: string + } + } + + constructor(artifact: Artifact) { + this.links = {} + if (typeof artifact.deployedBytecode !== 'string') { + this.parseLibraryPositions(artifact.deployedBytecode.linkReferences) + } else { + throw new Error(`Unexpected artifact type`) + } + } + + private parseLibraryPositions(references: LinkReferences) { + Object.keys(references).forEach(sourceFile => { + const libraryLinks = references[sourceFile] + Object.keys(libraryLinks).forEach(library => { + libraryLinks[library].forEach(reference => { + this.addPositionFromByte(library, sourceFile, reference.start, reference.length) + }) + }) + }) + } + + private addPositionFromByte(library: string, sourceFile: string, startByte: number, length: number) { + if (length !== 20) { + throw new Error(`Unexpected library link length for ${library} at ${startByte}: ${length}`) + } + + if (!this.links[library]) { + this.links[library] = { + positions: [], + placeholderHash: getPlaceholderHash(`${sourceFile}:${library}`) + } + } + + /* + * The `linkReferences` `start` value refers to the byte index in the deployed bytecode. + * We will be using the position as an index into the hex string representing the bytecode + * so we need to convert: + * - Multiply by 2 because every byte takes up two hex characters. + * - Add 2 to account for the "0x" characters at the start of the bytecode hex string. + */ + this.links[library].positions.push(startByte * 2 + 2) + } +} + +/* + * Stores information about linked libraries that is necessary for linking. + * Specifically, for each library: + * - `address`: the on-chain address. + * - `placeholderHash`: the 34-character hexadecimal placeholder hash. + */ +export class LibraryLinkingInfo { + info: LibraryLinks + + constructor() { + this.info = {} + } + + /* + * Collects and/or checks addresses of linked libraries in a contract, given its deployed bytecode + * and the expected library linking positions. + */ + collect = (bytecode: string, artifactLinking: ArtifactLibraryLinking, contractName?: string): string[] => { + const errors: string[] = [] + Object.keys(artifactLinking.links).forEach((library) => { + artifactLinking.links[library].positions.forEach((position) => { + const extractedAddress = bytecode.slice(position, position + ADDRESS_LENGTH) + console.log(` Extracted from ${contractName ?? 'unknown'} onchain bytecode at position ${position}: ${extractedAddress}`) + if (!this.addAddress(library, artifactLinking.links[library].placeholderHash, extractedAddress)) { + const msg = `Mismatched addresses for ${library} at position ${position} in ${contractName ?? 'unknown'}: expected ${this.info[library].address}, got ${extractedAddress}` + errors.push(msg) + } + }) + }) + return errors + } + + getAddressMapping = (): { [library: string]: string } => { + const mapping = {} + Object.keys(this.info).map(library => { + mapping[library] = this.info[library].address + }) + return mapping + } + + /* + * Tries to add a library name -> address mapping. If the library has already + * had an address added, checks that the new address matches the old one. + */ + private addAddress(library: string, placeholderHash: string, address: string): boolean { + if (!this.info[library]) { + this.info[library] = { + address, + placeholderHash + } + } + return this.info[library].address === address + } +} diff --git a/packages/protocol/lib/bytecode.ts b/packages/protocol/lib/bytecode.ts index c4a215790b8..960dc75482b 100644 --- a/packages/protocol/lib/bytecode.ts +++ b/packages/protocol/lib/bytecode.ts @@ -29,7 +29,7 @@ export const stripMetadata = (bytecode: string): string => { const match = bytecode.match(GENERAL_METADATA_REGEXP) if (match === null) { throw new Error( - 'Only support stripping metadata from bytecodes generated by solc up to v0.5.13 with no experimental features.' + 'Only support stripping metadata from bytecodes generated by solc in the versions listed in CONTRACT_METADATA_REGEXPS' ) } return match[1] @@ -52,7 +52,7 @@ const padForLink = (name: string): string => { export const linkLibraries = (bytecode: string, libraryLinks: LibraryLinks): string => { Object.keys(libraryLinks).forEach((libraryName) => { const linkString = padForLink(libraryName) - // Use g flag to iterate through for all occurences. + // Use g flag to iterate through for all occurrences. bytecode = bytecode.replace(RegExp(linkString, 'g'), libraryLinks[libraryName]) }) @@ -92,7 +92,7 @@ export class LibraryPositions { */ constructor(bytecode: string) { this.positions = {} - // Use g flag to iterate through for all occurences. + // Use g flag to iterate through for all occurrences. const libraryLinkRegExp = new RegExp(LibraryPositions.libraryLinkRegExpString, 'g') let match = libraryLinkRegExp.exec(bytecode) while (match != null) { @@ -122,7 +122,9 @@ export class LibraryAddresses { Object.keys(libraryPositions.positions).forEach((library) => libraryPositions.positions[library].forEach((position) => { if (!this.addAddress(library, bytecode.slice(position, position + ADDRESS_LENGTH))) { - throw new Error(`Mismatched addresses for ${library} at ${position}`) + const logMessage = `Mismatched addresses for ${library} at ${position}` + throw new Error(logMessage) + // console.log(logMessage) } }) ) diff --git a/packages/protocol/lib/compatibility/ast-code.ts b/packages/protocol/lib/compatibility/ast-code.ts index 9c97633965b..0aa948a125d 100644 --- a/packages/protocol/lib/compatibility/ast-code.ts +++ b/packages/protocol/lib/compatibility/ast-code.ts @@ -5,7 +5,7 @@ import { MethodMutabilityChange, MethodRemovedChange, MethodReturnChange, MethodVisibilityChange, NewContractChange } from '@celo/protocol/lib/compatibility/change' -import { makeZContract } from '@celo/protocol/lib/compatibility/internal' +import { Artifact, getArtifactByName, getContractName, makeZContract } from '@celo/protocol/lib/compatibility/internal' import { BuildArtifacts, Contract as ZContract @@ -237,9 +237,13 @@ function generateASTCompatibilityReport(oldContract: ZContract, oldArtifacts: Bu const report = doASTCompatibilityReport(contractName, oldAST, newAST) // Check deployed byte code change - if (stripMetadata(oldContract.schema.deployedBytecode) !== stripMetadata(newContract.schema.deployedBytecode)) { + const oldBytecodeStripped = stripMetadata(oldContract.schema.deployedBytecode) + const newBytecodeStripped = stripMetadata(newContract.schema.deployedBytecode) + + if (oldBytecodeStripped !== newBytecodeStripped) { report.push(new DeployedBytecodeChange(contractName)) } + return report } @@ -256,21 +260,100 @@ export function reportASTIncompatibilities( newArtifactsSets: BuildArtifacts[]): ASTCodeCompatibilityReport { let out: ASTCodeCompatibilityReport[] = [] + + // Helper function to get compiler version from artifacts + const getCompilerVersion = (artifacts: BuildArtifacts): string => { + const firstArtifact: Artifact | undefined = artifacts.listArtifacts()[0] + // Truffle artifacts have .compiler.version at top level + if (firstArtifact?.compiler?.version) { + return firstArtifact.compiler.version + } + // Foundry artifacts have .metadata.compiler.version + if (firstArtifact?.metadata?.compiler?.version) { + return firstArtifact.metadata.compiler.version + } + // Fallback: try to determine from artifact content + return 'unknown' + } + + // Process each new artifacts set and find matching old artifacts by compiler version for (const newArtifacts of newArtifactsSets) { - const reports = newArtifacts.listArtifacts() - .map((newArtifact) => { + const newCompilerVersion = getCompilerVersion(newArtifacts) + console.log(`[INFO] Processing new artifacts with compiler version: ${newCompilerVersion}`) + + // Find matching old artifacts with same compiler version + let matchingOldArtifacts: BuildArtifacts | null = null + for (const oldArtifacts of oldArtifactsSet) { + const oldCompilerVersion = getCompilerVersion(oldArtifacts) + if (oldCompilerVersion === newCompilerVersion) { + matchingOldArtifacts = oldArtifacts + console.log(`[INFO] Found matching old artifacts with compiler version: ${oldCompilerVersion}`) + break + } + } - for (const oldArtifacts of oldArtifactsSet) { - const oldArtifact = oldArtifacts.getArtifactByName(newArtifact.contractName) + if (matchingOldArtifacts) { + // Compare contracts from same compiler version + const reports = newArtifacts.listArtifacts() + .filter((newArtifact) => { + // Matches all Truffle project artifacts (core contracts and test resource contracts) + const truffleProjectContractPathPattern = /^project:/ + // Matches Foundry core contracts + const foundryCoreContractPathPattern = /^contracts(-0\.8)?\// + // Matches Foundry test resource contracts + const foundryTestContractPathPattern = /^test-ts/ + const path = newArtifact.ast.absolutePath + return truffleProjectContractPathPattern.test(path) || foundryCoreContractPathPattern.test(path) || foundryTestContractPathPattern.test(path) + }) + .map((newArtifact) => { + const newContractName = getContractName(newArtifact) + const oldArtifact = getArtifactByName(newContractName, matchingOldArtifacts!) if (oldArtifact) { - return generateASTCompatibilityReport(makeZContract(oldArtifact), oldArtifacts, makeZContract(newArtifact), newArtifacts) + return generateASTCompatibilityReport(makeZContract(oldArtifact), matchingOldArtifacts!, makeZContract(newArtifact), newArtifacts) + } else { + // Contract doesn't exist in old artifacts of same version + console.log(`[INFO] New contract detected: ${newContractName} (compiler: ${newCompilerVersion})`) + return generateASTCompatibilityReport(null, matchingOldArtifacts!, makeZContract(newArtifact), newArtifacts) } - } + }) + out = [...out, ...reports] + } else { + // No matching old artifacts found - treat all contracts as new + console.log(`[INFO] No matching old artifacts found for compiler version: ${newCompilerVersion}`) + const fallbackOldArtifacts = oldArtifactsSet.length > 0 ? oldArtifactsSet[0] : null + if (fallbackOldArtifacts) { + const reports = newArtifacts.listArtifacts() + .map((newArtifact) => { + const newContractName = getContractName(newArtifact) + console.log(`[INFO] New contract (no matching old version): ${newContractName} (compiler: ${newCompilerVersion})`) + return generateASTCompatibilityReport(null, fallbackOldArtifacts!, makeZContract(newArtifact), newArtifacts) + }) + out = [...out, ...reports] + } else { + console.log(`[WARNING] No old artifacts available for fallback comparison`) + } + } + } - return generateASTCompatibilityReport(null, oldArtifactsSet[0], makeZContract(newArtifact), newArtifacts) - }) - out = [...out, ...reports] + // Check for potentially removed contracts by looking for old artifacts without matching new artifacts + for (const oldArtifacts of oldArtifactsSet) { + const oldCompilerVersion = getCompilerVersion(oldArtifacts) + + // Find if there's a matching new artifacts set + let hasMatchingNewArtifacts = false + for (const newArtifacts of newArtifactsSets) { + const newCompilerVersion = getCompilerVersion(newArtifacts) + if (oldCompilerVersion === newCompilerVersion) { + hasMatchingNewArtifacts = true + break + } + } + if (!hasMatchingNewArtifacts) { + console.log(`[INFO] Old artifacts with compiler ${oldCompilerVersion} have no matching new artifacts - contracts may have been removed`) + const potentiallyRemovedContracts = oldArtifacts.listArtifacts().map(artifact => getContractName(artifact)) + console.log(`[INFO] Potentially removed contracts: ${potentiallyRemovedContracts.join(', ')}`) + } } return mergeReports(out) diff --git a/packages/protocol/lib/compatibility/ast-layout.ts b/packages/protocol/lib/compatibility/ast-layout.ts index a2897e9ae2d..cbd591ba4cc 100644 --- a/packages/protocol/lib/compatibility/ast-layout.ts +++ b/packages/protocol/lib/compatibility/ast-layout.ts @@ -1,38 +1,11 @@ -import { Contract as Web3Contract } from '@celo/connect'; -import { Artifact, TypeInfo } from '@celo/protocol/lib/compatibility/internal'; +import { Artifact, TypeInfo, makeZContract, getContractName, getArtifactByName } from '@celo/protocol/lib/compatibility/internal'; import { BuildArtifacts, Operation, StorageLayoutInfo, - Contract as ZContract, compareStorageLayouts, getStorageLayout } from '@openzeppelin/upgrades'; -const Web3 = require('web3') - -const web3 = new Web3(null) - -// getStorageLayout needs an oz-sdk Contract class instance. This class is a -// subclass of Contract from web3-eth-contract, with an added .schema member and -// several methods. -// -// Couldn't find an easy way of getting one just from contract artifacts. But -// for getStorageLayout we really only need .schema.ast and .schema.contractName. -const addSchemaForLayoutChecking = (web3Contract: Web3Contract, artifact: any): ZContract => { - // @ts-ignore - const contract = web3Contract as Contract - // @ts-ignore - contract.schema = {} - contract.schema.ast = artifact.ast - contract.schema.contractName = artifact.contractName - return contract -} - -const makeZContract = (artifact: any): ZContract => { - const contract = new web3.eth.Contract(artifact.abi) - - return addSchemaForLayoutChecking(contract, artifact) -} export const getLayout = (artifact: Artifact, artifacts: BuildArtifacts) => { const contract = makeZContract(artifact) @@ -200,15 +173,15 @@ export const generateCompatibilityReport = (oldArtifact: Artifact, oldArtifacts: const structsReport = generateStructsCompatibilityReport(oldLayout, newLayout) if (!layoutReport.compatible) { - console.log(newArtifact.contractName, "layoutReport incompatible", JSON.stringify(layoutReport.errors)); + console.log(getContractName(newArtifact), "layoutReport incompatible", JSON.stringify(layoutReport.errors)); } if (!structsReport.compatible) { - console.log(newArtifact.contractName, "structsReport incompatible", JSON.stringify(structsReport.errors)); + console.log(getContractName(newArtifact), "structsReport incompatible", JSON.stringify(structsReport.errors)); } return { - contract: newArtifact.contractName, + contract: getContractName(newArtifact), compatible: layoutReport.compatible && structsReport.compatible, errors: layoutReport.errors.concat(structsReport.errors), expanded: structsReport.expanded @@ -218,10 +191,20 @@ export const generateCompatibilityReport = (oldArtifact: Artifact, oldArtifacts: export const reportLayoutIncompatibilities = (oldArtifactsSet: BuildArtifacts[], newArtifactsSets: BuildArtifacts[]): ASTStorageCompatibilityReport[] => { let out: ASTStorageCompatibilityReport[] = [] for (const newArtifacts of newArtifactsSets) { - const reports = newArtifacts.listArtifacts().map((newArtifact) => { - + const reports = newArtifacts.listArtifacts() + .filter((newArtifact: any) => { + // Matches all Truffle project artifacts (core contracts and test resource contracts) + const truffleProjectContractPathPattern = /^project:/ + // Matches Foundry core contracts + const foundryCoreContractPathPattern = /^contracts(-0\.8)?\// + // Matches Foundry test resource contracts + const foundryTestContractPathPattern = /^test-ts/ + const path = newArtifact.ast.absolutePath + return truffleProjectContractPathPattern.test(path) || foundryCoreContractPathPattern.test(path) || foundryTestContractPathPattern.test(path) + }) + .map((newArtifact: any) => { for (const oldArtifacts of oldArtifactsSet) { - const oldArtifact = oldArtifacts.getArtifactByName(newArtifact.contractName) + const oldArtifact: any = getArtifactByName(getContractName(newArtifact), oldArtifacts) if (oldArtifact !== undefined) { return generateCompatibilityReport(oldArtifact, oldArtifacts, newArtifact, newArtifacts) } @@ -230,7 +213,7 @@ export const reportLayoutIncompatibilities = (oldArtifactsSet: BuildArtifacts[], // Generate an empty report for new contracts, which are, by definition, backwards // compatible. return { - contract: newArtifact.contractName, + contract: getContractName(newArtifact), compatible: true, errors: [] } diff --git a/packages/protocol/lib/compatibility/ast-version.ts b/packages/protocol/lib/compatibility/ast-version.ts index 30b77d51fbc..780c8957a25 100644 --- a/packages/protocol/lib/compatibility/ast-version.ts +++ b/packages/protocol/lib/compatibility/ast-version.ts @@ -1,5 +1,5 @@ /* eslint-disable max-classes-per-file: 0 */ -import { Artifact } from '@celo/protocol/lib/compatibility/internal'; +import { Artifact, getContractName } from '@celo/protocol/lib/compatibility/internal'; import { ContractVersion, ContractVersionChecker, ContractVersionCheckerIndex, ContractVersionDelta, ContractVersionDeltaIndex, ContractVersionIndex, DEFAULT_VERSION_STRING } from '@celo/protocol/lib/compatibility/version'; import { Address as EJSAddress } from "@ethereumjs/util"; import { VM } from "@ethereumjs/vm"; @@ -11,12 +11,12 @@ const abi = require('ethereumjs-abi') * A mapping {contract name => {@link ContractVersion}}. */ export class ASTContractVersions { - static fromArtifacts = async (artifactsSet: BuildArtifacts[]): Promise => { + static fromArtifacts = async (artifactsSet: BuildArtifacts[], newLinking: boolean): Promise => { const contracts = {} for (const artifacts of artifactsSet) { - await Promise.all(artifacts.listArtifacts().filter(c => !isLibrary(c.contractName, [artifacts])).map(async (artifact) => { - contracts[artifact.contractName] = await getContractVersion(artifact) + await Promise.all(artifacts.listArtifacts().filter(c => !isLibrary(getContractName(c), [artifacts])).map(async (artifact) => { + contracts[getContractName(artifact)] = await getContractVersion(artifact, newLinking) })) } @@ -30,15 +30,19 @@ export class ASTContractVersions { * Gets the version of a contract by calling Contract.getVersionNumber() on * the contract deployed bytecode. * + * If `newLinking` is true, expects `__$$__` style linking. Otherwise, `______` + * style. + * * If the contract version cannot be retrieved, returns version 1.1.0.0 by default. */ -export async function getContractVersion(artifact: Artifact): Promise { +export async function getContractVersion(artifact: Artifact, newLinking: boolean): Promise { const vm = await VM.create(); - const bytecode = artifact.deployedBytecode + // @ts-ignore + const bytecode = artifact.deployedBytecode.object || artifact.deployedBytecode const data = '0x' + abi.methodID('getVersionNumber', []).toString('hex') const nullAddress = '0000000000000000000000000000000000000000' - // Artificially link all libraries to the null address. - const linkedBytecode = bytecode.split(/[_]+[A-Za-z0-9]+[_]+/).join(nullAddress) + const compilerLinkRegex = newLinking ? /__\$[a-f0-9]{34}\$__/g : /__[A-Za-z0-9_]{36}__/g + const linkedBytecode = bytecode.split(compilerLinkRegex).join(nullAddress) const result = await vm.evm.runCall({ to: new EJSAddress(Buffer.from(nullAddress, 'hex')), caller: new EJSAddress(Buffer.from(nullAddress, 'hex')), @@ -57,9 +61,9 @@ export async function getContractVersion(artifact: Artifact): Promise => { - const oldVersions = await ASTContractVersions.fromArtifacts(oldArtifactsSet) - const newVersions = await ASTContractVersions.fromArtifacts(newArtifactsSet) + static create = async (oldArtifactsSet: BuildArtifacts[], newArtifactsSet: BuildArtifacts[], expectedVersionDeltas: ContractVersionDeltaIndex, newLinking: boolean): Promise => { + const oldVersions = await ASTContractVersions.fromArtifacts(oldArtifactsSet, newLinking) + const newVersions = await ASTContractVersions.fromArtifacts(newArtifactsSet, newLinking) const contracts = {} Object.keys(newVersions.contracts).map((contract: string) => { const versionDelta = expectedVersionDeltas[contract] === undefined ? ContractVersionDelta.fromChanges(false, false, false, false) : expectedVersionDeltas[contract] diff --git a/packages/protocol/lib/compatibility/ignored-contracts-v9.ts b/packages/protocol/lib/compatibility/ignored-contracts-v9.ts index 5b0f6b790d0..cf193ed7296 100644 --- a/packages/protocol/lib/compatibility/ignored-contracts-v9.ts +++ b/packages/protocol/lib/compatibility/ignored-contracts-v9.ts @@ -17,12 +17,21 @@ export const ignoredContractsV9Only = [ // Between CR9 and CR10, a Mento upgrade MU03 also upgraded SortedOracles. For the purposes of our compatibility tests, we use the Mento version of the contract in CR10, so that we're comparing the most recent pre-CR10 contracts with the CR10 versions. ] -export function getReleaseVersion(tag: string) { - const regexp = /core-contracts.v(?.*[0-9])/gm - const matches = regexp.exec(tag) +export function getReleaseVersion(tag: string): number { + // Support two formats: + // 1. Tag format: core-contracts.vX (e.g., core-contracts.v14) + // 2. Branch format: release/core-contracts/X (e.g., release/core-contracts/15) + const tagRegexp = /core-contracts\.v(?\d+)/ + const branchRegexp = /release\/core-contracts\/(?\d+)/ + + let matches = tagRegexp.exec(tag) + if (!matches) { + matches = branchRegexp.exec(tag) + } + const version = parseInt(matches?.groups?.version ?? '0', 10) - if ((version) == 0) { - throw `Tag doesn't have the correct format ${tag}` + if (version === 0) { + throw new Error(`Tag "${tag}" doesn't match expected format. Use: core-contracts.vX or release/core-contracts/X`) } return version } diff --git a/packages/protocol/lib/compatibility/internal.ts b/packages/protocol/lib/compatibility/internal.ts index d2b3cc5fc36..6265a677f03 100644 --- a/packages/protocol/lib/compatibility/internal.ts +++ b/packages/protocol/lib/compatibility/internal.ts @@ -1,36 +1,106 @@ -import { Contract as ZContract } from '@openzeppelin/upgrades' +import { BuildArtifacts, Contract as ZContract } from '@openzeppelin/upgrades' const Web3 = require('web3') const web3 = new Web3(null) +// Foundry build artifacts do not have a `.contractName` field, so we get it from the +// `ContractDefinition` expression in the AST. +const getContractNameFromDefinition = (artifact: Artifact): string => { + for (let i = 0; i < artifact.ast.nodes.length; i++) { + const node = artifact.ast.nodes[i] + if (node.nodeType === 'ContractDefinition') { + return node.name + } + } + console.error("Name not found in artifact AST") + return '' +} + +export const getContractName = (artifact: Artifact): string => { + if (artifact.contractName) { + return artifact.contractName + } else { + return getContractNameFromDefinition(artifact) + } +} + +export const getArtifactByName = (contractName: string, artifacts: BuildArtifacts): Artifact => { + return artifacts.listArtifacts().find(artifact => + getContractName(artifact) === contractName + ) +} + +export const getBytecode = (artifact: Artifact): string => { + if (typeof artifact.bytecode === "string") { + return artifact.bytecode + } else { + return artifact.bytecode.object + } +} + +export const getDeployedBytecode = (artifact: Artifact): string => { + if (typeof artifact.deployedBytecode === "string") { + return artifact.deployedBytecode + } else { + return artifact.deployedBytecode.object + } +} + +export const getSourceFile = (artifact: Artifact): string => { + if (typeof (artifact as any).metadata === "object") { + return Object.keys((artifact as any).metadata.sources)[0] + } else { + throw new Error("Artifact does not have metadata") + } +} + // getStorageLayout needs an oz-sdk Contract class instance. This class is a // subclass of Contract from web3-eth-contract, with an added .schema member and // several methods. // // Couldn't find an easy way of getting one just from contract artifacts. But // for getStorageLayout we really only need .schema.ast and .schema.contractName. -export function makeZContract(artifact: any): ZContract { +export function makeZContract(artifact: Artifact): ZContract { const web3Contract = new web3.eth.Contract(artifact.abi) // @ts-ignore const contract = web3Contract as Contract // @ts-ignore contract.schema = {} contract.schema.ast = artifact.ast - contract.schema.contractName = artifact.contractName - contract.schema.deployedBytecode = artifact.deployedBytecode + contract.contractName = getContractName(artifact) + contract.schema.contractName = contract.contractName + if (typeof artifact.deployedBytecode === "string") { + contract.schema.deployedBytecode = artifact.deployedBytecode + } else { + contract.schema.deployedBytecode = artifact.deployedBytecode.object + } return contract } +export interface LinkReference { + start: number + length: number +} + +export interface LibraryLinkReference { + [library: string]: LinkReference[] +} + +export interface LinkReferences { + [sourcePath: string]: LibraryLinkReference +} + // Inlined from OpenZeppelin SDK since its not exported. export interface Artifact { abi: any[] ast: any - bytecode: string + bytecode: (string | { object: string, linkReferences: LinkReferences }) compiler: any contractName: string - deployedBytecode: string + deployedBytecode: (string | { object: string, linkReferences: LinkReferences }) deployedSourceMap: string fileName: string legacyAST?: any + metadata?: { compiler?: { version?: string } } // Foundry artifact metadata networks: any schemaVersion: string source: string @@ -59,4 +129,3 @@ export interface StorageInfo { path?: string; contract?: string; } - diff --git a/packages/protocol/lib/compatibility/report.ts b/packages/protocol/lib/compatibility/report.ts index ac50e34ac9c..d5eee6d6664 100644 --- a/packages/protocol/lib/compatibility/report.ts +++ b/packages/protocol/lib/compatibility/report.ts @@ -6,7 +6,7 @@ import { ASTCodeCompatibilityReport } from '@celo/protocol/lib/compatibility/ast import { ASTStorageCompatibilityReport } from '@celo/protocol/lib/compatibility/ast-layout' import { categorize, Categorizer, ChangeType } from '@celo/protocol/lib/compatibility/categorizer' import { Change } from '@celo/protocol/lib/compatibility/change' -import { makeZContract } from '@celo/protocol/lib/compatibility/internal' +import { makeZContract, getArtifactByName } from '@celo/protocol/lib/compatibility/internal' import { ContractVersionDelta, ContractVersionDeltaIndex } from '@celo/protocol/lib/compatibility/version' /** * Value object holding all uncategorized storage and code reports. @@ -125,7 +125,7 @@ export interface ASTVersionedReportIndex { export const isLibrary = (contract: string, artifactsSet: BuildArtifacts[]) => { for (const artifacts of artifactsSet){ - const artifact = artifacts.getArtifactByName(contract) + const artifact = getArtifactByName(contract, artifacts) if (artifact === undefined){ // EAFP // the library may be in another package @@ -172,7 +172,9 @@ export class ASTVersionedReport { contracts: {}, libraries: {} } - Object.keys(changesByContract).forEach((contract: string) => { + // Sort contract names alphabetically to ensure consistent ordering + // regardless of whether Truffle or Foundry artifacts are used + Object.keys(changesByContract).sort().forEach((contract: string) => { if (isLibrary(contract, artifactsSet)) { reportIndex.libraries[contract] = changesByContract[contract] } else { diff --git a/packages/protocol/lib/compatibility/utils.ts b/packages/protocol/lib/compatibility/utils.ts index c8b5d5cc80d..0e8a77557ba 100644 --- a/packages/protocol/lib/compatibility/utils.ts +++ b/packages/protocol/lib/compatibility/utils.ts @@ -6,8 +6,7 @@ import { ASTDetailedVersionedReport, ASTReports } from '@celo/protocol/lib/compa import { linkedLibraries } from '@celo/protocol/migrationsConfig'; import { BuildArtifacts, Contracts, getBuildArtifacts } from '@openzeppelin/upgrades'; import { readJsonSync } from 'fs-extra'; - - +import { globSync } from 'glob' /** * Backward compatibility report, based on both the abstract syntax tree analysis of @@ -80,3 +79,47 @@ export function instantiateArtifacts(buildDirectory: string): BuildArtifacts { process.exit(10002) } } + +function listForgeBuildArtifacts(buildDirectory: string): string[] { + const buildInfoPathPattern = /build-info/ + const coreContractPathPattern = /contracts(-0\.8)?\// + const nonFoundryDependencyPathPattern = /lib\/(?!celo)/ + const foundryTestContractPathPattern = /test-ts\// + const pathPatterns = [ coreContractPathPattern, nonFoundryDependencyPathPattern, foundryTestContractPathPattern ] + const artifactsGlobPattern = `${buildDirectory}/**/*.json` + const allArtifactPaths = globSync(artifactsGlobPattern) + const coreContracts = allArtifactPaths.filter(artifactPath => { + if (artifactPath.match(buildInfoPathPattern)) { + return false + } + const artifact = readJsonSync(artifactPath) + const sourcePath = artifact.ast.absolutePath + return pathPatterns.some((pattern: RegExp) => sourcePath.match(pattern)) + }) + + return coreContracts +} + +interface CompilerArtifactsIndex { + [index: string]: string[] +} + +function splitArtifactsByCompiler(artifactPaths: string[]): CompilerArtifactsIndex { + const artifactsIndex: CompilerArtifactsIndex = {} + artifactPaths.forEach(artifactPath => { + const artifact = readJsonSync(artifactPath) + const version = artifact.metadata.compiler.version + if (!artifactsIndex[version]) { + artifactsIndex[version] = [] + } + artifactsIndex[version].push(artifactPath) + }) + + return artifactsIndex +} + +export function instantiateArtifactsFromForge(buildDirectory: string): BuildArtifacts[] { + const artifactPaths = listForgeBuildArtifacts(buildDirectory) + const artifactsIndex: CompilerArtifactsIndex = splitArtifactsByCompiler(artifactPaths) + return Object.keys(artifactsIndex).map(compiler => new BuildArtifacts(artifactsIndex[compiler])) +} diff --git a/packages/protocol/lib/compatibility/verify-bytecode-foundry.ts b/packages/protocol/lib/compatibility/verify-bytecode-foundry.ts new file mode 100644 index 00000000000..b23f4039895 --- /dev/null +++ b/packages/protocol/lib/compatibility/verify-bytecode-foundry.ts @@ -0,0 +1,340 @@ +/* eslint-disable no-console: 0 */ +import { ensureLeading0x } from '@celo/base/lib/address' +import { + ArtifactLibraryLinking, + LibraryLinkingInfo, + linkLibraries, + stripMetadata, + verifyAndStripLibraryPrefix, +} from '@celo/protocol/lib/bytecode-foundry' +import { getArtifactByName, getContractName, getDeployedBytecode } from '@celo/protocol/lib/compatibility/internal' +import { verifyProxyStorageProofFoundry } from '@celo/protocol/lib/proxy-utils' +import { ProposalTx } from '@celo/protocol/scripts/truffle/make-release' +import { BuildArtifacts } from '@openzeppelin/upgrades' +import { ignoredContractsV9, ignoredContractsV9Only } from './ignored-contracts-v9' + +export interface RegistryLookup { + getAddressForString: (name: string) => Promise +} + +export interface ProxyLookup { + getImplementation: (address: string) => Promise +} + +export interface ChainLookup { + getCode: (address: string) => Promise + encodeFunctionCall: (abi: any, args: any[]) => string + getProof: (address: string, slots: string[]) => Promise +} + +let ignoredContracts = [ + // This contract is not proxied + 'TransferWhitelist', + + // These contracts are not in the Registry (before release 1) + 'ReserveSpenderMultiSig', + 'GovernanceApproverMultiSig', + + // These contracts live in monorepo but are not part of the core protocol + 'CeloFeeCurrencyAdapterOwnable', + 'FeeCurrencyAdapter', + 'FeeCurrencyAdapterOwnable', +] + +interface VerificationContext { + artifacts: BuildArtifacts[] + libraryLinkingInfo: LibraryLinkingInfo + registry: RegistryLookup + governanceAddress: string + proposal: ProposalTx[] + proxyLookup: ProxyLookup + chainLookup: ChainLookup + network: string +} + +export interface InitializationData { + [contractName: string]: any[] +} + +const ContractNameExtractorRegex = new RegExp(/(.*)Proxy/) +const ZERO_ADDRESS = '0x0000000000000000000000000000000000000000' + +const getArtifact = (contractName: string, context: VerificationContext) => { + return context.artifacts.map(a => getArtifactByName(contractName, a)).find(a => a) +} + +// Checks if the given transaction is a repointing of the Proxy for the given +// contract. +const isProxyRepointTransaction = (tx: ProposalTx) => + tx.contract.endsWith('Proxy') && + (tx.function === '_setImplementation' || tx.function === '_setAndInitializeImplementation') + +export const isProxyRepointAndInitializeTransaction = (tx: ProposalTx) => + tx.contract.endsWith('Proxy') && tx.function === '_setAndInitializeImplementation' + +export const isProxyRepointAndInitForIdTransaction = (tx: ProposalTx, registryId: string) => + tx.contract === registryId && isProxyRepointAndInitializeTransaction(tx) + +const isProxyRepointForIdTransaction = (tx: ProposalTx, contract: string) => + tx.contract === `${contract}Proxy` && isProxyRepointTransaction(tx) + +const isImplementationChanged = (contract: string, proposal: ProposalTx[]): boolean => + proposal.some((tx: ProposalTx) => isProxyRepointForIdTransaction(tx, contract)) + +const getProposedImplementationAddress = (contract: string, proposal: ProposalTx[]) => + proposal.find((tx: ProposalTx) => isProxyRepointForIdTransaction(tx, contract)).args[0] + +// Checks if the given transaction is a repointing of the Registry entry for the +// given registryId. +const isRegistryRepointTransaction = (tx: ProposalTx) => + tx.contract === `Registry` && tx.function === 'setAddressFor' + +const isRegistryRepointForIdTransaction = (tx: ProposalTx, registryId: string) => + isRegistryRepointTransaction(tx) && tx.args[0] === registryId + +const isProxyChanged = (contract: string, proposal: ProposalTx[]): boolean => + proposal.some((tx) => isRegistryRepointForIdTransaction(tx, contract)) + +export const getProposedProxyAddress = (contract: string, proposal: ProposalTx[]): string => { + const relevantTx = proposal.find((tx) => isRegistryRepointForIdTransaction(tx, contract)) + return relevantTx.args[1] +} + +const getSourceBytecodeFromArtifacts = (contract: string, artifacts: BuildArtifacts[]): string => + stripMetadata(getDeployedBytecode(artifacts.map(a => getArtifactByName(contract, a)).find(a => a))) + +const getSourceBytecode = (contract: string, context: VerificationContext): string => + getSourceBytecodeFromArtifacts(contract, context.artifacts) + +const getOnchainBytecode = async (address: string, context: VerificationContext) => { + const code = await context.chainLookup.getCode(address) + if (!code || code === '0x') { + throw new Error(`No bytecode found at address ${address}`) + } + return stripMetadata(code) +} + +const isLibrary = (contract: string, context: VerificationContext) => { + const answer = Object.keys(context.libraryLinkingInfo.info).includes(contract) + return answer +} + +interface QueueEntry { + contract: string + requiredBy?: string +} + +const dfsStep = async (queue: QueueEntry[], visited: Set, context: VerificationContext, errors: string[], verifiedLibraries: Set) => { + const { contract, requiredBy } = queue.pop() + const artifact = getArtifact(contract, context) + const isLib = isLibrary(contract, context) + const kind = isLib ? 'Library' : 'Contract' + // mark current DFS node as visited + visited.add(contract) + + if (requiredBy) { + console.log(`\nVerifying ${kind} ${contract} (required by ${requiredBy})`) + } else { + console.log(`\nVerifying ${kind} ${contract}`) + } + + try { + // check proxy deployment + if (isProxyChanged(contract, context.proposal)) { + const proxyAddress = getProposedProxyAddress(contract, context.proposal) + // ganache does not support eth_getProof + if ( + context.network !== 'development' && + !(await verifyProxyStorageProofFoundry(context.chainLookup, proxyAddress, context.governanceAddress)) + ) { + const msg = `Proposed ${contract}Proxy has impure storage` + console.log(` ❌ ${msg}`) + errors.push(msg) + return + } + + const onchainProxyBytecode = await getOnchainBytecode(proxyAddress, context) + const sourceProxyBytecode = getSourceBytecode(`${contract}Proxy`, context) + if (onchainProxyBytecode !== sourceProxyBytecode) { + const msg = `Proposed ${contract}Proxy does not match compiled proxy bytecode` + console.log(` ❌ ${msg}`) + errors.push(msg) + return + } + } + + // check implementation deployment + const sourceBytecode = getSourceBytecode(contract, context) + const sourceArtifactLinking = new ArtifactLibraryLinking(artifact) + + let implementationAddress: string + if (isImplementationChanged(contract, context.proposal)) { + implementationAddress = getProposedImplementationAddress(contract, context.proposal) + } else if (isProxyChanged(contract, context.proposal)) { + const proxyAddress = getProposedProxyAddress(contract, context.proposal) + implementationAddress = await context.proxyLookup.getImplementation(proxyAddress) + } else if (isLib) { + implementationAddress = ensureLeading0x(context.libraryLinkingInfo.info[contract].address) + } else { + const proxyAddress = await context.registry.getAddressForString(contract) + if (proxyAddress === ZERO_ADDRESS) { + console.log(` ⏭️ ${contract} is not in registry - skipping`) + return + } + implementationAddress = await context.proxyLookup.getImplementation(proxyAddress) + } + + let onchainBytecode = await getOnchainBytecode(implementationAddress, context) + const collectErrors = context.libraryLinkingInfo.collect(onchainBytecode, sourceArtifactLinking, contract) + if (collectErrors.length > 0) { + for (const err of collectErrors) { + console.log(` ❌ ${err}`) + errors.push(err) + } + } + + let linkedSourceBytecode = linkLibraries(sourceBytecode, context.libraryLinkingInfo.info) + + // normalize library bytecodes + if (isLib) { + linkedSourceBytecode = verifyAndStripLibraryPrefix(linkedSourceBytecode) + onchainBytecode = verifyAndStripLibraryPrefix(onchainBytecode, implementationAddress) + } + + if (onchainBytecode !== linkedSourceBytecode) { + const msg = `${kind} ${contract} (at ${implementationAddress}): onchain and compiled bytecodes do not match` + console.log(` ❌ ${msg}`) + errors.push(msg) + } else { + console.log(` ✅ ${kind} ${contract} matches (at ${implementationAddress})`) + if (isLib) { + verifiedLibraries.add(contract) + } + } + + // push unvisited libraries to DFS queue + const unvisitedLibraries = Object.keys(sourceArtifactLinking.links).filter((library) => !visited.has(library)) + queue.push(...unvisitedLibraries.map((library) => ({ contract: library, requiredBy: contract }))) + } catch (err) { + const msg = `${kind} ${contract}: ${err.message}` + console.log(` ❌ ${msg}`) + errors.push(msg) + } +} + +const assertValidProposalTransactions = (proposal: ProposalTx[]) => { + const invalidTransactions = proposal.filter( + (tx) => !isProxyRepointTransaction(tx) && !isRegistryRepointTransaction(tx) + ) + if (invalidTransactions.length > 0) { + throw new Error(`Proposal contains invalid release transactions ${invalidTransactions}`) + } + + console.info('Proposal contains only valid release transactions!') +} + +const assertValidInitializationData = ( + artifacts: BuildArtifacts[], + proposal: ProposalTx[], + chainLookup: ChainLookup, + initializationData: InitializationData +) => { + const initializingProposals = proposal.filter(isProxyRepointAndInitializeTransaction) + const contractsInitialized = new Set() + for (const proposalTx of initializingProposals) { + const contractName = ContractNameExtractorRegex.exec(proposalTx.contract)[1] + + if (!initializationData[contractName]) { + throw new Error( + `Initialization Data for ${contractName} could not be found in reference file` + ) + } + + const contract = artifacts.map(a => a.getArtifactByName(contractName)).find(a => a) + const initializeAbi = contract.abi.find( + (abi: any) => abi.type === 'function' && abi.name === 'initialize' + ) + const args = initializationData[contractName] + const callData = chainLookup.encodeFunctionCall(initializeAbi, args) + + if (callData.toLowerCase() !== proposalTx.args[1].toLowerCase()) { + throw new Error( + `Intialization Data for ${contractName} in proposal does not match reference file ${initializationData[contractName]}` + ) + } + + contractsInitialized.add(contractName) + } + + for (const referenceContractName of Object.keys(initializationData)) { + if (!contractsInitialized.has(referenceContractName)) { + throw new Error( + `Reference file has initialization data for ${referenceContractName}, but proposal does not specify initialization` + ) + } + } + + console.info('Initialization Data was verified!') +} + +/* + * This function will visit all contracts in `contracts` as well as any + * linked libraries and verify that the compiled and linked source code matches + * the deployed bytecode registered or proposed. + */ +export const verifyBytecodes = async ( + contracts: string[], + artifacts: BuildArtifacts[], + registry: RegistryLookup, + proposal: ProposalTx[], + proxyLookup: ProxyLookup, + chainLookup: ChainLookup, + initializationData: InitializationData = {}, + version?: number, + network = 'development' +) => { + assertValidProposalTransactions(proposal) + assertValidInitializationData(artifacts, proposal, chainLookup, initializationData) + + const compiledContracts = Array.prototype.concat.apply([], artifacts.map(a => a.listArtifacts())).map((a) => getContractName(a)) + + if (version > 9) { + ignoredContracts = [...ignoredContracts, ...ignoredContractsV9] + } else if (version == 9) { + ignoredContracts = [...ignoredContracts, ...ignoredContractsV9, ...ignoredContractsV9Only] + } + + const filteredContracts = contracts.filter( + (contract) => !ignoredContracts.includes(contract) + ).filter( + (contract) => compiledContracts.includes(contract) + ) + + const queue: QueueEntry[] = filteredContracts.map((contract) => ({ contract })) + const visited: Set = new Set(filteredContracts) + + const governanceAddress = await registry.getAddressForString('Governance') + const context: VerificationContext = { + artifacts, + libraryLinkingInfo: new LibraryLinkingInfo(), + registry, + governanceAddress, + proposal, + proxyLookup, + chainLookup, + network, + } + + const errors: string[] = [] + const verifiedLibraries: Set = new Set() + + while (queue.length > 0) { + await dfsStep(queue, visited, context, errors, verifiedLibraries) + } + + if (errors.length > 0) { + throw new Error(errors.join('\n')) + } + + return { libraryLinkingInfo: context.libraryLinkingInfo, verifiedLibraries } +} diff --git a/packages/protocol/lib/compatibility/verify-bytecode.ts b/packages/protocol/lib/compatibility/verify-bytecode.ts index 56d6d9d0ee8..cc6f989f795 100644 --- a/packages/protocol/lib/compatibility/verify-bytecode.ts +++ b/packages/protocol/lib/compatibility/verify-bytecode.ts @@ -7,14 +7,19 @@ import { stripMetadata, verifyAndStripLibraryPrefix, } from '@celo/protocol/lib/bytecode' +import { MENTO_PACKAGE, SOLIDITY_05_PACKAGE, SOLIDITY_08_PACKAGE } from '@celo/protocol/contractPackages' import { verifyProxyStorageProof } from '@celo/protocol/lib/proxy-utils' import { ProposalTx } from '@celo/protocol/scripts/truffle/make-release' -import { ZERO_ADDRESS } from '@celo/protocol/test/constants' import { BuildArtifacts } from '@openzeppelin/upgrades' import { ProxyInstance, RegistryInstance } from 'types' import Web3 from 'web3' import { ignoredContractsV9, ignoredContractsV9Only } from './ignored-contracts-v9' +const contracts08Set = new Set(SOLIDITY_08_PACKAGE.contracts) +const mentoContractsSet = new Set(MENTO_PACKAGE.contracts) + +type ArtifactsMap = Record + let ignoredContracts = [ // This contract is not proxied 'TransferWhitelist', @@ -30,7 +35,7 @@ let ignoredContracts = [ ] interface VerificationContext { - artifacts: BuildArtifacts[] + artifactsMap: ArtifactsMap libraryAddresses: LibraryAddresses registry: RegistryInstance governanceAddress: string @@ -44,6 +49,7 @@ interface InitializationData { } const ContractNameExtractorRegex = new RegExp(/(.*)Proxy/) +const ZERO_ADDRESS = '0x0000000000000000000000000000000000000000' // Checks if the given transaction is a repointing of the Proxy for the given // contract. @@ -82,11 +88,14 @@ export const getProposedProxyAddress = (contract: string, proposal: ProposalTx[] return relevantTx.args[1] } -const getSourceBytecodeFromArtifacts = (contract: string, artifacts: BuildArtifacts[]): string => - stripMetadata(artifacts.map(a => a.getArtifactByName(contract)).find(a => a).deployedBytecode) +const getArtifactsForContract = (contract: string, artifactsMap: ArtifactsMap): BuildArtifacts => { + if (contracts08Set.has(contract)) return artifactsMap[SOLIDITY_08_PACKAGE.name] + if (mentoContractsSet.has(contract)) return artifactsMap[MENTO_PACKAGE.name] + return artifactsMap[SOLIDITY_05_PACKAGE.name] +} const getSourceBytecode = (contract: string, context: VerificationContext): string => - getSourceBytecodeFromArtifacts(contract, context.artifacts) + stripMetadata(getArtifactsForContract(contract, context.artifactsMap).getArtifactByName(contract).deployedBytecode) const getOnchainBytecode = async (address: string, context: VerificationContext) => stripMetadata(await context.web3.eth.getCode(address)) @@ -137,19 +146,51 @@ const dfsStep = async (queue: string[], visited: Set, context: Verificat implementationAddress = await proxy._getImplementation() } + console.log(`Verifying ${contract} at ${implementationAddress}`) let onchainBytecode = await getOnchainBytecode(implementationAddress, context) context.libraryAddresses.collect(onchainBytecode, sourceLibraryPositions) let linkedSourceBytecode = linkLibraries(sourceBytecode, context.libraryAddresses.addresses) - // normalize library bytecodes - if (isLibrary(contract, context)) { - linkedSourceBytecode = verifyAndStripLibraryPrefix(linkedSourceBytecode) - onchainBytecode = verifyAndStripLibraryPrefix(onchainBytecode, implementationAddress) + try { + if (isLibrary(contract, context)) { + linkedSourceBytecode = verifyAndStripLibraryPrefix(linkedSourceBytecode) + onchainBytecode = verifyAndStripLibraryPrefix(onchainBytecode, implementationAddress) + } + } catch(e) { + const logMessage = `Error verifying library prefix for ${contract} at ${implementationAddress}: ${e}` + throw new Error(logMessage) + // console.log(logMessage) } if (onchainBytecode !== linkedSourceBytecode) { - throw new Error(`${contract}'s onchain and compiled bytecodes do not match`) + // AddressLinkedList exists in both Mento (0.5) and 0.8 packages because Validators (0.8) + // links it, but it was deployed with the 0.5 compiler. Try the Mento artifacts as fallback. + // See https://github.com/celo-org/celo-monorepo/issues/11684 + if (contract === 'AddressLinkedList' && isLibrary(contract, context)) { + const mentoArtifacts = context.artifactsMap[MENTO_PACKAGE.name] + if (mentoArtifacts) { + let mentoBytecode = stripMetadata(mentoArtifacts.getArtifactByName(contract).deployedBytecode) + mentoBytecode = linkLibraries(mentoBytecode, context.libraryAddresses.addresses) + try { + mentoBytecode = verifyAndStripLibraryPrefix(mentoBytecode) + } catch { /* ignore */ } + + if (onchainBytecode === mentoBytecode) { + console.warn(`\n⚠️ WARNING: ${contract} was deployed with Solidity 0.5 instead of 0.8!`) + console.warn(`⚠️ This is a known issue caused by Validators linking AddressLinkedList.`) + console.warn(`⚠️ See https://github.com/celo-org/celo-monorepo/issues/11684\n`) + } else { + console.log("onchainBytecode", onchainBytecode) + console.log("linkedSourceBytecode", linkedSourceBytecode) + throw new Error(`${contract}'s onchain and compiled bytecodes do not match (tried both 0.5 and 0.8)`) + } + } + } else { + console.log("onchainBytecode", onchainBytecode) + console.log("linkedSourceBytecode", linkedSourceBytecode) + throw new Error(`${contract}'s onchain and compiled bytecodes do not match`) + } } else { console.log( `${isLibrary(contract, context) ? 'Library' : 'Contract' @@ -200,7 +241,7 @@ const assertValidInitializationData = ( if (callData.toLowerCase() !== proposalTx.args[1].toLowerCase()) { throw new Error( - `Intialization Data for ${contractName} in proposal does not match reference file ${initializationData[contractName]}` + `Initialization Data for ${contractName} in proposal does not match reference file ${initializationData[contractName]}` ) } @@ -225,7 +266,7 @@ const assertValidInitializationData = ( */ export const verifyBytecodes = async ( contracts: string[], - artifacts: BuildArtifacts[], + artifactsMap: ArtifactsMap, registry: RegistryInstance, proposal: ProposalTx[], Proxy: Truffle.Contract, @@ -234,10 +275,11 @@ export const verifyBytecodes = async ( version?: number, network = 'development' ) => { + const allArtifacts = Object.values(artifactsMap) assertValidProposalTransactions(proposal) - assertValidInitializationData(artifacts, proposal, _web3, initializationData) + assertValidInitializationData(allArtifacts, proposal, _web3, initializationData) - const compiledContracts = Array.prototype.concat.apply([], artifacts.map(a => a.listArtifacts())).map((a) => a.contractName) + const compiledContracts = Array.prototype.concat.apply([], allArtifacts.map(a => a.listArtifacts())).map((a) => a.contractName) if (version > 9) { ignoredContracts = [...ignoredContracts, ...ignoredContractsV9] @@ -258,7 +300,7 @@ export const verifyBytecodes = async ( const governanceAddress = await registry.getAddressForString('Governance') const context: VerificationContext = { - artifacts, + artifactsMap, libraryAddresses: new LibraryAddresses(), registry, governanceAddress, diff --git a/packages/protocol/lib/fed-attestations-utils.ts b/packages/protocol/lib/fed-attestations-utils.ts deleted file mode 100644 index dee70eb5678..00000000000 --- a/packages/protocol/lib/fed-attestations-utils.ts +++ /dev/null @@ -1,44 +0,0 @@ -import { Address } from '@celo/utils/lib/address'; -import { generateTypedDataHash, structHash } from '@celo/utils/lib/sign-typed-data-utils'; -import { parseSignatureWithoutPrefix } from '@celo/utils/lib/signatureUtils'; -import { registerAttestation as getTypedData } from '@celo/utils/lib/typed-data-constructors'; -import { - bufferToHex -} from '@ethereumjs/util'; - -export const getSignatureForAttestation = async ( - identifier: string, - issuer: string, - account: string, - issuedOn: number, - signer: string, - chainId: number, - contractAddress: string -) => { - const typedData = getTypedData(chainId, contractAddress, { identifier,issuer,account, signer, issuedOn}) - - const signature = await new Promise((resolve, reject) => { - web3.currentProvider.send( - { - method: 'eth_signTypedData', - params: [signer, typedData], - }, - (error: Error, resp: {result: string}) => { - if (error) { - reject(error) - } else { - resolve(resp.result) - } - } - ) - }) - - const messageHash = bufferToHex(generateTypedDataHash(typedData)) - const parsedSignature = parseSignatureWithoutPrefix(messageHash, signature, signer) - return parsedSignature -} - -export const getDomainDigest = (contractAddress: Address) => { - const typedData = getTypedData(1, contractAddress) - return bufferToHex(structHash('EIP712Domain', typedData.domain, typedData.types)) -} \ No newline at end of file diff --git a/packages/protocol/lib/memview.sol b/packages/protocol/lib/memview.sol deleted file mode 160000 index 79a08cb25aa..00000000000 --- a/packages/protocol/lib/memview.sol +++ /dev/null @@ -1 +0,0 @@ -Subproject commit 79a08cb25aac047d81c67c5422c9b55abfac8635 diff --git a/packages/protocol/lib/openzeppelin-contracts8 b/packages/protocol/lib/openzeppelin-contracts8 index b53c43242fc..54b3f14346d 160000 --- a/packages/protocol/lib/openzeppelin-contracts8 +++ b/packages/protocol/lib/openzeppelin-contracts8 @@ -1 +1 @@ -Subproject commit b53c43242fc9c0e435b66178c3847c4a1b417cc1 +Subproject commit 54b3f14346da01ba0d159114b399197fea8b7cda diff --git a/packages/protocol/lib/proxy-utils.ts b/packages/protocol/lib/proxy-utils.ts index a24bd1fe688..b62e5d3d8ac 100644 --- a/packages/protocol/lib/proxy-utils.ts +++ b/packages/protocol/lib/proxy-utils.ts @@ -1,9 +1,9 @@ -import { Address, bufferToHex, hexToBuffer } from '@celo/base/lib/address' -import { SecureTrie } from 'merkle-patricia-tree' -import { encode as rlpEncode } from 'rlp' -import { ProxyInstance } from 'types' -import Web3 from 'web3' -import { retryTx } from './web3-utils' +import { Address, bufferToHex, hexToBuffer } from '@celo/base/lib/address'; +import { SecureTrie } from 'merkle-patricia-tree'; +import { encode as rlpEncode } from 'rlp'; +import { ProxyInstance } from 'types'; +import Web3 from 'web3'; +import { retryTx } from './web3-utils'; // from Proxy.sol @@ -31,6 +31,23 @@ export async function verifyProxyStorageProof(web3: Web3, proxy: string, owner: return proof.storageHash === bufferToHex(trie.root) } +interface ProofLookup { + getProof: (address: string, slots: string[]) => Promise +} + +export async function verifyProxyStorageProofFoundry(proofLookup: ProofLookup, proxy: string, owner: string) { + const proof = await proofLookup.getProof( + proxy, + [OWNER_POSITION, IMPLEMENTATION_POSITION] + ) + + const trie = new SecureTrie() + await trie.put(hexToBuffer(OWNER_POSITION), rlpEncode(owner)) + + // @ts-ignore + return proof.storageHash === bufferToHex(trie.root) +} + export async function setAndInitializeImplementation( web3: Web3, proxy: ProxyInstance, @@ -44,7 +61,6 @@ export async function setAndInitializeImplementation( ) { try { - const callData = web3.eth.abi.encodeFunctionCall(initializerAbi, args) if (txOptions.from != null) { // The proxied contract needs to be funded prior to initialization @@ -69,6 +85,6 @@ export async function setAndInitializeImplementation( return retryTx(proxy._setAndInitializeImplementation, [implementationAddress, callData as any]) } } catch (error) { - console.log("errror", error); + console.log("error", error); } } diff --git a/packages/protocol/lib/registry-utils.ts b/packages/protocol/lib/registry-utils.ts index 36777a976ef..47f4bcf9df2 100644 --- a/packages/protocol/lib/registry-utils.ts +++ b/packages/protocol/lib/registry-utils.ts @@ -2,11 +2,10 @@ * Be careful when adding to this file or relying on this file. * The verification tooling uses the CeloContractName enum as a * source of truth for what contracts are considered "core" and - * need to be checked for backwards compatability and bytecode on + * need to be checked for backwards compatibility and bytecode on * an environment. */ -import { ContractPackage, MENTO_PACKAGE, SOLIDITY_08_PACKAGE } from "../contractPackages"; export const celoRegistryAddress = '0x000000000000000000000000000000000000ce10' @@ -52,44 +51,4 @@ export enum CeloContractName { TransferWhitelist = 'TransferWhitelist', UniswapFeeHandlerSeller = 'UniswapFeeHandlerSeller', Validators = 'Validators', -} - -export const usesRegistry = [ - CeloContractName.Reserve, - CeloContractName.StableToken, -] - -export const hasEntryInRegistry: ContractPackage[] = [ - { - name: "default", - contracts: [ - CeloContractName.Accounts, - CeloContractName.Attestations, - CeloContractName.BlockchainParameters, - CeloContractName.DoubleSigningSlasher, - CeloContractName.DowntimeSlasher, - CeloContractName.Election, - CeloContractName.Escrow, - CeloContractName.FederatedAttestations, - CeloContractName.FeeCurrencyWhitelist, - CeloContractName.Freezer, - CeloContractName.GoldToken, //TODO: Update when contract name is changed. - CeloContractName.GovernanceSlasher, - CeloContractName.OdisPayments, - CeloContractName.Random, - CeloContractName.SortedOracles, - ] - }, - SOLIDITY_08_PACKAGE - , - { - ...MENTO_PACKAGE, - // not all Mentro contracts are supposed to be in the Registry - contracts: [ - CeloContractName.Exchange, - CeloContractName.GrandaMento, - CeloContractName.Reserve, - CeloContractName.StableToken, - ], - } -] +} \ No newline at end of file diff --git a/packages/protocol/lib/test-utils.ts b/packages/protocol/lib/test-utils.ts index 5fc4e7b09e1..0f2b375d9c9 100644 --- a/packages/protocol/lib/test-utils.ts +++ b/packages/protocol/lib/test-utils.ts @@ -1,126 +1,38 @@ -import { ArtifactsSingleton } from '@celo/protocol/lib/artifactsSingleton'; -import { hasEntryInRegistry, usesRegistry } from '@celo/protocol/lib/registry-utils'; -import { getParsedSignatureOfAddress } from '@celo/protocol/lib/signing-utils'; -import { getDeployedProxiedContract } from '@celo/protocol/lib/web3-utils'; -import { config } from '@celo/protocol/migrationsConfig'; -import { privateKeyToAddress } from '@celo/utils/lib/address'; -import { soliditySha3 } from '@celo/utils/lib/solidity'; -import BigNumber from 'bignumber.js'; import chai from 'chai'; import chaiSubset from 'chai-subset'; // eslint-disable-next-line: ordered-imports import { spawn, SpawnOptions } from 'child_process'; -import { keccak256 } from 'ethereum-cryptography/keccak'; -import { GovernanceApproverMultiSigInstance, GovernanceInstance, LockedGoldInstance, ProxyInstance, RegistryInstance, UsingRegistryInstance } from 'types'; -import Web3 from 'web3'; -import { ContractPackage, MENTO_PACKAGE } from '../contractPackages'; /* eslint:disabled ordered-imports: 0 */ -import { fromFixed } from '@celo/utils/lib/fixidity'; -import { bufferToHex, toBuffer } from '@ethereumjs/util'; -import { utf8ToBytes } from 'ethereum-cryptography/utils'; -import { AccountsInstance } from 'types'; - - -import BN = require('bn.js') - -const isNumber = (x: any) => - typeof x === 'number' || (BN as any).isBN(x) || BigNumber.isBigNumber(x) chai.use(chaiSubset) // hard coded in ganache export const EPOCH = 100 -export function stripHexEncoding(hexString: string) { - return hexString.substring(0, 2) === '0x' ? hexString.substring(2) : hexString -} -export function assertContainSubset(superset: any, subset: any) { - const assert2: any = chai.assert - return assert2.containSubset(superset, subset) +async function isPortOpen(host: string, port: number) { + return (await execCmd('nc', ['-z', host, port.toString()], { silent: true })) === 0 } -export async function jsonRpc(web3: Web3, method: string, params: any[] = []): Promise { - return new Promise((resolve, reject) => { - if (typeof web3.currentProvider !== 'string') { - web3.currentProvider.send( - { - jsonrpc: '2.0', - method, - params, - // salt id generation, milliseconds might not be - // enough to generate unique ids - id: new Date().getTime() + Math.floor(Math.random() * (1 + 100 - 1)), - }, - // @ts-ignore - (err: any, result: any) => { - if (err) { - return reject(err) - } - return resolve(result) - } - ) - } else { - reject(new Error('Invalid Provider')) + +function execCmd(cmd: string, args: string[], options?: SpawnOptions & { silent?: boolean }) { + return new Promise(async (resolve, reject) => { + const { silent, ...spawnOptions } = options || { silent: false } + if (!silent) { + console.debug('$ ' + [cmd].concat(args).join(' ')) } + const process = spawn(cmd, args, { ...spawnOptions, stdio: silent ? 'ignore' : 'inherit' }) + process.on('close', (code) => { + try { + resolve(code) + } catch (error) { + reject(error) + } + }) }) } -export async function timeTravel(seconds: number, web3: Web3) { - await jsonRpc(web3, 'evm_increaseTime', [seconds]) - await jsonRpc(web3, 'evm_mine', []) -} - -export async function mineBlocks(blocks: number, web3: Web3) { - for (let i = 0; i < blocks; i++) { - await jsonRpc(web3, 'evm_mine', []) - } -} - -export async function currentEpochNumber(web3: Web3, epochSize: number = EPOCH) { - const blockNumber = await web3.eth.getBlockNumber() - - return getEpochNumberOfBlock(blockNumber, epochSize) -} - -export function getEpochNumberOfBlock(blockNumber: number, epochSize: number = EPOCH) { - // Follows GetEpochNumber from celo-blockchain/blob/master/consensus/istanbul/utils.go - const epochNumber = Math.floor(blockNumber / epochSize) - if (blockNumber % epochSize === 0) { - return epochNumber - } else { - return epochNumber + 1 - } -} - -// Follows GetEpochFirstBlockNumber from celo-blockchain/blob/master/consensus/istanbul/utils.go -export function getFirstBlockNumberForEpoch(epochNumber: number, epochSize: number = EPOCH) { - if (epochNumber === 0) { - // No first block for epoch 0 - return 0 - } - return (epochNumber - 1) * epochSize + 1 -} - -export async function mineToNextEpoch(web3: Web3, epochSize: number = EPOCH) { - const blockNumber = await web3.eth.getBlockNumber() - const epochNumber = await currentEpochNumber(web3, epochSize) - const blocksUntilNextEpoch = getFirstBlockNumberForEpoch(epochNumber + 1, epochSize) - blockNumber - await mineBlocks(blocksUntilNextEpoch, web3) -} - -export async function assertBalance(address: string, balance: BigNumber) { - const block = await web3.eth.getBlock('latest') - const web3balance = new BigNumber(await web3.eth.getBalance(address)) - if (isSameAddress(block.miner, address)) { - const blockReward = web3.utils.toWei(new BN(2), 'ether') as BigNumber - expectBigNumberInRange(web3balance, balance.plus(blockReward)) - } else { - expectBigNumberInRange(web3balance, balance) - } -} - export const assertThrowsAsync = async (promise: any, errorMessage: string = '') => { let failed = false try { @@ -129,60 +41,30 @@ export const assertThrowsAsync = async (promise: any, errorMessage: string = '') failed = true } - assert.strictEqual(true, failed, errorMessage) + chai.assert.strictEqual(true, failed, errorMessage) } -export async function assertTransactionRevertWithReason(promise: any, expectedRevertReason: string = '') { - try { - await promise - assert.fail('Expected transaction to revert') - } catch (error) { - // Only ever tested with ganache. - // When it's a transaction (eg a non-view send call), error.message has a shape like: - // 'StatusError: Transaction: ${transactionHash} exited with an error (status 0). Reason given: ${revertMessage}.' - // Therefore we try to search for `${expectedRevertReason}`. - const revertFound: boolean = - error.message.search(expectedRevertReason) >= 0 - const msg: string = - expectedRevertReason === '' ? `Expected "StatusError", got ${error} instead` : `Expected ${expectedRevertReason}, got ${error} instead` - assert(revertFound, msg) - } +function delay(time) { + return new Promise((resolve) => setTimeout(resolve, time)) } -export async function assertTransactionRevertWithoutReason(promise: any, errorMessage: string = '') { - // When a transaction reverts without a reason, error.message has a shape like: - // 'Transaction: ${transactionHash} exited with an error (status 0).' - try { - await promise - assert.fail('Expected transaction to revert') - } catch (error) { - const revertFound: boolean = - error.message.search('exited with an error [(]status 0[)]') >= 0 - const msg: string = - errorMessage === '' ? `Expected "StatusError", got ${error} instead` : errorMessage - assert(revertFound, msg) - } -} -// TODO: Use assertRevert directly from openzeppelin-solidity -// Note that errorMessage is not the expected revert message, but the -// message that is provided if there is no revert. -export async function assertRevert(promise: any, errorMessage: string = '') { - // Only ever tested with ganache. - // When it's a view call, error.message has a shape like: - // `Error: VM Exception while processing transaction: revert ${expectedRevertReason}` - try { - await promise - assert.fail('Expected transaction to revert') - } catch (error) { - const revertFound: boolean = - error.message.search('VM Exception while processing transaction: revert') >= 0 - const msg: string = - errorMessage === '' ? `Expected "revert", got ${error} instead` : errorMessage - assert(revertFound, msg) - } +export async function waitForPortOpen(host: string, port: number, seconds: number) { + console.info(`Waiting for ${host}:${port} to open for ${seconds}s`); + const deadline = Date.now() + seconds * 1000 + do { + if (await isPortOpen(host, port)) { + await delay(60000) // extra 60s just to give ganache extra time to startup + console.info(`Port ${host}:${port} opened`) + return true + } + } while (Date.now() < deadline) + console.info("Port was not opened in time"); + return false } + + export async function exec(command: string, args: string[]) { console.info(`Running: ${command} ${args}`) return new Promise((resolve, reject) => { @@ -209,549 +91,5 @@ export async function exec(command: string, args: string[]) { }) } -function execCmd(cmd: string, args: string[], options?: SpawnOptions & { silent?: boolean }) { - return new Promise(async (resolve, reject) => { - const { silent, ...spawnOptions } = options || { silent: false } - if (!silent) { - console.debug('$ ' + [cmd].concat(args).join(' ')) - } - const process = spawn(cmd, args, { ...spawnOptions, stdio: silent ? 'ignore' : 'inherit' }) - process.on('close', (code) => { - try { - resolve(code) - } catch (error) { - reject(error) - } - }) - }) -} - -async function isPortOpen(host: string, port: number) { - return (await execCmd('nc', ['-z', host, port.toString()], { silent: true })) === 0 -} - -export async function waitForPortOpen(host: string, port: number, seconds: number) { - console.info(`Waiting for ${host}:${port} to open for ${seconds}s`); - const deadline = Date.now() + seconds * 1000 - do { - if (await isPortOpen(host, port)) { - await delay(60000) // extra 60s just to give ganache extra time to startup - console.info(`Port ${host}:${port} opened`) - return true - } - } while (Date.now() < deadline) - console.info("Port was not opened in time"); - return false -} - -function delay(time) { - return new Promise((resolve) => setTimeout(resolve, time)) -} -type ProxiedContractGetter = ( - contractName: string, - type: string, - contractPackage: ContractPackage, - ) => Promise -type ContractGetter = ( - contractName: string, - contractPackage?: ContractPackage, - ) => Promise - - -export const assertProxiesSet = async (getContract: ProxiedContractGetter) => { - for (const contractList of proxiedContracts) { - for (const contractName of contractList.contracts) { - const contract = await getContract(contractName, 'contract', contractList.__contractPackage) - const proxy: ProxyInstance = await getContract(contractName, 'proxy', contractList.__contractPackage) - assert.strictEqual( - contract.address.toLowerCase(), - (await proxy._getImplementation()).toLowerCase(), - contractName + 'Proxy not pointing to the ' + contractName + ' implementation' - ) - } - } -} - -export const assertContractsRegistered = async (getContract: any) => { - const registry: RegistryInstance = await getContract('Registry') - for (const proxyPackage of hasEntryInRegistry) { - for (const contractName of proxyPackage.contracts) { - const contract: Truffle.ContractInstance = await getContract(contractName, proxyPackage) - assert.strictEqual( - contract.address.toLowerCase(), - (await registry.getAddressFor(soliditySha3(contractName))).toLowerCase(), - 'Registry does not have the correct information for ' + contractName - ) - } - } -} - -export const assertRegistryAddressesSet = async (getContract: ContractGetter) => { - const registry: RegistryInstance = await getContract('Registry') - for (const contractName of usesRegistry) { - const contract: UsingRegistryInstance = await getContract(contractName, MENTO_PACKAGE) - assert.strictEqual( - registry.address.toLowerCase(), - (await contract.registry()).toLowerCase(), - 'Registry address is not set properly in ' + contractName - ) - } -} - -// This function is currently not in use, it should be converted to assertContractsOwnedByGovernance -export const assertContractsOwnedByMultiSig = async (getContract: any) => { - const multiSigAddress = (await getContract('MultiSig', 'proxiedContract')).address - for (const contractList of ownedContracts) { - for (const contractName of contractList.contracts) { - const contractOwner: string = await (await getContract(contractName, 'proxiedContract', contractList.__contractPackage)).owner() - assert.strictEqual(contractOwner, multiSigAddress, contractName + ' is not owned by the MultiSig') - } - } - - for (const contractList of proxiedContracts) { - for (const contractName of contractList.contracts) { - const proxyOwner = await (await getContract(contractName, 'proxy', contractList.__contractPackage))._getOwner() - assert.strictEqual(proxyOwner, multiSigAddress, contractName + 'Proxy is not owned by the MultiSig') - }} -} - -export const assertFloatEquality = ( - a: BigNumber, - b: BigNumber, - errorMessage: string, - epsilon = new BigNumber(0.00000001) -) => { - assert(a.minus(b).abs().comparedTo(epsilon) === -1, errorMessage) -} - -export function assertLogMatches2( - log: Truffle.TransactionLog, - expected: { event: string; args: Record } -) { - assertLogMatches(log, expected.event, expected.args) -} - -export function assertLogMatches( - log: Truffle.TransactionLog, - event: string, - args: Record -) { - assert.strictEqual(log.event, event, `Log event name doesn\'t match`) - assertObjectWithBNEqual(log.args, args, (arg) => `Event ${event}, arg: ${arg} do not match`) -} - -// Compares objects' properties, using assertBNEqual to compare BN fields. -// Extracted out of previous `assertLogMatches`. -export function assertObjectWithBNEqual( - actual: object, - expected: Record, - fieldErrorMsg: (field?: string) => string -) { - const objectFields = Object.keys(actual) - .filter((k) => k !== '__length__' && isNaN(parseInt(k, 10))) - .sort() - - assert.deepStrictEqual(objectFields, Object.keys(expected).sort(), `Argument names do not match`) - for (const k of objectFields) { - if (typeof expected[k] === 'function') { - expected[k](actual[k], fieldErrorMsg(k)) - } else if (isNumber(actual[k]) || isNumber(expected[k])) { - assertEqualBN(actual[k], expected[k], fieldErrorMsg(k)) - } else if (Array.isArray(actual[k])) { - const actualArray = actual[k] as [] - const expectedArray = expected[k] as [] - if (actualArray.length === expectedArray.length - && actualArray.every(actualValue => isNumber(actualValue)) - && expectedArray.every(expectedValue => isNumber(expectedValue))) { - // if this is array of BNs, deepEqual will not work - // since it is not able to compare number/string/BN - // with each other and we have to compare it manually - for (let i = 0; i < actualArray.length; i++) { - assertEqualBN(actualArray[i], expectedArray[i], fieldErrorMsg(k)) - } - } else { - assert.deepStrictEqual(actual[k], expected[k], fieldErrorMsg(k)) - } - } - else { - assert.strictEqual(actual[k], expected[k], fieldErrorMsg(k)) - } - } -} - -export function assertBNArrayEqual( - actualArray: any[], - expectedArray: any[] -) { - assert(Array.isArray(actualArray), `Actual is not an array`) - assert(Array.isArray(expectedArray), `Expected is not an array`) - assert(actualArray.length === expectedArray.length, `Different array sizes; actual: ${actualArray.length} expected: ${expectedArray.length}`) - assert(actualArray.every(actualValue => isNumber(actualValue)) - && expectedArray.every(expectedValue => isNumber(expectedValue)), - `Expected all elements to be numbers`) - - for (let i = 0; i < actualArray.length; i++) { - assertEqualBN(actualArray[i], expectedArray[i]) - } -} - - -export function assertEqualBN( - actual: number | BN | BigNumber, - expected: number | BN | BigNumber, - msg?: string -) { - assert( - web3.utils.toBN(actual).eq(web3.utils.toBN(expected)), - `expected ${expected.toString(10)} and got ${actual.toString(10)}. ${msg || ''}` - ) -} - -export function assertAlmostEqualBN( - actual: number | BN | BigNumber, - expected: number | BN | BigNumber, - margin: number | BN | BigNumber, - msg?: string -) { - const diff = web3.utils.toBN(actual).sub(web3.utils.toBN(expected)).abs() - assert( - web3.utils.toBN(margin).gte(diff), - `expected ${expected.toString(10)} to be within ${margin.toString(10)} of ${actual.toString( - 10 - )}. ${msg || ''}` - ) -} - -export function assertEqualDpBN( - value: number | BN | BigNumber, - expected: number | BN | BigNumber, - decimals: number, - msg?: string -) { - const valueDp = new BigNumber(value.toString()).dp(decimals) - const expectedDp = new BigNumber(expected.toString()).dp(decimals) - assert( - valueDp.isEqualTo(expectedDp), - `expected ${expectedDp.toString()} and got ${valueDp.toString()}. ${msg || ''}` - ) -} - -export function assertEqualBNArray( - value: number[] | BN[] | BigNumber[], - expected: number[] | BN[] | BigNumber[], - msg?: string -) { - assert.strictEqual(value.length, expected.length, msg) - value.forEach((x, i) => assertEqualBN(x, expected[i])) -} - -export function assertGtBN( - value: number | BN | BigNumber, - expected: number | BN | BigNumber, - msg?: string -) { - assert( - web3.utils.toBN(value).gt(web3.utils.toBN(expected)), - `expected ${value.toString()} to be greater than to ${expected.toString()}. ${msg || - ''}` - ) -} - -export function assertGteBN( - value: number | BN | BigNumber, - expected: number | BN | BigNumber, - msg?: string -) { - assert( - web3.utils.toBN(value).gte(web3.utils.toBN(expected)), - `expected ${value.toString()} to be greater than or equal to ${expected.toString()}. ${ - msg || '' - }` - ) -} - -export const isSameAddress = (minerAddress, otherAddress) => { - return minerAddress.toLowerCase() === otherAddress.toLowerCase() -} - -// TODO(amy): Pull this list from the build artifacts instead -const proxiedContracts = [{ - contracts: [ - 'Attestations', - 'Escrow', - 'GoldToken', - 'Registry', - 'SortedOracles', - - ] - }, - { - contracts: [ - 'Reserve', - 'StableToken', - ], - __contractPackage: MENTO_PACKAGE - } -] - -// TODO(asa): Pull this list from the build artifacts instead -const ownedContracts = [{ - contracts: [ - 'Attestations', - 'Escrow', - 'Registry', - 'SortedOracles', - ] - },{ - contracts: [ - 'Reserve', - 'Exchange', - 'StableToken' - ], - __contractPackage: MENTO_PACKAGE - } -] - -export function getOffsetForMinerSelection( - blockhash: string, - index: number, - verifierBlockWindowSize: number -): number { - const selectedVerifierBlockOffsets = new Set() - - let hash: any = new BN(blockhash.replace('0x', ''), 16) - let verifierBlockOffset = 0 - let currentVerification = 0 - const mod = new BN(verifierBlockWindowSize) - while (currentVerification <= index) { - hash = keccak256(hash) - verifierBlockOffset = new BN(hash).mod(mod).toNumber() - if (!selectedVerifierBlockOffsets.has(verifierBlockOffset)) { - selectedVerifierBlockOffsets.add(verifierBlockOffset) - currentVerification++ - } - } - - return verifierBlockOffset -} - -export const assertSameAddress = (value: string, expected: string, msg?: string) => { - assert.strictEqual(expected.toLowerCase(), value.toLowerCase(), msg) -} - -export function createMatcher(assertFn: (value: A, expected: A, msg?: string) => void) { - return (expected: A) => (value: A, msg?: string) => { - assertFn(value, expected, msg) - } -} - -export const matchAddress = createMatcher(assertSameAddress) - -export const matchAny = () => { - // nothing -} - -export default { - assertContainSubset, - assertRevert, - timeTravel, - isSameAddress, -} - -export async function addressMinedLatestBlock(address: string) { - const block = await web3.eth.getBlock('latest') - return isSameAddress(block.miner, address) -} - -enum VoteValue { - None = 0, - Abstain, - No, - Yes, -} - -export async function assumeOwnershipWithTruffle(contractsToOwn: string[], to: string, dequeuedIndex: number = 0, contractPackage?:ContractPackage) { - const governance: GovernanceInstance = await getDeployedProxiedContract('Governance', artifacts) - const lockedGold: LockedGoldInstance = await getDeployedProxiedContract('LockedGold', artifacts) - const multiSig: GovernanceApproverMultiSigInstance = await getDeployedProxiedContract( - 'GovernanceApproverMultiSig', - artifacts - ) - const registry: RegistryInstance = await getDeployedProxiedContract('Registry', artifacts) - // Enough to pass the governance proposal unilaterally (and then some). - const tenMillionCELO = '10000000000000000000000000' - // @ts-ignore - await lockedGold.lock({ value: tenMillionCELO }) - // Any contract's `transferOwnership` function will work here as the function signatures are all the same. - // @ts-ignore - const transferOwnershipData = Buffer.from(stripHexEncoding(registry.contract.methods.transferOwnership(to).encodeABI()), 'hex') - const proposalTransactions = await Promise.all( - contractsToOwn.map(async (contractName: string) => { - - const artifactsInstance = ArtifactsSingleton.getInstance(contractPackage, artifacts) - - const contractAddress = (await getDeployedProxiedContract(contractName, artifactsInstance)).address - - return { - value: 0, - destination: contractAddress, - data: transferOwnershipData, - } - }) - ) - await governance.propose( - proposalTransactions.map((tx: any) => tx.value), - proposalTransactions.map((tx: any) => tx.destination), - // @ts-ignore - Buffer.concat(proposalTransactions.map((tx: any) => tx.data)), - proposalTransactions.map((tx: any) => tx.data.length), - 'URL', - // @ts-ignore: TODO(mcortesi) fix typings for TransactionDetails - { value: web3.utils.toWei(config.governance.minDeposit.toString(), 'ether') } - ) - - const proposalId = (await governance.proposalCount()).toNumber() - - await timeTravel(config.governance.dequeueFrequency, web3) - // @ts-ignore - const txData = governance.contract.methods.approve(proposalId, dequeuedIndex).encodeABI() - await multiSig.submitTransaction(governance.address, 0, txData) - await timeTravel(config.governance.approvalStageDuration, web3) - await governance.vote(proposalId, dequeuedIndex, VoteValue.Yes) - await timeTravel(config.governance.referendumStageDuration, web3) - await governance.execute(proposalId, dequeuedIndex) -} - -/* - * Helpers for verification - */ -export enum KeyOffsets { - VALIDATING_KEY_OFFSET, - ATTESTING_KEY_OFFSET, - NEW_VALIDATING_KEY_OFFSET, - VOTING_KEY_OFFSET, -} - -// Private keys of each of the 10 miners, in the same order as their addresses in 'accounts'. -export const accountPrivateKeys: string[] = [ - '0xf2f48ee19680706196e2e339e5da3491186e0c4c5030670656b0e0164837257d', - '0x5d862464fe9303452126c8bc94274b8c5f9874cbd219789b3eb2128075a76f72', - '0xdf02719c4df8b9b8ac7f551fcb5d9ef48fa27eef7a66453879f4d8fdc6e78fb1', - '0xff12e391b79415e941a94de3bf3a9aee577aed0731e297d5cfa0b8a1e02fa1d0', - '0x752dd9cf65e68cfaba7d60225cbdbc1f4729dd5e5507def72815ed0d8abc6249', - '0xefb595a0178eb79a8df953f87c5148402a224cdf725e88c0146727c6aceadccd', - '0x83c6d2cc5ddcf9711a6d59b417dc20eb48afd58d45290099e5987e3d768f328f', - '0xbb2d3f7c9583780a7d3904a2f55d792707c345f21de1bacb2d389934d82796b2', - '0xb2fd4d29c1390b71b8795ae81196bfd60293adf99f9d32a0aff06288fcdac55f', - '0x23cb7121166b9a2f93ae0b7c05bde02eae50d64449b2cbb42bc84e9d38d6cc89', -] - -export const getDerivedKey = (offset: number, address: string, accounts: string[]) => { - const pKey = accountPrivateKeys[accounts.indexOf(address)] - const aKey = Buffer.from(pKey.slice(2), 'hex') - aKey.write((aKey[0] + offset).toString(16)) - return '0x' + aKey.toString('hex') -} - -export const unlockAndAuthorizeKey = async ( - offset: number, - authorizeFn: any, - account: string, - accounts: string[] -) => { - const key = getDerivedKey(offset, account, accounts) - const addr = privateKeyToAddress(key) - // @ts-ignore - await web3.eth.personal.importRawKey(key, 'passphrase') - await web3.eth.personal.unlockAccount(addr, 'passphrase', 1000000) - - const signature = await getParsedSignatureOfAddress(web3, account, addr) - await authorizeFn(addr, signature.v, signature.r, signature.s, { - from: account, - }) - - return addr -} - -export const authorizeAndGenerateVoteSigner = async (accountsInstance: AccountsInstance, account: string, accounts: string[]) => { - const roleHash = keccak256(utf8ToBytes('celo.org/core/vote')) - const role = bufferToHex(toBuffer(roleHash)) - - const signer = await unlockAndAuthorizeKey( - KeyOffsets.VALIDATING_KEY_OFFSET, - accountsInstance.authorizeVoteSigner, - account, - accounts - ) - // fund singer - await web3.eth.sendTransaction({ - from: accounts[9], - to: signer, - value: web3.utils.toWei('1', 'ether'), - }) - - await accountsInstance.completeSignerAuthorization(account, role, { from: signer }) - - return signer; -} - -export async function createAndAssertDelegatorDelegateeSigners(accountsInstance: AccountsInstance, accounts: string[], delegator: string, delegatee?: string) { - let delegatorSigner - let delegateeSigner; - - if (delegator != null) { - delegatorSigner = await authorizeAndGenerateVoteSigner( - accountsInstance, - delegator, - accounts - ) - assert.notEqual(delegator, delegatorSigner) - assert.equal(await accountsInstance.voteSignerToAccount(delegatorSigner), delegator) - } - - if (delegatee != null) { - delegateeSigner = await authorizeAndGenerateVoteSigner( - accountsInstance, - delegatee, - accounts - ) - assert.notEqual(delegatee, delegateeSigner) - assert.equal(await accountsInstance.voteSignerToAccount(delegateeSigner), delegatee) - } - return [delegatorSigner, delegateeSigner] -} - -export async function assertDelegatorDelegateeAmounts( - delegator: string, - delegatee: string, - percent: number, - amount: number, - lockedGold: LockedGoldInstance -) { - const [fraction, currentAmount] = await lockedGold.getDelegatorDelegateeInfo( - delegator, - delegatee - ) - assertEqualBN(fromFixed(fraction).multipliedBy(100), percent) - assertEqualBN(currentAmount, amount) -} - -export function expectBigNumberInRange(real: BigNumber, - expected: BigNumber, - range: BigNumber = new BigNumber("10000000000000000") // gas - ) { - expect( - real.plus(range).gte(expected), - `Number ${real.toString()} is not in range <${expected.minus(range).toString()}, ${expected - .plus(range) - .toString()}>` - ).to.be.true; - expect( - real.minus(range).lte(expected), - `Number ${real.toString()} is not in range <${expected.minus(range).toString()}, ${expected - .plus(range) - .toString()}>` - ).to.be.true; -} diff --git a/packages/protocol/lib/web3-utils.ts b/packages/protocol/lib/web3-utils.ts index 45ec50e22ba..c0bbad72cb8 100644 --- a/packages/protocol/lib/web3-utils.ts +++ b/packages/protocol/lib/web3-utils.ts @@ -156,7 +156,7 @@ export async function setInitialProxyImplementation< const Contract = wrappedArtifacts.require(contractName) // getProxy function supports the case the proxy is in a different package - // which is the case for GasPriceMimimum + // which is the case for GasPriceMinimum const ContractProxy = wrappedArtifacts.getProxy(contractName, artifacts) await Contract.detectNetwork() diff --git a/packages/protocol/migrationsConfig.js b/packages/protocol/migrationsConfig.js index 79045a72c5d..dda550068c8 100644 --- a/packages/protocol/migrationsConfig.js +++ b/packages/protocol/migrationsConfig.js @@ -223,16 +223,11 @@ const DefaultConfig = { // MUST BE KEPT IN SYNC WITH MEMBERSHIP HISTORY LENGTH duration: 60 * DAY, }, - validatorScoreParameters: { - exponent: 10, - adjustmentSpeed: 0.1, - }, // MUST BE KEPT IN SYNC WITH VALIDATOR LOCKED GOLD DURATION membershipHistoryLength: 60, commissionUpdateDelay: (3 * DAY) / 5, // Approximately 3 days with 5s block times maxGroupSize: 5, slashingPenaltyResetPeriod: 30 * DAY, - downtimeGracePeriod: 0, // Register cLabs groups to contain an initial set of validators to run test networks. validatorKeys: [], @@ -312,97 +307,6 @@ const NetworkConfigs = { initialBalance: 100000000, // CELO }, }, - baklava: { - downtimeSlasher: { - slashableDowntime: (8 * HOUR) / 5, // ~8 hours - }, - election: { - minElectableValidators: 20, - frozen: false, - }, - epochRewards: { - targetVotingYieldParameters: { - initial: 0.00016, - }, - frozen: false, - }, - exchange: { - frozen: false, - }, - goldToken: { - frozen: false, - }, - governance: { - // Set to be able to complete a proposal in about a day, but give everyone a chance to participate. - dequeueFrequency: 4 * HOUR, - approvalStageDuration: 4 * HOUR, - referendumStageDuration: DAY, - executionStageDuration: WEEK, - participationBaseline: 1 / 200, // Very low participation requirement given its a testnet. - concurrentProposals: 3, - minDeposit: 100, // 100 cGLD - participationBaselineFloor: 1 / 100, - participationBaselineUpdateFactor: 1 / 5, - participationBaselineQuorumFactor: 1, - }, - governanceApproverMultiSig: { - // 1/5 multsig - signatories: [ - '0xb04778c00A8e30F59bFc91DD74853C4f32F34E54', // Google Cloud IAM managed account - '0x32830A3f65DF98aFCFA18bAd35009Aa51163D606', // Individual signer - '0x7c593219ad21e172c1fdc6bfdc359699fa428adb', // Individual signer - '0x31af68f73fb93815b3eB9a6FA76e63113De5f733', // Individual signer - '0x47fE4b9fFDB9712fC5793B1b5E86d96a4664cf02', // Individual signer - ], - numRequiredConfirmations: 1, - numInternalRequiredConfirmations: 2, - }, - lockedGold: { - unlockingPeriod: 6 * HOUR, // 1/12 of the Mainnet period. - }, - reserve: { - initialBalance: 100000000, // CELO - frozenAssetsStartBalance: 80000000, // Matches Mainnet after CGP-6 - frozenAssetsDays: 182, // 3x Mainnet thawing rate - }, - reserveSpenderMultiSig: { - // 1/3 multsig - signatories: [ - '0x62C6a0446BbD7f6260108dD538d88E8b53128a90', // Google Cloud IAM managed account - '0x49eFFA2ceF5FccA5540f421d6b28e76184cc0fDF', // Individual signer - '0x4550F1576fAC966Ac8b9F42e1D5D66D3A14dD8D3', // Individual signer - ], - numRequiredConfirmations: 1, - numInternalRequiredConfirmations: 2, - }, - stableToken: { - // Don't set an initial gold price before oracles start to report. - goldPrice: null, - oracles: [ - '0xd71fea6b92d3f21f659152589223385a7329bb11', - '0x1e477fc9b6a49a561343cd16b2c541930f5da7d2', - '0x460b3f8d3c203363bb65b1a18d89d4ffb6b0c981', - '0x3b522230c454ca9720665d66e6335a72327291e8', - '0x0AFe167600a5542d10912f4A07DFc4EEe0769672', - '0x412ebe7859e9aa71ff5ce4038596f6878c359c96', - '0xbbfe73df8b346b3261b19ac91235888aba36d68c', - '0x02b1d1bea682fcab4448c0820f5db409cce4f702', - '0xe90f891710f625f18ecbf1e02efb4fd1ab236a10', - '0x28c52c722df87ed11c5d7665e585e84aa93d7964', - ], - frozen: false, - }, - validators: { - groupLockedGoldRequirements: { - duration: 15 * DAY, // 1/12 of the Mainnet duration. - }, - validatorLockedGoldRequirements: { - duration: 5 * DAY, // 1/12 of the Mainnet duration. - }, - membershipHistoryLength: 15, // Number of epochs in the group lockup period. - votesRatioOfLastVsFirstGroup: 1.0, - }, - }, alfajores: { downtimeSlasher: { slashableDowntime: (8 * HOUR) / 5, // ~8 hours diff --git a/packages/protocol/migrations_sol/Migration.s.sol b/packages/protocol/migrations_sol/Migration.s.sol index 27751459fdf..d8ef1ec5adb 100644 --- a/packages/protocol/migrations_sol/Migration.s.sol +++ b/packages/protocol/migrations_sol/Migration.s.sol @@ -2,62 +2,64 @@ pragma solidity >=0.8.7 <0.8.20; // Note: This script should not include any cheatcode so that it can run in production +// Foundry-08 imports import { Script } from "forge-std-8/Script.sol"; // Foundry imports -import "forge-std/console.sol"; -import "forge-std/StdJson.sol"; +import { console } from "forge-std/console.sol"; +import { stdJson } from "forge-std/StdJson.sol"; // Helper contract imports -import "@migrations-sol/HelperInterFaces.sol"; +import { IReserveInitializer, IReserve, IStableTokenInitialize, IExchangeInitializer, IExchange, IReserveSpenderMultiSig } from "@migrations-sol/HelperInterFaces.sol"; import { MigrationsConstants } from "@migrations-sol/constants.sol"; -import "@openzeppelin/contracts8/utils/math/Math.sol"; // Core contract imports on Solidity 0.5 -import "@celo-contracts/common/interfaces/IProxy.sol"; -import "@celo-contracts/common/interfaces/IProxyFactory.sol"; -import "@celo-contracts/common/interfaces/IRegistry.sol"; -import "@celo-contracts/common/interfaces/IRegistryInitializer.sol"; -import "@celo-contracts/common/interfaces/IFreezer.sol"; -import "@celo-contracts/common/interfaces/IFreezerInitializer.sol"; -import "@celo-contracts/common/interfaces/ICeloTokenInitializer.sol"; -import "@celo-contracts/common/interfaces/IAccountsInitializer.sol"; -import "@celo-contracts/common/interfaces/IFeeHandlerSellerInitializer.sol"; -import "@celo-contracts/common/interfaces/IFeeHandler.sol"; -import "@celo-contracts/common/interfaces/IFeeHandlerInitializer.sol"; -import "@celo-contracts/common/interfaces/IFeeCurrencyWhitelist.sol"; -import "@celo-contracts/common/interfaces/IAccounts.sol"; -import "@celo-contracts/common/interfaces/IEpochManagerEnabler.sol"; -import "@celo-contracts/governance/interfaces/ILockedGoldInitializer.sol"; -import "@celo-contracts-8/governance/interfaces/IValidatorsInitializer.sol"; -import "@celo-contracts/governance/interfaces/IElectionInitializer.sol"; -import "@celo-contracts/governance/interfaces/IEpochRewardsInitializer.sol"; -import "@celo-contracts/governance/interfaces/IBlockchainParametersInitializer.sol"; -import "@celo-contracts/governance/interfaces/IGovernanceSlasherInitializer.sol"; -import "@celo-contracts/governance/interfaces/IDoubleSigningSlasherInitializer.sol"; -import "@celo-contracts/governance/interfaces/IDowntimeSlasherInitializer.sol"; -import "@celo-contracts/governance/interfaces/IGovernanceApproverMultiSigInitializer.sol"; -import "@celo-contracts/governance/interfaces/IGovernanceInitializer.sol"; -import "@celo-contracts/governance/interfaces/ILockedGold.sol"; -import "@celo-contracts/governance/interfaces/IGovernance.sol"; -import "@celo-contracts/identity/interfaces/IRandomInitializer.sol"; -import "@celo-contracts/identity/interfaces/IEscrowInitializer.sol"; -import "@celo-contracts/identity/interfaces/IOdisPaymentsInitializer.sol"; -import "@celo-contracts/identity/interfaces/IFederatedAttestationsInitializer.sol"; -import "@celo-contracts/stability/interfaces/ISortedOraclesInitializer.sol"; -import "@celo-contracts/stability/interfaces/ISortedOracles.sol"; +import { IProxy } from "@celo-contracts/common/interfaces/IProxy.sol"; +import { IProxyFactory } from "@celo-contracts/common/interfaces/IProxyFactory.sol"; +import { IRegistry } from "@celo-contracts/common/interfaces/IRegistry.sol"; +import { IRegistryInitializer } from "@celo-contracts/common/interfaces/IRegistryInitializer.sol"; +import { IFreezer } from "@celo-contracts/common/interfaces/IFreezer.sol"; +import { IFreezerInitializer } from "@celo-contracts/common/interfaces/IFreezerInitializer.sol"; +import { ICeloTokenInitializer } from "@celo-contracts/common/interfaces/ICeloTokenInitializer.sol"; +import { IAccountsInitializer } from "@celo-contracts/common/interfaces/IAccountsInitializer.sol"; +import { IFeeHandlerSellerInitializer } from "@celo-contracts/common/interfaces/IFeeHandlerSellerInitializer.sol"; +import { IFeeHandler } from "@celo-contracts/common/interfaces/IFeeHandler.sol"; +import { IFeeHandlerInitializer } from "@celo-contracts/common/interfaces/IFeeHandlerInitializer.sol"; +import { IFeeCurrencyWhitelist } from "@celo-contracts/common/interfaces/IFeeCurrencyWhitelist.sol"; +import { IAccounts } from "@celo-contracts/common/interfaces/IAccounts.sol"; +import { IEpochManagerEnabler } from "@celo-contracts/common/interfaces/IEpochManagerEnabler.sol"; +import { ILockedGoldInitializer } from "@celo-contracts/governance/interfaces/ILockedGoldInitializer.sol"; +import { IValidatorsInitializer } from "@celo-contracts-8/governance/interfaces/IValidatorsInitializer.sol"; +import { IElectionInitializer } from "@celo-contracts/governance/interfaces/IElectionInitializer.sol"; +import { IEpochRewardsInitializer } from "@celo-contracts/governance/interfaces/IEpochRewardsInitializer.sol"; +import { IBlockchainParametersInitializer } from "@celo-contracts/governance/interfaces/IBlockchainParametersInitializer.sol"; +import { IGovernanceSlasherInitializer } from "@celo-contracts/governance/interfaces/IGovernanceSlasherInitializer.sol"; +import { IDoubleSigningSlasherInitializer } from "@celo-contracts/governance/interfaces/IDoubleSigningSlasherInitializer.sol"; +import { IDowntimeSlasherInitializer } from "@celo-contracts/governance/interfaces/IDowntimeSlasherInitializer.sol"; +import { IGovernanceApproverMultiSigInitializer } from "@celo-contracts/governance/interfaces/IGovernanceApproverMultiSigInitializer.sol"; +import { IGovernanceInitializer } from "@celo-contracts/governance/interfaces/IGovernanceInitializer.sol"; +import { ILockedGold } from "@celo-contracts/governance/interfaces/ILockedGold.sol"; +import { IGovernance } from "@celo-contracts/governance/interfaces/IGovernance.sol"; +import { IRandomInitializer } from "@celo-contracts/identity/interfaces/IRandomInitializer.sol"; +import { IEscrowInitializer } from "@celo-contracts/identity/interfaces/IEscrowInitializer.sol"; +import { IOdisPaymentsInitializer } from "@celo-contracts/identity/interfaces/IOdisPaymentsInitializer.sol"; +import { IFederatedAttestationsInitializer } from "@celo-contracts/identity/interfaces/IFederatedAttestationsInitializer.sol"; +import { ISortedOraclesInitializer } from "@celo-contracts/stability/interfaces/ISortedOraclesInitializer.sol"; +import { ISortedOracles } from "@celo-contracts/stability/interfaces/ISortedOracles.sol"; // Core contract imports on Solidity 0.8 -import "@celo-contracts-8/common/interfaces/IFeeCurrencyDirectoryInitializer.sol"; -import "@celo-contracts-8/common/interfaces/IGasPriceMinimumInitializer.sol"; -import "@celo-contracts-8/common/interfaces/ICeloUnreleasedTreasuryInitializer.sol"; -import "@celo-contracts-8/common/interfaces/IEpochManagerEnablerInitializer.sol"; -import "@celo-contracts-8/common/interfaces/IEpochManagerInitializer.sol"; -import "@celo-contracts-8/common/interfaces/IScoreManagerInitializer.sol"; -import "@celo-contracts-8/common/interfaces/IFeeCurrencyDirectory.sol"; -import "@celo-contracts-8/common/UsingRegistry.sol"; - -import "@test-sol/utils/SECP256K1.sol"; +import { IFeeCurrencyDirectoryInitializer } from "@celo-contracts-8/common/interfaces/IFeeCurrencyDirectoryInitializer.sol"; +import { IGasPriceMinimumInitializer } from "@celo-contracts-8/common/interfaces/IGasPriceMinimumInitializer.sol"; +import { ICeloUnreleasedTreasuryInitializer } from "@celo-contracts-8/common/interfaces/ICeloUnreleasedTreasuryInitializer.sol"; +import { IEpochManagerEnablerInitializer } from "@celo-contracts-8/common/interfaces/IEpochManagerEnablerInitializer.sol"; +import { IEpochManagerInitializer } from "@celo-contracts-8/common/interfaces/IEpochManagerInitializer.sol"; +import { IScoreManagerInitializer } from "@celo-contracts-8/common/interfaces/IScoreManagerInitializer.sol"; +import { IFeeCurrencyDirectory } from "@celo-contracts-8/common/interfaces/IFeeCurrencyDirectory.sol"; +import { UsingRegistry } from "@celo-contracts-8/common/UsingRegistry.sol"; + +// Test imports +import { ISECP256K1 } from "@test-sol/utils/SECP256K1.sol"; +import { ConstitutionHelper } from "@test-sol/utils/ConstitutionHelper.sol"; contract ForceTx { // event to trigger so a tx can be processed @@ -76,13 +78,19 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { struct InitParamsTunnel { // The number of blocks to delay a ValidatorGroup's commission uint256 commissionUpdateDelay; - uint256 downtimeGracePeriod; + } + + enum SolidityVersions { + SOLIDITY_05, + SOLIDITY_08 } IProxyFactory proxyFactory; uint256 proxyNonce = 0; + ConstitutionHelper.ConstitutionEntry[] internal constitutionEntries; + event Result(bytes); function create2deploy(bytes32 salt, bytes memory initCode) internal returns (address) { @@ -117,6 +125,7 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { function deployCodeTo(string memory what, address where) internal { deployCodeTo(what, "", 0, where); } + function deployCodeTo(string memory what, bytes memory args, address where) internal { deployCodeTo(what, args, 0, where); } @@ -132,13 +141,27 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { registry.setAddressFor(contractName, proxyAddress); } + function getSolidityVersionPath( + SolidityVersions version + ) public pure returns (string memory versionPath) { + if (version == SolidityVersions.SOLIDITY_05) { + return "out-truffle-compat/"; + } else { + return "out-truffle-compat-0.8/"; + } + revert("Solidity version not supported"); + } + function setImplementationOnProxy( IProxy proxy, string memory contractName, - bytes memory initializeCalldata + bytes memory initializeCalldata, + SolidityVersions solidityVersion ) public { + string memory versionString = getSolidityVersionPath(solidityVersion); + bytes memory implementationBytecode = vm.getCode( - string.concat("out/", contractName, ".sol/", contractName, ".json") + string.concat(versionString, contractName, ".sol/", contractName, ".json") ); bool testingDeployment = false; bytes memory initialCode = abi.encodePacked( @@ -156,14 +179,15 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { function deployProxiedContract( string memory contractName, address toProxy, - bytes memory initializeCalldata + bytes memory initializeCalldata, + SolidityVersions solidityVersion ) public { console.log("Deploying: ", contractName); deployCodeTo("Proxy.sol", abi.encode(false), toProxy); IProxy proxy = IProxy(toProxy); console.log(" Proxy deployed to:", toProxy); - setImplementationOnProxy(proxy, contractName, initializeCalldata); + setImplementationOnProxy(proxy, contractName, initializeCalldata, solidityVersion); addToRegistry(contractName, address(proxy)); console.log(" Done deploying:", contractName); console.log("------------------------------"); @@ -171,7 +195,8 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { function deployProxiedContract( string memory contractName, - bytes memory initializeCalldata + bytes memory initializeCalldata, + SolidityVersions solidityVersion ) public returns (address proxyAddress) { console.log("Deploying: ", contractName); @@ -186,7 +211,7 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { IProxy proxy = IProxy(proxyAddress); console.log(" Proxy deployed to:", address(proxy)); - setImplementationOnProxy(proxy, contractName, initializeCalldata); + setImplementationOnProxy(proxy, contractName, initializeCalldata, solidityVersion); addToRegistry(contractName, address(proxy)); console.log(" Done deploying:", contractName); @@ -203,7 +228,7 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { string memory json = vm.readFile("./migrations_sol/migrationsConfig.json"); proxyFactory = IProxyFactory( - create2deploy(0, vm.getCode("./out/ProxyFactory.sol/ProxyFactory.json")) + create2deploy(0, vm.getCode("./out-truffle-compat/ProxyFactory.sol/ProxyFactory.json")) ); // Proxy for Registry is already set, just deploy implementation @@ -270,7 +295,8 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { setImplementationOnProxy( IProxy(REGISTRY_ADDRESS), "Registry", - abi.encodeWithSelector(IRegistryInitializer.initialize.selector) + abi.encodeWithSelector(IRegistryInitializer.initialize.selector), + SolidityVersions.SOLIDITY_05 ); // set registry in registry itself console.log("Owner of the Registry Proxy is", IProxy(REGISTRY_ADDRESS)._getOwner()); @@ -281,21 +307,24 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { function migrateFreezer() public { deployProxiedContract( "Freezer", - abi.encodeWithSelector(IFreezerInitializer.initialize.selector) + abi.encodeWithSelector(IFreezerInitializer.initialize.selector), + SolidityVersions.SOLIDITY_05 ); } function migrateFeeCurrencyWhitelist() public { deployProxiedContract( "FeeCurrencyWhitelist", - abi.encodeWithSelector(IFeeCurrencyWhitelist.initialize.selector) + abi.encodeWithSelector(IFeeCurrencyWhitelist.initialize.selector), + SolidityVersions.SOLIDITY_05 ); } function migrateFeeCurrencyDirectory() public { deployProxiedContract( "FeeCurrencyDirectory", - abi.encodeWithSelector(IFeeCurrencyDirectoryInitializer.initialize.selector) + abi.encodeWithSelector(IFeeCurrencyDirectoryInitializer.initialize.selector), + SolidityVersions.SOLIDITY_08 ); } @@ -303,7 +332,8 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { // TODO change pre-funded addresses to make it match circulation supply address celoProxyAddress = deployProxiedContract( "GoldToken", - abi.encodeWithSelector(ICeloTokenInitializer.initialize.selector, REGISTRY_ADDRESS) + abi.encodeWithSelector(ICeloTokenInitializer.initialize.selector, REGISTRY_ADDRESS), + SolidityVersions.SOLIDITY_05 ); addToRegistry("CeloToken", celoProxyAddress); @@ -320,7 +350,8 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { ); deployProxiedContract( "SortedOracles", - abi.encodeWithSelector(ISortedOraclesInitializer.initialize.selector, reportExpirySeconds) + abi.encodeWithSelector(ISortedOraclesInitializer.initialize.selector, reportExpirySeconds), + SolidityVersions.SOLIDITY_05 ); } @@ -348,7 +379,8 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { targetDensity, adjustmentSpeed, baseFeeOpCodeActivationBlock - ) + ), + SolidityVersions.SOLIDITY_08 ); } @@ -372,7 +404,8 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { owners, required, internalRequired - ) + ), + SolidityVersions.SOLIDITY_05 ); } @@ -413,7 +446,8 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { assetAllocationWeights, tobinTax, tobinTaxReserveRatio - ) + ), + SolidityVersions.SOLIDITY_05 ); // TODO this should be a transfer from the deployer rather than a deal @@ -455,7 +489,8 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { initialBalanceAddresses, initialBalanceValues, exchangeIdentifier - ) + ), + SolidityVersions.SOLIDITY_05 ); if (frozen) { @@ -478,7 +513,7 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { /* Arbitrary intrinsic gas number take from existing `FeeCurrencyDirectory.t.sol` tests Source: https://github.com/celo-org/celo-monorepo/blob/2cec07d43328cf4216c62491a35eacc4960fffb6/packages/protocol/test-sol/common/FeeCurrencyDirectory.t.sol#L27 - */ + */ uint256 mockIntrinsicGas = 21000; IFeeCurrencyDirectory(registry.getAddressForStringOrDie("FeeCurrencyDirectory")) @@ -549,7 +584,8 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { reserveFraction, updateFrequency, minimumReports - ) + ), + SolidityVersions.SOLIDITY_05 ); bool frozen = abi.decode(json.parseRaw(".exchange.frozen"), (bool)); @@ -563,7 +599,8 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { function migrateAccount() public { address accountsProxyAddress = deployProxiedContract( "Accounts", - abi.encodeWithSelector(IAccountsInitializer.initialize.selector, REGISTRY_ADDRESS) + abi.encodeWithSelector(IAccountsInitializer.initialize.selector, REGISTRY_ADDRESS), + SolidityVersions.SOLIDITY_05 ); IAccounts(accountsProxyAddress).setEip712DomainSeparator(); @@ -578,7 +615,8 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { ILockedGoldInitializer.initialize.selector, REGISTRY_ADDRESS, unlockingPeriod - ) + ), + SolidityVersions.SOLIDITY_05 ); addToRegistry("LockedCelo", LockedCeloProxyAddress); @@ -601,14 +639,6 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { json.parseRaw(".validators.validatorLockedGoldRequirements.duration"), (uint256) ); - uint256 validatorScoreExponent = abi.decode( - json.parseRaw(".validators.validatorScoreParameters.exponent"), - (uint256) - ); - uint256 validatorScoreAdjustmentSpeed = abi.decode( - json.parseRaw(".validators.validatorScoreParameters.adjustmentSpeed"), - (uint256) - ); uint256 membershipHistoryLength = abi.decode( json.parseRaw(".validators.membershipHistoryLength"), (uint256) @@ -622,14 +652,9 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { json.parseRaw(".validators.commissionUpdateDelay"), (uint256) ); - uint256 downtimeGracePeriod = abi.decode( - json.parseRaw(".validators.downtimeGracePeriod"), - (uint256) - ); InitParamsTunnel memory initParamsTunnel = InitParamsTunnel({ - commissionUpdateDelay: commissionUpdateDelay, - downtimeGracePeriod: downtimeGracePeriod + commissionUpdateDelay: commissionUpdateDelay }); deployProxiedContract( @@ -641,13 +666,12 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { groupRequirementDuration, validatorRequirementValue, validatorRequirementDuration, - validatorScoreExponent, - validatorScoreAdjustmentSpeed, membershipHistoryLength, slashingMultiplierResetPeriod, maxGroupSize, initParamsTunnel - ) + ), + SolidityVersions.SOLIDITY_08 ); } @@ -678,7 +702,8 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { maxElectableValidators, maxNumGroupsVotedFor, electabilityThreshold - ) + ), + SolidityVersions.SOLIDITY_05 ); } @@ -744,7 +769,8 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { communityRewardFraction, carbonOffsettingPartner, carbonOffsettingFraction - ) + ), + SolidityVersions.SOLIDITY_05 ); bool frozen = abi.decode(json.parseRaw(".epochRewards.frozen"), (bool)); @@ -762,12 +788,20 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { deployProxiedContract( "Random", - abi.encodeWithSelector(IRandomInitializer.initialize.selector, randomnessBlockRetentionWindow) + abi.encodeWithSelector( + IRandomInitializer.initialize.selector, + randomnessBlockRetentionWindow + ), + SolidityVersions.SOLIDITY_05 ); } function migrateEscrow() public { - deployProxiedContract("Escrow", abi.encodeWithSelector(IEscrowInitializer.initialize.selector)); + deployProxiedContract( + "Escrow", + abi.encodeWithSelector(IEscrowInitializer.initialize.selector), + SolidityVersions.SOLIDITY_05 + ); } function migrateBlockchainParameters(string memory json) public { @@ -788,14 +822,16 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { gasForNonGoldCurrencies, gasLimit, lookbackWindow - ) + ), + SolidityVersions.SOLIDITY_05 ); } function migrateGovernanceSlasher() public { deployProxiedContract( "GovernanceSlasher", - abi.encodeWithSelector(IGovernanceSlasherInitializer.initialize.selector, REGISTRY_ADDRESS) + abi.encodeWithSelector(IGovernanceSlasherInitializer.initialize.selector, REGISTRY_ADDRESS), + SolidityVersions.SOLIDITY_05 ); getLockedGold().addSlasher("GovernanceSlasher"); @@ -812,7 +848,8 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { REGISTRY_ADDRESS, penalty, reward - ) + ), + SolidityVersions.SOLIDITY_05 ); getLockedGold().addSlasher("DoubleSigningSlasher"); @@ -834,7 +871,8 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { penalty, reward, slashableDowntime - ) + ), + SolidityVersions.SOLIDITY_05 ); getLockedGold().addSlasher("DowntimeSlasher"); @@ -859,14 +897,16 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { owners, required, internalRequired - ) + ), + SolidityVersions.SOLIDITY_05 ); } function migrateFederatedAttestations() public { deployProxiedContract( "FederatedAttestations", - abi.encodeWithSelector(IFederatedAttestationsInitializer.initialize.selector) + abi.encodeWithSelector(IFederatedAttestationsInitializer.initialize.selector), + SolidityVersions.SOLIDITY_05 ); } @@ -881,7 +921,8 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { REGISTRY_ADDRESS, tokenAddresses, minimumReports - ) + ), + SolidityVersions.SOLIDITY_05 ); } @@ -896,7 +937,8 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { REGISTRY_ADDRESS, tokenAddresses, minimumReports - ) + ), + SolidityVersions.SOLIDITY_05 ); } @@ -919,7 +961,8 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { handlers, newLimits, newMaxSlippages - ) + ), + SolidityVersions.SOLIDITY_05 ); IFeeHandler(feeHandlerProxyAddress).addToken( @@ -931,7 +974,8 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { function migrateOdisPayments() public { deployProxiedContract( "OdisPayments", - abi.encodeWithSelector(IOdisPaymentsInitializer.initialize.selector) + abi.encodeWithSelector(IOdisPaymentsInitializer.initialize.selector), + SolidityVersions.SOLIDITY_05 ); } @@ -941,21 +985,24 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { abi.encodeWithSelector( ICeloUnreleasedTreasuryInitializer.initialize.selector, REGISTRY_ADDRESS - ) + ), + SolidityVersions.SOLIDITY_08 ); } function migrateEpochManagerEnabler() public { deployProxiedContract( "EpochManagerEnabler", - abi.encodeWithSelector(IEpochManagerEnablerInitializer.initialize.selector, REGISTRY_ADDRESS) + abi.encodeWithSelector(IEpochManagerEnablerInitializer.initialize.selector, REGISTRY_ADDRESS), + SolidityVersions.SOLIDITY_08 ); } function migrateScoreManager() public { deployProxiedContract( "ScoreManager", - abi.encodeWithSelector(IScoreManagerInitializer.initialize.selector) + abi.encodeWithSelector(IScoreManagerInitializer.initialize.selector), + SolidityVersions.SOLIDITY_08 ); } @@ -971,7 +1018,8 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { IEpochManagerInitializer.initialize.selector, REGISTRY_ADDRESS, newEpochDuration - ) + ), + SolidityVersions.SOLIDITY_08 ); } @@ -1029,7 +1077,8 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { participationFloor, baselineUpdateFactor, baselineQuorumFactor - ) + ), + SolidityVersions.SOLIDITY_05 ); _setConstitution(governanceProxyAddress, json); @@ -1045,54 +1094,45 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { // BlockchainParameters ownership transitioned to governance in a follow-up script.? for (uint256 i = 0; i < contractsInRegistry.length; i++) { string memory contractToTransfer = contractsInRegistry[i]; - console.log("Transfering ownership of: ", contractToTransfer); + console.log("Transferring ownership of: ", contractToTransfer); IProxy proxy = IProxy(registry.getAddressForStringOrDie(contractToTransfer)); proxy._transferOwnership(governanceAddress); } } } - function _setConstitution(address governanceAddress, string memory json) public { - bool skipSetConstitution = abi.decode(json.parseRaw(".governance.skipSetConstitution"), (bool)); - IGovernance governance = IGovernance(governanceAddress); - string memory constitutionJson = vm.readFile("./governanceConstitution.json"); - string[] memory contractsKeys = vm.parseJsonKeys(constitutionJson, "$"); - - if (!skipSetConstitution) { - for (uint256 i = 0; i < contractsKeys.length; i++) { - // TODO need to handle the special case for "proxy" - string memory contractName = contractsKeys[i]; - - // TODO make helper function for string comparison - if (keccak256(abi.encodePacked(contractName)) == keccak256(abi.encodePacked("proxy"))) { - continue; - } - - console.log(string.concat("Setting constitution thresholds for: ", contractName)); - registry = IRegistry(REGISTRY_ADDRESS); - - address contractAddress = registry.getAddressForString(contractName); + function _setConstitution(address _governanceAddress, string memory _json) public { + bool skipSetConstitution_ = abi.decode( + _json.parseRaw(".governance.skipSetConstitution"), + (bool) + ); + IGovernance governance_ = IGovernance(_governanceAddress); + registry = IRegistry(REGISTRY_ADDRESS); - string[] memory functionNames = vm.parseJsonKeys( - constitutionJson, - string.concat(".", contractName) + if (!skipSetConstitution_) { + // read constitution + ConstitutionHelper.readConstitution(constitutionEntries, registry, vm); + + // loop over & set constitution + for (uint256 i = 0; i < constitutionEntries.length; i++) { + ConstitutionHelper.ConstitutionEntry memory entry_ = constitutionEntries[i]; + console.log( + "Setting constitution for contract: ", + entry_.contractName, + " on function: ", + entry_.functionName ); - for (uint256 j = 0; j < functionNames.length; j++) { - string memory functionName = functionNames[j]; - console.log( - string.concat(" Setting constitution thresholds for function : ", functionName) + + if (entry_.contractAddress != address(0)) { + governance_.setConstitution( + entry_.contractAddress, + entry_.functionSelector, + entry_.threshold ); - bytes4 functionHash = bytes4(keccak256(bytes(functionName))); - uint256 threshold = abi.decode( - constitutionJson.parseRaw(string.concat(".", contractName, ".", functionName)), - (uint256) + } else { + revert( + string.concat("Contract address is invalid to set constitution: ", entry_.contractName) ); - - if (contractAddress != address(0)) { - // TODO fix this case, it should never be zero - // contract key is likely wrong - governance.setConstitution(contractAddress, functionHash, threshold); - } } } } @@ -1113,21 +1153,8 @@ contract Migration is Script, UsingRegistry, MigrationsConstants { lockGold(amountToLock); address accountAddress = (new ForceTx()).identity(); - // these blobs are not checked in the contract - // TODO make this configurable - bytes memory newBlsPublicKey = abi.encodePacked( - bytes32(0x0101010101010101010101010101010101010101010101010101010101010102), - bytes32(0x0202020202020202020202020202020202020202020202020202020202020203), - bytes32(0x0303030303030303030303030303030303030303030303030303030303030304) - ); - bytes memory newBlsPop = abi.encodePacked( - bytes16(0x04040404040404040404040404040405), - bytes16(0x05050505050505050505050505050506), - bytes16(0x06060606060606060606060606060607) - ); - (bytes memory ecdsaPubKey, , , ) = _generateEcdsaPubKeyWithSigner(accountAddress, validatorKey); - getValidators().registerValidator(ecdsaPubKey, newBlsPublicKey, newBlsPop); + getValidators().registerValidatorNoBls(ecdsaPubKey); getValidators().affiliate(groupToAffiliate); console.log("Done registering validators"); diff --git a/packages/protocol/migrations_sol/MigrationL2.s.sol b/packages/protocol/migrations_sol/MigrationL2.s.sol index 72c80815eb1..b0c1f9de7af 100644 --- a/packages/protocol/migrations_sol/MigrationL2.s.sol +++ b/packages/protocol/migrations_sol/MigrationL2.s.sol @@ -5,6 +5,7 @@ import { MigrationsConstants } from "@migrations-sol/constants.sol"; // Foundry imports import "forge-std/console.sol"; +import "forge-std/StdJson.sol"; import "@celo-contracts/common/FixidityLib.sol"; import "@celo-contracts-8/common/UsingRegistry.sol"; @@ -12,15 +13,22 @@ import "@celo-contracts/common/interfaces/IEpochManagerEnabler.sol"; contract MigrationL2 is Script, MigrationsConstants, UsingRegistry { using FixidityLib for FixidityLib.Fraction; + using stdJson for string; /** * Entry point of the script */ function run() external { + string memory json = vm.readFile("./migrations_sol/migrationsConfig.json"); vm.startBroadcast(DEPLOYER_ACCOUNT); setupUsingRegistry(); - dealToCeloUnreleasedTreasury(); + + dealToCeloUnreleasedTreasuryAndReserve(json); + + vm.stopBroadcast(); + + vm.startBroadcast(DEPLOYER_ACCOUNT); initializeEpochManagerSystem(); @@ -32,14 +40,14 @@ contract MigrationL2 is Script, MigrationsConstants, UsingRegistry { setRegistry(REGISTRY_ADDRESS); } - function dealToCeloUnreleasedTreasury() public { + function dealToCeloUnreleasedTreasuryAndReserve(string memory json) public { vm.deal(address(getCeloUnreleasedTreasury()), L2_INITIAL_STASH_BALANCE); + uint256 initialBalance = abi.decode(json.parseRaw(".reserve.initialBalance"), (uint256)); + vm.deal(registry.getAddressForOrDie(RESERVE_REGISTRY_ID), initialBalance); } function initializeEpochManagerSystem() public { console.log("Initializing EpochManager system"); - address[] memory firstElected = getValidators().getRegisteredValidators(); - IEpochManager epochManager = getEpochManager(); address epochManagerEnablerAddress = registry.getAddressForOrDie( EPOCH_MANAGER_ENABLER_REGISTRY_ID ); diff --git a/packages/protocol/migrations_sol/constants.sol b/packages/protocol/migrations_sol/constants.sol index c47449a5e1c..5fd57f90ac6 100644 --- a/packages/protocol/migrations_sol/constants.sol +++ b/packages/protocol/migrations_sol/constants.sol @@ -7,7 +7,9 @@ contract MigrationsConstants is TestConstants { address constant DEPLOYER_ACCOUNT = 0xf39Fd6e51aad88F6F4ce6aB8827279cffFb92266; // List of contracts that are expected to be in Registry.sol - string[27] contractsInRegistry = [ + // TODO: Probably should be synced with Migration.s.sol + // TODO: Change to be automatically populated and length calculated + string[26] contractsInRegistry = [ "Accounts", "BlockchainParameters", "CeloUnreleasedTreasury", @@ -20,20 +22,102 @@ contract MigrationsConstants is TestConstants { "EpochManager", "Escrow", "FederatedAttestations", - "FeeCurrencyWhitelist", "FeeCurrencyDirectory", - "Freezer", "FeeHandler", + "Freezer", "Governance", "GovernanceSlasher", "LockedGold", "OdisPayments", "Random", - "Registry", + "Registry", // FIXME: Should Registry be inside Registry? + "ScoreManager", "SortedOracles", - "UniswapFeeHandlerSeller", - "MentoFeeHandlerSeller", "Validators", - "ScoreManager" + "MentoFeeHandlerSeller", + "UniswapFeeHandlerSeller" + ]; + + string[26] contractsInRegistryPath = [ + string.concat("out-truffle-compat/", "Accounts", ".sol/", "Accounts", ".json"), + string.concat( + "out-truffle-compat/", + "BlockchainParameters", + ".sol/", + "BlockchainParameters", + ".json" + ), + string.concat( + "out-truffle-compat-0.8/", + "CeloUnreleasedTreasury", + ".sol/", + "CeloUnreleasedTreasury", + ".json" + ), + string.concat("out-truffle-compat/", "CeloToken", ".sol/", "CeloToken", ".json"), + string.concat( + "out-truffle-compat/", + "DoubleSigningSlasher", + ".sol/", + "DoubleSigningSlasher", + ".json" + ), + string.concat("out-truffle-compat/", "DowntimeSlasher", ".sol/", "DowntimeSlasher", ".json"), + string.concat("out-truffle-compat/", "Election", ".sol/", "Election", ".json"), + string.concat("out-truffle-compat/", "EpochRewards", ".sol/", "EpochRewards", ".json"), + string.concat( + "out-truffle-compat-0.8/", + "EpochManagerEnabler", + ".sol/", + "EpochManagerEnabler", + ".json" + ), + string.concat("out-truffle-compat-0.8/", "EpochManager", ".sol/", "EpochManager", ".json"), + string.concat("out-truffle-compat/", "Escrow", ".sol/", "Escrow", ".json"), + string.concat( + "out-truffle-compat/", + "FederatedAttestations", + ".sol/", + "FederatedAttestations", + ".json" + ), + string.concat( + "out-truffle-compat-0.8/", + "FeeCurrencyDirectory", + ".sol/", + "FeeCurrencyDirectory", + ".json" + ), + string.concat("out-truffle-compat/", "FeeHandler", ".sol/", "FeeHandler", ".json"), + string.concat("out-truffle-compat/", "Freezer", ".sol/", "Freezer", ".json"), + string.concat("out-truffle-compat/", "Governance", ".sol/", "Governance", ".json"), + string.concat( + "out-truffle-compat/", + "GovernanceSlasher", + ".sol/", + "GovernanceSlasher", + ".json" + ), + string.concat("out-truffle-compat/", "LockedGold", ".sol/", "LockedGold", ".json"), + string.concat("out-truffle-compat/", "OdisPayments", ".sol/", "OdisPayments", ".json"), + string.concat("out-truffle-compat/", "Random", ".sol/", "Random", ".json"), + string.concat("out-truffle-compat/", "Registry", ".sol/", "Registry", ".json"), + string.concat("out-truffle-compat-0.8/", "ScoreManager", ".sol/", "ScoreManager", ".json"), + string.concat("out-truffle-compat/", "SortedOracles", ".sol/", "SortedOracles", ".json"), + string.concat("out-truffle-compat/", "Validators", ".sol/", "Validators", ".json"), + string.concat( + "out-truffle-compat/", + "MentoFeeHandlerSeller", + ".sol/", + "MentoFeeHandlerSeller", + ".json" + ), + string.concat( + "out-truffle-compat/", + "UniswapFeeHandlerSeller", + ".sol/", + "UniswapFeeHandlerSeller", + ".json" + ) ]; } diff --git a/packages/protocol/migrations_sol/migrationsConfig.json b/packages/protocol/migrations_sol/migrationsConfig.json index ac1bdafe623..ef86124a464 100644 --- a/packages/protocol/migrations_sol/migrationsConfig.json +++ b/packages/protocol/migrations_sol/migrationsConfig.json @@ -73,10 +73,6 @@ "duration": 5184000, "duration_help": "60 * DAY" }, - "validatorScoreParameters": { - "exponent": 10, - "adjustmentSpeed": 100000000000000000000000 - }, "membershipHistoryLength": 60, "commissionUpdateDelay": 51840, "commissionUpdateDelay_help": "(3 * DAY) / 5", @@ -84,7 +80,6 @@ "maxGroupSize": 2, "slashingMultiplierResetPeriod": 2592000, "slashingPenaltyResetPeriod_help": "30 * DAY", - "downtimeGracePeriod": 0, "validatorKeys": [], "attestationKeys": [], "groupName": "cLabs", diff --git a/packages/protocol/migrations_ts/00_initial_migration.ts b/packages/protocol/migrations_ts/00_initial_migration.ts deleted file mode 100644 index 735072f14cc..00000000000 --- a/packages/protocol/migrations_ts/00_initial_migration.ts +++ /dev/null @@ -1,15 +0,0 @@ -/* tslint:disable no-console */ -import { ArtifactsSingleton } from '../lib/artifactsSingleton' -import { networks } from '../truffle-config.js' - -module.exports = async (deployer: any, network: any) => { - const Migrations = artifacts.require('./Migrations.sol') - deployer.deploy(Migrations) - - const currentNetwork = { ...networks[network], name: network } - - console.log('Current network is', JSON.stringify(currentNetwork)) - // Instad of setting this in a singleton, it could have been set in every migration - // but it would have required quite a lot of refactoring - ArtifactsSingleton.setNetwork(currentNetwork) -} diff --git a/packages/protocol/migrations_ts/01_libraries.ts b/packages/protocol/migrations_ts/01_libraries.ts deleted file mode 100644 index 6dd5b4047e7..00000000000 --- a/packages/protocol/migrations_ts/01_libraries.ts +++ /dev/null @@ -1,21 +0,0 @@ -import { SOLIDITY_08_PACKAGE } from '@celo/protocol/contractPackages' -import { ArtifactsSingleton } from '@celo/protocol/lib/artifactsSingleton' -import { makeTruffleContractForMigration } from '@celo/protocol/lib/web3-utils' -import { linkedLibraries } from '@celo/protocol/migrationsConfig' - -module.exports = (deployer: any) => { - Object.keys(linkedLibraries).forEach((lib: string) => { - const artifacts08 = ArtifactsSingleton.getInstance(SOLIDITY_08_PACKAGE, artifacts) - - for (const contractName of SOLIDITY_08_PACKAGE.contracts) { - makeTruffleContractForMigration(contractName, SOLIDITY_08_PACKAGE, web3) - } - - const Library = artifacts08.require(lib, artifacts) - deployer.deploy(Library) - const Contracts = linkedLibraries[lib].map((contract: string) => - artifacts08.require(contract, artifacts) - ) - deployer.link(Library, Contracts) - }) -} diff --git a/packages/protocol/migrations_ts/02_registry.ts b/packages/protocol/migrations_ts/02_registry.ts deleted file mode 100644 index 1b7448007d1..00000000000 --- a/packages/protocol/migrations_ts/02_registry.ts +++ /dev/null @@ -1,31 +0,0 @@ -import { build_directory, config } from '@celo/protocol/migrationsConfig' -import { RegistryInstance } from 'types' -import { setInitialProxyImplementation } from '../lib/web3-utils' - -const Artifactor = require('@truffle/artifactor') - -const name = 'Registry' -const Contract = artifacts.require(name) -const ContractProxy = artifacts.require(name + 'Proxy') - -module.exports = (deployer: any, _networkName: string, _accounts: string[]) => { - // eslint-disable-next-line: no-console - console.info('Deploying Registry') - deployer.deploy(ContractProxy) - deployer.deploy(Contract, false) - deployer.then(async () => { - const networkId = await web3.eth.net.getId() - // Hack to create build artifact. - const artifact = ContractProxy._json - artifact.networks[networkId] = { - address: config.registry.predeployedProxyAddress, - // @ts-ignore - transactionHash: '0x', - } - const contractsDir = build_directory + '/contracts' - const artifactor = new Artifactor(contractsDir) - - await artifactor.save(artifact) - await setInitialProxyImplementation(web3, artifacts, name) - }) -} diff --git a/packages/protocol/migrations_ts/03_freezer.ts b/packages/protocol/migrations_ts/03_freezer.ts deleted file mode 100644 index 17d9b171c72..00000000000 --- a/packages/protocol/migrations_ts/03_freezer.ts +++ /dev/null @@ -1,14 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { deploymentForCoreContract } from '@celo/protocol/lib/web3-utils' -import { FreezerInstance } from 'types' - -const initializeArgs = async (): Promise => { - return [] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.Freezer, - initializeArgs -) diff --git a/packages/protocol/migrations_ts/03_whitelist.ts b/packages/protocol/migrations_ts/03_whitelist.ts deleted file mode 100644 index 135c3f6764b..00000000000 --- a/packages/protocol/migrations_ts/03_whitelist.ts +++ /dev/null @@ -1,9 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { deploymentForCoreContract } from '@celo/protocol/lib/web3-utils' -import { FeeCurrencyWhitelistInstance } from 'types' - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.FeeCurrencyWhitelist -) diff --git a/packages/protocol/migrations_ts/04_goldtoken.ts b/packages/protocol/migrations_ts/04_goldtoken.ts deleted file mode 100644 index 5b1d9a1e321..00000000000 --- a/packages/protocol/migrations_ts/04_goldtoken.ts +++ /dev/null @@ -1,29 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { - deploymentForCoreContract, - getDeployedProxiedContract, -} from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { FreezerInstance, GoldTokenInstance, RegistryInstance } from 'types' - -const initializeArgs = async () => { - return [config.registry.predeployedProxyAddress] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.GoldToken, - initializeArgs, - async (goldToken: GoldTokenInstance) => { - if (config.goldToken.frozen) { - const freezer: FreezerInstance = await getDeployedProxiedContract( - 'Freezer', - artifacts - ) - await freezer.freeze(goldToken.address) - } - const registry = await getDeployedProxiedContract('Registry', artifacts) - await registry.setAddressFor(CeloContractName.CeloToken, goldToken.address) - } -) diff --git a/packages/protocol/migrations_ts/05_sortedoracles.ts b/packages/protocol/migrations_ts/05_sortedoracles.ts deleted file mode 100644 index 8bdacad01c9..00000000000 --- a/packages/protocol/migrations_ts/05_sortedoracles.ts +++ /dev/null @@ -1,11 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { deploymentForCoreContract } from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { SortedOraclesInstance } from 'types' - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.SortedOracles, - async () => [config.oracles.reportExpiry] -) diff --git a/packages/protocol/migrations_ts/06_gaspriceminimum.ts b/packages/protocol/migrations_ts/06_gaspriceminimum.ts deleted file mode 100644 index 5f19ea971d9..00000000000 --- a/packages/protocol/migrations_ts/06_gaspriceminimum.ts +++ /dev/null @@ -1,25 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { deploymentForCoreContract } from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { toFixed } from '@celo/utils/lib/fixidity' -import { GasPriceMinimumInstance } from 'types/08' -import { SOLIDITY_08_PACKAGE } from '../contractPackages' - -const initializeArgs = async (): Promise => { - return [ - config.registry.predeployedProxyAddress, - config.gasPriceMinimum.minimumFloor, - toFixed(config.gasPriceMinimum.targetDensity).toString(), - toFixed(config.gasPriceMinimum.adjustmentSpeed).toString(), - config.gasPriceMinimum.baseFeeOpCodeActivationBlock, - ] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.GasPriceMinimum, - initializeArgs, - undefined, - SOLIDITY_08_PACKAGE -) diff --git a/packages/protocol/migrations_ts/07_reserve_spender_multisig.ts b/packages/protocol/migrations_ts/07_reserve_spender_multisig.ts deleted file mode 100644 index b51f727043d..00000000000 --- a/packages/protocol/migrations_ts/07_reserve_spender_multisig.ts +++ /dev/null @@ -1,32 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { - deploymentForProxiedContract, - transferOwnershipOfProxy, -} from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { ReserveSpenderMultiSigInstance } from 'types/mento' -import { MENTO_PACKAGE } from '../contractPackages' -import { ArtifactsSingleton } from '../lib/artifactsSingleton' - -const initializeArgs = async (): Promise => { - return [ - config.reserveSpenderMultiSig.signatories, - config.reserveSpenderMultiSig.numRequiredConfirmations, - config.reserveSpenderMultiSig.numInternalRequiredConfirmations, - ] -} - -module.exports = deploymentForProxiedContract( - web3, - artifacts, - CeloContractName.ReserveSpenderMultiSig, - initializeArgs, - async (reserveSpenderMultiSig: ReserveSpenderMultiSigInstance) => { - await transferOwnershipOfProxy( - CeloContractName.ReserveSpenderMultiSig, - reserveSpenderMultiSig.address, - ArtifactsSingleton.getInstance(MENTO_PACKAGE) - ) - }, - MENTO_PACKAGE -) diff --git a/packages/protocol/migrations_ts/08_reserve.ts b/packages/protocol/migrations_ts/08_reserve.ts deleted file mode 100644 index 8c0b1efce37..00000000000 --- a/packages/protocol/migrations_ts/08_reserve.ts +++ /dev/null @@ -1,88 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { - deploymentForCoreContract, - getDeployedProxiedContract, -} from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { toFixed } from '@celo/utils/lib/fixidity' -import { RegistryInstance } from 'types' -import { ReserveInstance, ReserveSpenderMultiSigInstance } from 'types/mento' -import Web3 from 'web3' -import { MENTO_PACKAGE } from '../contractPackages' -import { ArtifactsSingleton } from '../lib/artifactsSingleton' - -import Web3Utils = require('web3-utils') - -const truffle = require('@celo/protocol/truffle-config.js') - -const initializeArgs = async (): Promise< - [string, number, string, number, number, string[], string[], string, string] -> => { - const registry: RegistryInstance = await getDeployedProxiedContract( - 'Registry', - artifacts - ) - return [ - registry.address, - config.reserve.tobinTaxStalenessThreshold, - config.reserve.dailySpendingRatio, - 0, // frozenGold cannot be set until the reserve us funded - 0, // frozenGold cannot be set until the reserve us funded - config.reserve.assetAllocationSymbols.map((assetSymbol) => - Web3Utils.padRight(Web3Utils.utf8ToHex(assetSymbol), 64) - ), - config.reserve.assetAllocationWeights.map((assetWeight) => toFixed(assetWeight).toFixed()), - config.reserve.tobinTax, - config.reserve.tobinTaxReserveRatio, - ] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.Reserve, - initializeArgs, - async (reserve: ReserveInstance, web3: Web3, networkName: string) => { - config.reserve.spenders.forEach(async (spender) => { - console.info(`Marking ${spender} as a Reserve spender`) - await reserve.addSpender(spender) - }) - config.reserve.otherAddresses.forEach(async (otherAddress) => { - console.info(`Marking ${otherAddress} as an "otherReserveAddress"`) - await reserve.addOtherReserveAddress(otherAddress) - }) - - if (config.reserve.initialBalance) { - const network: any = truffle.networks[networkName] - - const block = await web3.eth.getBlock('latest') - const nextGasPrice = Math.ceil(block.baseFeePerGas) - - await web3.eth.sendTransaction({ - from: network.from, - to: reserve.address, - value: web3.utils.toWei(config.reserve.initialBalance.toString(), 'ether').toString(), - // @ts-ignore: typing not available https://github.com/web3/web3.js/issues/6123#issuecomment-1568250373 - type: 0, - gasPrice: nextGasPrice, - }) - - if (config.reserve.frozenAssetsStartBalance && config.reserve.frozenAssetsDays) { - console.info('Setting frozen asset parameters on the Reserve') - await reserve.setFrozenGold( - config.reserve.frozenAssetsStartBalance, - config.reserve.frozenAssetsDays - ) - } - } - - const reserveSpenderMultiSig: ReserveSpenderMultiSigInstance = - await getDeployedProxiedContract( - CeloContractName.ReserveSpenderMultiSig, - ArtifactsSingleton.getInstance(MENTO_PACKAGE) - ) - console.info(`Marking ${reserveSpenderMultiSig.address} as a reserve spender`) - await reserve.addSpender(reserveSpenderMultiSig.address) - }, - MENTO_PACKAGE -) diff --git a/packages/protocol/migrations_ts/09_01_stableToken_EUR.ts b/packages/protocol/migrations_ts/09_01_stableToken_EUR.ts deleted file mode 100644 index a8c67aa4fdb..00000000000 --- a/packages/protocol/migrations_ts/09_01_stableToken_EUR.ts +++ /dev/null @@ -1,88 +0,0 @@ -import { ensureLeading0x, eqAddress, NULL_ADDRESS } from '@celo/base/lib/address' -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { - deploymentForCoreContract, - getDeployedProxiedContract, -} from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { toFixed } from '@celo/utils/lib/fixidity' -import { FeeCurrencyWhitelistInstance, FreezerInstance, SortedOraclesInstance } from 'types' -import { ReserveInstance, StableTokenEURInstance } from 'types/mento' -import Web3 from 'web3' -import { MENTO_PACKAGE } from '../contractPackages' -import { ArtifactsSingleton } from '../lib/artifactsSingleton' - -const truffle = require('@celo/protocol/truffle-config.js') - -const initializeArgs = async (): Promise => { - const rate = toFixed(config.stableTokenEUR.inflationRate) - return [ - config.stableTokenEUR.tokenName, - config.stableTokenEUR.tokenSymbol, - config.stableTokenEUR.decimals, - config.registry.predeployedProxyAddress, - rate.toString(), - config.stableTokenEUR.inflationPeriod, - config.stableTokenEUR.initialBalances.addresses, - config.stableTokenEUR.initialBalances.values, - 'ExchangeEUR', - ] -} - -// TODO make this general (do it!) -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.StableTokenEUR, - initializeArgs, - async (stableToken: StableTokenEURInstance, _web3: Web3, networkName: string) => { - if (config.stableTokenEUR.frozen) { - const freezer: FreezerInstance = await getDeployedProxiedContract( - 'Freezer', - artifacts - ) - await freezer.freeze(stableToken.address) - } - const sortedOracles: SortedOraclesInstance = - await getDeployedProxiedContract('SortedOracles', artifacts) - - for (const oracle of config.stableTokenEUR.oracles) { - console.info(`Adding ${oracle} as an Oracle for StableToken (EUR)`) - await sortedOracles.addOracle(stableToken.address, ensureLeading0x(oracle)) - } - - const goldPrice = config.stableTokenEUR.goldPrice - if (goldPrice) { - const fromAddress = truffle.networks[networkName].from - const isOracle = config.stableTokenEUR.oracles.some((o) => eqAddress(o, fromAddress)) - if (!isOracle) { - console.warn( - `Gold price specified in migration but ${fromAddress} not explicitly authorized as oracle, authorizing...` - ) - await sortedOracles.addOracle(stableToken.address, ensureLeading0x(fromAddress)) - } - console.info('Reporting price of StableToken (EUR) to oracle') - await sortedOracles.report( - stableToken.address, - toFixed(goldPrice), - NULL_ADDRESS, - NULL_ADDRESS - ) - const reserve: ReserveInstance = await getDeployedProxiedContract( - 'Reserve', - ArtifactsSingleton.getInstance(MENTO_PACKAGE) - ) - console.info('Adding StableToken (EUR) to Reserve') - await reserve.addToken(stableToken.address) - } - - console.info('Whitelisting StableToken (EUR) as a fee currency') - const feeCurrencyWhitelist: FeeCurrencyWhitelistInstance = - await getDeployedProxiedContract( - 'FeeCurrencyWhitelist', - artifacts - ) - await feeCurrencyWhitelist.addToken(stableToken.address) - }, - MENTO_PACKAGE -) diff --git a/packages/protocol/migrations_ts/09_02_stableToken_BRL.ts b/packages/protocol/migrations_ts/09_02_stableToken_BRL.ts deleted file mode 100644 index 2e42f9ff353..00000000000 --- a/packages/protocol/migrations_ts/09_02_stableToken_BRL.ts +++ /dev/null @@ -1,88 +0,0 @@ -import { ensureLeading0x, eqAddress, NULL_ADDRESS } from '@celo/base/lib/address' -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { - deploymentForCoreContract, - getDeployedProxiedContract, -} from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { toFixed } from '@celo/utils/lib/fixidity' -import { FeeCurrencyWhitelistInstance, FreezerInstance, SortedOraclesInstance } from 'types' -import { ReserveInstance, StableTokenBRLInstance } from 'types/mento' -import Web3 from 'web3' -import { MENTO_PACKAGE } from '../contractPackages' -import { ArtifactsSingleton } from '../lib/artifactsSingleton' - -const truffle = require('@celo/protocol/truffle-config.js') - -const initializeArgs = async (): Promise => { - const rate = toFixed(config.stableTokenBRL.inflationRate) - return [ - config.stableTokenBRL.tokenName, - config.stableTokenBRL.tokenSymbol, - config.stableTokenBRL.decimals, - config.registry.predeployedProxyAddress, - rate.toString(), - config.stableTokenBRL.inflationPeriod, - config.stableTokenBRL.initialBalances.addresses, - config.stableTokenBRL.initialBalances.values, - 'ExchangeBRL', - ] -} - -// TODO make this general -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.StableTokenBRL, - initializeArgs, - async (stableToken: StableTokenBRLInstance, _web3: Web3, networkName: string) => { - if (config.stableTokenBRL.frozen) { - const freezer: FreezerInstance = await getDeployedProxiedContract( - 'Freezer', - artifacts - ) - await freezer.freeze(stableToken.address) - } - const sortedOracles: SortedOraclesInstance = - await getDeployedProxiedContract('SortedOracles', artifacts) - - for (const oracle of config.stableTokenBRL.oracles) { - console.info(`Adding ${oracle} as an Oracle for StableToken (BRL)`) - await sortedOracles.addOracle(stableToken.address, ensureLeading0x(oracle)) - } - - const goldPrice = config.stableTokenBRL.goldPrice - if (goldPrice) { - const fromAddress = truffle.networks[networkName].from - const isOracle = config.stableTokenBRL.oracles.some((o) => eqAddress(o, fromAddress)) - if (!isOracle) { - console.warn( - `Gold price specified in migration but ${fromAddress} not explicitly authorized as oracle, authorizing...` - ) - await sortedOracles.addOracle(stableToken.address, ensureLeading0x(fromAddress)) - } - console.info('Reporting price of StableToken (BRL) to oracle') - await sortedOracles.report( - stableToken.address, - toFixed(goldPrice), - NULL_ADDRESS, - NULL_ADDRESS - ) - const reserve: ReserveInstance = await getDeployedProxiedContract( - 'Reserve', - ArtifactsSingleton.getInstance(MENTO_PACKAGE) - ) - console.info('Adding StableToken (BRL) to Reserve') - await reserve.addToken(stableToken.address) - } - - console.info('Whitelisting StableToken (BRL) as a fee currency') - const feeCurrencyWhitelist: FeeCurrencyWhitelistInstance = - await getDeployedProxiedContract( - 'FeeCurrencyWhitelist', - artifacts - ) - await feeCurrencyWhitelist.addToken(stableToken.address) - }, - MENTO_PACKAGE -) diff --git a/packages/protocol/migrations_ts/09_0_stabletoken_USD.ts b/packages/protocol/migrations_ts/09_0_stabletoken_USD.ts deleted file mode 100644 index 44cc2d52d5a..00000000000 --- a/packages/protocol/migrations_ts/09_0_stabletoken_USD.ts +++ /dev/null @@ -1,88 +0,0 @@ -import { ensureLeading0x, eqAddress, NULL_ADDRESS } from '@celo/base/lib/address' -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { - deploymentForCoreContract, - getDeployedProxiedContract, -} from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { toFixed } from '@celo/utils/lib/fixidity' -import { FeeCurrencyWhitelistInstance, FreezerInstance, SortedOraclesInstance } from 'types' -import { ReserveInstance, StableTokenInstance } from 'types/mento' -import Web3 from 'web3' -import { MENTO_PACKAGE } from '../contractPackages' -import { ArtifactsSingleton } from '../lib/artifactsSingleton' - -const truffle = require('@celo/protocol/truffle-config.js') - -const initializeArgs = async (): Promise => { - const rate = toFixed(config.stableToken.inflationRate) - return [ - config.stableToken.tokenName, - config.stableToken.tokenSymbol, - config.stableToken.decimals, - config.registry.predeployedProxyAddress, - rate.toString(), - config.stableToken.inflationPeriod, - config.stableToken.initialBalances.addresses, - config.stableToken.initialBalances.values, - 'Exchange', // USD - ] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.StableToken, - initializeArgs, - async (stableToken: StableTokenInstance, _web3: Web3, networkName: string) => { - if (config.stableToken.frozen) { - const freezer: FreezerInstance = await getDeployedProxiedContract( - 'Freezer', - artifacts - ) - await freezer.freeze(stableToken.address) - } - - const sortedOracles: SortedOraclesInstance = - await getDeployedProxiedContract('SortedOracles', artifacts) - - for (const oracle of config.stableToken.oracles) { - console.info(`Adding ${oracle} as an Oracle for StableToken (USD)`) - await sortedOracles.addOracle(stableToken.address, ensureLeading0x(oracle)) - } - - const goldPrice = config.stableToken.goldPrice - if (goldPrice) { - const fromAddress = truffle.networks[networkName].from - const isOracle = config.stableToken.oracles.some((o) => eqAddress(o, fromAddress)) - if (!isOracle) { - console.warn( - `Gold price specified in migration but ${fromAddress} not explicitly authorized as oracle, authorizing...` - ) - await sortedOracles.addOracle(stableToken.address, ensureLeading0x(fromAddress)) - } - console.info('Reporting price of StableToken (USD) to oracle') - await sortedOracles.report( - stableToken.address, - toFixed(goldPrice), - NULL_ADDRESS, - NULL_ADDRESS - ) - const reserve: ReserveInstance = await getDeployedProxiedContract( - 'Reserve', - ArtifactsSingleton.getInstance(MENTO_PACKAGE) - ) - console.info('Adding StableToken (USD) to Reserve') - await reserve.addToken(stableToken.address) - } - - console.info('Whitelisting StableToken (USD) as a fee currency') - const feeCurrencyWhitelist: FeeCurrencyWhitelistInstance = - await getDeployedProxiedContract( - 'FeeCurrencyWhitelist', - artifacts - ) - await feeCurrencyWhitelist.addToken(stableToken.address) - }, - MENTO_PACKAGE -) diff --git a/packages/protocol/migrations_ts/10_01_exchange_EUR.ts b/packages/protocol/migrations_ts/10_01_exchange_EUR.ts deleted file mode 100644 index aeb69eb208d..00000000000 --- a/packages/protocol/migrations_ts/10_01_exchange_EUR.ts +++ /dev/null @@ -1,47 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { - deploymentForCoreContract, - getDeployedProxiedContract, -} from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { toFixed } from '@celo/utils/lib/fixidity' -import { FreezerInstance } from 'types' -import { ExchangeEURInstance, ReserveInstance } from 'types/mento' -import { MENTO_PACKAGE } from '../contractPackages' -import { ArtifactsSingleton } from '../lib/artifactsSingleton' - -const initializeArgs = async (): Promise => { - return [ - config.registry.predeployedProxyAddress, - CeloContractName.StableTokenEUR, - toFixed(config.exchange.spread).toString(), - toFixed(config.exchange.reserveFraction).toString(), - config.exchange.updateFrequency, - config.exchange.minimumReports, - ] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.ExchangeEUR, - initializeArgs, - async (exchange: ExchangeEURInstance) => { - if (config.exchange.frozen) { - const freezer: FreezerInstance = await getDeployedProxiedContract( - 'Freezer', - artifacts - ) - await freezer.freeze(exchange.address) - } - - const reserve: ReserveInstance = await getDeployedProxiedContract( - 'Reserve', - ArtifactsSingleton.getInstance(MENTO_PACKAGE) - ) - // cUSD doesn't need to be added as it is currently harcoded in Reserve.sol - await reserve.addExchangeSpender(exchange.address) - await exchange.activateStable() - }, - MENTO_PACKAGE -) diff --git a/packages/protocol/migrations_ts/10_02_exchange_BRL.ts b/packages/protocol/migrations_ts/10_02_exchange_BRL.ts deleted file mode 100644 index c4f4155b11d..00000000000 --- a/packages/protocol/migrations_ts/10_02_exchange_BRL.ts +++ /dev/null @@ -1,47 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { - deploymentForCoreContract, - getDeployedProxiedContract, -} from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { toFixed } from '@celo/utils/lib/fixidity' -import { FreezerInstance } from 'types' -import { ExchangeBRLInstance, ReserveInstance } from 'types/mento' -import { MENTO_PACKAGE } from '../contractPackages' -import { ArtifactsSingleton } from '../lib/artifactsSingleton' - -const initializeArgs = async (): Promise => { - return [ - config.registry.predeployedProxyAddress, - CeloContractName.StableTokenBRL, - toFixed(config.exchange.spread).toString(), - toFixed(config.exchange.reserveFraction).toString(), - config.exchange.updateFrequency, - config.exchange.minimumReports, - ] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.ExchangeBRL, - initializeArgs, - async (exchange: ExchangeBRLInstance) => { - if (config.exchange.frozen) { - const freezer: FreezerInstance = await getDeployedProxiedContract( - 'Freezer', - artifacts - ) - await freezer.freeze(exchange.address) - } - - const reserve: ReserveInstance = await getDeployedProxiedContract( - 'Reserve', - ArtifactsSingleton.getInstance(MENTO_PACKAGE) - ) - // cUSD doesn't need to be added as it is currently harcoded in Reserve.sol - await reserve.addExchangeSpender(exchange.address) - await exchange.activateStable() - }, - MENTO_PACKAGE -) diff --git a/packages/protocol/migrations_ts/10_0_exchange_USD.ts b/packages/protocol/migrations_ts/10_0_exchange_USD.ts deleted file mode 100644 index 22e2ced22fe..00000000000 --- a/packages/protocol/migrations_ts/10_0_exchange_USD.ts +++ /dev/null @@ -1,39 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { - deploymentForCoreContract, - getDeployedProxiedContract, -} from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { toFixed } from '@celo/utils/lib/fixidity' -import { FreezerInstance } from 'types' -import { ExchangeInstance } from 'types/mento' -import { MENTO_PACKAGE } from '../contractPackages' - -const initializeArgs = async (): Promise => { - return [ - config.registry.predeployedProxyAddress, - CeloContractName.StableToken, - toFixed(config.exchange.spread).toString(), - toFixed(config.exchange.reserveFraction).toString(), - config.exchange.updateFrequency, - config.exchange.minimumReports, - ] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.Exchange, - initializeArgs, - async (exchange: ExchangeInstance) => { - if (config.exchange.frozen) { - const freezer: FreezerInstance = await getDeployedProxiedContract( - 'Freezer', - artifacts - ) - await freezer.freeze(exchange.address) - } - await exchange.activateStable() - }, - MENTO_PACKAGE -) diff --git a/packages/protocol/migrations_ts/11_accounts.ts b/packages/protocol/migrations_ts/11_accounts.ts deleted file mode 100644 index d72f57ad6b9..00000000000 --- a/packages/protocol/migrations_ts/11_accounts.ts +++ /dev/null @@ -1,24 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { - deploymentForCoreContract, - getDeployedProxiedContract, -} from '@celo/protocol/lib/web3-utils' -import { AccountsInstance, RegistryInstance } from 'types' - -const initializeArgs = async (): Promise<[string]> => { - const registry: RegistryInstance = await getDeployedProxiedContract( - 'Registry', - artifacts - ) - return [registry.address] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.Accounts, - initializeArgs, - async (accounts: AccountsInstance) => { - await accounts.setEip712DomainSeparator() - } -) diff --git a/packages/protocol/migrations_ts/12_lockedgold.ts b/packages/protocol/migrations_ts/12_lockedgold.ts deleted file mode 100644 index 4404ec9fbe9..00000000000 --- a/packages/protocol/migrations_ts/12_lockedgold.ts +++ /dev/null @@ -1,22 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { - deploymentForCoreContract, - getDeployedProxiedContract, -} from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { LockedGoldInstance, RegistryInstance } from 'types' - -const initializeArgs = async (): Promise => { - return [config.registry.predeployedProxyAddress, config.lockedGold.unlockingPeriod] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.LockedGold, - initializeArgs, - async (lockedGold: LockedGoldInstance) => { - const registry = await getDeployedProxiedContract('Registry', artifacts) - await registry.setAddressFor(CeloContractName.LockedCelo, lockedGold.address) - } -) diff --git a/packages/protocol/migrations_ts/13_validators.ts b/packages/protocol/migrations_ts/13_validators.ts deleted file mode 100644 index fc29dfdd635..00000000000 --- a/packages/protocol/migrations_ts/13_validators.ts +++ /dev/null @@ -1,34 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { deploymentForCoreContract } from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { toFixed } from '@celo/utils/lib/fixidity' -import { ValidatorsInstance } from 'types/08' -import { SOLIDITY_08_PACKAGE } from '../contractPackages' - -const initializeArgs = async (): Promise => { - return [ - config.registry.predeployedProxyAddress, - config.validators.groupLockedGoldRequirements.value, - config.validators.groupLockedGoldRequirements.duration, - config.validators.validatorLockedGoldRequirements.value, - config.validators.validatorLockedGoldRequirements.duration, - config.validators.validatorScoreParameters.exponent, - toFixed(config.validators.validatorScoreParameters.adjustmentSpeed).toFixed(), - config.validators.membershipHistoryLength, - config.validators.slashingPenaltyResetPeriod, - config.validators.maxGroupSize, - { - commissionUpdateDelay: config.validators.commissionUpdateDelay, - downtimeGracePeriod: config.validators.downtimeGracePeriod, - }, - ] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.Validators, - initializeArgs, - undefined, - SOLIDITY_08_PACKAGE -) diff --git a/packages/protocol/migrations_ts/14_election.ts b/packages/protocol/migrations_ts/14_election.ts deleted file mode 100644 index 94bc55add3a..00000000000 --- a/packages/protocol/migrations_ts/14_election.ts +++ /dev/null @@ -1,22 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { deploymentForCoreContract } from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { toFixed } from '@celo/utils/lib/fixidity' -import { ElectionInstance } from 'types' - -const initializeArgs = async (): Promise => { - return [ - config.registry.predeployedProxyAddress, - config.election.minElectableValidators, - config.election.maxElectableValidators, - config.election.maxVotesPerAccount, - toFixed(config.election.electabilityThreshold).toFixed(), - ] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.Election, - initializeArgs -) diff --git a/packages/protocol/migrations_ts/15_epoch_rewards.ts b/packages/protocol/migrations_ts/15_epoch_rewards.ts deleted file mode 100644 index 0c6db1fd743..00000000000 --- a/packages/protocol/migrations_ts/15_epoch_rewards.ts +++ /dev/null @@ -1,41 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { - deploymentForCoreContract, - getDeployedProxiedContract, -} from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { toFixed } from '@celo/utils/lib/fixidity' -import { EpochRewardsInstance, FreezerInstance } from 'types' - -const initializeArgs = async (): Promise => { - return [ - config.registry.predeployedProxyAddress, - toFixed(config.epochRewards.targetVotingYieldParameters.initial).toFixed(), - toFixed(config.epochRewards.targetVotingYieldParameters.max).toFixed(), - toFixed(config.epochRewards.targetVotingYieldParameters.adjustmentFactor).toFixed(), - toFixed(config.epochRewards.rewardsMultiplierParameters.max).toFixed(), - toFixed(config.epochRewards.rewardsMultiplierParameters.adjustmentFactors.underspend).toFixed(), - toFixed(config.epochRewards.rewardsMultiplierParameters.adjustmentFactors.overspend).toFixed(), - toFixed(config.epochRewards.targetVotingGoldFraction).toFixed(), - config.epochRewards.maxValidatorEpochPayment, - toFixed(config.epochRewards.communityRewardFraction).toFixed(), - config.epochRewards.carbonOffsettingPartner, - toFixed(config.epochRewards.carbonOffsettingFraction).toFixed(), - ] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.EpochRewards, - initializeArgs, - async (epochRewards: EpochRewardsInstance) => { - if (config.epochRewards.frozen) { - const freezer: FreezerInstance = await getDeployedProxiedContract( - 'Freezer', - artifacts - ) - await freezer.freeze(epochRewards.address) - } - } -) diff --git a/packages/protocol/migrations_ts/16_random.ts b/packages/protocol/migrations_ts/16_random.ts deleted file mode 100644 index 0069a349c9a..00000000000 --- a/packages/protocol/migrations_ts/16_random.ts +++ /dev/null @@ -1,15 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { deploymentForCoreContract } from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { RandomInstance } from 'types' - -const initializeArgs = async (_: string): Promise => { - return [config.random.randomnessBlockRetentionWindow] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.Random, - initializeArgs -) diff --git a/packages/protocol/migrations_ts/17_attestations.ts b/packages/protocol/migrations_ts/17_attestations.ts deleted file mode 100644 index 7abb69f26c7..00000000000 --- a/packages/protocol/migrations_ts/17_attestations.ts +++ /dev/null @@ -1,38 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { - convertToContractDecimals, - deploymentForCoreContract, - getDeployedProxiedContract, -} from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { AttestationsInstance } from 'types' -import { StableTokenInstance } from 'types/mento' -import { MENTO_PACKAGE } from '../contractPackages' -import { ArtifactsSingleton } from '../lib/artifactsSingleton' - -const initializeArgs = async (): Promise<[string, string, string, string, string[], string[]]> => { - const stableToken: StableTokenInstance = await getDeployedProxiedContract( - 'StableToken', - ArtifactsSingleton.getInstance(MENTO_PACKAGE) - ) - - const attestationFee = await convertToContractDecimals( - config.attestations.attestationRequestFeeInDollars, - stableToken - ) - return [ - config.registry.predeployedProxyAddress, - config.attestations.attestationExpiryBlocks.toString(), - config.attestations.selectIssuersWaitBlocks.toString(), - config.attestations.maxAttestations.toString(), - [stableToken.address], - [attestationFee.toString()], - ] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.Attestations, - initializeArgs -) diff --git a/packages/protocol/migrations_ts/18_escrow.ts b/packages/protocol/migrations_ts/18_escrow.ts deleted file mode 100644 index 643258b7f12..00000000000 --- a/packages/protocol/migrations_ts/18_escrow.ts +++ /dev/null @@ -1,10 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { deploymentForCoreContract } from '@celo/protocol/lib/web3-utils' -import { EscrowInstance } from 'types' - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.Escrow, - async () => [] -) diff --git a/packages/protocol/migrations_ts/19_blockchainparams.ts b/packages/protocol/migrations_ts/19_blockchainparams.ts deleted file mode 100644 index f64538a5109..00000000000 --- a/packages/protocol/migrations_ts/19_blockchainparams.ts +++ /dev/null @@ -1,19 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { deploymentForCoreContract } from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { BlockchainParametersInstance } from 'types' - -const initializeArgs = async (_: string): Promise => { - return [ - config.blockchainParameters.gasForNonGoldCurrencies, - config.blockchainParameters.deploymentBlockGasLimit, - config.blockchainParameters.uptimeLookbackWindow, - ] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.BlockchainParameters, - initializeArgs -) diff --git a/packages/protocol/migrations_ts/20_governance_slasher.ts b/packages/protocol/migrations_ts/20_governance_slasher.ts deleted file mode 100644 index 474950d7b21..00000000000 --- a/packages/protocol/migrations_ts/20_governance_slasher.ts +++ /dev/null @@ -1,26 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { - deploymentForCoreContract, - getDeployedProxiedContract, -} from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { GovernanceSlasherInstance, LockedGoldInstance } from 'types' - -const initializeArgs = async (_: string): Promise => { - return [config.registry.predeployedProxyAddress] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.GovernanceSlasher, - initializeArgs, - async () => { - console.info('Adding GovernanceSlasher contract as slasher.') - const lockedGold: LockedGoldInstance = await getDeployedProxiedContract( - 'LockedGold', - artifacts - ) - await lockedGold.addSlasher(CeloContractName.GovernanceSlasher) - } -) diff --git a/packages/protocol/migrations_ts/21_double_signing_slasher.ts b/packages/protocol/migrations_ts/21_double_signing_slasher.ts deleted file mode 100644 index 31ffcfee98d..00000000000 --- a/packages/protocol/migrations_ts/21_double_signing_slasher.ts +++ /dev/null @@ -1,30 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { - deploymentForCoreContract, - getDeployedProxiedContract, -} from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { DoubleSigningSlasherInstance, LockedGoldInstance } from 'types' - -const initializeArgs = async (_: string): Promise => { - return [ - config.registry.predeployedProxyAddress, - config.doubleSigningSlasher.penalty, - config.doubleSigningSlasher.reward, - ] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.DoubleSigningSlasher, - initializeArgs, - async () => { - console.info('Adding DoubleSigningSlasher contract as slasher.') - const lockedGold: LockedGoldInstance = await getDeployedProxiedContract( - 'LockedGold', - artifacts - ) - await lockedGold.addSlasher(CeloContractName.DoubleSigningSlasher) - } -) diff --git a/packages/protocol/migrations_ts/22_downtime_slasher.ts b/packages/protocol/migrations_ts/22_downtime_slasher.ts deleted file mode 100644 index ec729e0107e..00000000000 --- a/packages/protocol/migrations_ts/22_downtime_slasher.ts +++ /dev/null @@ -1,31 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { - deploymentForCoreContract, - getDeployedProxiedContract, -} from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { DowntimeSlasherInstance, LockedGoldInstance } from 'types' - -const initializeArgs = async (_: string): Promise => { - return [ - config.registry.predeployedProxyAddress, - config.downtimeSlasher.penalty, - config.downtimeSlasher.reward, - config.downtimeSlasher.slashableDowntime, - ] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.DowntimeSlasher, - initializeArgs, - async () => { - console.info('Adding DowntimeSlasher contract as slasher.') - const lockedGold: LockedGoldInstance = await getDeployedProxiedContract( - 'LockedGold', - artifacts - ) - await lockedGold.addSlasher(CeloContractName.DowntimeSlasher) - } -) diff --git a/packages/protocol/migrations_ts/23_governance_approver_multisig.ts b/packages/protocol/migrations_ts/23_governance_approver_multisig.ts deleted file mode 100644 index 52dd4b191e5..00000000000 --- a/packages/protocol/migrations_ts/23_governance_approver_multisig.ts +++ /dev/null @@ -1,29 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { - deploymentForProxiedContract, - transferOwnershipOfProxy, -} from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { GovernanceApproverMultiSigInstance } from 'types' - -const initializeArgs = async (): Promise => { - return [ - config.governanceApproverMultiSig.signatories, - config.governanceApproverMultiSig.numRequiredConfirmations, - config.governanceApproverMultiSig.numInternalRequiredConfirmations, - ] -} - -module.exports = deploymentForProxiedContract( - web3, - artifacts, - CeloContractName.GovernanceApproverMultiSig, - initializeArgs, - async (governanceApproverMultiSig: GovernanceApproverMultiSigInstance) => { - await transferOwnershipOfProxy( - CeloContractName.GovernanceApproverMultiSig, - governanceApproverMultiSig.address, - artifacts - ) - } -) diff --git a/packages/protocol/migrations_ts/24_grandamento.ts b/packages/protocol/migrations_ts/24_grandamento.ts deleted file mode 100644 index 1d676b51eca..00000000000 --- a/packages/protocol/migrations_ts/24_grandamento.ts +++ /dev/null @@ -1,41 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { - deploymentForCoreContract, - getDeployedProxiedContract, -} from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { toFixed } from '@celo/utils/lib/fixidity' -import { GrandaMentoInstance, ReserveInstance } from 'types/mento' -import { MENTO_PACKAGE } from '../contractPackages' -import { ArtifactsSingleton } from '../lib/artifactsSingleton' - -const initializeArgs = async (): Promise => { - return [ - config.registry.predeployedProxyAddress, - config.grandaMento.approver, - toFixed(config.grandaMento.maxApprovalExchangeRateChange).toString(), - toFixed(config.grandaMento.spread).toString(), - config.grandaMento.vetoPeriodSeconds, - ] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.GrandaMento, - initializeArgs, - async (grandaMento: GrandaMentoInstance) => { - // Add as a spender of the Reserve - const reserve: ReserveInstance = await getDeployedProxiedContract( - 'Reserve', - ArtifactsSingleton.getInstance(MENTO_PACKAGE) - ) - await reserve.addExchangeSpender(grandaMento.address) - - for (const stableToken of Object.keys(config.grandaMento.stableTokenExchangeLimits)) { - const { min, max } = config.grandaMento.stableTokenExchangeLimits[stableToken] - await grandaMento.setStableTokenExchangeLimits(stableToken, min, max) - } - }, - MENTO_PACKAGE -) diff --git a/packages/protocol/migrations_ts/25_federated_attestations.ts b/packages/protocol/migrations_ts/25_federated_attestations.ts deleted file mode 100644 index 43bace75b06..00000000000 --- a/packages/protocol/migrations_ts/25_federated_attestations.ts +++ /dev/null @@ -1,14 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { deploymentForCoreContract } from '@celo/protocol/lib/web3-utils' -import { FederatedAttestationsInstance } from 'types' - -const initializeArgs = async () => { - return [] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.FederatedAttestations, - initializeArgs -) diff --git a/packages/protocol/migrations_ts/26_00_mento_fee_handler_seller.ts b/packages/protocol/migrations_ts/26_00_mento_fee_handler_seller.ts deleted file mode 100644 index ffae1b7434b..00000000000 --- a/packages/protocol/migrations_ts/26_00_mento_fee_handler_seller.ts +++ /dev/null @@ -1,15 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { deploymentForCoreContract } from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { MentoFeeHandlerSellerInstance } from 'types' - -const initializeArgs = async () => { - return [config.registry.predeployedProxyAddress, [], []] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.MentoFeeHandlerSeller, - initializeArgs -) diff --git a/packages/protocol/migrations_ts/26_01_uniswap_fee_handler_seller.ts b/packages/protocol/migrations_ts/26_01_uniswap_fee_handler_seller.ts deleted file mode 100644 index 5e1e192687c..00000000000 --- a/packages/protocol/migrations_ts/26_01_uniswap_fee_handler_seller.ts +++ /dev/null @@ -1,15 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { deploymentForCoreContract } from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { UniswapFeeHandlerSellerInstance } from 'types' - -const initializeArgs = async () => { - return [config.registry.predeployedProxyAddress, [], []] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.UniswapFeeHandlerSeller, - initializeArgs -) diff --git a/packages/protocol/migrations_ts/26_100_fee_currency_directory.ts b/packages/protocol/migrations_ts/26_100_fee_currency_directory.ts deleted file mode 100644 index 604d4276a89..00000000000 --- a/packages/protocol/migrations_ts/26_100_fee_currency_directory.ts +++ /dev/null @@ -1,52 +0,0 @@ -import { ArtifactsSingleton } from '@celo/protocol/lib/artifactsSingleton' -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { - deploymentForCoreContract, - getDeployedProxiedContract, -} from '@celo/protocol/lib/web3-utils' -import { SortedOraclesInstance, StableTokenInstance } from '@celo/protocol/types/typechain-mento' -import { FeeCurrencyDirectoryInstance } from 'types/08' -import { MENTO_PACKAGE, SOLIDITY_08_PACKAGE } from '../contractPackages' - -const initializeArgs = async (): Promise => { - return [] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.FeeCurrencyDirectory, - initializeArgs, - async (feeCurrencyDirectory: FeeCurrencyDirectoryInstance, _web3: Web3, networkName: string) => { - const sortedOracles = await getDeployedProxiedContract( - 'SortedOracles', - artifacts - ) - - for (const token of ['StableToken', 'StableTokenEUR', 'StableTokenBRL']) { - const stableToken: StableTokenInstance = - await getDeployedProxiedContract( - token, - ArtifactsSingleton.getInstance(MENTO_PACKAGE) - ) - console.log( - 'setting currency config for', - token, - 'with address', - stableToken.address, - 'and adapter address', - sortedOracles.address, - 'on network', - networkName - ) - await feeCurrencyDirectory.setCurrencyConfig(stableToken.address, sortedOracles.address, 1) - } - - console.log( - 'Fee currency directory deployed and registered!!!', - feeCurrencyDirectory.address, - networkName - ) - }, - SOLIDITY_08_PACKAGE -) diff --git a/packages/protocol/migrations_ts/26_101_score_manager.ts b/packages/protocol/migrations_ts/26_101_score_manager.ts deleted file mode 100644 index 93a74670975..00000000000 --- a/packages/protocol/migrations_ts/26_101_score_manager.ts +++ /dev/null @@ -1,17 +0,0 @@ -import { SOLIDITY_08_PACKAGE } from '@celo/protocol/contractPackages' -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { deploymentForCoreContract } from '@celo/protocol/lib/web3-utils' -import { ScoreManagerInstance } from 'types/08' - -const initializeArgs = async (): Promise => { - return [] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.ScoreManager, - initializeArgs, - undefined, - SOLIDITY_08_PACKAGE -) diff --git a/packages/protocol/migrations_ts/26_102_epoch_manager_enabler.ts b/packages/protocol/migrations_ts/26_102_epoch_manager_enabler.ts deleted file mode 100644 index 6a9a3d9220f..00000000000 --- a/packages/protocol/migrations_ts/26_102_epoch_manager_enabler.ts +++ /dev/null @@ -1,18 +0,0 @@ -import { SOLIDITY_08_PACKAGE } from '@celo/protocol/contractPackages' -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { deploymentForCoreContract } from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { EpochManagerEnablerInstance } from 'types/08' - -const initializeArgs = async (): Promise => { - return [config.registry.predeployedProxyAddress] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.EpochManagerEnabler, - initializeArgs, - undefined, - SOLIDITY_08_PACKAGE -) diff --git a/packages/protocol/migrations_ts/26_103_epoch_manager.ts b/packages/protocol/migrations_ts/26_103_epoch_manager.ts deleted file mode 100644 index 4956ebcc139..00000000000 --- a/packages/protocol/migrations_ts/26_103_epoch_manager.ts +++ /dev/null @@ -1,18 +0,0 @@ -import { SOLIDITY_08_PACKAGE } from '@celo/protocol/contractPackages' -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { deploymentForCoreContract } from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { EpochManagerInstance } from 'types/08' - -const initializeArgs = async (): Promise => { - return [config.registry.predeployedProxyAddress, config.epochManager.newEpochDuration] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.EpochManager, - initializeArgs, - undefined, - SOLIDITY_08_PACKAGE -) diff --git a/packages/protocol/migrations_ts/26_99_fee_handler.ts b/packages/protocol/migrations_ts/26_99_fee_handler.ts deleted file mode 100644 index a9b283db903..00000000000 --- a/packages/protocol/migrations_ts/26_99_fee_handler.ts +++ /dev/null @@ -1,47 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { - deploymentForCoreContract, - getDeployedProxiedContract, -} from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { toFixed } from '@celo/utils/lib/fixidity' -import { FeeHandlerInstance, MentoFeeHandlerSellerInstance } from 'types' -import { StableTokenInstance } from 'types/mento' -import { MENTO_PACKAGE } from '../contractPackages' -import { ArtifactsSingleton } from '../lib/artifactsSingleton' - -const initializeArgs = async () => { - return [ - config.registry.predeployedProxyAddress, - config.feeHandler.beneficiaryAddress, - toFixed(config.feeHandler.burnFraction).toString(), - [], - [], - [], - [], - ] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.FeeHandler, - initializeArgs, - async (feeHandler: FeeHandlerInstance) => { - for (const token of ['StableToken', 'StableTokenEUR', 'StableTokenBRL']) { - const stableToken: StableTokenInstance = - await getDeployedProxiedContract( - token, - ArtifactsSingleton.getInstance(MENTO_PACKAGE) - ) - - const mentoFeeHandlerSeller: MentoFeeHandlerSellerInstance = - await getDeployedProxiedContract( - CeloContractName.MentoFeeHandlerSeller, - artifacts - ) - - await feeHandler.addToken(stableToken.address, mentoFeeHandlerSeller.address) - } - } -) diff --git a/packages/protocol/migrations_ts/27_odispayments.ts b/packages/protocol/migrations_ts/27_odispayments.ts deleted file mode 100644 index 947b95214d0..00000000000 --- a/packages/protocol/migrations_ts/27_odispayments.ts +++ /dev/null @@ -1,14 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { deploymentForCoreContract } from '@celo/protocol/lib/web3-utils' -import { OdisPaymentsInstance } from 'types' - -const initializeArgs = async () => { - return [] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.OdisPayments, - initializeArgs -) diff --git a/packages/protocol/migrations_ts/28_celo_unreleased_treasury.ts b/packages/protocol/migrations_ts/28_celo_unreleased_treasury.ts deleted file mode 100644 index 81f0c6d1f7d..00000000000 --- a/packages/protocol/migrations_ts/28_celo_unreleased_treasury.ts +++ /dev/null @@ -1,25 +0,0 @@ -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { - deploymentForCoreContract, - getDeployedProxiedContract, -} from '@celo/protocol/lib/web3-utils' -import { RegistryInstance } from '@celo/protocol/types' -import { CeloUnreleasedTreasuryInstance } from '@celo/protocol/types/08' -import { SOLIDITY_08_PACKAGE } from '../contractPackages' - -const initializeArgs = async (): Promise<[string]> => { - const registry: RegistryInstance = await getDeployedProxiedContract( - 'Registry', - artifacts - ) - return [registry.address] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.CeloUnreleasedTreasury, - initializeArgs, - undefined, - SOLIDITY_08_PACKAGE -) diff --git a/packages/protocol/migrations_ts/29_governance.ts b/packages/protocol/migrations_ts/29_governance.ts deleted file mode 100644 index 634f711a5ac..00000000000 --- a/packages/protocol/migrations_ts/29_governance.ts +++ /dev/null @@ -1,159 +0,0 @@ -import { constitution } from '@celo/protocol/governanceConstitution' -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { - deploymentForCoreContract, - getDeployedProxiedContract, - getFunctionSelectorsForContractProxy, - transferOwnershipOfProxyAndImplementation, -} from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { toFixed } from '@celo/utils/lib/fixidity' -import { GovernanceApproverMultiSigInstance, GovernanceInstance } from 'types' -import { MENTO_PACKAGE, SOLIDITY_08_PACKAGE } from '../contractPackages' - -import { ArtifactsSingleton } from '../lib/artifactsSingleton' - -const initializeArgs = async (networkName: string): Promise => { - const governanceApproverMultiSig: GovernanceApproverMultiSigInstance = - await getDeployedProxiedContract( - CeloContractName.GovernanceApproverMultiSig, - artifacts - ) - const networkFrom: string = require('@celo/protocol/truffle-config.js').networks[networkName].from - const approver: string = config.governanceApproverMultiSig.useMultiSig - ? governanceApproverMultiSig.address - : networkFrom - - return [ - config.registry.predeployedProxyAddress, - approver, - config.governance.concurrentProposals, - web3.utils.toWei(config.governance.minDeposit.toString(), 'ether'), - config.governance.queueExpiry, - config.governance.dequeueFrequency, - config.governance.referendumStageDuration, - config.governance.executionStageDuration, - toFixed(config.governance.participationBaseline).toString(), - toFixed(config.governance.participationBaselineFloor).toString(), - toFixed(config.governance.participationBaselineUpdateFactor).toString(), - toFixed(config.governance.participationBaselineQuorumFactor).toString(), - ] -} - -module.exports = deploymentForCoreContract( - web3, - artifacts, - CeloContractName.Governance, - initializeArgs, - async (governance: GovernanceInstance) => { - if (!config.governance.skipSetConstitution) { - console.info('Setting constitution thresholds') - const constitutionContractNames = Object.keys(constitution).filter( - (contractName) => contractName !== 'proxy' - ) - - for (const contractName of constitutionContractNames) { - console.info(`\tSetting constitution thresholds for ${contractName}`) - - const artifactsObject = ArtifactsSingleton.getInstance( - constitution[contractName].__contractPackage, - artifacts - ) - - const contract = await getDeployedProxiedContract( - contractName, - artifactsObject - ) - - const selectors = getFunctionSelectorsForContractProxy( - contract, - artifactsObject.getProxy(contractName, artifacts), - web3 - ) - - selectors.default = ['0x00000000'] - const thresholds = { ...constitution.proxy, ...constitution[contractName] } - - const tresholdKeys = Object.keys(thresholds).filter( - (method) => method !== '__contractPackage' - ) - - for (const func of tresholdKeys) { - await Promise.all( - selectors[func].map((selector) => - governance.setConstitution(contract.address, selector, toFixed(thresholds[func])) - ) - ) - } - } - } - - // This list probably needs a refactor - const proxyAndImplementationOwnedByGovernance = [ - { - contracts: [ - 'Accounts', - 'Attestations', - // BlockchainParameters ownership transitioned to governance in a follow-up script. - // 'BlockchainParameters', - 'DoubleSigningSlasher', - 'DowntimeSlasher', - 'Election', - 'EpochRewards', - 'Escrow', - 'FederatedAttestations', - 'FeeCurrencyWhitelist', - 'Freezer', - 'FeeHandler', - 'GoldToken', - 'Governance', - 'GovernanceSlasher', - 'LockedGold', - 'OdisPayments', - 'Random', - 'Registry', - 'SortedOracles', - ], - }, - { - contracts: [ - 'Exchange', - 'ExchangeEUR', - 'ExchangeBRL', - 'GrandaMento', - 'Reserve', - 'StableToken', - 'StableTokenEUR', - 'StableTokenBRL', - ], - __contractPackage: MENTO_PACKAGE, - }, - { - contracts: [ - 'GasPriceMinimum', - 'Validators', - 'EpochManager', - 'ScoreManager', - 'EpochManagerEnabler', - ], - __contractPackage: SOLIDITY_08_PACKAGE, - }, - ] - - if (!config.governance.skipTransferOwnership) { - for (const contractPackage of proxyAndImplementationOwnedByGovernance) { - const artifactsInstance = ArtifactsSingleton.getInstance( - contractPackage.__contractPackage, - artifacts - ) - for (const contractName of contractPackage.contracts) { - await transferOwnershipOfProxyAndImplementation( - contractName, - governance.address, - artifactsInstance - ) - } - } - } - } -) diff --git a/packages/protocol/migrations_ts/30_elect_validators.ts b/packages/protocol/migrations_ts/30_elect_validators.ts deleted file mode 100644 index 17d35a0b6c7..00000000000 --- a/packages/protocol/migrations_ts/30_elect_validators.ts +++ /dev/null @@ -1,408 +0,0 @@ -import { NULL_ADDRESS } from '@celo/base/lib/address' -import { CeloTxObject } from '@celo/connect' -import { getBlsPoP, getBlsPublicKey } from '@celo/cryptographic-utils/lib/bls' -import { SOLIDITY_08_PACKAGE } from '@celo/protocol/contractPackages' -import { ArtifactsSingleton } from '@celo/protocol/lib/artifactsSingleton' -import { - getDeployedProxiedContract, - sendTransactionWithPrivateKey, -} from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { privateKeyToAddress, privateKeyToPublicKey } from '@celo/utils/lib/address' -import { toFixed } from '@celo/utils/lib/fixidity' -import { signMessage } from '@celo/utils/lib/signatureUtils' -import { BigNumber } from 'bignumber.js' -import { AccountsInstance, ElectionInstance, LockedGoldInstance } from 'types' -import { ValidatorsInstance } from 'types/08' -import Web3 from 'web3' - -const truffle = require('@celo/protocol/truffle-config.js') -const bip39 = require('bip39') -const hdkey = require('ethereumjs-wallet/hdkey') - -function ganachePrivateKey(num) { - const seed = bip39.mnemonicToSeedSync(truffle.networks.development.mnemonic) - const hdk = hdkey.fromMasterSeed(seed) - const addrNode = hdk.derivePath("m/44'/60'/0'/0/" + num) // m/44'/60'/0'/0/0 is derivation path for the first account. m/44'/60'/0'/0/1 is the derivation path for the second account and so on - return addrNode.getWallet().getPrivateKey().toString('hex') -} - -function serializeKeystore(keystore: any) { - return Buffer.from(JSON.stringify(keystore)).toString('base64') -} - -let isGanache = false - -// Will include Ganache private keys for accounts 7-9, used for group keys -let extraKeys = [] - -async function sendTransaction( - web3: Web3, - tx: CeloTxObject | null, - privateKey: string, - txArgs: any -) { - if (isGanache) { - const from = privateKeyToAddress(privateKey) - if (tx == null) { - await web3.eth.sendTransaction({ ...txArgs, from }) - } else { - await tx.send({ ...txArgs, from, gasLimit: '10000000' }) - } - } else { - await sendTransactionWithPrivateKey(web3, tx, privateKey, txArgs) - } -} - -async function lockGold( - accounts: AccountsInstance, - lockedGold: LockedGoldInstance, - value: BigNumber, - privateKey: string -) { - // @ts-ignore - const createAccountTx = accounts.contract.methods.createAccount() - await sendTransaction(web3, createAccountTx, privateKey, { - to: accounts.address, - }) - - // @ts-ignore - const lockTx = lockedGold.contract.methods.lock() - - await sendTransaction(web3, lockTx, privateKey, { - to: lockedGold.address, - value: value.toString(10), - }) -} - -function createAccountOrUseFromGanache() { - if (isGanache) { - const privateKey = extraKeys.pop() - return { address: privateKeyToAddress(privateKey), privateKey } - } else { - return web3.eth.accounts.create() - } -} - -async function registerValidatorGroup( - name: string, - accounts: AccountsInstance, - lockedGold: LockedGoldInstance, - validators: ValidatorsInstance, - privateKey: string, - lockedGoldValue: BigNumber -) { - // Validators can't also be validator groups, so we create a new account to register the - // validator group with, and set the name of the group account to the private key of this account - // encrypted with the private key of the first validator, so that the group private key - // can be recovered. - const account = createAccountOrUseFromGanache() - - // We do not use web3 provided by Truffle since the eth.accounts.encrypt behaves differently - // in the version we use elsewhere. - const encryptionWeb3 = new Web3('http://localhost:8545') - const encryptedPrivateKey = encryptionWeb3.eth.accounts.encrypt(account.privateKey, privateKey) - const encodedKey = serializeKeystore(encryptedPrivateKey) - - // Add a premium to cover tx fees - const v = lockedGoldValue.times(1.01).integerValue() - - console.info(` - send funds ${v} to group address ${account.address}}`) - await sendTransaction(web3, null, privateKey, { - to: account.address, - value: v, - }) - - console.info(` - lock gold`) - await lockGold(accounts, lockedGold, lockedGoldValue, account.privateKey) - - console.info(` - setName`) - // @ts-ignore - const setNameTx = accounts.contract.methods.setName(`${name} ${encodedKey}`) - await sendTransaction(web3, setNameTx, account.privateKey, { - to: accounts.address, - }) - - console.info(` - registerValidatorGroup`) - // @ts-ignore - const tx = validators.contract.methods.registerValidatorGroup( - toFixed(config.validators.commission).toString() - ) - - await sendTransaction(web3, tx, account.privateKey, { - to: validators.address, - }) - - return account -} - -async function registerValidator( - accounts: AccountsInstance, - lockedGold: LockedGoldInstance, - validators: ValidatorsInstance, - validatorPrivateKey: string, - attestationKey: string, - groupAddress: string, - index: number, - networkName: string -) { - const valName = `CLabs Validator #${index} on ${networkName}` - - console.info(` - lockGold ${valName}`) - await lockGold( - accounts, - lockedGold, - config.validators.validatorLockedGoldRequirements.value, - validatorPrivateKey - ) - - console.info(` - setName ${valName}`) - - // @ts-ignore - const setNameTx = accounts.contract.methods.setName(valName) - await sendTransaction(web3, setNameTx, validatorPrivateKey, { - to: accounts.address, - }) - - console.info(` - registerValidator ${valName}`) - const publicKey = privateKeyToPublicKey(validatorPrivateKey) - const blsPublicKey = getBlsPublicKey(validatorPrivateKey) - const blsPoP = getBlsPoP(privateKeyToAddress(validatorPrivateKey), validatorPrivateKey) - - // @ts-ignore - const registerTx = validators.contract.methods.registerValidator(publicKey, blsPublicKey, blsPoP) - - await sendTransaction(web3, registerTx, validatorPrivateKey, { - to: validators.address, - }) - - console.info(` - affiliate ${valName}`) - - // @ts-ignore - const affiliateTx = validators.contract.methods.affiliate(groupAddress) - - await sendTransaction(web3, affiliateTx, validatorPrivateKey, { - to: validators.address, - }) - - console.info(` - setAccountDataEncryptionKey ${valName}`) - - // @ts-ignore - const registerDataEncryptionKeyTx = accounts.contract.methods.setAccountDataEncryptionKey( - privateKeyToPublicKey(validatorPrivateKey) - ) - - await sendTransaction(web3, registerDataEncryptionKeyTx, validatorPrivateKey, { - to: accounts.address, - }) - - if (!isGanache) { - // Authorize the attestation signer - const attestationKeyAddress = privateKeyToAddress(attestationKey) - console.info(` - authorizeAttestationSigner ${valName}->${attestationKeyAddress}`) - const message = web3.utils.soliditySha3({ - type: 'address', - value: privateKeyToAddress(validatorPrivateKey), - }) - const signature = signMessage(message, attestationKey, attestationKeyAddress) - - // @ts-ignore - const registerAttestationKeyTx = accounts.contract.methods.authorizeAttestationSigner( - attestationKeyAddress, - signature.v, - signature.r, - signature.s - ) - - await sendTransaction(web3, registerAttestationKeyTx, validatorPrivateKey, { - to: accounts.address, - }) - } - - console.info(` - done ${valName}`) - return -} - -module.exports = async (_deployer: any, networkName: string) => { - const artifacts08 = ArtifactsSingleton.getInstance(SOLIDITY_08_PACKAGE, artifacts) - - const accounts: AccountsInstance = await getDeployedProxiedContract( - 'Accounts', - artifacts - ) - - const validators: ValidatorsInstance = await getDeployedProxiedContract( - 'Validators', - artifacts08 - ) - - const lockedGold: LockedGoldInstance = await getDeployedProxiedContract( - 'LockedGold', - artifacts - ) - - const election: ElectionInstance = await getDeployedProxiedContract( - 'Election', - artifacts - ) - - if (networkName === 'development') { - isGanache = true - const addr0 = privateKeyToAddress('0x' + ganachePrivateKey(0)) - for (let i = 10; i < 36; i++) { - const key = '0x' + ganachePrivateKey(i) - const addr = privateKeyToAddress(key) - // @ts-ignore - await web3.eth.personal.importRawKey(key, 'passphrase') - await web3.eth.personal.unlockAccount(addr, 'passphrase', 1000000) - await web3.eth.sendTransaction({ from: addr0, to: addr, value: new BigNumber(11000e18) }) - } - config.validators.validatorKeys = [...Array(30)].map((_, i) => ganachePrivateKey(i)) - extraKeys = [...Array(6)].map((_, i) => ganachePrivateKey(i + 30)) - config.validators.attestationKeys = config.validators.validatorKeys - } - - const valKeys: string[] = config.validators.validatorKeys - const attestationKeys: string[] = config.validators.attestationKeys - - if (valKeys.length === 0) { - console.info(' No validators to register') - return - } - - if (config.validators.votesRatioOfLastVsFirstGroup < 1) { - throw new Error(`votesRatioOfLastVsFirstGroup needs to be >= 1`) - } - - // Assumptions about where funds are located: - // * Validator 0 holds funds for all groups' stakes - // * Validator 1-n holds funds needed for their own stake - const validator0Key = valKeys[0] - - if (valKeys.length < parseInt(config.election.minElectableValidators, 10)) { - console.info( - ` Warning: Have ${valKeys.length} Validator keys but require a minimum of ${config.election.minElectableValidators} Validators in order for a new validator set to be elected.` - ) - } - - // Split the validator keys into groups that will fit within the max group size. - const valKeyGroups: string[][] = [] - const maxGroupSize: number = Number(config.validators.maxGroupSize) - for (let i = 0; i < valKeys.length; i += maxGroupSize) { - valKeyGroups.push(valKeys.slice(i, Math.min(i + maxGroupSize, valKeys.length))) - } - - // Calculate per validator locked gold for first group... - const lockedGoldPerValAtFirstGroup = new BigNumber( - config.validators.groupLockedGoldRequirements.value - ) - // ...and the delta for each subsequent group - const lockedGoldPerValEachGroup = new BigNumber( - config.validators.votesRatioOfLastVsFirstGroup - 1 - ) - .times(lockedGoldPerValAtFirstGroup) - .div(Math.max(valKeyGroups.length - 1, 1)) - .integerValue() - - const groups = valKeyGroups.map((keys, i) => { - const lockedGoldAmount = lockedGoldPerValAtFirstGroup - .plus(lockedGoldPerValEachGroup.times(i)) - .times(keys.length) - return { - valKeys: keys, - name: valKeyGroups.length - ? config.validators.groupName + `(${i + 1})` - : config.validators.groupName, - lockedGold: lockedGoldAmount, - voteAmount: - i === 0 || i === valKeyGroups.length - 1 - ? lockedGoldAmount - : new BigNumber(config.validators.groupLockedGoldRequirements.value), - account: null, - } - }) - - for (const [idx, group] of groups.entries()) { - console.info( - ` Registering validator group: ${group.name} with: ${group.lockedGold} CG locked...` - ) - group.account = await registerValidatorGroup( - group.name, - accounts, - lockedGold, - validators, - validator0Key, - group.lockedGold - ) - - console.info(` * Registering ${group.valKeys.length} validators ...`) - await Promise.all( - group.valKeys.map((key, i) => { - const index = idx * config.validators.maxGroupSize + i - return registerValidator( - accounts, - lockedGold, - validators, - key, - attestationKeys[index], - group.account.address, - index, - networkName - ) - }) - ) - - console.info(` * Adding Validators to ${group.name} ...`) - for (const [i, key] of group.valKeys.entries()) { - const address = privateKeyToAddress(key) - console.info(` - Adding ${address} ...`) - if (i === 0) { - const groupsWithVotes = groups.slice(0, idx) - groupsWithVotes.sort((a, b) => a.voteAmount.comparedTo(b.voteAmount)) - - // @ts-ignore - const addTx = validators.contract.methods.addFirstMember( - address, - NULL_ADDRESS, - groupsWithVotes.length ? groupsWithVotes[0].account.address : NULL_ADDRESS - ) - await sendTransaction(web3, addTx, group.account.privateKey, { - to: validators.address, - }) - } else { - // @ts-ignore - const addTx = validators.contract.methods.addMember(address) - await sendTransaction(web3, addTx, group.account.privateKey, { - to: validators.address, - }) - } - } - - // Determine the lesser and greater group addresses after voting. - const sortedGroups = groups.slice(0, idx + 1) - sortedGroups.sort((a, b) => a.voteAmount.comparedTo(b.voteAmount)) - const groupSortedIndex = sortedGroups.indexOf(group) - const lesser = - groupSortedIndex > 0 ? sortedGroups[groupSortedIndex - 1].account.address : NULL_ADDRESS - const greater = - groupSortedIndex < idx ? sortedGroups[groupSortedIndex + 1].account.address : NULL_ADDRESS - - // Note: Only the groups vote for themselves here. The validators do not vote. - console.info(' * Group voting for itself ...') - - // Make first and last group high votes so we can maintain presence. - const voteAmount = '0x' + group.voteAmount.toString(16) - - // @ts-ignore - const voteTx = election.contract.methods.vote( - group.account.address, - voteAmount, - lesser, - greater - ) - await sendTransaction(web3, voteTx, group.account.privateKey, { - to: election.address, - }) - } - console.info('Done with migrations') -} diff --git a/packages/protocol/package.json b/packages/protocol/package.json index 11104743ba6..ef8b8d242a7 100644 --- a/packages/protocol/package.json +++ b/packages/protocol/package.json @@ -6,151 +6,115 @@ "author": "Celo", "license": "LGPL-3.0", "scripts": { - "lint:ts": "yarn run --top-level eslint .", - "lint:sol": "solhint --version && solhint './contracts/**/*.sol' && solhint './contracts-0.8/**/*.sol'", - "lint": "yarn run lint:ts && yarn run lint:sol", - "clean": "rm -rf ./types/typechain && rm -rf build/* && rm -rf .0x-artifacts/* && rm -rf migrations/*.js* && rm -rf migrations_ts/*.js* && rm -rf test/**/*.js* && rm -f lib/*.js* && rm -f lib/**/*.js* && rm -f scripts/*.js* && yarn clean:foundry", - "clean:foundry": "forge clean && rm -rf cache out", - "test": "rm test/**/*.js ; node runTests.js", - "test:scripts": "yarn ts-node scripts/run-scripts-tests.ts --testPathPattern=scripts/", - "quicktest": "./scripts/bash/quicktest.sh", - "test:coverage": "yarn run test --coverage", - "ci:test-make-release": "./scripts/bash/release-on-devchain.sh", - "test:release-snapshots": "./scripts/bash/release-snapshots.sh", - "test:generate-old-devchain-and-build": "./scripts/bash/generate-old-devchain-and-build.sh", - "build:ts": "rm -f migrations/*.js* && yarn ts-node --preferTsExts ./scripts/build.ts --truffleTypes ./types/typechain && tsc -b && mv migrations_ts/*.js* migrations", - "gas": "yarn run test --gas", - "pull-submodules": "git submodule update --init --recursive", - "delete-submodules": "rm -rf $(git submodule | awk '{ print $2 }')", - "build:sol": "yarn pull-submodules && mkdir -p migrations && yarn ts-node --preferTsExts ./scripts/build.ts --solidity ${BUILD_DIR:-./build}", - "build": "yarn build:sol && yarn build:ts", - "prebuild": "rm -rf ./build", - "determine-release-version": "yarn --silent ts-node --preferTsExts ./scripts/determine-release-version.ts", - "prepare_contracts_and_abis_publishing": "yarn ts-node --preferTsExts ./scripts/prepare-contracts-and-abis-publishing.ts", - "prepare_devchain_anvil_publishing": "yarn ts-node --preferTsExts ./scripts/change-anvil-devchain-package-version.ts", - "validate_abis_exports": "yarn ts-node --preferTsExts ./scripts/validate-abis-package.ts", - "sourcify-publish": "yarn ts-node ./scripts/sourcify-publish.ts", - "migrate": "./scripts/bash/migrate.sh", - "set_block_gas_limit": "./scripts/bash/set_block_gas_limit.sh", - "download-artifacts": "./scripts/bash/download_artifacts.sh", - "init-network": "./scripts/bash/init_network.sh", - "upload-artifacts": "./scripts/bash/upload_artifacts.sh", - "upgrade": "./scripts/bash/upgrade.sh", - "revoke": "./scripts/bash/revoke.sh", - "govern": "./scripts/bash/govern.sh", - "console": "./scripts/bash/console.sh", - "check-versions": "./scripts/bash/check-versions.sh", - "check-opcodes": "yarn ts-node scripts/check-opcodes.ts", - "make-release": "./scripts/bash/make-release.sh", - "verify-deployed": "./scripts/bash/verify-deployed.sh", - "verify-release": "./scripts/bash/verify-release.sh", - "size:onchain": "./scripts/bash/get_smart_contract_size_from_onchain_address.sh", - "size:artifacts": "./scripts/bash/get_smart_contract_size_from_build_artifacts.sh", - "ganache-dev": "./scripts/bash/ganache.sh", - "ganache-devchain": "./scripts/bash/ganache_devchain.sh", - "truffle:migrate": "truffle migrate", - "devchain": "yarn ts-node scripts/devchain.ts", - "devchain:reset": "yarn devchain generate-tar .tmp/devchain.tar.gz --upto 29", + "anvil-devchain:e2e-tests": "./scripts/foundry/run_e2e_tests_in_anvil.sh", + "anvil-devchain:e2e-tests:rpc-logs": "ANVIL_LOGGING=true ./scripts/foundry/run_e2e_tests_in_anvil.sh", + "anvil-devchain:integration-tests": "./scripts/foundry/run_integration_tests_in_anvil.sh", "anvil-devchain:start-L1": "./scripts/foundry/create_and_migrate_anvil_devchain.sh", "anvil-devchain:start-L2": "./scripts/foundry/create_and_migrate_anvil_l2_devchain.sh", "anvil-devchain:status": "if nc -z localhost 8546; then echo 'Devchain is serving at http://localhost:8546'; else echo 'Devchain is not running.'; fi", "anvil-devchain:stop": "./scripts/foundry/stop_anvil.sh", - "anvil-devchain:e2e-tests": "./scripts/foundry/run_e2e_tests_in_anvil.sh", - "anvil-devchain:integration-tests": "./scripts/foundry/run_integration_tests_in_anvil.sh", - "view-tags": "git for-each-ref 'refs/tags/core-contracts.*' --sort=-committerdate --format='%(color:magenta)%(committerdate:short) %(color:blue)%(tree) %(color:green)github.com/celo-org/celo-monorepo/releases/tag/%(color:yellow)%(refname:short)'", - "generate-stabletoken-files": "yarn ts-node ./scripts/generate-stabletoken-files.ts", - "truffle-verify": "yarn truffle run verify", - "compare-git-tags": "./scripts/bash/compare-git-tags.sh" + "build:foundry": "forge build", + "prebuild": "rm -rf ./build", + "build": "yarn build:truffle-sol && yarn build:truffle-ts", + "build:truffle-sol": "yarn submodules:pull && yarn ts-node --preferTsExts ./scripts/build.ts --solidity ${BUILD_DIR:-./build}", + "build:truffle-ts": "yarn ts-node --preferTsExts ./scripts/build.ts --truffleTypes ./types/typechain && tsc -b", + "cheatsheet": "cat CHEATSHEET.md", + "ci:test-make-release": "./scripts/bash/release-on-devchain.sh", + "clean": "rm -rf ./types/typechain && rm -rf build/* && rm -rf .0x-artifacts/* && rm -f lib/*.js* && rm -f lib/**/*.js* && rm -f scripts/*.js* && rm -f test-ts/**/*.js* && yarn clean:foundry", + "clean:foundry": "forge clean && rm -rf cache out out-truffle-compat out-truffle-compat-0.8", + "lint": "yarn run lint:ts && yarn run lint:sol", + "lint:sol": "solhint --version && solhint './contracts/**/*.sol' && solhint './contracts-0.8/**/*.sol'", + "lint:ts": "yarn run --top-level eslint .", + "release:check-opcodes": "yarn ts-node scripts/check-opcodes.ts", + "release:check-versions": "./scripts/bash/check-versions.sh", + "release:check-versions:foundry": "./scripts/bash/check-versions-foundry.sh", + "release:determine-release-version": "yarn --silent ts-node --preferTsExts ./scripts/determine-release-version.ts", + "release:make": "./scripts/bash/make-release.sh", + "release:make:foundry": "./scripts/foundry/make-release-foundry.sh", + "release:verify-deployed": "./scripts/bash/verify-deployed.sh", + "release:verify-release": "./scripts/bash/verify-release.sh", + "release:verify-deployed:foundry": "./scripts/bash/verify-deployed-forge.sh", + "size:onchain": "./scripts/bash/get_smart_contract_size_from_onchain_address.sh", + "size:artifacts": "./scripts/bash/get_smart_contract_size_from_build_artifacts.sh", + "submodules:pull": "git submodule update --init --recursive", + "submodules:delete": "rm -rf $(git submodule | awk '{ print $2 }')", + "tags:compare": "./scripts/bash/compare-git-tags.sh", + "tags:view": "git for-each-ref 'refs/tags/core-contracts.*' --sort=-committerdate --format='%(color:magenta)%(committerdate:short) %(color:blue)%(tree) %(color:green)github.com/celo-org/celo-monorepo/releases/tag/%(color:yellow)%(refname:short)'", + "test": "forge test", + "test:release-snapshots": "./scripts/bash/release-snapshots.sh", + "test:ts": "yarn build:foundry && yarn tsc && yarn mocha --recursive test-ts", + "truffle:console": "./scripts/bash/console.sh", + "truffle:verify": "yarn truffle run verify", + "utils:prepare-contracts-and-abis-publishing": "yarn ts-node --preferTsExts ./scripts/prepare-contracts-and-abis-publishing.ts", + "utils:prepare-devchain-anvil-publishing": "yarn ts-node --preferTsExts ./scripts/change-anvil-devchain-package-version.ts", + "utils:sourcify-publish": "yarn ts-node ./scripts/sourcify-publish.ts", + "utils:validate-abis-exports": "yarn ts-node --preferTsExts ./scripts/validate-abis-package.ts" }, "dependencies": { - "@0x/sol-compiler": "^4.8.3", - "@0x/sol-coverage": "^4.0.47", - "@0x/sol-profiler": "^4.1.37", - "@0x/sol-trace": "^3.0.47", - "@0x/subproviders": "^7.0.1", "@celo/base": "^6.0.0", - "@celo/bls12377js": "0.1.1", "@celo/connect": "^5.1.2", "@celo/cryptographic-utils": "^5.0.7", "@celo/utils": "^5.0.6", "@celo/wallet-local": "^5.1.2", "@ethereumjs/util": "8.0.5", "@ethereumjs/vm": "npm:@celo/ethereumjs-vm@6.4.1-unofficial.0", - "@ganache/console.log": "0.3.0", "@openzeppelin/contracts8": "npm:@openzeppelin/contracts@^4.4.2", "@openzeppelin/upgrades": "^2.8.0", - "@summa-tx/memview.sol": "^1.1.0", - "@truffle/artifactor": "4.0.180", "@truffle/contract": "4.6.10", - "@truffle/resolver": "9.0.53", "bignumber.js": "9.1.0", - "bip39": "https://github.com/bitcoinjs/bip39#d8ea080a18b40f301d4e2219a2991cd2417e83c2", "bn.js": "^5.1.0", "chai": "^4.3.6", "chai-subset": "^1.6.0", "chalk": "^2.4.2", - "csv-parser": "^2.0.0", - "csv-stringify": "^4.3.1", - "elliptic": "^6.5.4", - "ethereum-cryptography": "1.2.0", - "ethereumjs-abi": "^0.6.8", - "ethereumjs-wallet": "^0.6.3", "form-data": "^3.0.0", "fs-extra": "^5.0.0", - "ganache": "npm:@celo/ganache@7.8.0-unofficial.0", "glob-fs": "^0.1.7", - "graphql": "^14.1.1", - "j6": "^1.0.2", "lodash": "^4.17.21", - "mathjs": "^5.0.4", "minimist": "^1.2.0", "node-fetch": "^2.6.9", "openzeppelin-solidity": "^2.5.0", "prompts": "^2.0.1", - "solhint": "^4.5.4", "semver": "^7.5.4", + "solhint": "^4.5.4", "solidity-bytes-utils": "0.0.7", "solidity-bytes-utils-8": "npm:solidity-bytes-utils@^0.8.2", "truffle": "5.9.0", + "truffle-plugin-verify": "^0.6.5", "truffle-security": "^1.7.3", - "weak-map": "^1.0.5", + "viem": "2.29.4", "web3": "1.10.0", "web3-core": "1.10.0", "web3-core-helpers": "1.10.0", "web3-provider-engine": "^16.0.5", - "web3-utils": "1.10.0", - "truffle-plugin-verify": "^0.6.5" + "web3-utils": "1.10.0" }, "devDependencies": { - "@tsconfig/recommended": "^1.0.3", - "@celo/dev-utils":"^0.0.3", "@celo/typechain-target-web3-v1-celo": "^1.0.0", "@celo/typescript": "0.0.2", "@jest/globals": "^29.5.0", + "@truffle/hdwallet-provider": "^2.1.15", + "@tsconfig/recommended": "^1.0.3", "@types/bn.js": "^5.1.0", "@types/chai": "^4.1.3", "@types/chai-subset": "1.3.5", "@types/jest": "^29.1.1", "@types/lodash": "^4.14.199", - "@types/mathjs": "^4.4.1", "@types/mocha": "^7.0.2", - "@types/targz": "^1.0.0", "@types/tmp": "^0.1.0", "@types/yargs": "^13.0.2", "@wagmi/cli": "^1.0.1", - "cross-env": "^5.1.6", + "dotenv": "^16.5.0", "eth-gas-reporter": "^0.2.16", "jest": "^29.0.2", "merkle-patricia-tree": "4.0.0", "rimraf": "^5.0.5", - "targz": "^1.0.1", "tmp": "^0.1.0", "truffle-typings": "^1.0.6", - "ts-node": "^10.9.2", "ts-generator": "^0.0.8", + "ts-node": "^10.9.2", "typechain": "^4.0.3", - "typechain-target-truffle": "^1.0.2", "typechain-target-ethers-v5": "^5.0.1", + "typechain-target-truffle": "^1.0.2", "yargs": "^14.0.0" } -} \ No newline at end of file +} diff --git a/packages/protocol/releaseData/README.md b/packages/protocol/releaseData/README.md index 8c1627ada56..524bc75a779 100644 --- a/packages/protocol/releaseData/README.md +++ b/packages/protocol/releaseData/README.md @@ -10,10 +10,10 @@ arguments to newly deployed contracts. ## `versionReports/` -This subdirectory contains the version reports output by the `check-versions` +This subdirectory contains the version reports output by the `release:check-versions` script between each successive major release. They are used by the `protocol-test-release-snapshots` CI job as a regression snapshot test for the -`check-versions` script, so the `oldArtifactsFolder` and `newArtifactsFolder` +`release:check-versions` script, so the `oldArtifactsFolder` and `newArtifactsFolder` should be set to paths that CircleCI jobs use (`/home/circleci/app/...`, see one of the files for an example). @@ -25,5 +25,5 @@ standard release process. ### `releaseBRL.json` This file is the initialization data used when deploying the `cREAL` -stable token. The release occured between Core Contracts releases 5 and 6 and +stable token. The release occurred between Core Contracts releases 5 and 6 and didn't exactly follow the release process. diff --git a/packages/protocol/releaseData/initializationData/release14.json b/packages/protocol/releaseData/initializationData/release14.json new file mode 100644 index 00000000000..9e26dfeeb6e --- /dev/null +++ b/packages/protocol/releaseData/initializationData/release14.json @@ -0,0 +1 @@ +{} \ No newline at end of file diff --git a/packages/protocol/releaseData/initializationData/release15.json b/packages/protocol/releaseData/initializationData/release15.json new file mode 100644 index 00000000000..0967ef424bc --- /dev/null +++ b/packages/protocol/releaseData/initializationData/release15.json @@ -0,0 +1 @@ +{} diff --git a/packages/protocol/releaseData/versionReports/release1-report.json b/packages/protocol/releaseData/versionReports/release1-report.json index 4b5a00a3284..a154695742f 100644 --- a/packages/protocol/releaseData/versionReports/release1-report.json +++ b/packages/protocol/releaseData/versionReports/release1-report.json @@ -1,10 +1,13 @@ { - "oldArtifactsFolder": "build/core-contracts.v0/contracts", + "oldArtifactsFolder": [ + "build/core-contracts.v0/contracts", + "build/core-contracts.v0/contracts-0.8" + ], "newArtifactsFolder": [ "build/core-contracts.v1/contracts", "build/core-contracts.v1/contracts-0.8" ], - "exclude": "/.*Test|Mock.*|I[A-Z].*|.*Proxy|MultiSig.*|ReleaseGold|SlasherUtil|UsingPrecompiles/", + "exclude": "/.*Test|Mock.*|I[A-Z].*|.*Proxy|MultiSig.*|ReleaseGold|SlasherUtil|UsingPrecompiles|CeloFeeCurrencyAdapterOwnable|FeeCurrencyAdapter|FeeCurrencyAdapterOwnable|IsL2Check|Blockable|PrecompilesOverride|CompileExchange|PrecompilesOverrideV2|UsingRegistryV2NoMento/", "report": { "contracts": { "DowntimeSlasher": { diff --git a/packages/protocol/releaseData/versionReports/release13-report.json b/packages/protocol/releaseData/versionReports/release13-report.json new file mode 100644 index 00000000000..2e12858a590 --- /dev/null +++ b/packages/protocol/releaseData/versionReports/release13-report.json @@ -0,0 +1,421 @@ +{ + "oldArtifactsFolder": [ + "build/core-contracts.v12-renamed/contracts", + "build/core-contracts.v12-renamed/contracts-0.8" + ], + "newArtifactsFolder": [ + "build/core-contracts.v13/contracts", + "build/core-contracts.v13/contracts-0.8" + ], + "exclude": "/.*Test|Mock.*|I[A-Z].*|.*Proxy|MultiSig.*|ReleaseGold|SlasherUtil|UsingPrecompiles|CeloFeeCurrencyAdapterOwnable|FeeCurrencyAdapter|FeeCurrencyAdapterOwnable|IsL2Check|Blockable|PrecompilesOverride|CompileExchange|PrecompilesOverrideV2|UsingRegistryV2NoMento|^UsingRegistry|^Ownable|Initializable|BLS12_377Passthrough|BLS12_381Passthrough]UniswapV2ERC20|ReentrancyGuard|MockElection|\\bFeeHandlerSeller\\b/", + "report": { + "contracts": { + "Accounts": { + "changes": { + "storage": [], + "major": [ + { + "contract": "Accounts", + "signature": "authorizeValidatorSignerWithKeys(address,uint8,bytes32,bytes32,bytes,bytes,bytes)", + "type": "MethodRemoved" + } + ], + "minor": [], + "patch": [ + { + "contract": "Accounts", + "type": "DeployedBytecode" + } + ] + }, + "versionDelta": { + "storage": "=", + "major": "+1", + "minor": "0", + "patch": "0" + } + }, + "Election": { + "changes": { + "storage": [], + "major": [ + { + "contract": "Election", + "signature": "getGroupEpochRewards(address,uint256,uint256[])", + "type": "MethodRemoved" + } + ], + "minor": [], + "patch": [ + { + "contract": "Election", + "type": "DeployedBytecode" + } + ] + }, + "versionDelta": { + "storage": "=", + "major": "+1", + "minor": "0", + "patch": "0" + } + }, + "EpochRewards": { + "changes": { + "storage": [], + "major": [ + { + "contract": "EpochRewards", + "signature": "isReserveLow()", + "type": "MethodRemoved" + } + ], + "minor": [], + "patch": [ + { + "contract": "EpochRewards", + "type": "DeployedBytecode" + } + ] + }, + "versionDelta": { + "storage": "=", + "major": "+1", + "minor": "0", + "patch": "0" + } + }, + "GoldToken": { + "changes": { + "storage": [], + "major": [ + { + "contract": "GoldToken", + "signature": "mint(address,uint256)", + "type": "MethodRemoved" + }, + { + "contract": "GoldToken", + "signature": "increaseSupply(uint256)", + "type": "MethodRemoved" + }, + { + "contract": "GoldToken", + "signature": "circulatingSupply()", + "type": "MethodRemoved" + } + ], + "minor": [], + "patch": [ + { + "contract": "GoldToken", + "signature": "totalSupply()", + "oldValue": "public", + "newValue": "external", + "type": "MethodVisibility" + }, + { + "contract": "GoldToken", + "type": "DeployedBytecode" + } + ] + }, + "versionDelta": { + "storage": "=", + "major": "+1", + "minor": "0", + "patch": "0" + } + }, + "Governance": { + "changes": { + "storage": [], + "major": [ + { + "contract": "Governance", + "signature": "whitelistHotfix(bytes32)", + "type": "MethodRemoved" + }, + { + "contract": "Governance", + "signature": "hotfixWhitelistValidatorTally(bytes32)", + "type": "MethodRemoved" + }, + { + "contract": "Governance", + "signature": "isHotfixPassing(bytes32)", + "type": "MethodRemoved" + }, + { + "contract": "Governance", + "signature": "getL1HotfixRecord(bytes32)", + "type": "MethodRemoved" + }, + { + "contract": "Governance", + "signature": "getHotfixRecord(bytes32)", + "oldValue": "bool, bool, uint256", + "newValue": "bool, bool, bool, uint256", + "type": "MethodReturn" + }, + { + "contract": "Governance", + "signature": "getL2HotfixRecord(bytes32)", + "type": "MethodRemoved" + }, + { + "contract": "Governance", + "signature": "isHotfixWhitelistedBy(bytes32,address)", + "type": "MethodRemoved" + } + ], + "minor": [], + "patch": [ + { + "contract": "Governance", + "type": "DeployedBytecode" + }, + { + "contract": "Governance", + "dependency": "IntegerSortedLinkedList", + "type": "LibraryLinkingChange" + } + ] + }, + "versionDelta": { + "storage": "=", + "major": "+1", + "minor": "0", + "patch": "0" + } + }, + "GovernanceSlasher": { + "changes": { + "storage": [], + "major": [ + { + "contract": "GovernanceSlasher", + "signature": "slash(address,address[],address[],uint256[])", + "type": "MethodRemoved" + } + ], + "minor": [ + { + "contract": "GovernanceSlasher", + "signature": "slash(address,address,address[],address[],uint256[])", + "type": "MethodAdded" + } + ], + "patch": [ + { + "contract": "GovernanceSlasher", + "signature": "slashL2(address,address,address[],address[],uint256[])", + "oldValue": "external", + "newValue": "public", + "type": "MethodVisibility" + }, + { + "contract": "GovernanceSlasher", + "type": "DeployedBytecode" + } + ] + }, + "versionDelta": { + "storage": "=", + "major": "+1", + "minor": "0", + "patch": "0" + } + }, + "Permissioned": { + "changes": { + "storage": [], + "major": [ + { + "contract": "Permissioned", + "type": "NewContract" + } + ], + "minor": [], + "patch": [] + }, + "versionDelta": { + "storage": "=", + "major": "+1", + "minor": "0", + "patch": "0" + } + }, + "SuperBridgeETHWrapper": { + "changes": { + "storage": [], + "major": [ + { + "contract": "SuperBridgeETHWrapper", + "type": "NewContract" + } + ], + "minor": [], + "patch": [] + }, + "versionDelta": { + "storage": "=", + "major": "+1", + "minor": "0", + "patch": "0" + } + }, + "Validators": { + "changes": { + "storage": [], + "major": [ + { + "contract": "Validators", + "signature": "registerValidator(bytes,bytes,bytes)", + "type": "MethodRemoved" + }, + { + "contract": "Validators", + "signature": "updateBlsPublicKey(bytes,bytes)", + "type": "MethodRemoved" + }, + { + "contract": "Validators", + "signature": "setValidatorScoreParameters(uint256,uint256)", + "type": "MethodRemoved" + }, + { + "contract": "Validators", + "signature": "setDowntimeGracePeriod(uint256)", + "type": "MethodRemoved" + }, + { + "contract": "Validators", + "signature": "updatePublicKeys(address,address,bytes,bytes,bytes)", + "type": "MethodRemoved" + }, + { + "contract": "Validators", + "signature": "updateValidatorScoreFromSigner(address,uint256)", + "type": "MethodRemoved" + }, + { + "contract": "Validators", + "signature": "distributeEpochPaymentsFromSigner(address,uint256)", + "type": "MethodRemoved" + }, + { + "contract": "Validators", + "signature": "downtimeGracePeriod()", + "type": "MethodRemoved" + }, + { + "contract": "Validators", + "signature": "getValidatorScoreParameters()", + "type": "MethodRemoved" + }, + { + "contract": "Validators", + "signature": "calculateEpochScore(uint256)", + "type": "MethodRemoved" + }, + { + "contract": "Validators", + "signature": "calculateGroupEpochScore(uint256[])", + "type": "MethodRemoved" + }, + { + "contract": "Validators", + "signature": "getValidatorBlsPublicKeyFromSigner(address)", + "type": "MethodRemoved" + }, + { + "contract": "Validators", + "signature": "initialize(address,uint256,uint256,uint256,uint256,uint256,uint256,uint256,uint256,uint256,struct Validators.InitParams)", + "type": "MethodRemoved" + } + ], + "minor": [ + { + "contract": "Validators", + "signature": "registerValidator(bytes)", + "type": "MethodAdded" + }, + { + "contract": "Validators", + "signature": "initialize(address,uint256,uint256,uint256,uint256,uint256,uint256,uint256,struct Validators.InitParams)", + "type": "MethodAdded" + } + ], + "patch": [ + { + "contract": "Validators", + "signature": "registerValidatorNoBls(bytes)", + "oldValue": "external", + "newValue": "public", + "type": "MethodVisibility" + }, + { + "contract": "Validators", + "type": "DeployedBytecode" + }, + { + "contract": "Validators", + "dependency": "AddressLinkedList", + "type": "LibraryLinkingChange" + } + ] + }, + "versionDelta": { + "storage": "=", + "major": "+1", + "minor": "0", + "patch": "0" + } + }, + "EpochManager": { + "changes": { + "storage": [], + "major": [], + "minor": [], + "patch": [ + { + "contract": "EpochManager", + "type": "DeployedBytecode" + } + ] + }, + "versionDelta": { + "storage": "=", + "major": "=", + "minor": "=", + "patch": "+1" + } + } + }, + "libraries": { + "AddressLinkedList": { + "storage": [], + "major": [], + "minor": [], + "patch": [ + { + "contract": "AddressLinkedList", + "type": "DeployedBytecode" + } + ] + }, + "IntegerSortedLinkedList": { + "storage": [], + "major": [], + "minor": [], + "patch": [ + { + "contract": "IntegerSortedLinkedList", + "type": "DeployedBytecode" + } + ] + } + } + } +} diff --git a/packages/protocol/releaseData/versionReports/release15-report.json b/packages/protocol/releaseData/versionReports/release15-report.json new file mode 100644 index 00000000000..af1f0e3d602 --- /dev/null +++ b/packages/protocol/releaseData/versionReports/release15-report.json @@ -0,0 +1,39 @@ +{ + "oldArtifactsFolder": [ + "out-core-contracts.v14" + ], + "newArtifactsFolder": [ + "out-core-contracts.v15" + ], + "exclude": "/.*Test|Mock.*|I[A-Z].*|.*Proxy|MultiSig.*|ReleaseGold|SlasherUtil|UsingPrecompiles|CeloFeeCurrencyAdapterOwnable|FeeCurrencyAdapter|FeeCurrencyAdapterOwnable|IsL2Check|Blockable|PrecompilesOverride|CompileExchange|PrecompilesOverrideV2|UsingRegistryV2NoMento|^UsingRegistry|^Ownable|Initializable|BLS12_377Passthrough|BLS12_381Passthrough]UniswapV2ERC20|ReentrancyGuard|MockElection|\\bFeeHandlerSeller\\b/", + "report": { + "contracts": { + "Governance": { + "changes": { + "storage": [], + "major": [], + "minor": [ + { + "contract": "Governance", + "signature": "proposalCount()", + "type": "MethodAdded" + } + ], + "patch": [ + { + "contract": "Governance", + "type": "DeployedBytecode" + } + ] + }, + "versionDelta": { + "storage": "=", + "major": "=", + "minor": "+1", + "patch": "0" + } + } + }, + "libraries": {} + } +} diff --git a/packages/protocol/releaseData/versionReports/release2-report.json b/packages/protocol/releaseData/versionReports/release2-report.json index 5030b9cfb63..932deb8bfdc 100644 --- a/packages/protocol/releaseData/versionReports/release2-report.json +++ b/packages/protocol/releaseData/versionReports/release2-report.json @@ -1,10 +1,13 @@ { - "oldArtifactsFolder": "build/core-contracts.v1/contracts", + "oldArtifactsFolder": [ + "build/core-contracts.v1/contracts", + "build/core-contracts.v1/contracts-0.8" + ], "newArtifactsFolder": [ "build/core-contracts.v2/contracts", "build/core-contracts.v2/contracts-0.8" ], - "exclude": "/.*Test|Mock.*|I[A-Z].*|.*Proxy|MultiSig.*|ReleaseGold|SlasherUtil|UsingPrecompiles/", + "exclude": "/.*Test|Mock.*|I[A-Z].*|.*Proxy|MultiSig.*|ReleaseGold|SlasherUtil|UsingPrecompiles|CeloFeeCurrencyAdapterOwnable|FeeCurrencyAdapter|FeeCurrencyAdapterOwnable|IsL2Check|Blockable|PrecompilesOverride|CompileExchange|PrecompilesOverrideV2|UsingRegistryV2NoMento/", "report": { "contracts": { "MetaTransactionWalletDeployer": { diff --git a/packages/protocol/releaseData/versionReports/release3-report.json b/packages/protocol/releaseData/versionReports/release3-report.json index cac02545a2a..b649cbfcabb 100644 --- a/packages/protocol/releaseData/versionReports/release3-report.json +++ b/packages/protocol/releaseData/versionReports/release3-report.json @@ -1,10 +1,13 @@ { - "oldArtifactsFolder": "build/core-contracts.v2/contracts", + "oldArtifactsFolder": [ + "build/core-contracts.v2/contracts", + "build/core-contracts.v2/contracts-0.8" + ], "newArtifactsFolder": [ "build/core-contracts.v3/contracts", "build/core-contracts.v3/contracts-0.8" ], - "exclude": "/.*Test|Mock.*|I[A-Z].*|.*Proxy|MultiSig.*|ReleaseGold|SlasherUtil|UsingPrecompiles/", + "exclude": "/.*Test|Mock.*|I[A-Z].*|.*Proxy|MultiSig.*|ReleaseGold|SlasherUtil|UsingPrecompiles|CeloFeeCurrencyAdapterOwnable|FeeCurrencyAdapter|FeeCurrencyAdapterOwnable|IsL2Check|Blockable|PrecompilesOverride|CompileExchange|PrecompilesOverrideV2|UsingRegistryV2NoMento/", "report": { "contracts": { "BlockchainParameters": { diff --git a/packages/protocol/releaseData/versionReports/release4-report.json b/packages/protocol/releaseData/versionReports/release4-report.json index 0797607a690..b78ce9f2f5a 100644 --- a/packages/protocol/releaseData/versionReports/release4-report.json +++ b/packages/protocol/releaseData/versionReports/release4-report.json @@ -1,10 +1,13 @@ { - "oldArtifactsFolder": "build/core-contracts.v3/contracts", + "oldArtifactsFolder": [ + "build/core-contracts.v3/contracts", + "build/core-contracts.v3/contracts-0.8" + ], "newArtifactsFolder": [ "build/core-contracts.v4/contracts", "build/core-contracts.v4/contracts-0.8" ], - "exclude": "/.*Test|Mock.*|I[A-Z].*|.*Proxy|MultiSig.*|ReleaseGold|SlasherUtil|UsingPrecompiles/", + "exclude": "/.*Test|Mock.*|I[A-Z].*|.*Proxy|MultiSig.*|ReleaseGold|SlasherUtil|UsingPrecompiles|CeloFeeCurrencyAdapterOwnable|FeeCurrencyAdapter|FeeCurrencyAdapterOwnable|IsL2Check|Blockable|PrecompilesOverride|CompileExchange|PrecompilesOverrideV2|UsingRegistryV2NoMento/", "report": { "contracts": { "Accounts": { diff --git a/packages/protocol/releaseData/versionReports/release5-report.json b/packages/protocol/releaseData/versionReports/release5-report.json index e62c0284916..c602ef249fc 100644 --- a/packages/protocol/releaseData/versionReports/release5-report.json +++ b/packages/protocol/releaseData/versionReports/release5-report.json @@ -1,10 +1,13 @@ { - "oldArtifactsFolder": "build/core-contracts.v4/contracts", + "oldArtifactsFolder": [ + "build/core-contracts.v4/contracts", + "build/core-contracts.v4/contracts-0.8" + ], "newArtifactsFolder": [ "build/core-contracts.v5/contracts", "build/core-contracts.v5/contracts-0.8" ], - "exclude": "/.*Test|Mock.*|I[A-Z].*|.*Proxy|MultiSig.*|ReleaseGold|SlasherUtil|UsingPrecompiles/", + "exclude": "/.*Test|Mock.*|I[A-Z].*|.*Proxy|MultiSig.*|ReleaseGold|SlasherUtil|UsingPrecompiles|CeloFeeCurrencyAdapterOwnable|FeeCurrencyAdapter|FeeCurrencyAdapterOwnable|IsL2Check|Blockable|PrecompilesOverride|CompileExchange|PrecompilesOverrideV2|UsingRegistryV2NoMento/", "report": { "contracts": { "GrandaMento": { diff --git a/packages/protocol/releaseData/versionReports/release6-report.json b/packages/protocol/releaseData/versionReports/release6-report.json index 0c2d8ef95be..43607781d86 100644 --- a/packages/protocol/releaseData/versionReports/release6-report.json +++ b/packages/protocol/releaseData/versionReports/release6-report.json @@ -1,10 +1,13 @@ { - "oldArtifactsFolder": "build/core-contracts.v5/contracts", + "oldArtifactsFolder": [ + "build/core-contracts.v5/contracts", + "build/core-contracts.v5/contracts-0.8" + ], "newArtifactsFolder": [ "build/core-contracts.v6/contracts", "build/core-contracts.v6/contracts-0.8" ], - "exclude": "/.*Test|Mock.*|I[A-Z].*|.*Proxy|MultiSig.*|ReleaseGold|SlasherUtil|UsingPrecompiles/", + "exclude": "/.*Test|Mock.*|I[A-Z].*|.*Proxy|MultiSig.*|ReleaseGold|SlasherUtil|UsingPrecompiles|CeloFeeCurrencyAdapterOwnable|FeeCurrencyAdapter|FeeCurrencyAdapterOwnable|IsL2Check|Blockable|PrecompilesOverride|CompileExchange|PrecompilesOverrideV2|UsingRegistryV2NoMento/", "report": { "contracts": { "ExchangeBRL": { diff --git a/packages/protocol/releaseData/versionReports/release7-report.json b/packages/protocol/releaseData/versionReports/release7-report.json index e9f9b13d938..499541b2488 100644 --- a/packages/protocol/releaseData/versionReports/release7-report.json +++ b/packages/protocol/releaseData/versionReports/release7-report.json @@ -1,10 +1,13 @@ { - "oldArtifactsFolder": "build/core-contracts.v6/contracts", + "oldArtifactsFolder": [ + "build/core-contracts.v6/contracts", + "build/core-contracts.v6/contracts-0.8" + ], "newArtifactsFolder": [ "build/core-contracts.v7/contracts", "build/core-contracts.v7/contracts-0.8" ], - "exclude": "/.*Test|Mock.*|I[A-Z].*|.*Proxy|MultiSig.*|ReleaseGold|SlasherUtil|UsingPrecompiles/", + "exclude": "/.*Test|Mock.*|I[A-Z].*|.*Proxy|MultiSig.*|ReleaseGold|SlasherUtil|UsingPrecompiles|CeloFeeCurrencyAdapterOwnable|FeeCurrencyAdapter|FeeCurrencyAdapterOwnable|IsL2Check|Blockable|PrecompilesOverride|CompileExchange|PrecompilesOverrideV2|UsingRegistryV2NoMento/", "report": { "contracts": { "Exchange": { diff --git a/packages/protocol/releaseData/versionReports/release8-report.json b/packages/protocol/releaseData/versionReports/release8-report.json index fb6e6664984..8eb5e1fb763 100644 --- a/packages/protocol/releaseData/versionReports/release8-report.json +++ b/packages/protocol/releaseData/versionReports/release8-report.json @@ -1,10 +1,13 @@ { - "oldArtifactsFolder": "build/core-contracts.v7/contracts", + "oldArtifactsFolder": [ + "build/core-contracts.v7/contracts", + "build/core-contracts.v7/contracts-0.8" + ], "newArtifactsFolder": [ "build/core-contracts.v8/contracts", "build/core-contracts.v8/contracts-0.8" ], - "exclude": "/.*Test|Mock.*|I[A-Z].*|.*Proxy|MultiSig.*|ReleaseGold|SlasherUtil|UsingPrecompiles|^UsingRegistry/", + "exclude": "/.*Test|Mock.*|I[A-Z].*|.*Proxy|MultiSig.*|ReleaseGold|SlasherUtil|UsingPrecompiles|CeloFeeCurrencyAdapterOwnable|FeeCurrencyAdapter|FeeCurrencyAdapterOwnable|IsL2Check|Blockable|PrecompilesOverride|CompileExchange|PrecompilesOverrideV2|UsingRegistryV2NoMento|^UsingRegistry/", "report": { "contracts": { "Escrow": { diff --git a/packages/protocol/releaseData/versionReports/release9-report.json b/packages/protocol/releaseData/versionReports/release9-report.json index 0423d48120b..31a472039e5 100644 --- a/packages/protocol/releaseData/versionReports/release9-report.json +++ b/packages/protocol/releaseData/versionReports/release9-report.json @@ -1,10 +1,13 @@ { - "oldArtifactsFolder": "build/core-contracts.v8/contracts", + "oldArtifactsFolder": [ + "build/core-contracts.v8/contracts", + "build/core-contracts.v8/contracts-0.8" + ], "newArtifactsFolder": [ "build/core-contracts.v9/contracts", "build/core-contracts.v9/contracts-0.8" ], - "exclude": "/.*Test|Mock.*|I[A-Z].*|.*Proxy|MultiSig.*|ReleaseGold|SlasherUtil|UsingPrecompiles|^UsingRegistry/", + "exclude": "/.*Test|Mock.*|I[A-Z].*|.*Proxy|MultiSig.*|ReleaseGold|SlasherUtil|UsingPrecompiles|CeloFeeCurrencyAdapterOwnable|FeeCurrencyAdapter|FeeCurrencyAdapterOwnable|IsL2Check|Blockable|PrecompilesOverride|CompileExchange|PrecompilesOverrideV2|UsingRegistryV2NoMento|^UsingRegistry/", "report": { "contracts": { "Attestations": { diff --git a/packages/protocol/runTests.js b/packages/protocol/runTests.js deleted file mode 100644 index 6537a9a5f59..00000000000 --- a/packages/protocol/runTests.js +++ /dev/null @@ -1,92 +0,0 @@ -const ganache = require('ganache') -const glob = require('glob-fs')({ - gitignore: false, -}) -const { exec, waitForPortOpen } = require('./lib/test-utils') -const minimist = require('minimist') -const networkName = 'development' -const network = require('./truffle-config.js').networks[networkName] - -const sleep = (seconds) => new Promise((resolve) => setTimeout(resolve, 1000 * seconds)) - -// As documented https://circleci.com/docs/2.0/env-vars/#built-in-environment-variables -const isCI = process.env.CI === 'true' - -async function startGanache() { - const server = ganache.server({ - logging: { quiet: true }, - wallet: { mnemonic: network.mnemonic, defaultBalance: network.defaultBalance }, - miner: { - blockGasLimit: 30000000, - defaultGasPrice: network.gasPrice, - }, - chain: { - networkId: network.network_id, - chainId: 1, - allowUnlimitedContractSize: true, - allowUnlimitedInitCodeSize: true, - }, - }) - - server.listen(8545, (err, blockchain) => { - if (err) throw err - blockchain - }) - - return async () => - await server.close((err) => { - if (err) throw err - }) -} - -async function test() { - const argv = minimist(process.argv.slice(2), { - boolean: ['gas', 'coverage', 'verbose-rpc'], - }) - - try { - console.info('Starting Ganache ...') - const closeGanache = await startGanache() - if (isCI) { - // If we are running on circle ci we need to wait for ganache to be up. - await waitForPortOpen('localhost', 8545, 120) - } - - // --reset is a hack to trick truffle into using 20M gas. - let testArgs = ['run', 'truffle', 'test', '--reset'] // Adding config doesn't seem to do much --config truffle-confisg0.8.js - if (argv['verbose-rpc']) { - testArgs.push('--verbose-rpc') - } - if (argv.coverage) { - testArgs = testArgs.concat(['--network', 'coverage']) - } else { - testArgs = testArgs.concat(['--network', networkName]) - } - if (argv.gas) { - testArgs = testArgs.concat(['--color', '--gas']) - } - - const testGlob = - argv._.length > 0 - ? argv._.map((testName) => - testName.endsWith('/') ? `test/${testName}\*\*/*.ts` : `test/\*\*/${testName}.ts` - ).join(' ') - : `test/\*\*/*.ts` - const testFiles = glob.readdirSync(testGlob) - if (testFiles.length === 0) { - // eslint-disable-next-line: no-console - console.error(`No test files matched with ${testGlob}`) - process.exit(1) - } - testArgs = testArgs.concat(testFiles) - - await exec('yarn', testArgs) - await closeGanache() - } catch (e) { - // eslint-disable-next-line: no-console - console.error(e.stdout ? e.stdout : e) - process.nextTick(() => process.exit(1)) - } -} - -test() diff --git a/packages/protocol/scripts/DeploySuperbridgeWETH.s.sol b/packages/protocol/scripts/DeploySuperbridgeWETH.s.sol new file mode 100644 index 00000000000..cb600648e34 --- /dev/null +++ b/packages/protocol/scripts/DeploySuperbridgeWETH.s.sol @@ -0,0 +1,43 @@ +// SPDX-License-Identifier: LGPL-3.0-only +pragma solidity ^0.8.15; + +import "forge-std/Script.sol"; +import "../contracts-0.8/common/SuperBridgeETHWrapper.sol"; + +contract DeploySuperBridgeWETH is Script { + function run() external { + // Fetch private key from environment variable. + uint256 deployerPrivateKey = vm.envUint("PRIVATE_KEY"); + require(deployerPrivateKey != 0, "PRIVATE_KEY environment variable not set"); + + // Addresses used in the constructor. Ensure these are correctly set via environment variables: + + // WETH address on the local chain (e.g., L1 or mainnet) + address wethAddressLocal = vm.envAddress("WETH_ADDRESS_LOCAL"); + require(wethAddressLocal != address(0), "WETH_ADDRESS_LOCAL environment variable not set"); + // WETH address on the remote chain (e.g., L2 or another network) + address wethAddressRemote = vm.envAddress("WETH_ADDRESS_REMOTE"); + require(wethAddressRemote != address(0), "WETH_ADDRESS_REMOTE environment variable not set"); + // L1StandardBridgeProxy address (or equivalent) on the local chain + address standardBridgeAddress = vm.envAddress("STANDARD_BRIDGE_ADDRESS"); + require( + standardBridgeAddress != address(0), + "STANDARD_BRIDGE_ADDRESS environment variable not set" + ); + + // Start broadcasting transactions to the network + vm.startBroadcast(deployerPrivateKey); + + // Deploy the contract + SuperBridgeETHWrapper superBridge = new SuperBridgeETHWrapper( + wethAddressLocal, + wethAddressRemote, + standardBridgeAddress + ); + + // Optionally, log the deployed contract's address + console.log("SuperBridgeETHWrapper deployed at:", address(superBridge)); + + vm.stopBroadcast(); + } +} diff --git a/packages/protocol/scripts/bash/backupmigrations.sh b/packages/protocol/scripts/bash/backupmigrations.sh deleted file mode 100755 index 1e324b3e0a7..00000000000 --- a/packages/protocol/scripts/bash/backupmigrations.sh +++ /dev/null @@ -1,53 +0,0 @@ -#!/bin/bash - -if [ -d migrations.bak ]; then - echo Replacing migrations - rm -rf migrations - mv migrations.bak migrations -else - echo Backing up migrations - mv migrations migrations.bak - mkdir migrations - - # Migration 0 always needs to be present - cp migrations.bak/00_initial_migration.* migrations/ - - # Uncomment lines for whichever migrations you actually do need. - # Note that some migrations depend on others (for example, many contracts - # require libraries to have been migrated, so you might need migration 1 to be - # uncommented). - cp migrations.bak/01_libraries.* migrations/ - # cp migrations.bak/02_registry.* migrations/ - # cp migrations.bak/03_freezer.* migrations/ - # cp migrations.bak/03_whitelist.* migrations/ - # cp migrations.bak/04_goldtoken.* migrations/ - # cp migrations.bak/05_sortedoracles.* migrations/ - # cp migrations.bak/06_gaspriceminimum.* migrations/ - # cp migrations.bak/07_reserve_spender_multisig.* migrations/ - # cp migrations.bak/08_reserve.* migrations/ - # cp migrations.bak/09_0_stabletoken_USD.* migrations/ - # cp migrations.bak/09_01_stableToken_EUR.* migrations/ - # cp migrations.bak/09_02_stableToken_BRL.* migrations/ - # cp migrations.bak/10_0_exchange_USD.* migrations/ - # cp migrations.bak/10_01_exchange_EUR.* migrations/ - # cp migrations.bak/10_02_exchange_BRL.* migrations/ - # cp migrations.bak/11_accounts.* migrations/ - # cp migrations.bak/12_lockedgold.* migrations/ - # cp migrations.bak/13_validators.* migrations/ - # cp migrations.bak/14_election.* migrations/ - # cp migrations.bak/15_epoch_rewards.* migrations/ - # cp migrations.bak/16_random.* migrations/ - # cp migrations.bak/17_attestations.* migrations/ - # cp migrations.bak/18_escrow.* migrations/ - # cp migrations.bak/19_blockchainparams.* migrations/ - # cp migrations.bak/20_governance_slasher.* migrations/ - # cp migrations.bak/21_double_signing_slasher.* migrations/ - # cp migrations.bak/22_downtime_slasher.* migrations/ - # cp migrations.bak/23_governance_approver_multisig.* migrations/ - # cp migrations.bak/24_grandamento.* migrations/ - # cp migrations.bak/25_stableToken_registry.* migrations/ - # cp migrations.bak/26_federated_attestations.* migrations/ - # cp migrations.bak/27_odispayments.* migrations/ - # cp migrations.bak/28_governance.* migrations/ - # cp migrations.bak/29_elect_validators.* migrations/ -fi diff --git a/packages/protocol/scripts/bash/check-versions-foundry.sh b/packages/protocol/scripts/bash/check-versions-foundry.sh new file mode 100755 index 00000000000..648b93700e6 --- /dev/null +++ b/packages/protocol/scripts/bash/check-versions-foundry.sh @@ -0,0 +1,63 @@ +#!/usr/bin/env bash +set -euo pipefail + +# Checks that the contract version numbers in a provided branch are as expected given +# a released branch. +# +# Flags: +# -a: Old branch containing smart contracts, which has likely been released. +# -b: New branch containing smart contracts, on which version numbers may be updated. +# -r: (Deprecated) No longer accepts a path. Report is always generated as report-$OLD_BRANCH-$NEW_BRANCH.json. +# -l: Path to a file to which logs should be appended + +BRANCH="" +NEW_BRANCH="" +REPORT="" +LOG_FILE="/tmp/celo-check-versions.log" + +while getopts 'a:b:r:l:i' flag; do + case "${flag}" in + a) BRANCH="${OPTARG}" ;; + b) NEW_BRANCH="${OPTARG}" ;; + r) REPORT="${OPTARG}" ;; + l) LOG_FILE="${OPTARG}" ;; + *) error "Unexpected option ${flag}" ;; + esac +done + +[ -z "$BRANCH" ] && echo "Need to set the old branch via the -a flag" && exit 1; +[ -z "$NEW_BRANCH" ] && echo "Need to set the new branch via the -b flag" && exit 1; + +if [ -n "$REPORT" ]; then + echo "Error: -r no longer accepts a path. Report name is now generated automatically from release names." >&2 + echo "See: https://github.com/celo-org/celo-monorepo/pull/11662" >&2 + exit 1 +fi +REPORT="report-$BRANCH-$NEW_BRANCH.json" + +# CONTRACT_EXCLUSION_REGEX imported from here +source scripts/bash/contract-exclusion-regex.sh + +REPORT_FLAG="--output_file $REPORT" + +source scripts/bash/release-lib.sh + +build_tag_foundry $BRANCH $LOG_FILE +BRANCH_BUILD_DIR=$BUILD_DIR +build_tag_foundry $NEW_BRANCH $LOG_FILE +NEW_BRANCH_BUILD_DIR=$BUILD_DIR + +# check-backward script uses migrationsConfig +echo " - Checkout migrationsConfig.js at $NEW_BRANCH" +CURRENT_HASH=`git log -n 1 --oneline | cut -c 1-9` +git checkout $NEW_BRANCH -- migrationsConfig.js + +yarn ts-node scripts/check-backward.ts sem_check \ + --old_contracts $BRANCH_BUILD_DIR \ + --new_contracts $NEW_BRANCH_BUILD_DIR \ + --exclude $CONTRACT_EXCLUSION_REGEX \ + --new_branch $NEW_BRANCH \ + --forge \ + $REPORT_FLAG + +git checkout $CURRENT_HASH -- migrationsConfig.js diff --git a/packages/protocol/scripts/bash/check-versions.sh b/packages/protocol/scripts/bash/check-versions.sh index 789c77af8f3..cf14dd66024 100755 --- a/packages/protocol/scripts/bash/check-versions.sh +++ b/packages/protocol/scripts/bash/check-versions.sh @@ -7,7 +7,7 @@ set -euo pipefail # Flags: # -a: Old branch containing smart contracts, which has likely been released. # -b: New branch containing smart contracts, on which version numbers may be updated. -# -r: Path that the contract compatibility report should be written to. +# -r: (Deprecated) No longer accepts a path. Report is always generated as report-$OLD_BRANCH-$NEW_BRANCH.json. # -l: Path to a file to which logs should be appended BRANCH="" @@ -28,13 +28,17 @@ done [ -z "$BRANCH" ] && echo "Need to set the old branch via the -a flag" && exit 1; [ -z "$NEW_BRANCH" ] && echo "Need to set the new branch via the -b flag" && exit 1; +if [ -n "$REPORT" ]; then + echo "Error: -r no longer accepts a path. Report name is now generated automatically from release names." >&2 + echo "See: https://github.com/celo-org/celo-monorepo/pull/11662" >&2 + exit 1 +fi +REPORT="report-$BRANCH-$NEW_BRANCH.json" + # CONTRACT_EXCLUSION_REGEX imported from here source scripts/bash/contract-exclusion-regex.sh -REPORT_FLAG="" -if [ ! -z "$REPORT" ]; then - REPORT_FLAG="--output_file "$REPORT -fi +REPORT_FLAG="--output_file $REPORT" source scripts/bash/release-lib.sh diff --git a/packages/protocol/scripts/bash/compare-git-tags.sh b/packages/protocol/scripts/bash/compare-git-tags.sh index 36aacb3b20d..0898a9129b1 100755 --- a/packages/protocol/scripts/bash/compare-git-tags.sh +++ b/packages/protocol/scripts/bash/compare-git-tags.sh @@ -2,10 +2,17 @@ # Function to print usage usage() { - echo "Usage: $0 " + echo "Usage: $0 [--expanded] " exit 1 } +# Parse optional flags +EXPANDED=false +if [ "$1" = "--expanded" ]; then + EXPANDED=true + shift +fi + # Check if the correct number of arguments are provided if [ "$#" -ne 2 ]; then usage @@ -26,8 +33,8 @@ CHANGED_FILES=$(git diff --name-only "$BRANCH1" "$BRANCH2" | grep -E '(.*/contra # Print the changed Solidity files echo "Changed Solidity files between $BRANCH1 and $BRANCH2 (excluding *.t.sol and files containing 'test'/'Test' and including only contracts or contracts-0.8 folders):" -CHANGED_FILES=$(echo "$CHANGED_FILES" | sed 's|^packages/protocol/||') -echo "$CHANGED_FILES" +CHANGED_FILES_DISPLAY=$(echo "$CHANGED_FILES" | sed 's|^packages/protocol/||') +echo "$CHANGED_FILES_DISPLAY" # Initialize an empty string for storing commits COMMITS="" @@ -37,6 +44,13 @@ for file in $CHANGED_FILES; do FILE_COMMITS=$(git log --pretty=format:"%h %s" "$BRANCH1..$BRANCH2" -- "$file") COMMITS+=$FILE_COMMITS COMMITS+="\n" + + if [ "$EXPANDED" = true ]; then + echo "" + echo "********************************************" + echo "Diff for $file" + git diff "$BRANCH1" "$BRANCH2" -- "$file" + fi done # Extract unique commits from the collected commit messages diff --git a/packages/protocol/scripts/bash/deploy_release_contracts.sh b/packages/protocol/scripts/bash/deploy_release_contracts.sh index 2bbb7fa2b4f..fe755438cca 100755 --- a/packages/protocol/scripts/bash/deploy_release_contracts.sh +++ b/packages/protocol/scripts/bash/deploy_release_contracts.sh @@ -1,6 +1,14 @@ #!/usr/bin/env bash set -euo pipefail +# NOTE: Deprecation notice: This script become outdated during migration from Truffle to Foundry. +# NOTE: Please checkout branch 'release/core-contracts/12' to further use script. +# NOTE: If it breaks please contact cLabs for porting to Foundry. +echo "Deprecation notice: This script become outdated during migration from Truffle to Foundry. +Please checkout branch 'release/core-contracts/12' to further use script. +If it breaks please contact cLabs for porting to Foundry." +# TODO: exit 1 + # Deploys all grants detailed in `GRANTS_FILE` from the corresponding entity. # # Flags: @@ -45,15 +53,10 @@ done CONTRACT_ARTIFACTS_DIR="$PWD/build" -if [[ ! -d "$CONTRACT_ARTIFACTS_DIR" ]]; then - echo "Error: no contract artifacts found in $CONTRACT_ARTIFACTS_DIR. Use download-artifacts to obtain them, or build them locally." >&2 - exit 1 -fi - if ! nc -z 127.0.0.1 8545 ; then echo "Warning: port 8545 not open" >&2 fi -yarn run build:ts && \ +yarn run build:truffle-ts && \ yarn run truffle exec ./scripts/truffle/deploy_release_contracts.js \ --network $NETWORK --from $FROM --grants $GRANTS_FILE --start_gold $START_GOLD --deployed_grants $DEPLOYED_GRANTS --output_file $OUTPUT_FILE $REALLY --build_directory $CONTRACT_ARTIFACTS_DIR \ diff --git a/packages/protocol/scripts/bash/download_artifacts.sh b/packages/protocol/scripts/bash/download_artifacts.sh deleted file mode 100755 index 41f3b4d58b9..00000000000 --- a/packages/protocol/scripts/bash/download_artifacts.sh +++ /dev/null @@ -1,26 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# Downloads contract build artifacts from GCS -# -# Flags: -# -b: Name of the bucket to download artifacts from -# -n: Name of the network to download artifacts for - -ARTIFACT_BUCKET="contract_artifacts" -NETWORK="" -while getopts 'b:n:' flag; do - case "${flag}" in - b) ARTIFACT_BUCKET="${OPTARG:-contract_artifacts}" ;; - n) NETWORK="${OPTARG}" ;; - *) error "Unexpected option ${flag}" ;; - esac -done -[ -z "$NETWORK" ] && echo "Need to set the NETWORK via the -n flag" && exit 1; - -TARBALL=.$NETWORK-$RANDOM.tar.gz - -# For some reason, unable to extract sometimes -curl https://www.googleapis.com/storage/v1/b/$ARTIFACT_BUCKET/o/$NETWORK?alt=media > $TARBALL \ - && tar -zxvf $TARBALL \ - && rm $TARBALL diff --git a/packages/protocol/scripts/bash/ganache.sh b/packages/protocol/scripts/bash/ganache.sh deleted file mode 100755 index 822adb1ff54..00000000000 --- a/packages/protocol/scripts/bash/ganache.sh +++ /dev/null @@ -1,13 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# Runs ganache with the mnemonic used in our tests. - -yarn run ganache \ - --wallet.mnemonic='concert load couple harbor equip island argue ramp clarify fence smart topic' \ - --miner.defaultGasPrice=0 \ - --chain.networkId=1101 \ - --miner.blockGasLimit=20000000 \ - --wallet.defaultBalance=200000000 \ - --chain.allowUnlimitedContractSize=true \ - --chain.chainId=1 \ diff --git a/packages/protocol/scripts/bash/ganache_devchain.sh b/packages/protocol/scripts/bash/ganache_devchain.sh deleted file mode 100755 index 174ad8adf0a..00000000000 --- a/packages/protocol/scripts/bash/ganache_devchain.sh +++ /dev/null @@ -1,25 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -DATA_DIR="" - -while getopts 'd:' flag; do - case "${flag}" in - d) DATA_DIR="${OPTARG}" ;; - *) error "Unexpected option ${flag}" ;; - esac -done - -[ -z "$DATA_DIR" ] && echo "Need to set the datadir path via the -d flag" && exit 1; - -yarn run ganache \ - --detach \ - --wallet.mnemonic='concert load couple harbor equip island argue ramp clarify fence smart topic' \ - --miner.defaultGasPrice=0 \ - --miner.blockGasLimit=20000000 \ - --wallet.defaultBalance=200000000 \ - --chain.networkId=1101 \ - --chain.allowUnlimitedContractSize=true \ - --chain.chainId=1 \ - --chain.hardfork='istanbul' \ - --database.dbPath=$DATA_DIR \ diff --git a/packages/protocol/scripts/bash/generate-old-devchain-and-build.sh b/packages/protocol/scripts/bash/generate-old-devchain-and-build.sh deleted file mode 100755 index 373edb93ed5..00000000000 --- a/packages/protocol/scripts/bash/generate-old-devchain-and-build.sh +++ /dev/null @@ -1,77 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -source ./scripts/bash/utils.sh - -# Generates a local network of a target git ref -# -# Flags: -# -b: Branch containing smart contracts that currently comprise the Celo protocol -# -l: Path to a file to which logs should be appended - -BRANCH="" -NETWORK="" -FORNO="" -BUILD_DIR="" -LOG_FILE="/dev/null" -GRANTS_FILE="" - -while getopts ':b:rl:d:g:' flag; do - case "${flag}" in - b) BRANCH="${OPTARG}" ;; - l) LOG_FILE="${OPTARG}" ;; - d) BUILD_DIR="${OPTARG}" ;; - g) GRANTS_FILE="${OPTARG}";; - *) error "Unexpected option ${flag}" ;; - esac -done - -[ -z "$BRANCH" ] && echo "Need to set the branch via the -b flag" && exit 1; - - -REMOTE_URL=$(git remote get-url origin) - - -# Create temporary directory -TMP_DIR=$(mktemp -d) -echo "Using temporary directory $TMP_DIR" - -[ -z "$BUILD_DIR" ] && BUILD_DIR=$(echo "build/$(echo $BRANCH | sed -e 's/\//_/g')"); - -echo "Using build directory $BUILD_DIR" -rm -rf $BUILD_DIR && mkdir -p $BUILD_DIR -BUILD_DIR_ABOSLUTE=$(cd "$BUILD_DIR" && pwd || echo "Error: Failed to find directory") -echo "ABSOLUTE BUILD DIR: $BUILD_DIR_ABOSLUTE" - - -echo "- Checkout source code at $BRANCH" and remote url $REMOTE_URL -# Clone the repository into the temporary directory -git clone $REMOTE_URL "$TMP_DIR/repo" --branch "$BRANCH" --single-branch -cd "$TMP_DIR/repo" - -# Redirection of logs -exec 2>>$LOG_FILE >> $LOG_FILE - -echo "- Build monorepo (contract artifacts, migrations, + all dependencies)" - -# Here, replace the 'yarn' commands as necessary to work within the temp directory structure -yarn run reset -yarn install -yarn run clean -RELEASE_TAG="" yarn build -cd packages/protocol - -echo "- Create local network" -if [ -z "$GRANTS_FILE" ]; then - yarn devchain generate-tar "$PWD/devchain.tar.gz" -else - yarn devchain generate-tar "$PWD/devchain.tar.gz" --release_gold_contracts "$GRANTS_FILE" -fi - -echo "moving contracts from build/contracts to $BUILD_DIR_ABOSLUTE" -mv build/contracts $BUILD_DIR_ABOSLUTE -echo "moving $PWD/devchain.tar.gz to $BUILD_DIR_ABOSLUTE" -mv "$PWD/devchain.tar.gz" $BUILD_DIR_ABOSLUTE/. - -echo "removing tmp directory $TMP_DIR" -rm -rf "$TMP_DIR" diff --git a/packages/protocol/scripts/bash/govern.sh b/packages/protocol/scripts/bash/govern.sh deleted file mode 100755 index f257f085b02..00000000000 --- a/packages/protocol/scripts/bash/govern.sh +++ /dev/null @@ -1,30 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# Allows users submit (and possibly execute) MultiSig transactions. -# -# Flags: -# -c: Function to run, e.g. stableToken.setMinter(0x1234) -# -n: Name of the network to govern - -CMD="" -NETWORK="" - -while getopts 'c:n:' flag; do - case "${flag}" in - c) CMD="${OPTARG}" ;; - n) NETWORK="$OPTARG" ;; - *) error "Unexpected option ${flag}" ;; - esac -done - -[ -z "$NETWORK" ] && echo "Need to set the NETWORK via the -n flag" && exit 1; - -if ! nc -z 127.0.0.1 8545 ; then - echo "Port 8545 not open" - exit 1 -fi - -yarn run build && \ -yarn run truffle exec ./scripts/truffle/govern.js \ - --network $NETWORK --build_directory $PWD/build/$NETWORK --command "$CMD" diff --git a/packages/protocol/scripts/bash/init_network.sh b/packages/protocol/scripts/bash/init_network.sh deleted file mode 100755 index 1461658b7df..00000000000 --- a/packages/protocol/scripts/bash/init_network.sh +++ /dev/null @@ -1,9 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# Runs all truffle migrations in protocol/migrations/ -# -# Flags: -# -n: Name of the network to migrate to - -yarn run migrate -r "$@" && yarn run set_block_gas_limit "$@" diff --git a/packages/protocol/scripts/bash/make-release.sh b/packages/protocol/scripts/bash/make-release.sh index 73e688c9e04..4775fd22125 100755 --- a/packages/protocol/scripts/bash/make-release.sh +++ b/packages/protocol/scripts/bash/make-release.sh @@ -7,7 +7,7 @@ set -euo pipefail # Flags: # -n: The network to deploy to. # -b: Branch to build contracts from. -# -p: Path that the governance proposal should be written to. +# -p: Deprecated. Proposal path is auto-generated as proposal-$NETWORK-$BRANCH.json. # -i: Path to the data needed to initialize contracts. # -r: Path to the contract compatibility report. # -d: Whether to dry-run this deploy @@ -42,14 +42,27 @@ done [ -z "$NETWORK" ] && echo "Need to set the NETWORK via the -n flag" && exit 1; [ -z "$BRANCH" ] && echo "Need to set the build branch via the -b flag" && exit 1; -[ -z "$PROPOSAL" ] && echo "Need to set the proposal outfile via the -p flag" && exit 1; [ -z "$INITIALIZE_DATA" ] && echo "Need to set the initialization data via the -i flag" && exit 1; [ -z "$REPORT" ] && echo "Need to set the compatibility report input via the -r flag" && exit 1; [ -z "$LIBRARIES" ] && echo "Need to set the library mapping input via the -l flag" && exit 1; +if [ -n "$PROPOSAL" ]; then + echo "Error: -p no longer accepts a path. Proposal name is now generated automatically as proposal-\$NETWORK-\$BRANCH.json." >&2 + echo "See: https://github.com/celo-org/celo-monorepo/pull/11662" >&2 + exit 1 +fi +PROPOSAL="proposal-$NETWORK-$BRANCH.json" + +source scripts/bash/validate-libraries-filename.sh +validate_libraries_filename "$LIBRARIES" "$NETWORK" "$BRANCH" + +source scripts/bash/validate-libraries-bytecode.sh +validate_libraries_bytecode "$LIBRARIES" "$(get_forno_url "$NETWORK")" + source scripts/bash/release-lib.sh build_tag $BRANCH "/dev/stdout" + yarn run truffle exec ./scripts/truffle/make-release.js \ --network $NETWORK \ --build_directory $BUILD_DIR \ @@ -58,4 +71,4 @@ yarn run truffle exec ./scripts/truffle/make-release.js \ --proposal $PROPOSAL \ --from $FROM \ --branch $BRANCH \ - --initialize_data $INITIALIZE_DATA $DRYRUN $FORNO + --initialize_data $INITIALIZE_DATA $DRYRUN $FORNO \ No newline at end of file diff --git a/packages/protocol/scripts/bash/migrate.sh b/packages/protocol/scripts/bash/migrate.sh deleted file mode 100755 index 22c358bbafe..00000000000 --- a/packages/protocol/scripts/bash/migrate.sh +++ /dev/null @@ -1,42 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# Runs unmigrated truffle migrations in protocol/migrations/ -# -# Flags: -# -n: Name of the network to migrate to -# -r: Reset network state by running all migrations - -TRUFFLE_OVERRIDE="" -MIGRATION_OVERRIDE="" -NETWORK="" -RESET="" -# https://github.com/trufflesuite/truffle-migrate/blob/develop/index.js#L161 -# Default to larger than the number of contracts we will ever have - -while getopts 'n:rt:f:c:m:' flag; do - case "${flag}" in - n) NETWORK="$OPTARG" ;; - r) RESET="--reset" ;; - t) TO="$OPTARG" ;; - f) FROM="$OPTARG" ;; - c) TRUFFLE_OVERRIDE="$OPTARG" ;; - m) MIGRATION_OVERRIDE="$OPTARG" ;; - *) error "Unexpected option ${flag}" ;; - esac -done - -[ -z "$NETWORK" ] && echo "Need to set the NETWORK via the -n flag" && exit 1; - -if ! nc -z 127.0.0.1 8545 ; then - echo "Port 8545 not open" - exit 1 -fi - -yarn run build && \ -echo "Migrating contracts migrations${FROM:+ from number $FROM}${TO:+ up to number $TO}" && \ -yarn run truffle migrate --compile-all --network $NETWORK --build_directory $PWD/build/$NETWORK $RESET \ - ${TO:+ --to $TO} \ - ${FROM:+ -f $FROM} \ - --truffle_override "$TRUFFLE_OVERRIDE" \ - --migration_override "$MIGRATION_OVERRIDE" diff --git a/packages/protocol/scripts/bash/quicktest.sh b/packages/protocol/scripts/bash/quicktest.sh deleted file mode 100755 index 35ad0785f07..00000000000 --- a/packages/protocol/scripts/bash/quicktest.sh +++ /dev/null @@ -1,14 +0,0 @@ -#!/usr/bin/env bash - -# Use this script if you want quick test iterations. -# Compared to the normal test command, this script will: -# 1. not run the pretest script of building solidity (will still be run as part of truffle test) -# and compiling typescript. This works because truffle can run typescript "natively". -# 2. only migrate selected migrations as set in `backupmigrations.sh` (you'll likely need at -# least one compilation step since truffle seems to only run compiled migrations) -# - -rm test/**/*.js -./scripts/bash/backupmigrations.sh -node runTests.js "$@" -./scripts/bash/backupmigrations.sh diff --git a/packages/protocol/scripts/bash/release-lib.sh b/packages/protocol/scripts/bash/release-lib.sh index af240a67f89..105c7992d1c 100644 --- a/packages/protocol/scripts/bash/release-lib.sh +++ b/packages/protocol/scripts/bash/release-lib.sh @@ -19,19 +19,82 @@ function build_tag() { echo " - Checkout contracts source code at $BRANCH" BUILD_DIR=$(echo build/$(echo $BRANCH | sed -e 's/\//_/g')) [ -d contracts ] && rm -r contracts + [ -d contracts-0.8 ] && rm -r contracts-0.8 + + git restore --source $BRANCH contracts* 2>>$LOG_FILE >> $LOG_FILE - # this remove is necesary because when bringing a contracts folder from previous commit - # if a folder didn't exist in the past, git will not remove the current one - # trying to compile it and leading to potental build errors - rm -rf contracts* - git checkout $BRANCH -- contracts* 2>>$LOG_FILE >> $LOG_FILE if [ ! -d $BUILD_DIR ]; then echo " - Build contract artifacts at $BUILD_DIR" - BUILD_DIR=$BUILD_DIR yarn build:sol >> $LOG_FILE + BUILD_DIR=$BUILD_DIR yarn build:truffle-sol >> $LOG_FILE else echo " - Contract artifacts already built at $BUILD_DIR" fi [ -d contracts ] && rm -r contracts - git checkout $CURRENT_HASH -- contracts 2>>$LOG_FILE >> $LOG_FILE -} \ No newline at end of file + [ -d contracts-0.8 ] && rm -r contracts-0.8 + git restore --source $CURRENT_HASH --staged --worktree contracts* 2>>$LOG_FILE >> $LOG_FILE +} + +function checkout_build_sources() { + local BUILD_SOURCES="contracts contracts-0.8 test-sol foundry.toml remappings.txt" + local FROM=$1 + local LOG_FILE=$2 + # The third argument is optional. We temporarily allow unset variables. + set +u + local STAGE=$3 + set -u + local FLAGS= + + if [[ $STAGE == "-s" ]]; then + FLAGS="--staged --worktree" + fi + + rm -rf $BUILD_SOURCES + git restore --source $FROM $FLAGS $BUILD_SOURCES 2>>$LOG_FILE >> $LOG_FILE +} + +# USAGE: build_tag_foundry +# This function: +# 1. checks out the given branch +# 2. builds contracts with Foundry +# 3. returns to original branch +# piping output of any commands to the specified log file. +# Sets $BUILD_DIR to the directory where resulting build artifacts may be found. +function build_tag_foundry() { + local BRANCH="$1" + local LOG_FILE="$2" + # Temporarily allow unset variables to handle optional parameters. + set +u + local PROFILE="$3" + local CONFIG="$4" + set -u + + local RELEASE_NUMBER=$(echo "$BRANCH" | grep -o 'v[0-9]\+' | tr -dc '0-9') + + echo "Writing logs to $LOG_FILE" + + local CURRENT_HASH=`git log -n 1 --oneline | cut -c 1-9` + + git fetch origin +'refs/tags/core-contracts.v*:refs/tags/core-contracts.v*' >> $LOG_FILE + echo " - Checkout contracts source code at $BRANCH" + BUILD_DIR=$(echo out-$(echo $BRANCH | sed -e 's/\//_/g')) + if [[ -n "$PROFILE" ]]; then + BUILD_DIR=${BUILD_DIR}-$PROFILE + fi + + checkout_build_sources $BRANCH $LOG_FILE + + if [[ -n "$CONFIG" ]]; then + cp "$CONFIG" foundry.toml + fi + + if [ ! -d $BUILD_DIR ]; then + echo " - Build contract artifacts at $BUILD_DIR" + export FOUNDRY_PROFILE=$PROFILE + forge build --out $BUILD_DIR --ast >> $LOG_FILE + else + echo " - Contract artifacts already built at $BUILD_DIR" + fi + + checkout_build_sources $CURRENT_HASH $LOG_FILE -s +} diff --git a/packages/protocol/scripts/bash/release-on-devchain.sh b/packages/protocol/scripts/bash/release-on-devchain.sh index 56e465b2a03..708763d25c6 100755 --- a/packages/protocol/scripts/bash/release-on-devchain.sh +++ b/packages/protocol/scripts/bash/release-on-devchain.sh @@ -2,6 +2,7 @@ set -euo pipefail source ./scripts/bash/utils.sh +source ./scripts/foundry/constants.sh # Simulates a release of the current contracts against a target git ref on a local network # @@ -21,30 +22,26 @@ while getopts 'b:l:d:' flag; do esac done -[ -z "$BRANCH" ] && echo "Need to set the branch via the -b flag" && exit 1; - -# if BUILD_DIR was not set as a parameter, we generate the build and the chain for that specific branch -if [ -z "$BUILD_DIR" ] -then - RE_BUILD_REPO="yes" - BUILD_DIR=$(echo build/$(echo $BRANCH | sed -e 's/\//_/g')) -fi +[ -z "$BRANCH" ] && echo "Need to set the branch via the -b flag" && exit 1; echo "- Run local network" -yarn devchain run-tar-in-bg packages/protocol/$BUILD_DIR/devchain.tar.gz +./scripts/foundry/start_anvil.sh -p $ANVIL_PORT -l .tmp/devchain/l2-devchain.json -GANACHE_PID= if command -v lsof; then - GANACHE_PID=`lsof -i tcp:8545 | tail -n 1 | awk '{print $2}'` - echo "Network started with PID $GANACHE_PID, if exit 1, you will need to manually stop the process" + ANVIL_PID=`lsof -i tcp:$ANVIL_PORT | tail -n 1 | awk '{print $2}'` + echo "Network started with PID $ANVIL_PID, if exit 1, you will need to manually stop the process" fi echo "- Verify bytecode of the network" -yarn run truffle exec ./scripts/truffle/verify-bytecode.js --network development --build_artifacts $BUILD_DIR/contracts --build_artifacts08 $BUILD_DIR/contracts-0.8 --branch $BRANCH --librariesFile libraries.json + +# this commands compiles the output +yarn --cwd packages/protocol release:verify-deployed -n anvil -b $BRANCH + echo "- Check versions of current branch" + # From check-versions.sh BASE_COMMIT=$(git rev-parse HEAD) @@ -53,7 +50,11 @@ echo " - Checkout migrationsConfig.js at $BRANCH" git checkout $BRANCH -- migrationsConfig.js source scripts/bash/contract-exclusion-regex.sh -yarn ts-node scripts/check-backward.ts sem_check --old_contracts $BUILD_DIR/contracts --new_contracts build/contracts --exclude $CONTRACT_EXCLUSION_REGEX --new_branch $BRANCH --output_file report.json + +CURRENT_BRANCH=$(git rev-parse --abbrev-ref HEAD) +REPORT="report-$BRANCH-$CURRENT_BRANCH.json" + +yarn ts-node scripts/check-backward.ts sem_check --old_contracts $BUILD_DIR/contracts --new_contracts build/contracts --exclude $CONTRACT_EXCLUSION_REGEX --new_branch $BRANCH --output_file $REPORT echo "- Clean git modified file" git restore migrationsConfig.js @@ -62,12 +63,15 @@ git restore migrationsConfig.js # From make-release.sh echo "- Deploy release of current branch" INITIALIZATION_FILE=`ls releaseData/initializationData/release*.json | sort -V | tail -n 1 | xargs realpath` -yarn truffle exec --network development ./scripts/truffle/make-release.js --build_directory build/ --branch $BRANCH --report report.json --proposal proposal.json --librariesFile libraries.json --initialize_data $INITIALIZATION_FILE +LIBRARIES_FILE="anvil-$BRANCH-libraries.json" +PROPOSAL="proposal-anvil-$BRANCH.json" +yarn truffle exec --network anvil ./scripts/truffle/make-release.js --build_directory build/ --branch $BRANCH --report $REPORT --proposal $PROPOSAL --librariesFile $LIBRARIES_FILE --initialize_data $INITIALIZATION_FILE # From verify-release.sh echo "- Verify release" -yarn truffle exec --network development ./scripts/truffle/verify-bytecode.js --build_artifacts build/contracts --proposal ../../proposal.json --branch $BRANCH --initialize_data $INITIALIZATION_FILE +yarn truffle exec --network anvil ./scripts/truffle/verify-bytecode.js --build_artifacts build/contracts --proposal ../../$PROPOSAL --branch $BRANCH --initialize_data $INITIALIZATION_FILE + -if [[ -n $GANACHE_PID ]]; then - kill $GANACHE_PID +if [[ -n $ANVIL_PID ]]; then + kill $ANVIL_PID fi diff --git a/packages/protocol/scripts/bash/release-snapshots.sh b/packages/protocol/scripts/bash/release-snapshots.sh index c7b6a8039fe..bbaf4f4633d 100755 --- a/packages/protocol/scripts/bash/release-snapshots.sh +++ b/packages/protocol/scripts/bash/release-snapshots.sh @@ -1,10 +1,11 @@ #!/usr/bin/env bash +set -euo pipefail N=`echo -n $RELEASE_TAG | tail -c -1` for i in `eval echo {1..$N}` do - yarn check-versions \ + yarn release:check-versions \ -a "core-contracts.v$(($i - 1))" \ -b "core-contracts.v$i" \ -r "releaseData/versionReports/release$i-report.json" diff --git a/packages/protocol/scripts/bash/revoke_contracts.sh b/packages/protocol/scripts/bash/revoke_contracts.sh index 4f076545174..8f66de5cdad 100755 --- a/packages/protocol/scripts/bash/revoke_contracts.sh +++ b/packages/protocol/scripts/bash/revoke_contracts.sh @@ -1,4 +1,8 @@ #! /usr/bin/env bash +set -euo pipefail + +# Revokes a ReleaseGold contract + set -euo pipefail rejects=() diff --git a/packages/protocol/scripts/bash/set_block_gas_limit.sh b/packages/protocol/scripts/bash/set_block_gas_limit.sh deleted file mode 100755 index c260ed298f8..00000000000 --- a/packages/protocol/scripts/bash/set_block_gas_limit.sh +++ /dev/null @@ -1,34 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# Sets block gas limit and turns ownership of contract over to governance -# -# Flags: -# -n: Name of the network to migrate to - -TRUFFLE_OVERRIDE="" -MIGRATION_OVERRIDE="" -NETWORK="" -# https://github.com/trufflesuite/truffle-migrate/blob/develop/index.js#L161 -# Default to larger than the number of contracts we will ever have - -while getopts 'n:rt:f:c:m:' flag; do - case "${flag}" in - n) NETWORK="$OPTARG" ;; - c) TRUFFLE_OVERRIDE="$OPTARG" ;; - m) MIGRATION_OVERRIDE="$OPTARG" ;; - t) ;; - *) error "Unexpected option ${flag}" ;; - esac -done - -[ -z "$NETWORK" ] && echo "Need to set the NETWORK via the -n flag" && exit 1; - -if ! nc -z 127.0.0.1 8545 ; then - echo "Port 8545 not open" - exit 1 -fi - -yarn run truffle exec ./scripts/truffle/set_block_gas_limit.js \ - --network $NETWORK --build_directory $PWD/build/$NETWORK --truffle_override "$TRUFFLE_OVERRIDE" \ - --migration_override "$MIGRATION_OVERRIDE" diff --git a/packages/protocol/scripts/bash/set_exchange_rate.sh b/packages/protocol/scripts/bash/set_exchange_rate.sh deleted file mode 100755 index ec54f5aac2a..00000000000 --- a/packages/protocol/scripts/bash/set_exchange_rate.sh +++ /dev/null @@ -1,39 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# Updates the Gold/Stable token exchange rate to the provided value, or, if a CSV of -# (timestamp, stableValue, goldValue) tuples is provided, updates the exchange rate to the value -# associated with the most recent timestamp less than or equal to the current time stamp. -# -# Flags: -# -f: Filepath to csv of (timestamp, stableValue, goldValue) tuples -# -n: name of the network defined in truffle-config.js to set the exchange rate on -# -s: StableToken component of exchange rate -# -g: GoldToken component of exchange rate -# -c: Override for truffle config - -NETWORK="" -FILE="" -GOLD_VALUE="" -STABLE_VALUE="" -CONFIG_OVERRIDE="" - -while getopts 'g:f:n:s:c:' flag; do - case "${flag}" in - n) NETWORK="$OPTARG" ;; - f) FILE="$OPTARG" ;; - s) STABLE_VALUE="$OPTARG" ;; - g) GOLD_VALUE="$OPTARG" ;; - c) CONFIG_OVERRIDE="$OPTARG" ;; - *) error "Unexpected option ${flag}" ;; - esac -done - -[ -z "$NETWORK" ] && echo "Need to set the NETWORK via the -n flag" && exit 1; - -yarn run download-artifacts -n $NETWORK && \ -yarn run build && \ -yarn run truffle exec ./scripts/truffle/set_exchange_rate.js \ - --network $NETWORK --stableValue $STABLE_VALUE --goldValue $GOLD_VALUE \ - --build_directory $PWD/build/$NETWORK --csv $FILE \ - --config_override "$CONFIG_OVERRIDE" diff --git a/packages/protocol/scripts/bash/simulate_proposal.sh b/packages/protocol/scripts/bash/simulate_proposal.sh new file mode 100755 index 00000000000..8481f8ec7cb --- /dev/null +++ b/packages/protocol/scripts/bash/simulate_proposal.sh @@ -0,0 +1,108 @@ +#!/usr/bin/env bash +set -euo pipefail + +# Disable sync check for celocli commands (anvil fork won't pass sync checks) +export NO_SYNCCHECK=true + +# Simulates a governance proposal on a forked anvil instance. +# +# Flags: +# -p: Path to the proposal JSON file. +# -n: Network name (e.g., rc1, celo-sepolia). + +PROPOSAL="" +NETWORK="" + +while getopts 'p:n:' flag; do + case "${flag}" in + p) PROPOSAL="${OPTARG}" ;; + n) NETWORK="${OPTARG}" ;; + *) echo "Unexpected option ${flag}" >&2; exit 1 ;; + esac +done + +[ -z "$PROPOSAL" ] && echo "Need to set the proposal path via the -p flag" && exit 1; +[ -z "$NETWORK" ] && echo "Need to set the network via the -n flag" && exit 1; +[ ! -f "$PROPOSAL" ] && echo "Proposal file '$PROPOSAL' not found" && exit 1; + +# Fetch all network metadata as JSON +NETWORK_INFO=$(yarn --silent ts-node scripts/ts/network-info.ts "$NETWORK") +RPC_URL=$(echo "$NETWORK_INFO" | jq -r '.rpcUrl') +PROPOSER=$(echo "$NETWORK_INFO" | jq -r '.proposer') +APPROVER=$(echo "$NETWORK_INFO" | jq -r '.approver') +VOTER=$(echo "$NETWORK_INFO" | jq -r '.voter') + +echo "Network: $NETWORK" +echo "RPC URL: $RPC_URL" +echo "Proposer: $PROPOSER" +echo "Approver: $APPROVER" +echo "Voter: $VOTER" + +# Fork the network with anvil +source scripts/foundry/constants.sh +scripts/foundry/start_anvil.sh -f "$RPC_URL" -a +ANVIL_RPC_URL=$(get_anvil_rpc_url) + +# Impersonate governance accounts so celocli can send transactions from them +# this walks-around a cli trying to simulate a tx with the node tx +cast rpc anvil_impersonateAccount "$PROPOSER" --rpc-url "$ANVIL_RPC_URL" +cast rpc anvil_impersonateAccount "$APPROVER" --rpc-url "$ANVIL_RPC_URL" +cast rpc anvil_impersonateAccount "$VOTER" --rpc-url "$ANVIL_RPC_URL" + +echo "Anvil forked $NETWORK at $ANVIL_RPC_URL" + +# Verify the forked network is healthy +echo "Verifying Registry proxy implementation..." +REGISTRY_IMPL=$(cast call "$REGISTRY_ADDRESS" "_getImplementation() (address)" --rpc-url "$ANVIL_RPC_URL") +echo "Registry implementation: $REGISTRY_IMPL" +if [ "$REGISTRY_IMPL" = "0x0000000000000000000000000000000000000000" ]; then + echo "Error: Registry has no implementation. Fork may not be working correctly." >&2 + kill $(lsof -t -i:$ANVIL_PORT) 2>/dev/null || true + exit 1 +fi + +echo "Verifying network contracts..." +CONTRACTS_OUTPUT=$(celocli network:contracts -n "$ANVIL_RPC_URL") +echo "$CONTRACTS_OUTPUT" + +GOVERNANCE_ADDRESS=$(echo "$CONTRACTS_OUTPUT" | awk '/Governance [^S]/{print $2}') +echo "Governance contract: $GOVERNANCE_ADDRESS" + +MIN_DEPOSIT=$(cast call "$GOVERNANCE_ADDRESS" "minDeposit()(uint256)" --rpc-url "$ANVIL_RPC_URL" --json | jq -r '.[0]') +echo "Min deposit: $MIN_DEPOSIT" + +celocli governance:withdraw --from="$PROPOSER" -n "$ANVIL_RPC_URL" || echo "no existing deposit to withdraw" + +# Propose +echo "Simulating proposal $PROPOSAL on $ANVIL_RPC_URL..." +PROPOSE_OUTPUT=$(celocli governance:propose --jsonTransactions="$PROPOSAL" --from="$PROPOSER" --deposit="$MIN_DEPOSIT" --descriptionURL="https://github.com/celo-org/governance/blob/main/CGPs/TEST" -n "$ANVIL_RPC_URL" 2>&1 | tee /dev/stderr) +PROPOSAL_ID=$(echo "$PROPOSE_OUTPUT" | grep "proposalId:" | awk '{print $2}') +echo "Proposal ID: $PROPOSAL_ID" + +# Approve +celocli governance:approve --proposalID="$PROPOSAL_ID" --from="$APPROVER" -n "$ANVIL_RPC_URL" && \ +echo "Proposal approved" + +# Vote +echo "Voting yes on proposal $PROPOSAL_ID..." +celocli governance:vote --value=Yes --from="$VOTER" --proposalID="$PROPOSAL_ID" -n "$ANVIL_RPC_URL" 2>&1 | tee /dev/stderr +echo "Proposal voted" + +# Fast-forward past the referendum period +REFERENDUM_DURATION=$(cast call "$GOVERNANCE_ADDRESS" "getReferendumStageDuration()(uint256)" --rpc-url "$ANVIL_RPC_URL" --json | jq -r '.[0]') +echo "Referendum stage duration: $REFERENDUM_DURATION seconds" +cast rpc evm_increaseTime $((REFERENDUM_DURATION + 1)) --rpc-url "$ANVIL_RPC_URL" +cast rpc evm_mine --rpc-url "$ANVIL_RPC_URL" + +# Execute +celocli governance:execute --from="$VOTER" --proposalID="$PROPOSAL_ID" -n "$ANVIL_RPC_URL" +echo "Proposal executed" + +# Sanity-check the Governance contract +celocli governance:withdraw --from="$PROPOSER" -n "$ANVIL_RPC_URL" +# propose just as a test +celocli governance:propose --jsonTransactions="$PROPOSAL" --from="$PROPOSER" --deposit="$MIN_DEPOSIT" --descriptionURL="https://github.com/celo-org/governance/blob/main/CGPs/TEST" -n "$ANVIL_RPC_URL" + +# Cleanup +kill $(lsof -t -i:$ANVIL_PORT) 2>/dev/null || true +echo "Done." diff --git a/packages/protocol/scripts/bash/transfer.sh b/packages/protocol/scripts/bash/transfer.sh deleted file mode 100755 index 9c8dfdbe378..00000000000 --- a/packages/protocol/scripts/bash/transfer.sh +++ /dev/null @@ -1,36 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# Transfers StableToken and Gold balances to an account -# -# Flags: -# -a: Address of the account who will receive the transfer -# -n: Name of the network to increment balances on - -ACCOUNT="" -NETWORK="" -DOLLARS="0" -GOLD="0" - -while getopts 'a:n:d:g:' flag; do - case "${flag}" in - a) ACCOUNT="$OPTARG" ;; - n) NETWORK="$OPTARG" ;; - d) DOLLARS="$OPTARG" ;; - g) GOLD="$OPTARG" ;; - *) error "Unexpected option ${flag}" ;; - esac -done - -[ -z "$NETWORK" ] && echo "Need to set the NETWORK via the -n flag" && exit 1; -[ -z "$ACCOUNT" ] && echo "Need to set the ACCOUNT via the -a flag" && exit 1; - -if ! nc -z 127.0.0.1 8545 ; then - echo "Port 8545 not open" - exit 1 -fi - -yarn run build && \ -yarn run truffle exec ./scripts/truffle/transfer.js \ - --network $NETWORK --stableValue $DOLLARS --goldValue $GOLD \ - --build_directory $PWD/build/$NETWORK --to $ACCOUNT diff --git a/packages/protocol/scripts/bash/upload_artifacts.sh b/packages/protocol/scripts/bash/upload_artifacts.sh deleted file mode 100755 index 10e82c71bf9..00000000000 --- a/packages/protocol/scripts/bash/upload_artifacts.sh +++ /dev/null @@ -1,25 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# Uploads contract build artifacts to GCS -# -# Flags: -# -b: Name of the bucket to upload artifacts to -# -n: Name of the network to upload artifacts for - -ARTIFACT_BUCKET="contract_artifacts" -NETWORK="" -while getopts 'b:n:' flag; do - case "${flag}" in - b) ARTIFACT_BUCKET="${OPTARG:-contract_artifacts}" ;; - n) NETWORK="${OPTARG}" ;; - *) error "Unexpected option ${flag}" ;; - esac -done -[ -z "$NETWORK" ] && echo "Need to set the NETWORK via the -n flag" && exit 1; - -TARBALL=$NETWORK.tar.gz - -tar -zcvf $TARBALL build/$NETWORK && \ - gsutil cp $TARBALL gs://$ARTIFACT_BUCKET/$NETWORK && \ - rm $TARBALL diff --git a/packages/protocol/scripts/bash/validate-libraries-bytecode.sh b/packages/protocol/scripts/bash/validate-libraries-bytecode.sh new file mode 100755 index 00000000000..3929cc637db --- /dev/null +++ b/packages/protocol/scripts/bash/validate-libraries-bytecode.sh @@ -0,0 +1,59 @@ +#!/usr/bin/env bash +# Validates that all addresses in a libraries JSON file have deployed bytecode. +# +# Usage: validate_libraries_bytecode + +# Extracts the forno URL for a network from truffle-config-parent.js +get_forno_url() { + local NETWORK="$1" + local URL + URL=$(yarn --silent ts-node scripts/ts/network-info.ts "$NETWORK" | jq -r '.rpcUrl') + + if [ $? -ne 0 ] || [ -z "$URL" ]; then + echo "Error: Could not resolve forno URL for network '$NETWORK'" >&2 + exit 1 + fi + + echo "$URL" +} + +validate_libraries_bytecode() { + local LIBRARIES="$1" + local RPC_URL="$2" + + if [ ! -f "$LIBRARIES" ]; then + echo "Error: Libraries file '$LIBRARIES' not found." >&2 + exit 1 + fi + + if ! command -v cast &> /dev/null; then + echo "Error: 'cast' command not found. Please install Foundry." >&2 + exit 1 + fi + + echo "Validating bytecode for libraries in $LIBRARIES using RPC $RPC_URL..." + + local HAS_ERROR=false + + for NAME in $(jq -r 'keys[]' "$LIBRARIES"); do + local ADDRESS + ADDRESS="0x$(jq -r --arg name "$NAME" '.[$name]' "$LIBRARIES")" + + local CODE + CODE=$(cast code "$ADDRESS" --rpc-url "$RPC_URL" 2>/dev/null) + + if [ "$CODE" = "0x" ] || [ -z "$CODE" ]; then + echo "Error: Library '$NAME' at $ADDRESS has no bytecode." >&2 + HAS_ERROR=true + else + echo " $NAME ($ADDRESS): OK" + fi + done + + if [ "$HAS_ERROR" = true ]; then + echo "Error: One or more libraries have no deployed bytecode. Aborting." >&2 + exit 1 + fi + + echo "All libraries have deployed bytecode." +} diff --git a/packages/protocol/scripts/bash/validate-libraries-filename.sh b/packages/protocol/scripts/bash/validate-libraries-filename.sh new file mode 100755 index 00000000000..27043f3996a --- /dev/null +++ b/packages/protocol/scripts/bash/validate-libraries-filename.sh @@ -0,0 +1,35 @@ +#!/usr/bin/env bash +# Validates that a libraries file is from the previous release. +# Expected naming convention: $NETWORK-$PREVIOUS_BRANCH-libraries.json +# where PREVIOUS_BRANCH has version number N-1 relative to the current branch. +# +# Usage: validate_libraries_filename + +validate_libraries_filename() { + local LIBRARIES="$1" + local NETWORK="$2" + local BRANCH="$3" + local VERSION_NUMBER + VERSION_NUMBER=$(echo "$BRANCH" | grep -o 'v[0-9]\+' | tr -dc '0-9') + + if [ -z "$VERSION_NUMBER" ] || [ "$VERSION_NUMBER" -lt 1 ]; then + echo "Error: Could not extract a valid version number from branch '$BRANCH'." >&2 + echo "Branch must match the pattern *vN (e.g., core-contracts.v15)." >&2 + exit 1 + fi + + local PREVIOUS_VERSION=$((VERSION_NUMBER - 1)) + local PREVIOUS_BRANCH + PREVIOUS_BRANCH=$(echo "$BRANCH" | sed "s/v${VERSION_NUMBER}/v${PREVIOUS_VERSION}/") + + local EXPECTED="$NETWORK-$PREVIOUS_BRANCH-libraries.json" + local ACTUAL + ACTUAL=$(basename "$LIBRARIES") + + if [ "$ACTUAL" != "$EXPECTED" ]; then + echo "Error: Libraries file name '$ACTUAL' does not match expected format '$EXPECTED'." >&2 + echo "The libraries file must be from the previous release (v$PREVIOUS_VERSION), not the current one (v$VERSION_NUMBER)." >&2 + exit 1 + fi + +} diff --git a/packages/protocol/scripts/bash/verify-deployed-forge.sh b/packages/protocol/scripts/bash/verify-deployed-forge.sh new file mode 100755 index 00000000000..d8969904a94 --- /dev/null +++ b/packages/protocol/scripts/bash/verify-deployed-forge.sh @@ -0,0 +1,43 @@ +#!/usr/bin/env bash +set -euo pipefail + +# Checks that Solidity sources on a given branch correspond to bytecodes +# deployed to a Celo system deployed to the given network. +# +# Flags: +# -b: Branch containing smart contracts that currently comprise the Celo protocol +# -n: The network to check +# -f: Boolean flag to indicate if the Forno service should be used to connect to +# the network +# -l: Path to a file to which logs should be appended + +BRANCH="" +NETWORK="" +FORNO="" +LOG_FILE="/dev/stdout" + +while getopts 'b:n:fl:' flag; do + case "${flag}" in + b) BRANCH="${OPTARG}" ;; + n) NETWORK="${OPTARG}" ;; + f) FORNO="--forno" ;; + l) LOG_FILE="${OPTARG}" ;; + *) error "Unexpected option ${flag}" ;; + esac +done + +[ -z "$BRANCH" ] && echo "Need to set the branch via the -b flag" && exit 1; +[ -z "$NETWORK" ] && echo "Need to set the NETWORK via the -n flag" && exit 1; + +source scripts/bash/release-lib.sh +source scripts/bash/warn-if-libraries-exist.sh +warn_if_libraries_exist "$NETWORK-$BRANCH-libraries.json" + +cp foundry.toml foundry.toml.bak + +build_tag_foundry $BRANCH $LOG_FILE truffle-compat foundry.toml.bak +build_tag_foundry $BRANCH $LOG_FILE truffle-compat8 foundry.toml.bak + +mv foundry.toml.bak foundry.toml + +TS_NODE_CACHE=false yarn ts-node --preferTsExts ./scripts/foundry/verify-bytecode-foundry.ts --network $NETWORK --branch $BRANCH --librariesFile "$NETWORK-$BRANCH-libraries.json" $FORNO diff --git a/packages/protocol/scripts/bash/verify-deployed.sh b/packages/protocol/scripts/bash/verify-deployed.sh index b358b0ff103..3e6e5224d9b 100755 --- a/packages/protocol/scripts/bash/verify-deployed.sh +++ b/packages/protocol/scripts/bash/verify-deployed.sh @@ -30,6 +30,9 @@ done [ -z "$NETWORK" ] && echo "Need to set the NETWORK via the -n flag" && exit 1; source scripts/bash/release-lib.sh +source scripts/bash/warn-if-libraries-exist.sh +warn_if_libraries_exist "$NETWORK-$BRANCH-libraries.json" + build_tag $BRANCH $LOG_FILE if [ "$BRANCH" = "core-contracts.v10" ]; then @@ -59,4 +62,4 @@ if [ "$BRANCH" = "core-contracts.v10" ]; then cd ../../../ fi -yarn run truffle exec ./scripts/truffle/verify-bytecode.js --network $NETWORK --build_artifacts $BUILD_DIR/contracts --branch $BRANCH --librariesFile "libraries.json" $FORNO +yarn run truffle exec ./scripts/truffle/verify-bytecode.js --network $NETWORK --build_artifacts $BUILD_DIR/contracts --branch $BRANCH --librariesFile "$NETWORK-$BRANCH-libraries.json" $FORNO diff --git a/packages/protocol/scripts/bash/warn-if-libraries-exist.sh b/packages/protocol/scripts/bash/warn-if-libraries-exist.sh new file mode 100755 index 00000000000..2657057185e --- /dev/null +++ b/packages/protocol/scripts/bash/warn-if-libraries-exist.sh @@ -0,0 +1,17 @@ +#!/usr/bin/env bash +# Warns the user if a libraries file already exists and will be overwritten. +# +# Usage: warn_if_libraries_exist + +warn_if_libraries_exist() { + local LIBRARIES="$1" + + if [ -f "$LIBRARIES" ]; then + echo "Warning: Libraries file '$LIBRARIES' already exists and will be overwritten." >&2 + read -r -p "Are you sure you want to continue? [y/N] " response + if [[ ! "$response" =~ ^[Yy]$ ]]; then + echo "Aborted." >&2 + exit 1 + fi + fi +} diff --git a/packages/protocol/scripts/check-backward.ts b/packages/protocol/scripts/check-backward.ts index b671a25492b..a0c33e91a4f 100644 --- a/packages/protocol/scripts/check-backward.ts +++ b/packages/protocol/scripts/check-backward.ts @@ -2,7 +2,12 @@ import { ASTContractVersionsChecker } from '@celo/protocol/lib/compatibility/ast import { DefaultCategorizer } from '@celo/protocol/lib/compatibility/categorizer' import { getReleaseVersion } from '@celo/protocol/lib/compatibility/ignored-contracts-v9' import { CategorizedChanges } from '@celo/protocol/lib/compatibility/report' -import { ASTBackwardReport, instantiateArtifacts } from '@celo/protocol/lib/compatibility/utils' +import { + ASTBackwardReport, + instantiateArtifacts, + instantiateArtifactsFromForge, +} from '@celo/protocol/lib/compatibility/utils' +import { BuildArtifacts } from '@openzeppelin/upgrades' import { writeJsonSync } from 'fs-extra' import path from 'path' import tmp from 'tmp' @@ -34,6 +39,11 @@ const argv = yargs type: 'string', demandOption: true, }) + .option('forge', { + description: 'Specifies that Forge artifacts are provided, rather than Truffle artifacts', + type: 'boolean', + default: false, + }) .option('output_file', { alias: 'f', description: 'Destination file output for the compatibility report', @@ -56,34 +66,94 @@ const argv = yargs .demandCommand() .strict().argv -// old artifacts folder needs to be generalized https://github.com/celo-org/celo-monorepo/issues/10567 -const oldArtifactsFolder = path.relative(process.cwd(), argv.old_contracts) -const oldArtifactsFolder08 = path.relative(process.cwd(), argv.old_contracts + '-0.8') -const newArtifactsFolder = path.relative(process.cwd(), argv.new_contracts) -const newArtifactsFolder08 = path.relative(process.cwd(), argv.new_contracts + '-0.8') -const newArtifactsFolders = [newArtifactsFolder, newArtifactsFolder08] -const oldArtifactsFolders = [oldArtifactsFolder, oldArtifactsFolder08] - const out = (msg: string, force?: boolean): void => { if (force || !argv.quiet) { process.stdout.write(msg) } } +interface ArtifactsFolders { + old: string[] + new: string[] +} + +const getForgeArtifactsFolders = (): ArtifactsFolders => { + const oldArtifactsFolder = path.relative(process.cwd(), argv.old_contracts) + const newArtifactsFolder = path.relative(process.cwd(), argv.new_contracts) + + return { + old: [oldArtifactsFolder], + new: [newArtifactsFolder], + } +} + +const getTruffleArtifactsFolders = (): ArtifactsFolders => { + const oldArtifactsFolder = path.relative(process.cwd(), argv.old_contracts) + const oldArtifactsFolder08 = path.relative(process.cwd(), argv.old_contracts + '-0.8') + const newArtifactsFolder = path.relative(process.cwd(), argv.new_contracts) + const newArtifactsFolder08 = path.relative(process.cwd(), argv.new_contracts + '-0.8') + const newArtifactsFolders = [newArtifactsFolder, newArtifactsFolder08] + const oldArtifactsFolders = [oldArtifactsFolder, oldArtifactsFolder08] + + return { + old: oldArtifactsFolders, + new: newArtifactsFolders, + } +} + +const getArtifactFolders = (): ArtifactsFolders => { + if (argv.forge) { + return getForgeArtifactsFolders() + } else { + return getTruffleArtifactsFolders() + } +} + const outFile = argv.output_file ? argv.output_file : tmp.tmpNameSync({}) const exclude: RegExp = argv.exclude ? new RegExp(argv.exclude) : null -// old artifacts needs to be generalized https://github.com/celo-org/celo-monorepo/issues/10567 -const oldArtifacts = instantiateArtifacts(oldArtifactsFolder) -const oldArtifacts08 = instantiateArtifacts(oldArtifactsFolder08) -const newArtifacts = instantiateArtifacts(newArtifactsFolder) -const newArtifacts08 = instantiateArtifacts(newArtifactsFolder08) + +const artifactFolders = getArtifactFolders() + +interface BuildArtifactSets { + old: BuildArtifacts[] + new: BuildArtifacts[] +} + +const getForgeArtifactSets = (folders: ArtifactsFolders): BuildArtifactSets => { + return { + old: instantiateArtifactsFromForge(folders.old[0]), + new: instantiateArtifactsFromForge(folders.new[0]), + } +} + +const getTruffleArtifactSets = (folders: ArtifactsFolders): BuildArtifactSets => { + const oldArtifacts = instantiateArtifacts(folders.old[0]) + const oldArtifacts08 = instantiateArtifacts(folders.old[1]) + const newArtifacts = instantiateArtifacts(folders.new[0]) + const newArtifacts08 = instantiateArtifacts(folders.new[1]) + + return { + old: [oldArtifacts, oldArtifacts08], + new: [newArtifacts, newArtifacts08], + } +} + +const getArtifactSets = (folders: ArtifactsFolders): BuildArtifactSets => { + if (argv.forge) { + return getForgeArtifactSets(folders) + } else { + return getTruffleArtifactSets(folders) + } +} + +const buildArtifacts = getArtifactSets(artifactFolders) try { const backward = ASTBackwardReport.create( - oldArtifactsFolders, - newArtifactsFolders, - [oldArtifacts, oldArtifacts08], - [newArtifacts, newArtifacts08], + artifactFolders.old, + artifactFolders.new, + buildArtifacts.old, + buildArtifacts.new, exclude, new DefaultCategorizer(), out @@ -109,9 +179,10 @@ try { } else if (argv._.includes(COMMAND_SEM_CHECK)) { const doVersionCheck = async () => { const versionChecker = await ASTContractVersionsChecker.create( - [oldArtifacts, oldArtifacts08], - [newArtifacts, newArtifacts08], - backward.report.versionDeltas() + buildArtifacts.old, + buildArtifacts.new, + backward.report.versionDeltas(), + argv.forge ) const mismatches = versionChecker.excluding(exclude).mismatches() if (mismatches.isEmpty()) { diff --git a/packages/protocol/scripts/deploy_superbridge_weth.sh b/packages/protocol/scripts/deploy_superbridge_weth.sh new file mode 100755 index 00000000000..418c65b5a91 --- /dev/null +++ b/packages/protocol/scripts/deploy_superbridge_weth.sh @@ -0,0 +1,117 @@ +#!/bin/bash + + +# Create L2 WETH token if necessary: +# cast send 0x4200000000000000000000000000000000000012 "createOptimismMintableERC20(address,string,string)" $WETH_L1_ADDR "Wrapped ETH (Celo native bridge)" "WETH" --private-key $PRIVKEY + +if ! command -v forge &> /dev/null +then + echo "forge could not be found, please install forge." + exit 1 +fi + +NETWORK=${1:-alfajores} + +if [ -f ".env.$NETWORK" ]; then + export $(grep -v '^#' ".env.$NETWORK" | xargs) +fi + +if [ -z "$PRIVATE_KEY" ]; then + echo "Error: PRIVATE_KEY environment variable is not set." + echo "Please set it in your environment or in the .env.$NETWORK file." + exit 1 +fi + +deploy_alfajores() { + echo "Deploying to Alfajores..." + export WETH_ADDRESS_LOCAL="0x94373a4919B3240D86eA41593D5eBa789FEF3848" + export WETH_ADDRESS_REMOTE="0x4EE7Ea447197c6b7BE0ab1A068F55c74a3390F33" + # L1StandardBridgeProxy + export STANDARD_BRIDGE_ADDRESS="0xD1B0E0581973c9eB7f886967A606b9441A897037" + export RPC_URL="https://ethereum-holesky-rpc.publicnode.com" + + FORGE_ARGS="--rpc-url $RPC_URL --broadcast" + if [ -n "$ETHERSCAN_API_KEY" ]; then + FORGE_ARGS="$FORGE_ARGS --verify --etherscan-api-key $ETHERSCAN_API_KEY" + echo "Verification enabled." + else + echo "ETHERSCAN_API_KEY not set. Skipping verification." + fi + + echo "Using WETH_ADDRESS_LOCAL: $WETH_ADDRESS_LOCAL" + echo "Using WETH_ADDRESS_REMOTE: $WETH_ADDRESS_REMOTE" + echo "Using STANDARD_BRIDGE_ADDRESS: $STANDARD_BRIDGE_ADDRESS" + + forge script scripts/DeploySuperbridgeWETH.s.sol:DeploySuperBridgeWETH $FORGE_ARGS + + # SuperBridgeETHWrapper deployed to holesky at: 0x78fb67119c4a055d6eb497b1aa5d09f7124225e5 +} + +deploy_baklava() { + echo "Deploying to Baklava..." + export WETH_ADDRESS_LOCAL="0x94373a4919B3240D86eA41593D5eBa789FEF3848" + export WETH_ADDRESS_REMOTE="0xBEcfCB91527166382187D5EE80ac07433D01549e" + # L1StandardBridgeProxy + export STANDARD_BRIDGE_ADDRESS="0x6fd3fF186975aD8B66Ab40b705EC016b36da0486" + + export RPC_URL="https://ethereum-holesky-rpc.publicnode.com" + + FORGE_ARGS="--rpc-url $RPC_URL --broadcast" + if [ -n "$ETHERSCAN_API_KEY" ]; then + FORGE_ARGS="$FORGE_ARGS --verify --etherscan-api-key $ETHERSCAN_API_KEY" + echo "Verification enabled." + else + echo "ETHERSCAN_API_KEY not set. Skipping verification." + fi + + echo "Using WETH_ADDRESS_LOCAL: $WETH_ADDRESS_LOCAL" + echo "Using WETH_ADDRESS_REMOTE: $WETH_ADDRESS_REMOTE" + echo "Using STANDARD_BRIDGE_ADDRESS: $STANDARD_BRIDGE_ADDRESS" + + forge script scripts/DeploySuperbridgeWETH.s.sol:DeploySuperBridgeWETH $FORGE_ARGS + # SuperBridgeETHWrapper deployed to holesky at: 0x6b7FAa7cC86DCd14e78F6a78F2dCfC76f8042e58 +} + + +deploy_mainnet() { + echo "Deploying to Mainnet..." + export WETH_ADDRESS_LOCAL="0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2" + export WETH_ADDRESS_REMOTE="0xD221812de1BD094f35587EE8E174B07B6167D9Af" + export STANDARD_BRIDGE_ADDRESS="0x9C4955b92F34148dbcfDCD82e9c9eCe5CF2badfe" + + export RPC_URL="https://eth.llamarpc.com" + + FORGE_ARGS="--rpc-url $RPC_URL --broadcast" + if [ -n "$ETHERSCAN_API_KEY" ]; then + FORGE_ARGS="$FORGE_ARGS --verify --etherscan-api-key $ETHERSCAN_API_KEY" + echo "Verification enabled." + else + echo "ETHERSCAN_API_KEY not set. Skipping verification." + fi + + echo "Using WETH_ADDRESS_LOCAL: $WETH_ADDRESS_LOCAL" + echo "Using WETH_ADDRESS_REMOTE: $WETH_ADDRESS_REMOTE" + # L1StandardBridgeProxy + echo "Using STANDARD_BRIDGE_ADDRESS: $STANDARD_BRIDGE_ADDRESS" + + forge script scripts/DeploySuperbridgeWETH.s.sol:DeploySuperBridgeWETH $FORGE_ARGS + # SuperBridgeETHWrapper deployed to mainnet at: 0x3bC7C4f8Afe7C8d514c9d4a3A42fb8176BE33c1e +} + +case $NETWORK in + alfajores) + deploy_alfajores + ;; + baklava) + deploy_baklava + ;; + mainnet) + deploy_mainnet + ;; + *) + echo "Usage: $0 [alfajores|baklava|mainnet]" + exit 1 + ;; +esac + +echo "Deployment script finished for $NETWORK." diff --git a/packages/protocol/scripts/devchain.ts b/packages/protocol/scripts/devchain.ts deleted file mode 100644 index f2a9f9a2d67..00000000000 --- a/packages/protocol/scripts/devchain.ts +++ /dev/null @@ -1,402 +0,0 @@ -import chalk from 'chalk' -import { spawn, SpawnOptions } from 'child_process' -import fs from 'fs-extra' -import ganache from 'ganache' -import path from 'path' -import targz from 'targz' -import tmp from 'tmp' -import yargs from 'yargs' - -tmp.setGracefulCleanup() - -const MNEMONIC = 'concert load couple harbor equip island argue ramp clarify fence smart topic' - -const gasLimit = 20000000 - -const ProtocolRoot = path.normalize(path.join(__dirname, '../')) - -// As documented https://circleci.com/docs/2.0/env-vars/#built-in-environment-variables -const isCI = process.env.CI === 'true' - -// Move to where the caller made the call So to have relative paths -const CallerCWD = process.env.INIT_CWD ? process.env.INIT_CWD : process.cwd() -process.chdir(CallerCWD) - -// eslint-disable-next-line @typescript-eslint/no-unused-expressions -yargs - .scriptName('devchain') - .recommendCommands() - .demandCommand(1) - .strict(true) - .showHelpOnFail(true) - .command( - 'run ', - "Run celo's devchain using given datadir without copying it", - (args) => - args - .positional('datadir', { type: 'string', description: 'Data Dir' }) - .option('reset', { - type: 'boolean', - description: 'Start fresh if enabled', - }) - .option('upto', { - type: 'number', - description: 'When reset, run upto given migration', - }), - (args) => - exitOnError(runDevChain(args.datadir, { reset: args.reset, upto: args.upto, targz: false })) - ) - .command( - 'run-tar ', - "Run celo's devchain using given tar filename. Generates a copy and then delete it", - (args) => args.positional('filename', { type: 'string', description: 'Chain tar filename' }), - (args) => exitOnError(runDevChainFromTar(args.filename)) - ) - .command( - 'run-tar-in-bg ', - "Run celo's devchain using given tar filename. Generates a copy and then delete it", - (args) => args.positional('filename', { type: 'string', description: 'Chain tar filename' }), - (args) => exitOnError(runDevChainFromTarInBackGround(args.filename)) - ) - .command( - 'generate ', - 'Create a new devchain directory from scratch', - (args) => - args - .positional('datadir', { type: 'string', description: 'Data Dir' }) - .option('upto', { - type: 'number', - description: 'When reset, run upto given migration', - }) - .option('migration_override', { - type: 'string', - description: 'Path to JSON containing config values to use in migrations', - }), - (args) => - exitOnError( - generateDevChain(args.datadir, { - upto: args.upto, - migrationOverride: args.migration_override, - targz: false, - }) - ) - ) - .command( - 'generate-tar ', - 'Create a new devchain.tar.gz from scratch', - (args) => - args - .positional('filename', { type: 'string', description: 'chain tar filename' }) - .option('upto', { - type: 'number', - description: 'When reset, run upto given migration', - }) - .option('migration_override', { - type: 'string', - description: 'Path to JSON containing config values to use in migrations', - }) - .option('release_gold_contracts', { - type: 'string', - description: 'Path to JSON containing list of release gold contracts', - }), - (args) => - exitOnError( - generateDevChain(args.filename, { - upto: args.upto, - migrationOverride: args.migration_override, - releaseGoldContracts: args.release_gold_contracts, - targz: true, - }) - ) - ) - .command( - 'compress-chain ', - 'Create a devchain.tar.gz from specified datadir', - (args) => - args - .positional('datadir', { type: 'string', description: 'datadir path' }) - .positional('filename', { type: 'string', description: 'chain tar filename' }), - (args) => exitOnError(compressChain(args.datadir, args.filename)) - ).argv - -function startGanache(datadir: string, opts: { verbose?: boolean }, chainCopy?: tmp.DirResult) { - const logFn = opts.verbose - ? // eslint-disable-next-line @typescript-eslint/no-unsafe-argument - (...args: any[]) => console.info(...args) - : () => { - /* nothing */ - } - - const server = ganache.server({ - logging: { logger: { log: logFn } }, - database: { dbPath: datadir }, - wallet: { mnemonic: MNEMONIC, defaultBalance: 200000000 }, - miner: { blockGasLimit: gasLimit }, - chain: { networkId: 1101, chainId: 1, allowUnlimitedContractSize: true }, - allowUnlimitedInitCodeSize: true, - }) - - server.listen(8545, (err) => { - if (err) { - throw err - } - // eslint-disable-next-line: no-console - console.info(chalk.red('Ganache STARTED')) - }) - - return async () => { - await server.close() - if (chainCopy) { - chainCopy.removeCallback() - } - console.info(chalk.red('Ganache server CLOSED')) - } -} - -export function execCmd( - cmd: string, - args: string[], - options?: SpawnOptions & { silent?: boolean } -) { - return new Promise((resolve, reject) => { - const { silent, ...spawnOptions } = options || { silent: false } - if (!silent) { - console.debug('$ ' + [cmd].concat(args).join(' ')) - } - const process = spawn(cmd, args, { - ...spawnOptions, - stdio: silent ? 'ignore' : 'inherit', - }) - process.on('close', (code) => { - try { - resolve(code) - } catch (error) { - reject(error) - } - }) - }) -} - -function exitOnError(p: Promise) { - p.catch((err) => { - console.error(`Command Failed`) - console.error(err) - process.exit(1) - }) -} - -async function resetDir(dir: string, silent?: boolean) { - if (fs.existsSync(dir)) { - await execCmd('rm', ['-rf', dir], { silent }) - } -} -function createDirIfMissing(dir: string) { - if (!fs.existsSync(dir)) { - fs.mkdirSync(dir) - } -} - -function runMigrations(opts: { upto?: number; migrationOverride?: string } = {}) { - const cmdArgs = ['truffle', 'migrate', '--reset', '--network', 'development'] - - if (opts.upto) { - cmdArgs.push('--to') - cmdArgs.push(opts.upto.toString()) - } - - if (opts.migrationOverride) { - cmdArgs.push('--migration_override') - const file: string = fs.readFileSync(opts.migrationOverride).toString() - cmdArgs.push(file) - } - return execCmd(`yarn`, cmdArgs, { cwd: ProtocolRoot }) -} - -function deployReleaseGold(releaseGoldContracts: string) { - const cmdArgs = ['truffle', 'exec', 'scripts/truffle/deploy_release_contracts.js'] - cmdArgs.push('--network') - // TODO(lucas): investigate if this can be found dynamically - cmdArgs.push('development') - cmdArgs.push('--from') - cmdArgs.push('0x5409ED021D9299bf6814279A6A1411A7e866A631') - cmdArgs.push('--grants') - cmdArgs.push(releaseGoldContracts) - cmdArgs.push('--start_gold') - cmdArgs.push('1') - cmdArgs.push('--deployed_grants') - // Random file name to prevent rewriting to it - cmdArgs.push('/tmp/deployedGrants' + Math.floor(1000 * Math.random()) + '.json') - cmdArgs.push('--output_file') - cmdArgs.push('/tmp/releaseGoldOutput.txt') - // --yesreally command to bypass prompts - cmdArgs.push('--yesreally') - cmdArgs.push('--build_directory') - cmdArgs.push(ProtocolRoot + 'build') - - return execCmd(`yarn`, cmdArgs, { cwd: ProtocolRoot }) -} - -async function runDevChainFromTar(filename: string) { - const chainCopy: tmp.DirResult = tmp.dirSync({ keep: false, unsafeCleanup: true }) - // eslint-disable-next-line: no-console - console.info(`Creating tmp folder: ${chainCopy.name}`) - - await decompressChain(filename, chainCopy.name) - - console.info('Starting Ganache ...') - const stopGanache = startGanache(chainCopy.name, { verbose: true }, chainCopy) - if (isCI) { - // If we are running on circle ci we need to wait for ganache to be up. - await waitForPortOpen('localhost', 8545, 120) - } - - return stopGanache -} - -/// This function was created to replace `startInBgAndWaitForString` in `release-on-devchain.sh` -/// and intended to be run on a hosted instances that shutdown after execution. -/// Note: If you run this locally, you will need to properly cleanup tmp.DirResult and -/// manually close the detached ganache instance. -/// see https://trufflesuite.com/docs/ganache/reference/cli-options/#manage-detached-instances for more details -async function runDevChainFromTarInBackGround(filename: string) { - const cmdArgs = ['ganache-devchain', '-d'] - - // keep is set to true, because `release-on-devchain` fails when set to false. - const chainCopy: tmp.DirResult = tmp.dirSync({ keep: true, unsafeCleanup: true }) - - // eslint-disable-next-line: no-console - console.info(`Creating tmp folder: ${chainCopy.name}`) - - await decompressChain(filename, chainCopy.name) - - cmdArgs.push(chainCopy.name) - - return execCmd(`yarn`, cmdArgs, { cwd: ProtocolRoot }) -} - -function decompressChain(tarPath: string, copyChainPath: string): Promise { - // eslint-disable-next-line: no-console - console.info('Decompressing chain') - return new Promise((resolve, reject) => { - targz.decompress({ src: tarPath, dest: copyChainPath }, (err) => { - if (err) { - console.error(err) - reject(err) - } else { - // eslint-disable-next-line: no-console - console.info('Chain decompressed') - resolve() - } - }) - }) -} - -async function runDevChain( - datadir: string, - opts: { - reset?: boolean - upto?: number - migrationOverride?: string - targz?: boolean - runMigrations?: boolean - releaseGoldContracts?: string - } = {} -) { - if (opts.reset) { - await resetDir(datadir) - } - createDirIfMissing(datadir) - console.info('Starting Ganache ...') - const stopGanache = startGanache(datadir, { verbose: true }) - if (isCI) { - // If we are running on circle ci we need to wait for ganache to be up. - await waitForPortOpen('localhost', 8545, 120) - } - if (opts.reset || opts.runMigrations) { - const code = await runMigrations({ upto: opts.upto, migrationOverride: opts.migrationOverride }) - if (code !== 0) { - throw Error('Migrations failed') - } - console.info('Migrations successfully applied') - } - if (opts.releaseGoldContracts) { - const code = await deployReleaseGold(opts.releaseGoldContracts) - if (code !== 0) { - throw Error('ReleaseGold deployment failed') - } - console.info('ReleaseGold successfully deployed') - } - return stopGanache -} - -async function generateDevChain( - filePath: string, - opts: { - upto?: number - migrationOverride?: string - releaseGoldContracts?: string - targz?: boolean - } = {} -) { - let chainPath = filePath - let chainTmp: tmp.DirResult - if (opts.targz) { - chainTmp = tmp.dirSync({ keep: false, unsafeCleanup: true }) - chainPath = chainTmp.name - } else { - fs.ensureDirSync(chainPath) - } - const stopGanache = await runDevChain(chainPath, { - reset: !opts.targz, - runMigrations: true, - upto: opts.upto, - migrationOverride: opts.migrationOverride, - releaseGoldContracts: opts.releaseGoldContracts, - }) - await stopGanache() - if (opts.targz && chainTmp) { - await compressChain(chainPath, filePath) - chainTmp.removeCallback() - } -} - -async function compressChain(chainPath: string, filename: string): Promise { - // eslint-disable-next-line: no-console - console.info('Compressing chain') - return new Promise((resolve, reject) => { - // ensures the path to the file - fs.ensureFileSync(filename) - targz.compress({ src: chainPath, dest: filename }, (err: Error) => { - if (err) { - console.error(err) - reject(err) - } else { - // eslint-disable-next-line: no-console - console.info('Chain compressed') - resolve() - } - }) - }) -} - -export async function waitForPortOpen(host: string, port: number, seconds: number) { - console.info(`Waiting for ${host}:${port} to open for ${seconds}s`) - const deadline = Date.now() + seconds * 1000 - do { - if (await isPortOpen(host, port)) { - await delay(10000) // extra 10s just to give ganache extra time to startup - console.info(`Port ${host}:${port} opened`) - return true - } - } while (Date.now() < deadline) - console.info('Port was not opened in time') - return false -} - -async function isPortOpen(host: string, port: number) { - return (await execCmd('nc', ['-z', host, port.toString()], { silent: true })) === 0 -} - -function delay(time: number) { - return new Promise((resolve) => setTimeout(resolve, time)) -} diff --git a/packages/protocol/scripts/foundry/ForgeArtifact.ts b/packages/protocol/scripts/foundry/ForgeArtifact.ts new file mode 100644 index 00000000000..9684f3fd21b --- /dev/null +++ b/packages/protocol/scripts/foundry/ForgeArtifact.ts @@ -0,0 +1,59 @@ +import { Hex } from 'viem' + +export interface ForgeArtifact { + abi: any + bytecode: { + object: Hex + } + metadata: { + sources: { + [sourcePath: string]: { + [key: string]: any + } + } + compiler?: { + version?: string + } + settings?: { + optimizer?: { + enabled?: boolean + runs?: number + } + evmVersion?: string + compilationTarget?: { + [path: string]: string + } + } + } + contractName?: string + deployedBytecode?: { + object?: string + } + sourceMap?: string + deployedSourceMap?: string + sourcePath?: string + ast?: any + legacyAST?: any + compiler?: { + name: string + version: string + } + networks?: { + [networkId: string]: { + events: any + links: any + address: string + transactionHash: string + } + } + schemaVersion?: string + updatedAt?: string + devdoc?: any + userdoc?: any + methodIdentifiers?: object + gasEstimates?: object + storageLayout?: { + storage?: any + types?: any + } +} diff --git a/packages/protocol/scripts/foundry/build_constitution_selectors_map.sh b/packages/protocol/scripts/foundry/build_constitution_selectors_map.sh new file mode 100755 index 00000000000..d54c8cb5145 --- /dev/null +++ b/packages/protocol/scripts/foundry/build_constitution_selectors_map.sh @@ -0,0 +1,13 @@ +#!/usr/bin/env bash +set -euo pipefail + +### This scripts builds map of selectors to avoid reliance on ffi through Foundry scripts and tests + +# Create selectors dir +mkdir .tmp/selectors + +# Iterate over contracts defined in constitution and build json map +for contractName in $(jq -r keys[] governanceConstitution.json); do + echo "Building selectors map for contract: $contractName" + forge inspect $contractName methods --json > .tmp/selectors/$contractName.json +done diff --git a/packages/protocol/scripts/foundry/constants.sh b/packages/protocol/scripts/foundry/constants.sh index d44bdfce4ac..443cdbcc25f 100755 --- a/packages/protocol/scripts/foundry/constants.sh +++ b/packages/protocol/scripts/foundry/constants.sh @@ -4,7 +4,16 @@ export FROM_ACCOUNT="0x$FROM_ACCOUNT_NO_ZERO" # Anvil default account (1) # Anvil configurations (Source: https://book.getfoundry.sh/reference/anvil/) export ANVIL_PORT=8546 -export ANVIL_RPC_URL="http://127.0.0.1:$ANVIL_PORT" +export ANVIL_RPC_URL_BASE="http://127.0.0.1" + +export ANVIL_RPC_URL="$ANVIL_RPC_URL_BASE:$ANVIL_PORT" # TODO deprecate this variable in favor of the function below + +get_anvil_rpc_url() { + echo "$ANVIL_RPC_URL_BASE:$ANVIL_PORT" +} + +# Anvil logging +export ANVIL_LOGGING_ENABLED=${ANVIL_LOGGING:=false} # Flag to enable or disable logging. Useful for local development export GAS_LIMIT=50000000 export CODE_SIZE_LIMIT=245760 # EIP-170: Contract code size limit in bytes. Useful to increase for tests. [default: 0x6000 (~25kb)] export BALANCE=60000 # Set the balance of the accounts. [default: 10000] @@ -19,6 +28,7 @@ export MIGRATION_L2_TARGET_CONTRACT="MigrationL2" # The name of the contract yo export BROADCAST="--broadcast" # Broadcasts the transactions. Enable: "--broadcast" / Disable: "" export SKIP_SIMULATION="" # Skips on-chain simulation. Enable: "--skip-simulation" / Disable: "" export NON_INTERACTIVE="--non-interactive" # Remove interactive prompts which appear if the contract is near the EIP-170 size limit. +export TIMEOUT=30000 # set a timeout for the avil node export VERBOSITY_LEVEL="-vvv" # Pass multiple times to increase the verbosity (e.g. -v, -vv, -vvv). export REGISTRY_OWNER_ADDRESS=$FROM_ACCOUNT_NO_ZERO @@ -42,26 +52,29 @@ export REGISTRY_STORAGE_LOCATION="0xb53127684a568b3173ae13b9f8a6016e243e63b6e8ee GOLD_TOKEN_CELO_SUPPLY_CAP=1000000000 # `GoldToken.CELO_SUPPLY_CAP()` GOLD_TOKEN_TOTAL_SUPPLY=700000000 # Arbitrary amount chosen to be approximately equal to `GoldToken.totalSupply()` on the L1 Mainnet (695,313,643 CELO as of this commit). export CELO_UNRELEASED_TREASURY_INITIAL_BALANCE="$(($GOLD_TOKEN_CELO_SUPPLY_CAP - $GOLD_TOKEN_TOTAL_SUPPLY))" # During the real L2 genesis, the VM will calculate and set an appropriate balance. +export RESERVE_INITIAL_BALANCE="5000000" # setting this here because it gets overwritten in the L2 migration script # Contract libraries export LIBRARIES_PATH=("contracts/common/linkedlists/AddressSortedLinkedListWithMedian.sol:AddressSortedLinkedListWithMedian" "contracts/common/Signatures.sol:Signatures" - "contracts-0.8/common/linkedlists/AddressLinkedList.sol:AddressLinkedList" "contracts/common/linkedlists/AddressSortedLinkedList.sol:AddressSortedLinkedList" "contracts/common/linkedlists/IntegerSortedLinkedList.sol:IntegerSortedLinkedList" "contracts/governance/Proposals.sol:Proposals" ) + +export LIBRARIES_PATH_08=("contracts-0.8/common/linkedlists/AddressLinkedList.sol:AddressLinkedList") + export LIBRARY_DEPENDENCIES_PATH=( - "contracts/common/FixidityLib.sol" - "contracts/common/linkedlists/LinkedList.sol" - "contracts-0.8/common/linkedlists/LinkedList.sol" - "contracts/common/linkedlists/SortedLinkedList.sol" - "contracts/common/linkedlists/SortedLinkedListWithMedian.sol" - "lib/openzeppelin-contracts/contracts/math/SafeMath.sol" - "lib/openzeppelin-contracts8/contracts/utils/math/SafeMath.sol" - "lib/openzeppelin-contracts/contracts/math/Math.sol" - "lib/openzeppelin-contracts/contracts/cryptography/ECDSA.sol" - "lib/openzeppelin-contracts/contracts/utils/Address.sol" - "lib/solidity-bytes-utils/contracts/BytesLib.sol" - "lib/celo-foundry/lib/forge-std/src/console.sol" + "contracts/common/FixidityLib.sol" + "contracts/common/linkedlists/LinkedList.sol" + "contracts-0.8/common/linkedlists/LinkedList.sol" + "contracts/common/linkedlists/SortedLinkedList.sol" + "contracts/common/linkedlists/SortedLinkedListWithMedian.sol" + "lib/openzeppelin-contracts/contracts/math/SafeMath.sol" + "lib/openzeppelin-contracts8/contracts/utils/math/SafeMath.sol" + "lib/openzeppelin-contracts/contracts/math/Math.sol" + "lib/openzeppelin-contracts/contracts/cryptography/ECDSA.sol" + "lib/openzeppelin-contracts/contracts/utils/Address.sol" + "lib/solidity-bytes-utils/contracts/BytesLib.sol" + "lib/celo-foundry/lib/forge-std/src/console.sol" ) diff --git a/packages/protocol/scripts/foundry/create_and_migrate_anvil_devchain.sh b/packages/protocol/scripts/foundry/create_and_migrate_anvil_devchain.sh index 1310cc63307..0a4eb3ee693 100755 --- a/packages/protocol/scripts/foundry/create_and_migrate_anvil_devchain.sh +++ b/packages/protocol/scripts/foundry/create_and_migrate_anvil_devchain.sh @@ -22,21 +22,31 @@ mkdir -p $TMP_FOLDER # Start a local anvil instance source $PWD/scripts/foundry/start_anvil.sh +# build standard forge artifacts, needed to deploy precompiles +forge build + # Deploy libraries to the anvil instance source $PWD/scripts/foundry/deploy_libraries.sh -echo "Library flags are: $LIBRARY_FLAGS" +echo "Library flags 0.5 are: $LIBRARY_FLAGS" +echo "Library flags 0.8 are: $LIBRARY_FLAGS_08" + +# Build map of selectors from governanceConstitution.json +source $PWD/scripts/foundry/build_constitution_selectors_map.sh # Build all contracts with deployed libraries # Including contracts that depend on libraries. This step replaces the library placeholder # in the bytecode with the address of the actually deployed library. -echo "Compiling with libraries..." -time FOUNDRY_PROFILE=devchain forge build $LIBRARY_FLAGS +echo "Compiling 0.5 with libraries..." +time FOUNDRY_PROFILE=truffle-compat forge build $LIBRARY_FLAGS +echo "Compiling 0.8 with libraries..." + +time FOUNDRY_PROFILE=truffle-compat8 forge build $LIBRARY_FLAGS_08 # Deploy precompile contracts source $PWD/scripts/foundry/deploy_precompiles.sh echo "Setting Registry Proxy" -PROXY_DEPLOYED_BYTECODE=$(jq -r '.deployedBytecode.object' ./out/Proxy.sol/Proxy.json) +PROXY_DEPLOYED_BYTECODE=$(jq -r '.deployedBytecode.object' ./out-truffle-compat/Proxy.sol/Proxy.json) cast rpc anvil_setCode $REGISTRY_ADDRESS $PROXY_DEPLOYED_BYTECODE --rpc-url $ANVIL_RPC_URL # Sets the storage of the registry so that it has an owner we control @@ -53,11 +63,11 @@ forge script \ --target-contract $MIGRATION_TARGET_CONTRACT \ --sender $FROM_ACCOUNT \ --unlocked \ - $VERBOSITY_LEVEL \ $BROADCAST \ $SKIP_SIMULATION \ $NON_INTERACTIVE \ $LIBRARY_FLAGS \ + $LIBRARY_FLAGS_08 \ --rpc-url $ANVIL_RPC_URL || { echo "Migration script failed"; exit 1; } CELO_EPOCH_REWARDS_ADDRESS=$( diff --git a/packages/protocol/scripts/foundry/create_and_migrate_anvil_l2_devchain.sh b/packages/protocol/scripts/foundry/create_and_migrate_anvil_l2_devchain.sh index 25f2ee31a37..0dd4ec1549b 100755 --- a/packages/protocol/scripts/foundry/create_and_migrate_anvil_l2_devchain.sh +++ b/packages/protocol/scripts/foundry/create_and_migrate_anvil_l2_devchain.sh @@ -4,20 +4,19 @@ set -euo pipefail # Read environment variables and constants source $PWD/scripts/foundry/constants.sh +# Read the JSON file +CONFIG_FILE=$PWD/migrations_sol/migrationsConfig.json + export KEEP_DEVCHAIN_FOLDER=true +# this temp file is deleted at the end +cp test-sol/devchain/Import05Dependencies.sol contracts + # Generate and run L1 devchain echo "Generating and running L1 devchain before activating L2..." source $PWD/scripts/foundry/create_and_migrate_anvil_devchain.sh -# Activate L2 by deploying arbitrary bytecode to the proxy admin address. -# Note: This can't be done from the migration script -ARBITRARY_BYTECODE=$(cast format-bytes32-string "L2 is activated") -cast rpc anvil_setCode \ - $PROXY_ADMIN_ADDRESS $ARBITRARY_BYTECODE \ - --rpc-url $ANVIL_RPC_URL - -# Fetch address of Celo distribution +# Fetch address CELO_UNRELEASED_TREASURY_ADDRESS=$( cast call \ $REGISTRY_ADDRESS \ @@ -26,9 +25,119 @@ CELO_UNRELEASED_TREASURY_ADDRESS=$( --rpc-url $ANVIL_RPC_URL ) +RESERVE_ADDRESS=$( + cast call \ + $REGISTRY_ADDRESS \ + "getAddressForStringOrDie(string calldata identifier)(address)" \ + "Reserve" \ + --rpc-url $ANVIL_RPC_URL +) + +VALIDATORS_ADDRESS=$( + cast call \ + $REGISTRY_ADDRESS \ + "getAddressForStringOrDie(string calldata identifier)(address)" \ + "Validators" \ + --rpc-url $ANVIL_RPC_URL +) + +ELECTIONS_ADDRESS=$( + cast call \ + $REGISTRY_ADDRESS \ + "getAddressForStringOrDie(string calldata identifier)(address)" \ + "Election" \ + --rpc-url $ANVIL_RPC_URL +) + +# Activate Validators +echo "Activating Validators..." +registered_validators=$(cast call \ + $VALIDATORS_ADDRESS \ + "getRegisteredValidators()(address[])" \ + --rpc-url $ANVIL_RPC_URL) + +echo "### registered_validators: $registered_validators" + +# Increase the block number using anvil cast rpc +BLOCKS_TO_ADVANCE=101 +cast rpc anvil_mine $BLOCKS_TO_ADVANCE --rpc-url $ANVIL_RPC_URL --rpc-timeout 30000 + +# Check if registered_validators is empty or invalid +if [ -z "$registered_validators" ] || [ "$registered_validators" == "[]" ]; then + echo "Error: registered_validators is empty or invalid JSON." + exit 1 +fi + +VAL_KEYS=$(jq -r '.validators.valKeys[]' "$CONFIG_FILE") + +# Extract the first 3 keys from VAL_KEYS +FIRST_THREE_KEYS=($(echo "$VAL_KEYS" | head -n 3)) + +# Print the extracted keys +echo "First 3 keys from VAL_KEYS:" +for key in "${FIRST_THREE_KEYS[@]}"; do + echo "$key" +done +key_index=0 +for validator in $registered_validators; do + validator=$(echo $validator | tr -d '[],') + echo "### registered val $validator" + + validator_info=$(cast call \ + $VALIDATORS_ADDRESS \ + "getValidator(address)(bytes,bytes,address,uint256,address)" \ + $validator \ + --rpc-url $ANVIL_RPC_URL) + + echo "### validator info: $validator_info" + # Extract the validator group from the validator info + validator_group=$(echo "$validator_info" | sed -n '3p' | tr -d '[:space:]') + + if [ -z "$validator_group" ]; then + echo "Error: Failed to retrieve validator group for validator $validator" + exit 1 + fi + + echo "### validator group: $validator_group" + + pending_votes=$(cast call \ + $ELECTIONS_ADDRESS \ + "getPendingVotesForGroup(address)(uint256)" \ + $validator_group \ + --rpc-url $ANVIL_RPC_URL) + + echo "### pending votes for group $validator_group: $pending_votes" + + pending_votes_cleaned=$(echo $pending_votes | sed 's/\[[^]]*\]//g' | tr -d '[:space:]') + echo "### cleaned pending votes:$pending_votes_cleaned" + if [ "$pending_votes_cleaned" = "0" ]; then + continue + fi + + cast send $ELECTIONS_ADDRESS "activate(address)" $validator_group --private-key ${FIRST_THREE_KEYS[$key_index]} --rpc-url $ANVIL_RPC_URL + key_index=$((key_index + 1)) + + echo "### activated validator group: $validator_group with pending votes: $pending_votes" + +done + +active_votes=$(cast call \ + $ELECTIONS_ADDRESS \ + "getActiveVotes()(uint256)" \ + --rpc-url $ANVIL_RPC_URL) +echo "### active votes: $active_votes" + +# Activate L2 by deploying arbitrary bytecode to the proxy admin address. +# Note: This can't be done from the migration script +echo "Activating L2 by deploying arbitrary bytecode to the proxy admin address..." +ARBITRARY_BYTECODE=$(cast format-bytes32-string "L2 is activated") +cast rpc anvil_setCode \ + $PROXY_ADMIN_ADDRESS $ARBITRARY_BYTECODE \ + --rpc-url $ANVIL_RPC_URL + # Set the balance of the CeloUnreleasedTreasury (like the Celo client would do during L2 genesis) # Note: This can't be done from the migration script, because CeloUnreleasedTreasury.sol does not -# implement the receive function nor does it allow ERC20 transfers. This is the only way I +# implement the receive function nor does it allow ERC20 transfers. This is the only way I # managed to give the CeloUnreleasedTreasury a balance. echo "Setting CeloUnreleasedTreasury balance..." HEX_CELO_UNRELEASED_TREASURY_INITIAL_BALANCE=$(cast to-hex $CELO_UNRELEASED_TREASURY_INITIAL_BALANCE"000000000000000000") @@ -37,6 +146,15 @@ cast rpc \ $CELO_UNRELEASED_TREASURY_ADDRESS $HEX_CELO_UNRELEASED_TREASURY_INITIAL_BALANCE \ --rpc-url $ANVIL_RPC_URL +# Set the balance of the Reserve For some reason, the balance that is set in the anvil_L1_devchain +# migration script does not get carried over to anvil_L2_devchain. +echo "Setting reserve balance..." +HEX_RESERVE_INITIAL_BALANCE=$(cast to-hex $RESERVE_INITIAL_BALANCE"000000000000000000") +cast rpc \ + anvil_setBalance \ + $RESERVE_ADDRESS $HEX_RESERVE_INITIAL_BALANCE \ + --rpc-url $ANVIL_RPC_URL + # Run L2 migrations echo "Running L2 migration script... " forge script \ @@ -44,17 +162,22 @@ forge script \ --target-contract $MIGRATION_L2_TARGET_CONTRACT \ --sender $FROM_ACCOUNT \ --unlocked \ - $VERBOSITY_LEVEL \ $BROADCAST \ $SKIP_SIMULATION \ $NON_INTERACTIVE \ - --rpc-url $ANVIL_RPC_URL || { echo "Migration script failed"; exit 1; } + $VERBOSITY_LEVEL \ + --timeout $TIMEOUT \ + --rpc-url $ANVIL_RPC_URL || { + echo "Migration script failed" + exit 1 +} # Give anvil enough time to save the state sleep $SLEEP_DURATION -# # Save L2 state so it can published to NPM +# Save L2 state so it can published to NPM mv $ANVIL_FOLDER/state.json $TMP_FOLDER/$L2_DEVCHAIN_FILE_NAME echo "Saved anvil L2 state to $TMP_FOLDER/$L2_DEVCHAIN_FILE_NAME" +rm contracts/Import05Dependencies.sol rm -rf $ANVIL_FOLDER diff --git a/packages/protocol/scripts/foundry/deploy_libraries.sh b/packages/protocol/scripts/foundry/deploy_libraries.sh index 366f3944991..b31da299e6e 100644 --- a/packages/protocol/scripts/foundry/deploy_libraries.sh +++ b/packages/protocol/scripts/foundry/deploy_libraries.sh @@ -4,22 +4,50 @@ set -euo pipefail # Read environment variables and constants source $PWD/scripts/foundry/constants.sh +# Function to copy libraries to temporary directory +copy_libraries() { + local -n lib_array=$1 + for LIB_PATH in "${lib_array[@]}"; do + IFS=":" read -r SOURCE DEST <<< "$LIB_PATH" + echo "SOURCE: $SOURCE" + echo "DEST: $DEST" + DEST_FILE="$TEMP_DIR/$SOURCE" + DEST_DIR=$(dirname "$DEST_FILE") + mkdir -p "$DEST_DIR" + echo "Copying file $SOURCE to $DEST_FILE" + cp "$SOURCE" "$DEST_FILE" + done +} + +# Function to deploy libraries +deploy_libraries() { + local -n lib_array=$1 + local profile=$2 + local flags_var=$3 + local version=$4 + + echo "Deploying libraries $version..." + for LIB_PATH in "${lib_array[@]}"; do + LIB_NAME="${LIB_PATH#*:}" + echo "Deploying library: $LIB_NAME" + create_library_out=`FOUNDRY_PROFILE=$profile forge create $LIB_PATH --from $FROM_ACCOUNT --rpc-url $ANVIL_RPC_URL --unlocked --broadcast --json` + LIB_ADDRESS=`echo $create_library_out | jq -r '.deployedTo'` + # Constructing library flag so the remaining contracts can be built and linkeded to these libraries + eval "$flags_var=\"\$$flags_var --libraries $LIB_PATH:$LIB_ADDRESS\"" + done +} + # Create a temporary directory or remove it first it if exists if [ -d "$TEMP_DIR" ]; then echo "Removing existing temporary folder..." rm -rf $TEMP_DIR fi + mkdir $TEMP_DIR # Copy libraries to the directory -for LIB_PATH in "${LIBRARIES_PATH[@]}"; do - IFS=":" read -r SOURCE DEST <<< "$LIB_PATH" - DEST_FILE="$TEMP_DIR/$SOURCE" - DEST_DIR=$(dirname "$DEST_FILE") - mkdir -p "$DEST_DIR" - echo "Copying file $SOURCE to $DEST_FILE" - cp "$SOURCE" "$DEST_FILE" -done +copy_libraries LIBRARIES_PATH +copy_libraries LIBRARIES_PATH_08 # Creating two variables for better readability SOURCE_DIR=$PWD @@ -41,23 +69,16 @@ cp $SOURCE_DIR/remappings.txt $DEST_DIR/remappings.txt pushd $TEMP_DIR # Build libraries -echo "Building libraries..." -forge build +echo "Building with 0.5 libraries..." +time FOUNDRY_PROFILE=truffle-compat forge build # Deploy libraries and building library flag -echo "Deploying libraries..." export LIBRARY_FLAGS="" -for LIB_PATH in "${LIBRARIES_PATH[@]}"; do - LIB_NAME="${LIB_PATH#*:}" - # For example: - # LIB_PATH = "contracts/common/linkedlists/AddressSortedLinkedListWithMedian.sol:AddressSortedLinkedListWithMedian" - # LIB_NAME = AddressSortedLinkedListWithMedian - echo "Deploying library: $LIB_NAME" - create_library_out=`forge create $LIB_PATH --from $FROM_ACCOUNT --rpc-url $ANVIL_RPC_URL --unlocked --json` - LIB_ADDRESS=`echo $create_library_out | jq -r '.deployedTo'` - # Constructing library flag so the remaining contracts can be built and linkeded to these libraries - LIBRARY_FLAGS="$LIBRARY_FLAGS --libraries $LIB_PATH:$LIB_ADDRESS" -done +deploy_libraries LIBRARIES_PATH "truffle-compat" "LIBRARY_FLAGS" "0.5" + +export LIBRARY_FLAGS_08="" +deploy_libraries LIBRARIES_PATH_08 "truffle-compat8" "LIBRARY_FLAGS_08" "0.8" + # Move out of the temporary directory popd diff --git a/packages/protocol/scripts/foundry/make-release-foundry.sh b/packages/protocol/scripts/foundry/make-release-foundry.sh new file mode 100755 index 00000000000..8ec6576255f --- /dev/null +++ b/packages/protocol/scripts/foundry/make-release-foundry.sh @@ -0,0 +1,92 @@ +#!/usr/bin/env bash +set -euo pipefail + +# Deploys new contract implementations and generates governance proposal. +# +# Flags: +# -b: Branch to build contracts from (must be core-contracts.vX or release/core-contracts/X format). +# -k: Private key to sign transactions from. +# -i: Path to the data needed to initialize contracts. +# -l: Path to the canonical library mapping. +# -n: The network to deploy to. +# -p: (Deprecated) No longer accepts a path. Proposal is always generated as proposal-$NETWORK-$BRANCH.json. +# -r: Path to the contract compatibility report. +# -u: Custom RPC URL (optional, overrides network default). +# -s: Skip contract verification (optional). +# -a: Celoscan API key for verification (optional, can also use CELOSCAN_API_KEY env var). + +BRANCH="" +PRIVATE_KEY="" +INITIALIZE_DATA="" +LIBRARIES="" +NETWORK="" +PROPOSAL="" +REPORT="" +RPC_URL="" +SKIP_VERIFICATION="" +CELOSCAN_API_KEY_ARG="" + +while getopts 'b:k:i:l:n:p:r:u:sa:' flag; do + case "${flag}" in + b) BRANCH="${OPTARG}" ;; + k) PRIVATE_KEY="${OPTARG}" ;; + i) INITIALIZE_DATA="${OPTARG}" ;; + l) LIBRARIES="${OPTARG}" ;; + n) NETWORK="${OPTARG}" ;; + p) PROPOSAL="${OPTARG}" ;; + r) REPORT="${OPTARG}" ;; + u) RPC_URL="${OPTARG}" ;; + s) SKIP_VERIFICATION="true" ;; + a) CELOSCAN_API_KEY_ARG="${OPTARG}" ;; + *) + echo "Unexpected option ${flag}" >&2 + exit 1 + ;; + esac +done + +[ -z "$BRANCH" ] && echo "Need to set the build branch via the -b flag" && exit 1; +[ -z "$PRIVATE_KEY" ] && echo "Need to set the private key for signing via the -k flag" && exit 1; +[ -z "$INITIALIZE_DATA" ] && echo "Need to set the initialization data via the -i flag" && exit 1; +[ -z "$LIBRARIES" ] && echo "Need to set the library mapping input via the -l flag" && exit 1; +[ -z "$NETWORK" ] && echo "Need to set the NETWORK via the -n flag" && exit 1; +[ -z "$REPORT" ] && echo "Need to set the compatibility report input via the -r flag" && exit 1; + +if [ -n "$PROPOSAL" ]; then + echo "Error: -p no longer accepts a path. Proposal name is now generated automatically as proposal-\$NETWORK-\$BRANCH.json." >&2 + echo "See: https://github.com/celo-org/celo-monorepo/pull/11662" >&2 + exit 1 +fi +PROPOSAL="proposal-$NETWORK-$BRANCH.json" + +source scripts/bash/validate-libraries-filename.sh +validate_libraries_filename "$LIBRARIES" "$NETWORK" "$BRANCH" + +source scripts/bash/validate-libraries-bytecode.sh +VALIDATION_RPC_URL="${RPC_URL:-$(get_forno_url "$NETWORK")}" +validate_libraries_bytecode "$LIBRARIES" "$VALIDATION_RPC_URL" + +BUILD_DIR="./out-${BRANCH}" + +# Build the command with optional flags +OPTIONAL_FLAGS="" +if [ -n "$RPC_URL" ]; then + OPTIONAL_FLAGS="$OPTIONAL_FLAGS --rpcUrl $RPC_URL" +fi +if [ -n "$SKIP_VERIFICATION" ]; then + OPTIONAL_FLAGS="$OPTIONAL_FLAGS --skipVerification" +fi +if [ -n "$CELOSCAN_API_KEY_ARG" ]; then + OPTIONAL_FLAGS="$OPTIONAL_FLAGS --celoscanApiKey $CELOSCAN_API_KEY_ARG" +fi + +yarn ts-node --transpile-only ./scripts/foundry/make-release.ts \ + --branch "$BRANCH" \ + --privateKey "$PRIVATE_KEY" \ + --initializeData "$INITIALIZE_DATA" \ + --librariesFile "$LIBRARIES" \ + --network "$NETWORK" \ + --proposal "$PROPOSAL" \ + --report "$REPORT" \ + --buildDirectory "$BUILD_DIR" \ + $OPTIONAL_FLAGS diff --git a/packages/protocol/scripts/foundry/make-release.ts b/packages/protocol/scripts/foundry/make-release.ts new file mode 100644 index 00000000000..75c75aa8ea4 --- /dev/null +++ b/packages/protocol/scripts/foundry/make-release.ts @@ -0,0 +1,1411 @@ +/* eslint-disable no-console */ +import { LibraryAddresses } from '@celo/protocol/lib/bytecode' +import { ASTDetailedVersionedReport } from '@celo/protocol/lib/compatibility/report' +import { getCeloContractDependencies } from '@celo/protocol/lib/contract-dependencies' +import { CeloContractName, celoRegistryAddress } from '@celo/protocol/lib/registry-utils' +import { SOLIDITY_08_PACKAGE } from '@celo/protocol/contractPackages' +import { ForgeArtifact } from '@celo/protocol/scripts/foundry/ForgeArtifact' +import { NULL_ADDRESS, eqAddress } from '@celo/utils/lib/address' +import { exec } from 'child_process' +import { createInterface } from 'readline' +import { existsSync, readJsonSync, readdirSync, writeJsonSync } from 'fs-extra' +import { basename, join } from 'path' +import { TextEncoder, promisify } from 'util' +import { + Abi, + Account, + Chain, + Hex, + PublicClient, + Transport, + Address as ViemAddress, + WalletClient, + createPublicClient, + createWalletClient, + decodeFunctionResult, + defineChain, + encodeFunctionData, + http, + keccak256, + toHex, +} from 'viem' +import { mnemonicToAccount, privateKeyToAccount } from 'viem/accounts' +import * as viemChains from 'viem/chains' +import yargs from 'yargs' +import { hideBin } from 'yargs/helpers' +import { getReleaseVersion, ignoredContractsV9 } from '../../lib/compatibility/ignored-contracts-v9' + +const execAsync = promisify(exec) + +// Use Pick to extract only the methods we need from viem's client types +// This maintains compatibility with viem's complex generics +type PublicClientMethods = Pick< + PublicClient, + 'call' | 'waitForTransactionReceipt' +> + +type WalletClientMethods = Pick< + WalletClient, + 'account' | 'chain' | 'deployContract' | 'writeContract' +> + +// Registry ABI for getAddressForString - used for type-safe contract reads +const registryGetAddressAbi = [ + { + type: 'function', + name: 'getAddressForString', + inputs: [{ name: 'identifier', type: 'string' }], + outputs: [{ name: '', type: 'address' }], + stateMutability: 'view', + }, +] as const +// AbiParameter type is inferred from Abi entries +type AbiParameter = { + name?: string + type: string + internalType?: string + components?: AbiParameter[] +} + +interface MakeReleaseArgv { + report: string + proposal: string + librariesFile: string + initializeData: string + buildDirectory: string + branch: string + network: string + privateKey?: string + mnemonic?: string + rpcUrl?: string + skipVerification?: boolean + celoscanApiKey?: string +} + +// Track linked library for verification +interface LinkedLibrary { + sourceFile: string + name: string + address: string +} + +// Track deployed contracts for verification +interface DeployedContract { + name: string + address: string + sourceFile: string + constructorArgs: any[] + isLibrary: boolean + compilerVersion: string + optimizerEnabled: boolean + optimizerRuns: number + evmVersion: string + linkedLibraries: LinkedLibrary[] + foundryProfile?: string // Foundry compilation profile for verification +} + +// Network verification configuration +interface VerificationConfig { + celoscanApiUrl: string + celoscanApiKey?: string + blockscoutApiUrl: string + chainId: number +} + +const getVerificationConfig = (networkName: string): VerificationConfig | null => { + // Etherscan V2 API uses unified endpoint: https://api.etherscan.io/v2/api?chainid=CHAINID + // Works with a single API key for all supported chains + switch (networkName.toLowerCase()) { + case 'celo': + case 'mainnet': + case 'rc1': + return { + celoscanApiUrl: 'https://api.etherscan.io/v2/api?chainid=42220', + blockscoutApiUrl: 'https://celo.blockscout.com/api/', + chainId: 42220, + } + case 'celo-sepolia': + return { + celoscanApiUrl: 'https://api.etherscan.io/v2/api?chainid=11142220', + blockscoutApiUrl: 'https://celo-sepolia.blockscout.com/api/', + chainId: 11142220, + } + default: + // Local forks don't need verification + return null + } +} + +// Store deployed contracts for verification +const deployedContracts: DeployedContract[] = [] + +const verifyContractOnBlockscout = async ( + contract: DeployedContract, + config: VerificationConfig, + rpcUrl: string +): Promise => { + // Build forge verify-contract command for Blockscout + const cmd = [ + 'forge verify-contract', + `--rpc-url "${rpcUrl}"`, + contract.address, + `"${contract.sourceFile}:${contract.name}"`, + '--verifier blockscout', + `--verifier-url "${config.blockscoutApiUrl}"`, + `--compiler-version ${contract.compilerVersion}`, + `--evm-version ${contract.evmVersion}`, + '--watch', + '--retries 5', + ] + + // Add optimizer settings + if (contract.optimizerEnabled) { + cmd.push(`--num-of-optimizations ${contract.optimizerRuns}`) + } + + // Add constructor args if present + if (contract.constructorArgs && contract.constructorArgs.length > 0) { + const encodedArgs = encodeConstructorArgs(contract.constructorArgs) + if (encodedArgs) { + cmd.push(`--constructor-args ${encodedArgs}`) + } + } + + // Add linked libraries if present (critical for contracts that use libraries) + if (contract.linkedLibraries && contract.linkedLibraries.length > 0) { + for (const lib of contract.linkedLibraries) { + cmd.push(`--libraries "${lib.sourceFile}:${lib.name}:${lib.address}"`) + } + } + + const fullCmd = cmd.join(' ') + + // Set FOUNDRY_PROFILE environment variable for proper compilation settings + const env = { ...process.env } + if (contract.foundryProfile) { + env.FOUNDRY_PROFILE = contract.foundryProfile + } + + // Retry loop for handling "Address is not a smart-contract" errors + for (let attempt = 0; attempt <= VERIFICATION_MAX_RETRIES; attempt++) { + try { + await execAsync(fullCmd, { + cwd: process.cwd(), + timeout: 180000, // 3 minute timeout + env, + }) + + process.stdout.write(' Blockscout ✓') + return true + } catch (error: any) { + const isRetryable = isRetryableVerificationError(error) + const hasRetriesLeft = attempt < VERIFICATION_MAX_RETRIES + + if (isRetryable && hasRetriesLeft) { + const delay = getRetryDelay(attempt) + process.stdout.write(` (retry ${attempt + 1}...)`) + await sleep(delay) + continue + } + + process.stdout.write(' Blockscout ✗') + return false + } + } + + return false +} + +const verifyContractOnCeloscan = async ( + contract: DeployedContract, + config: VerificationConfig, + rpcUrl: string +): Promise => { + if (!config.celoscanApiKey) { + process.stdout.write(' Celoscan (skipped)') + return false + } + + // Build forge verify-contract command for Celoscan (Etherscan-compatible) + const cmd = [ + 'forge verify-contract', + `--rpc-url "${rpcUrl}"`, + contract.address, + `"${contract.sourceFile}:${contract.name}"`, + '--verifier etherscan', + `--verifier-url "${config.celoscanApiUrl}"`, + `--etherscan-api-key "${config.celoscanApiKey}"`, + `--chain-id ${config.chainId}`, + `--compiler-version ${contract.compilerVersion}`, + `--evm-version ${contract.evmVersion}`, + '--watch', + '--retries 5', + ] + + // Add optimizer settings + if (contract.optimizerEnabled) { + cmd.push(`--num-of-optimizations ${contract.optimizerRuns}`) + } + + // Add constructor args if present + if (contract.constructorArgs && contract.constructorArgs.length > 0) { + const encodedArgs = encodeConstructorArgs(contract.constructorArgs) + if (encodedArgs) { + cmd.push(`--constructor-args ${encodedArgs}`) + } + } + + // Add linked libraries if present (critical for contracts that use libraries) + if (contract.linkedLibraries && contract.linkedLibraries.length > 0) { + for (const lib of contract.linkedLibraries) { + cmd.push(`--libraries "${lib.sourceFile}:${lib.name}:${lib.address}"`) + } + } + + const fullCmd = cmd.join(' ') + + // Set FOUNDRY_PROFILE environment variable for proper compilation settings + const env = { ...process.env } + if (contract.foundryProfile) { + env.FOUNDRY_PROFILE = contract.foundryProfile + } + + // Retry loop for handling "Address is not a smart-contract" errors + for (let attempt = 0; attempt <= VERIFICATION_MAX_RETRIES; attempt++) { + try { + await execAsync(fullCmd, { + cwd: process.cwd(), + timeout: 180000, // 3 minute timeout + env, + }) + + process.stdout.write(' Celoscan ✓') + return true + } catch (error: any) { + const isRetryable = isRetryableVerificationError(error) + const hasRetriesLeft = attempt < VERIFICATION_MAX_RETRIES + + if (isRetryable && hasRetriesLeft) { + const delay = getRetryDelay(attempt) + process.stdout.write(` (retry ${attempt + 1}...)`) + await sleep(delay) + continue + } + + process.stdout.write(' Celoscan ✗') + return false + } + } + + return false +} + +// Helper to sleep for a given number of milliseconds +const sleep = (ms: number): Promise => new Promise((resolve) => setTimeout(resolve, ms)) + +const promptUserConfirmation = (message: string): Promise => { + const rl = createInterface({ input: process.stdin, output: process.stdout }) + return new Promise((resolve) => { + rl.question(`${message} (y/N): `, (answer) => { + rl.close() + resolve(answer.toLowerCase() === 'y') + }) + }) +} + +// Retry configuration for verification +const VERIFICATION_MAX_RETRIES = 6 +const VERIFICATION_INITIAL_DELAY_MS = 5000 // 5 seconds +const VERIFICATION_MAX_DELAY_MS = 60000 // 1 minute + +// Calculate logarithmic delay: 5s, 10s, 20s, 40s... capped at 60s +const getRetryDelay = (attempt: number): number => { + const delay = VERIFICATION_INITIAL_DELAY_MS * Math.pow(2, attempt) + return Math.min(delay, VERIFICATION_MAX_DELAY_MS) +} + +// Check if error is retryable (block explorer hasn't indexed the contract yet) +const isRetryableVerificationError = (error: any): boolean => { + const errorMessage = error?.message || '' + const stdout = error?.stdout || '' + const stderr = error?.stderr || '' + const fullMessage = `${errorMessage} ${stdout} ${stderr}`.toLowerCase() + + return ( + fullMessage.includes('address is not a smart-contract') || + fullMessage.includes('contract not found') || + fullMessage.includes('not yet indexed') + ) +} + +// Helper to encode constructor args for verification +const encodeConstructorArgs = (args: any[]): string | null => { + if (!args || args.length === 0) return null + + try { + // For simple boolean args (our common case: [false]) + if (args.length === 1 && typeof args[0] === 'boolean') { + // false encodes to 0x0000...0000 (32 bytes of zeros) + // true encodes to 0x0000...0001 (31 bytes of zeros + 1) + return args[0] + ? '0x0000000000000000000000000000000000000000000000000000000000000001' + : '0x0000000000000000000000000000000000000000000000000000000000000000' + } + + // For other cases, we'd need more sophisticated encoding + // This could be extended as needed + console.warn('Complex constructor args may need manual encoding') + return null + } catch (e) { + console.warn('Failed to encode constructor args:', e) + return null + } +} + +const verifyAllContracts = async ( + networkName: string, + rpcUrl: string, + celoscanApiKey?: string +): Promise => { + const config = getVerificationConfig(networkName) + + if (!config) { + console.log('\nSkipping verification (not supported for this network/fork)') + return + } + + if (deployedContracts.length === 0) { + console.log('\nNo contracts to verify') + return + } + + // Set API key if provided + if (celoscanApiKey) { + config.celoscanApiKey = celoscanApiKey + } + + console.log(`\nVerifying ${deployedContracts.length} contract(s) on ${networkName}...`) + + // Wait for block explorers to index the contracts + process.stdout.write(`Waiting for indexing...`) + await new Promise((resolve) => setTimeout(resolve, 30000)) + console.log(` done\n`) + + let blockscoutSuccess = 0 + let celoscanSuccess = 0 + + for (const contract of deployedContracts) { + // Single line per contract: Name (address) -> verification results + process.stdout.write(` ${contract.name} (${contract.address.slice(0, 10)}...)`) + + // Verify on Blockscout first (no API key needed) + const blockscoutResult = await verifyContractOnBlockscout(contract, config, rpcUrl) + if (blockscoutResult) blockscoutSuccess++ + + // Then verify on Celoscan (needs API key) + const celoscanResult = await verifyContractOnCeloscan(contract, config, rpcUrl) + if (celoscanResult) celoscanSuccess++ + + console.log() // newline after each contract + } + + console.log( + `\nVerified: Blockscout ${blockscoutSuccess}/${deployedContracts.length}, Celoscan ${celoscanSuccess}/${deployedContracts.length}` + ) +} + +function bigIntReplacer(_key: string, value: any): unknown { + if (typeof value === 'bigint') { + return value.toString() + } + return value +} + +let ignoredContractsSet = new Set() + +class ContractAddresses { + static async create( + contracts: string[], + publicClient: PublicClientMethods, + _registryAbi: Abi, // Kept for API compatibility, uses registryGetAddressAbi internally + registryAddress: ViemAddress, + libraryAddresses: LibraryAddresses['addresses'] + ) { + const addresses = new Map() + await Promise.all( + contracts.map(async (contract: string) => { + try { + // Use low-level call to avoid viem's strict readContract typing + const callData = encodeFunctionData({ + abi: registryGetAddressAbi, + functionName: 'getAddressForString', + args: [contract], + }) + const result = await publicClient.call({ + to: registryAddress, + data: callData, + }) + const registeredAddress = result.data + ? decodeFunctionResult({ + abi: registryGetAddressAbi, + functionName: 'getAddressForString', + data: result.data, + }) + : NULL_ADDRESS + if (registeredAddress && !eqAddress(registeredAddress, NULL_ADDRESS)) { + addresses.set(contract, registeredAddress) + } + } catch (error) { + /* Ignore error if contract not in registry */ + } + }) + ) + Object.entries(libraryAddresses).forEach(([library, address]) => + addresses.set(library, address as string) + ) + return new ContractAddresses(addresses) + } + + constructor(public addresses: Map) {} + + public get = (contract: string): string => { + if (this.addresses.has(contract)) { + return this.addresses.get(contract)! + } else { + throw new Error(`Unable to find address for ${contract}`) + } + } + + public set = (contract: string, address: string) => { + this.addresses.set(contract, address) + } +} + +interface ViemContract { + contractName: string + address: ViemAddress + abi: Abi + bytecode: Hex + sourceFiles: string[] + compilerVersion: string + optimizerEnabled: boolean + optimizerRuns: number + evmVersion: string + foundryProfile?: string // Foundry compilation profile for verification +} + +const proxiedCoreContracts = new Set([ + CeloContractName.Accounts, + CeloContractName.Attestations, + CeloContractName.BlockchainParameters, + CeloContractName.DoubleSigningSlasher, + CeloContractName.DowntimeSlasher, + CeloContractName.Election, + CeloContractName.EpochRewards, + CeloContractName.Escrow, + CeloContractName.Exchange, + CeloContractName.ExchangeEUR, + CeloContractName.ExchangeBRL, + CeloContractName.FeeCurrencyWhitelist, + CeloContractName.Freezer, + CeloContractName.GoldToken, + CeloContractName.Governance, + CeloContractName.LockedGold, + CeloContractName.Random, + CeloContractName.Reserve, + CeloContractName.SortedOracles, + CeloContractName.StableToken, + CeloContractName.StableTokenEUR, + CeloContractName.StableTokenBRL, + CeloContractName.Validators, + CeloContractName.GrandaMento, + CeloContractName.FeeHandler, + CeloContractName.FederatedAttestations, + CeloContractName.EpochManager, + CeloContractName.EpochManagerEnabler, + CeloContractName.ScoreManager, + CeloContractName.FeeCurrencyDirectory, + CeloContractName.CeloUnreleasedTreasury, + CeloContractName.OdisPayments, +]) + +const isProxiedContract = ( + contractName: string, + buildDir05: string, + buildDir08: string +): boolean => { + // eslint-disable-next-line @typescript-eslint/no-unsafe-return + return ( + proxiedCoreContracts.has(contractName) || + existsSync(getContractArtifactPath(`${contractName}Proxy`, buildDir05, buildDir08)) + ) +} + +const isCoreContract = (contractName: string) => + [...Object.keys(CeloContractName)].includes(contractName) + +type ViemAbiConstructor = Extract + +type SolidityDefaultValue = + | bigint + | boolean + | string + | SolidityDefaultValue[] + | { [key: string]: SolidityDefaultValue } + | undefined + +function getDefaultValueForSolidityType( + solidityType: string, + components?: readonly AbiParameter[] +): SolidityDefaultValue { + if (solidityType.endsWith('[]')) { + return [] + } + + const fixedArrayMatch = solidityType.match(/^(.*)\[(\d+)\]$/) + if (fixedArrayMatch) { + const baseType = fixedArrayMatch[1] + const size = parseInt(fixedArrayMatch[2], 10) + const elementComponents = baseType === 'tuple' ? components : undefined + return Array(size) + .fill(null) + .map(() => getDefaultValueForSolidityType(baseType, elementComponents)) + } + + if (solidityType.startsWith('uint') || solidityType.startsWith('int')) { + return BigInt(0) + } + if (solidityType === 'bool') { + return false + } + if (solidityType === 'address') { + return NULL_ADDRESS + } + if (solidityType === 'string') { + return '' + } + if (solidityType.startsWith('bytes')) { + const sizeMatch = solidityType.match(/^bytes(\d+)$/) + if (sizeMatch) { + const size = parseInt(sizeMatch[1], 10) + return `0x${'00'.repeat(size)}` + } + return '0x' + } + + if (solidityType.startsWith('tuple')) { + if (!components || components.length === 0) { + console.warn( + `Tuple type ${solidityType} has no components defined in ABI input, returning {}.` + ) + return {} + } + const tupleValue: { [key: string]: SolidityDefaultValue } = {} + for (const component of components) { + const subComponents = + 'components' in component ? (component.components as readonly AbiParameter[]) : undefined + tupleValue[component.name!] = getDefaultValueForSolidityType(component.type, subComponents) + } + return tupleValue + } + + console.warn(`Unknown Solidity type for default value: ${solidityType}, returning undefined.`) + return undefined +} + +const deployImplementation = async ( + contractName: string, + contractArtifact: ViemContract, + walletClient: WalletClientMethods, + publicClient: PublicClientMethods, + requireVersion = true, + gas?: bigint, + isLibrary = false, + linkedLibraries: LinkedLibrary[] = [] +): Promise => { + let finalConstructorArgs: any[] = [] + const constructorAbiEntry = contractArtifact.abi.find((item) => item.type === 'constructor') as + | ViemAbiConstructor + | undefined + + if (constructorAbiEntry && constructorAbiEntry.inputs && constructorAbiEntry.inputs.length > 0) { + finalConstructorArgs = constructorAbiEntry.inputs.map((input: Readonly) => { + const components = + 'components' in input ? (input.components as readonly AbiParameter[]) : undefined + return getDefaultValueForSolidityType(input.type, components) + }) + } + + console.log('Deploying', contractName, 'with derived constructor args:', finalConstructorArgs) + + if (!contractArtifact.bytecode) { + throw new Error(`Bytecode for ${contractName} is missing.`) + } + + const hash = await walletClient.deployContract({ + abi: contractArtifact.abi, + bytecode: contractArtifact.bytecode, + account: walletClient.account!, + chain: walletClient.chain!, + gas: gas ?? BigInt(20_000_000), + args: finalConstructorArgs, + }) + const receipt = await publicClient.waitForTransactionReceipt({ hash }) + if (receipt.status !== 'success' || !receipt.contractAddress) { + throw new Error( + `Deployment of ${contractName} failed. Receipt: ${JSON.stringify(receipt, bigIntReplacer)}` + ) + } + const deployedAddress = receipt.contractAddress + console.info(`${contractName} deployed at ${deployedAddress}`) + + if (requireVersion) { + const getVersionNumberAbiEntry = contractArtifact.abi.find( + (item: any) => item.type === 'function' && item.name === 'getVersionNumber' + ) + if (!getVersionNumberAbiEntry) { + throw new Error( + `Contract ${contractName} has changes but does not specify a version number in its ABI` + ) + } + } + + // Track deployed contract for verification + // Determine source file from artifact's sourceFiles - find the one containing the contract + const sourceFile = + contractArtifact.sourceFiles.find((f) => f.includes(`${contractName}.sol`)) || + contractArtifact.sourceFiles[0] || + `contracts/${contractName}.sol` + + deployedContracts.push({ + name: contractName, + address: deployedAddress, + sourceFile, + constructorArgs: finalConstructorArgs, + isLibrary, + compilerVersion: contractArtifact.compilerVersion, + optimizerEnabled: contractArtifact.optimizerEnabled, + optimizerRuns: contractArtifact.optimizerRuns, + evmVersion: contractArtifact.evmVersion, + linkedLibraries, + foundryProfile: contractArtifact.foundryProfile, + }) + + return { ...contractArtifact, address: deployedAddress } +} + +const deployProxy = async ( + contractName: string, + proxyArtifact: ViemContract, + addresses: ContractAddresses, + walletClient: WalletClientMethods, + publicClient: PublicClientMethods, + gas?: bigint +): Promise => { + if (contractName === 'Governance') { + throw new Error(`Storage incompatible changes to Governance are not yet supported`) + } + const proxyContractName = `${contractName}Proxy` + console.info(`Deploying ${proxyContractName}`) + + if (!proxyArtifact.bytecode) { + throw new Error(`Bytecode for ${proxyContractName} is missing.`) + } + + const hash = await walletClient.deployContract({ + abi: proxyArtifact.abi, + bytecode: proxyArtifact.bytecode, + account: walletClient.account!, + chain: walletClient.chain!, + gas: gas ?? BigInt(20_000_000), + }) + const receipt = await publicClient.waitForTransactionReceipt({ hash }) + if (receipt.status !== 'success' || !receipt.contractAddress) { + throw new Error( + `Deployment of ${proxyContractName} failed. Receipt: ${JSON.stringify( + receipt, + bigIntReplacer + )}` + ) + } + const proxyAddress = receipt.contractAddress + console.info(`${proxyContractName} deployed at ${proxyAddress}`) + + const deployedProxyContract = { ...proxyArtifact, address: proxyAddress } + + const governanceAddress = addresses.get('Governance') + const transferHash = await walletClient.writeContract({ + address: proxyAddress, + abi: deployedProxyContract.abi, + functionName: '_transferOwnership', + args: [governanceAddress], + account: walletClient.account!, + chain: walletClient.chain!, + }) + await publicClient.waitForTransactionReceipt({ hash: transferHash }) + return deployedProxyContract +} + +const shouldDeployProxy = (report: ASTDetailedVersionedReport, contractName: string) => { + const hasStorageChanges = report.contracts[contractName].changes.storage.length > 0 + const isNewContract = report.contracts[contractName].changes.major.find( + (change: any) => change.type === 'NewContract' + ) + return hasStorageChanges || isNewContract +} + +const deployCoreContract = async ( + contractName: string, + implementationArtifact: ViemContract, + proposal: ProposalTx[], + addresses: ContractAddresses, + report: ASTDetailedVersionedReport, + initializationData: Record, + walletClient: WalletClientMethods, + publicClient: PublicClientMethods, + buildDir05: string, + buildDir08: string, + linkedLibraries: LinkedLibrary[] = [] +) => { + const deployedImplementation = await deployImplementation( + contractName, + implementationArtifact, + walletClient, + publicClient, + true, + undefined, + false, // isLibrary + linkedLibraries + ) + + const setImplementationTx: ProposalTx = { + contract: `${contractName}Proxy`, + function: '_setImplementation', + args: [deployedImplementation.address], + value: '0', + } + + if (!shouldDeployProxy(report, contractName)) { + proposal.push(setImplementationTx) + } else { + const proxyArtifactName = `${contractName}Proxy` + const proxyArtifactPath = getContractArtifactPath(proxyArtifactName, buildDir05, buildDir08) + if (!existsSync(proxyArtifactPath)) { + throw new Error(`Proxy artifact ${proxyArtifactName} not found at ${proxyArtifactPath}.`) + } + let proxyArtifact: ViemContract + try { + proxyArtifact = loadContractArtifact(proxyArtifactName, proxyArtifactPath) + } catch (e) { + throw new Error( + `Failed to load proxy artifact ${proxyArtifactName} from ${proxyArtifactPath}. Error: ${e}` + ) + } + const deployedProxy = await deployProxy( + contractName, + proxyArtifact, + addresses, + walletClient, + publicClient + ) + + addresses.set(contractName, deployedProxy.address) + proposal.push({ + contract: 'Registry', + function: 'setAddressFor', + args: [contractName, deployedProxy.address], + value: '0', + description: `Registry: ${contractName} -> ${deployedProxy.address}`, + }) + + const initializeAbiEntry = implementationArtifact.abi.find( + (item: any) => item.type === 'function' && item.name === 'initialize' + ) + + if (initializeAbiEntry) { + const initArgs = initializationData[contractName] + if (initArgs) { + let callData: Hex + try { + callData = encodeFunctionData({ + abi: implementationArtifact.abi, + functionName: 'initialize', + args: initArgs, + }) + } catch (error) { + throw new Error( + `Tried to encode initialize for ${contractName} with args: ${JSON.stringify( + initArgs + )}. Error: ${error}. ABI: ${JSON.stringify(initializeAbiEntry)}` + ) + } + setImplementationTx.function = '_setAndInitializeImplementation' + setImplementationTx.args.push(callData) + } else { + console.warn(`No initialization data found for ${contractName}. Skipping initialization.`) + } + } + console.info( + `Add '${contractName}Proxy.${setImplementationTx.function}' with args ${JSON.stringify( + setImplementationTx.args + )} to proposal` + ) + console.log('Deployed', contractName) + proposal.push(setImplementationTx) + } +} + +const deployLibrary = async ( + libraryName: string, + libraryArtifact: ViemContract, + addresses: ContractAddresses, + walletClient: WalletClientMethods, + publicClient: PublicClientMethods +): Promise => { + const deployedLibrary = await deployImplementation( + libraryName, + libraryArtifact, + walletClient, + publicClient, + false, + undefined, + true // isLibrary = true + ) + addresses.set(libraryName, deployedLibrary.address.substring(2)) +} + +export interface ProposalTx { + contract: string + function: string + args: string[] + value: string + description?: string +} + +const getViemChain = (networkName: string): Chain => { + switch (networkName.toLowerCase()) { + case 'celo': + case 'mainnet': + case 'rc1': + return viemChains.celo + case 'celo-sepolia': + return defineChain({ + id: 11142220, + name: 'Celo Sepolia', + nativeCurrency: { name: 'Celo', symbol: 'CELO', decimals: 18 }, + rpcUrls: { + default: { http: ['https://forno.celo-sepolia.celo-testnet.org'] }, + }, + blockExplorers: { + default: { name: 'CeloScan', url: 'https://celo-sepolia.blockscout.com' }, + }, + testnet: true, + }) + default: + console.warn(`Unknown network ${networkName}, defaulting to Hardhat chain config`) + return { ...viemChains.hardhat, id: 31337 } + } +} + +const loadContractArtifact = (contractName: string, artifactPath: string): ViemContract => { + console.log('loadContractArtifact', contractName, artifactPath) + const artifact = readJsonSync(artifactPath) as ForgeArtifact + const sourceFiles = Object.keys(artifact.metadata.sources) + + // Extract compiler settings from metadata + const compiler = artifact.metadata?.compiler || {} + const settings = artifact.metadata?.settings || {} + const optimizer = settings.optimizer || { enabled: true, runs: 200 } + + // Use full compiler version (e.g., "0.5.14+commit.01f1aaa4") for verification + // Etherscan may require the full version to properly verify + const fullVersion = compiler.version || '0.8.19' + + // Determine foundry profile based on source file paths + // contracts/ = truffle-compat (Solidity 0.5.x) + // contracts-0.8/ = truffle-compat8 (Solidity 0.8.x) + let foundryProfile: string | undefined + const mainSourceFile = + sourceFiles.find((f) => f.includes(`${contractName}.sol`)) || sourceFiles[0] + if (mainSourceFile) { + if (mainSourceFile.startsWith('contracts-0.8/')) { + foundryProfile = 'truffle-compat8' + } else if (mainSourceFile.startsWith('contracts/')) { + foundryProfile = 'truffle-compat' + } + } + + return { + contractName, + abi: artifact.abi as Abi, + bytecode: artifact.bytecode.object, + address: '0x0' as ViemAddress, + sourceFiles, + compilerVersion: fullVersion, + optimizerEnabled: optimizer.enabled ?? true, + optimizerRuns: optimizer.runs ?? 200, + evmVersion: settings.evmVersion || 'paris', + foundryProfile, + } +} + +const contracts08Set = new Set(SOLIDITY_08_PACKAGE.contracts) + +const getContractBuildDir = ( + contractName: string, + buildDir05: string, + buildDir08: string +): string => { + if (contracts08Set.has(contractName)) { + return buildDir08 + } + return buildDir05 +} + +const getContractArtifactPath = ( + contractName: string, + buildDir05: string, + buildDir08: string +): string => { + const buildDir = getContractBuildDir(contractName, buildDir05, buildDir08) + return join(buildDir, `${contractName}.sol`, `${contractName}.json`) +} + +const listContractNames = (baseDir: string): string[] => { + const names: string[] = [] + const entries = readdirSync(baseDir, { withFileTypes: true }) + for (const entry of entries) { + if (!entry.isDirectory() || !entry.name.endsWith('.sol')) { + continue + } + const contractSolDirPath = join(baseDir, entry.name as string) + const filesInSolDir = readdirSync(contractSolDirPath, { withFileTypes: true }) + + for (const fileEntry of filesInSolDir) { + if (!fileEntry.isFile() || !fileEntry.name.endsWith('.json')) { + continue + } + names.push(basename(fileEntry.name as string, '.json')) + } + } + return names +} + +const linkLibraries = ( + contractViemArtifact: ViemContract, + contractDependencies: string[], + addresses: ContractAddresses +): LinkedLibrary[] => { + const linkedLibraries: LinkedLibrary[] = [] + + if (!contractViemArtifact.bytecode.includes('__')) { + return linkedLibraries + } + + if (contractDependencies.length === 0) { + console.error( + `No dependencies found for ${contractViemArtifact.contractName}. Skipping library linking.` + ) + return linkedLibraries + } + + for (const dep of contractDependencies) { + if (addresses.addresses.has(dep)) { + const libAddressWithPrefix = addresses.get(dep) + const libAddress = libAddressWithPrefix.replace('0x', '') + const libSourceFilePath = contractViemArtifact.sourceFiles.find((file) => + file.includes(`${dep}.sol`) + ) + + if (!libSourceFilePath) { + throw new Error( + `Could not determine sourceFilePath for library ${dep} in ${contractViemArtifact.contractName}.` + ) + } + + const stringToHash = `${libSourceFilePath}:${dep}` + const hashed = keccak256(toHex(new TextEncoder().encode(stringToHash))) + const placeholderHash = hashed.substring(2, 2 + 34) + + const placeholderRegexDollar = new RegExp(`__\\$${placeholderHash}\\$__`, 'g') + if (contractViemArtifact.bytecode!.match(placeholderRegexDollar)) { + contractViemArtifact.bytecode = contractViemArtifact.bytecode!.replace( + placeholderRegexDollar, + libAddress + ) as Hex + + // Track the linked library for verification + // Ensure the address has 0x prefix for verification + const fullAddress = libAddressWithPrefix.startsWith('0x') + ? libAddressWithPrefix + : `0x${libAddressWithPrefix}` + linkedLibraries.push({ + sourceFile: libSourceFilePath, + name: dep, + address: fullAddress, + }) + } else { + console.log(`No placeholder match for ${dep} in ${contractViemArtifact.contractName}.`) + } + } + } + + if (contractViemArtifact.bytecode.includes('__')) { + const missingLibs = contractDependencies.filter((dep) => !addresses.addresses.has(dep)) + throw new Error( + `Bytecode for ${contractViemArtifact.contractName} still contains unlinked library placeholders. ` + + `Missing library addresses: ${ + missingLibs.length > 0 ? missingLibs.join(', ') : '(unknown - check libraries file)' + }` + ) + } + + return linkedLibraries +} + +const performRelease = async ( + contractName: string, + report: ASTDetailedVersionedReport, + released: Set, + buildDir05: string, + buildDir08: string, + dependencies: { get(key: string): string[] | undefined }, + addresses: ContractAddresses, + proposal: ProposalTx[], + initializationData: Record, + walletClient: WalletClientMethods, + publicClient: PublicClientMethods +): Promise => { + if (released.has(contractName)) return + + const shouldDeployContract = Object.keys(report.contracts).includes(contractName) + const shouldDeployLibrary = Object.keys(report.libraries).includes(contractName) + + if (!shouldDeployContract && !shouldDeployLibrary) return + + const artifactPath = getContractArtifactPath(contractName, buildDir05, buildDir08) + if (!existsSync(artifactPath)) { + throw new Error(`Artifact for ${contractName} not found at ${artifactPath}.`) + } + + let contractViemArtifact: ViemContract + try { + contractViemArtifact = loadContractArtifact(contractName, artifactPath) + } catch (e) { + throw new Error( + `Failed to load artifact for ${contractName} from ${artifactPath}. Skipping. Error: ${e}` + ) + } + + if (shouldDeployContract) { + const contractDependencies = dependencies.get(contractName) || [] + for (const dependency of contractDependencies) { + if (!released.has(dependency)) { + await performRelease( + dependency, + report, + released, + buildDir05, + buildDir08, + dependencies, + addresses, + proposal, + initializationData, + walletClient, + publicClient + ) + } + } + + // Check for missing library addresses (not in libraries file and not yet deployed) + const missingLibraries = contractDependencies.filter((dep) => !addresses.addresses.has(dep)) + if (missingLibraries.length > 0) { + console.warn( + `\nWARNING: ${contractName} requires libraries not found in the libraries file: ${missingLibraries.join( + ', ' + )}` + ) + console.warn( + `Missing libraries is often due to an error in the libraries file. Deploying the wrong version can lead to errors.` + ) + const confirmed = await promptUserConfirmation( + `Deploy missing libraries (${missingLibraries.join(', ')})?` + ) + if (!confirmed) { + throw new Error( + `Aborting: missing libraries for ${contractName}: ${missingLibraries.join(', ')}` + ) + } + for (const lib of missingLibraries) { + const libArtifactPath = getContractArtifactPath(lib, buildDir05, buildDir08) + if (!existsSync(libArtifactPath)) { + throw new Error(`Artifact for library ${lib} not found at ${libArtifactPath}.`) + } + const libArtifact = loadContractArtifact(lib, libArtifactPath) + await deployLibrary(lib, libArtifact, addresses, walletClient, publicClient) + released.add(lib) + } + } + + const linkedLibraries = linkLibraries(contractViemArtifact, contractDependencies, addresses) + + await deployCoreContract( + contractName, + contractViemArtifact, + proposal, + addresses, + report, + initializationData, + walletClient, + publicClient, + buildDir05, + buildDir08, + linkedLibraries + ) + } else if (shouldDeployLibrary) { + await deployLibrary(contractName, contractViemArtifact, addresses, walletClient, publicClient) + } + released.add(contractName) +} + +async function main() { + try { + const argv: MakeReleaseArgv = yargs(hideBin(process.argv) as string[]) + .option('report', { + type: 'string', + demandOption: true, + description: 'Path to the compatibility report JSON file.', + }) + .option('proposal', { + type: 'string', + demandOption: true, + description: 'Path to output the proposal JSON file.', + }) + .option('librariesFile', { + type: 'string', + demandOption: true, + description: 'Path to the libraries.json file.', + }) + .option('initializeData', { + type: 'string', + demandOption: true, + description: 'Path to the JSON file with initialization data for contracts.', + }) + .option('buildDirectory', { + type: 'string', + demandOption: true, + description: 'Path to the Foundry build output directory (e.g., out/).', + }) + .option('branch', { + type: 'string', + demandOption: true, + description: 'Git branch name (used for versioning).', + }) + .option('network', { + type: 'string', + demandOption: true, + description: 'Network name (e.g., celo-sepolia, celo, mainnet).', + }) + .option('privateKey', { type: 'string', description: 'Private key for deployment.' }) + .option('mnemonic', { type: 'string', description: 'Mnemonic for deployment.' }) + .option('rpcUrl', { + type: 'string', + description: 'Custom RPC URL (overrides network default, useful for local anvil forks).', + }) + .option('skipVerification', { + type: 'boolean', + default: false, + description: 'Skip contract verification on block explorers.', + }) + .option('celoscanApiKey', { + type: 'string', + description: + 'Celoscan API key for contract verification (can also be set via CELOSCAN_API_KEY env var or .env.json).', + }) + .check((currentArgs) => { + if (!currentArgs.privateKey && !currentArgs.mnemonic) { + throw new Error('Either --privateKey or --mnemonic must be provided.') + } + return true + }).argv + + const networkName = argv.network! + const buildDirBase = argv.buildDirectory + const buildDir05 = `${buildDirBase}-truffle-compat` + const buildDir08 = `${buildDirBase}-truffle-compat8` + if (!existsSync(buildDir05)) { + throw new Error(`${buildDir05} directory not found. Make sure to run foundry build first`) + } + if (!existsSync(buildDir08)) { + throw new Error(`${buildDir08} directory not found. Make sure to run foundry build first`) + } + + // Check for Celoscan API key early (before deployment) for production networks + const isProductionNetwork = ['celo', 'mainnet', 'rc1', 'celo-sepolia'].includes( + networkName.toLowerCase() + ) + const isLocalFork = !!argv.rpcUrl + + if (isProductionNetwork && !isLocalFork && !argv.skipVerification) { + // Load Celoscan API key from various sources + let celoscanApiKey = argv.celoscanApiKey || process.env.CELOSCAN_API_KEY + + // Try loading from .env.json if not set + if (!celoscanApiKey) { + const envJsonPath = join(process.cwd(), '.env.json') + if (existsSync(envJsonPath)) { + try { + const envJson = readJsonSync(envJsonPath) + celoscanApiKey = envJson.celoScanApiKey || envJson.celoscanApiKey + } catch (e) { + // Failed to parse .env.json - fall through to validation check below + // which will throw a descriptive error if API key is still missing + console.warn(`Warning: Could not read Celoscan API key from ${envJsonPath}: ${e}`) + } + } + } + + if (!celoscanApiKey) { + throw new Error( + `Celoscan API key is required for ${networkName}. ` + + `Provide it via:\n` + + ` - CLI flag: -a YOUR_API_KEY\n` + + ` - Environment variable: CELOSCAN_API_KEY\n` + + ` - Config file: packages/protocol/.env.json (celoScanApiKey)\n` + + `Or use -s to skip verification.` + ) + } + } + + const viemChain = getViemChain(networkName) + + // Use custom rpcUrl if provided, otherwise use chain default + let transportUrl: string + if (argv.rpcUrl) { + transportUrl = argv.rpcUrl + console.log(`Using custom RPC URL: ${transportUrl}`) + } else if (viemChain.rpcUrls.default?.http?.[0]) { + transportUrl = viemChain.rpcUrls.default.http[0] + } else { + throw new Error( + `RPC URL for network ${networkName} could not be determined. Provide --rpcUrl parameter.` + ) + } + const publicClient = createPublicClient({ + chain: viemChain, + transport: http(transportUrl), + }) + + let account: Account + + if (argv.privateKey) { + const privateKey = argv.privateKey.startsWith('0x') + ? (argv.privateKey as Hex) + : (`0x${argv.privateKey}` as Hex) + account = privateKeyToAccount(privateKey) + } else { + account = mnemonicToAccount(argv.mnemonic as string) + } + + const walletClient = createWalletClient({ + account, + chain: viemChain, + transport: http(transportUrl), + }) + const fullReport = readJsonSync(argv.report) + const libraryMapping: LibraryAddresses['addresses'] = readJsonSync(argv.librariesFile) + const report: ASTDetailedVersionedReport = fullReport.report + const branch = (argv.branch ?? '') as string + const initializationData: Record = readJsonSync(String(argv.initializeData)) + const dependencies = getCeloContractDependencies() + const version = getReleaseVersion(branch) + + if (version >= 9) { + ignoredContractsSet = new Set(ignoredContractsV9) + } + + const names05 = listContractNames(buildDir05) + const names08 = listContractNames(buildDir08) + const allContractNamesFromDirs = [...new Set([...names05, ...names08])].sort() + + const registryArtifactPath = getContractArtifactPath('Registry', buildDir05, buildDir08) + if (!existsSync(registryArtifactPath)) { + throw new Error( + `Registry.json artifact not found at ${registryArtifactPath}. ` + + `Please ensure it is compiled and present in the Foundry output format.` + ) + } + const registryArtifact = loadContractArtifact('Registry', registryArtifactPath) + + const allContractNames = allContractNamesFromDirs.filter( + (contractName) => + !ignoredContractsSet.has(contractName) && + !ignoredContractsSet.has(contractName.replace('Proxy', '')) + ) + + const addresses = await ContractAddresses.create( + allContractNames, + publicClient, + registryArtifact.abi, + celoRegistryAddress as ViemAddress, + libraryMapping + ) + + const released: Set = new Set([]) + const proposal: ProposalTx[] = [] + + for (const contractName of allContractNames) { + if (isCoreContract(contractName) && isProxiedContract(contractName, buildDir05, buildDir08)) { + await performRelease( + contractName, + report, + released, + buildDir05, + buildDir08, + dependencies, + addresses, + proposal, + initializationData, + walletClient, + publicClient + ) + } + } + + writeJsonSync(argv.proposal, proposal, { spaces: 2 }) + console.log(`Proposal successfully written to ${argv.proposal}`) + + // Contract verification + if (isLocalFork) { + console.log('\nContract verification skipped (custom RPC URL indicates local fork)') + } else if (argv.skipVerification) { + console.log('\nContract verification skipped (--skipVerification flag)') + } else { + // Load Celoscan API key (already validated at start for production networks) + let celoscanApiKey = argv.celoscanApiKey || process.env.CELOSCAN_API_KEY + + if (!celoscanApiKey) { + const envJsonPath = join(process.cwd(), '.env.json') + if (existsSync(envJsonPath)) { + try { + const envJson = readJsonSync(envJsonPath) + celoscanApiKey = envJson.celoScanApiKey || envJson.celoscanApiKey + } catch (e) { + // Failed to parse .env.json - verifyAllContracts handles undefined + // gracefully by skipping Celoscan verification (Blockscout still works) + console.warn(`Warning: Could not read Celoscan API key from ${envJsonPath}: ${e}`) + } + } + } + + await verifyAllContracts(networkName, transportUrl, celoscanApiKey) + } + } catch (error) { + console.error('Error during script execution:', error) + } +} + +main().catch((error) => { + console.error('Unhandled error in main execution:', error) + process.exit(1) +}) diff --git a/packages/protocol/scripts/foundry/run_e2e_tests_in_anvil.sh b/packages/protocol/scripts/foundry/run_e2e_tests_in_anvil.sh index 781543a9b61..55ab316de18 100755 --- a/packages/protocol/scripts/foundry/run_e2e_tests_in_anvil.sh +++ b/packages/protocol/scripts/foundry/run_e2e_tests_in_anvil.sh @@ -6,7 +6,7 @@ source $PWD/scripts/foundry/constants.sh # Generate and run devchain echo "Generating and running devchain before running e2e tests..." -source $PWD/scripts/foundry/create_and_migrate_anvil_devchain.sh +source $PWD/scripts/foundry/create_and_migrate_anvil_l2_devchain.sh # Run e2e tests diff --git a/packages/protocol/scripts/foundry/run_integration_tests_in_anvil.sh b/packages/protocol/scripts/foundry/run_integration_tests_in_anvil.sh index bdcd06f4f09..061451fb98b 100755 --- a/packages/protocol/scripts/foundry/run_integration_tests_in_anvil.sh +++ b/packages/protocol/scripts/foundry/run_integration_tests_in_anvil.sh @@ -6,7 +6,7 @@ source $PWD/scripts/foundry/constants.sh # Generate and run devchain echo "Generating and running devchain before running integration tests..." -source $PWD/scripts/foundry/create_and_migrate_anvil_devchain.sh +source $PWD/scripts/foundry/create_and_migrate_anvil_l2_devchain.sh # Run integration tests echo "Running integration tests..." diff --git a/packages/protocol/scripts/foundry/signature/Recover.sol b/packages/protocol/scripts/foundry/signature/Recover.sol new file mode 100644 index 00000000000..5a8fc206816 --- /dev/null +++ b/packages/protocol/scripts/foundry/signature/Recover.sol @@ -0,0 +1,21 @@ +// SPDX-License-Identifier: MIT +pragma solidity ^0.8.15; + +import { Script } from "forge-std/Script.sol"; +import { console2 as console } from "forge-std/console2.sol"; + +import { ECDSA } from "@openzeppelin/contracts8/utils/cryptography/ECDSA.sol"; + +/// @notice Recover the address of signer of a hash +/// @dev Useful while working with multiple external signatures from Security Council or cLabs +/// @dev Particulary during performing OpStack upgrades or interacting with Core Contracts governance +contract Recover is Script { + function run() external { + bytes32 hash_ = vm.envBytes32("HASH"); + bytes memory sig_ = vm.envBytes("SIG"); + address signer_ = ECDSA.recover(hash_, sig_); + console.logBytes32(hash_); + console.logBytes(sig_); + console.logAddress(signer_); + } +} diff --git a/packages/protocol/scripts/foundry/start_anvil.sh b/packages/protocol/scripts/foundry/start_anvil.sh index ec802d5d980..4080d66a066 100755 --- a/packages/protocol/scripts/foundry/start_anvil.sh +++ b/packages/protocol/scripts/foundry/start_anvil.sh @@ -4,10 +4,31 @@ set -euo pipefail # Read environment variables and constants source $PWD/scripts/foundry/constants.sh +# Parse command line options: +# -p: Custom port number for Anvil to listen on (overrides default ANVIL_PORT) +# -l: Path to load existing Anvil state from (instead of creating new state) +# -f: Fork URL to fork from a live network +# -a: Enable auto-impersonate mode + +while getopts 'p:l:f:a' flag; do + case "${flag}" in + p) CUSTOM_PORT="${OPTARG}" ;; + l) LOAD_STATE="${OPTARG}" ;; + f) FORK_URL="${OPTARG}" ;; + a) AUTO_IMPERSONATE=true ;; + *) error "Unexpected option ${flag}" ;; + esac +done + +if [ -n "${CUSTOM_PORT:-}" ]; then + ANVIL_PORT=$CUSTOM_PORT +fi + +ANVIL_RPC_URL=$(get_anvil_rpc_url) + timestamp=`date -Iseconds` mkdir -p $ANVIL_FOLDER -echo "Anvil state will be saved to $ANVIL_FOLDER" # create package.json echo "{\"name\": \"@celo/devchain-anvil\",\"version\": \"1.0.0\",\"repository\": { \"url\": \"https://github.com/celo-org/celo-monorepo\", \"directory\": \"packages/protocol/migrations_sol\" },\"homepage\": \"https://github.com/celo-org/celo-monorepo/blob/master/packages/protocol/migrations_sol/README.md\",\"description\": \"Anvil based devchain that contains core smart contracts of celo\",\"author\":\"Celo\",\"license\": \"LGPL-3.0\"}" > $TMP_FOLDER/package.json @@ -22,14 +43,35 @@ if nc -z localhost $ANVIL_PORT; then fi # Start anvil +if [ -n "${LOAD_STATE:-}" ]; then + echo "Loading Anvil state from $LOAD_STATE" + STATE_FLAGS="--load-state $LOAD_STATE" +else + echo "Anvil state will be saved to $ANVIL_FOLDER" + STATE_FLAGS="--dump-state $ANVIL_FOLDER --state-interval $STATE_INTERVAL" +fi + +FORK_FLAGS="" +if [ -n "${FORK_URL:-}" ]; then + echo "Forking from $FORK_URL" + FORK_FLAGS="--fork-url $FORK_URL" +fi + +IMPERSONATE_FLAGS="" +if [ "${AUTO_IMPERSONATE:-}" = true ]; then + IMPERSONATE_FLAGS="--auto-impersonate --accounts 0" +fi + anvil \ +$STATE_FLAGS \ +$FORK_FLAGS \ +$IMPERSONATE_FLAGS \ --port $ANVIL_PORT \ ---dump-state $ANVIL_FOLDER \ ---state-interval $STATE_INTERVAL \ --gas-limit $GAS_LIMIT \ --code-size-limit $CODE_SIZE_LIMIT \ --balance $BALANCE \ --steps-tracing & + # For context "&" tells the shell to start a command as a background process. # This allows you to continue executing other commands without waiting for the background command to finish. @@ -44,8 +86,8 @@ while ! nc -z localhost $ANVIL_PORT; do sleep 0.1 # wait for 1/10 of the second before check again done -# enabled logging -cast rpc anvil_setLoggingEnabled true --rpc-url $ANVIL_RPC_URL - echo "Anvil launched" + sleep 1 +# enabled logging +cast rpc anvil_setLoggingEnabled $ANVIL_LOGGING_ENABLED --rpc-url $ANVIL_RPC_URL diff --git a/packages/protocol/scripts/foundry/stop_anvil.sh b/packages/protocol/scripts/foundry/stop_anvil.sh index f492cf26ee0..d4da088bc5d 100755 --- a/packages/protocol/scripts/foundry/stop_anvil.sh +++ b/packages/protocol/scripts/foundry/stop_anvil.sh @@ -3,10 +3,47 @@ set -euo pipefail # A small script to terminate any instance of anvil currently serving at localhost. +# Default behavior: delete tmp state unless overridden +DELETE_STATE=true + +# Parse command-line arguments +while [[ $# -gt 0 ]]; do + key="$1" + case $key in + --keep-state) + DELETE_STATE=false + shift # past argument + ;; + *) # unknown option + echo "Unknown option: $1" + # Optionally exit here if only known flags should be allowed + shift # past argument + ;; + esac +done + +if [ "$DELETE_STATE" = false ]; then + echo "Anvil temporary state will NOT be deleted (--keep-state specified)." +fi + # Read environment variables and constants source $PWD/scripts/foundry/constants.sh if nc -z localhost $ANVIL_PORT; then kill $(lsof -t -i:$ANVIL_PORT) echo "Killed Anvil" -fi \ No newline at end of file +fi + +# Conditionally remove anvil tmp state +ANVIL_TMP_DIR="$HOME/.foundry/anvil/tmp" +if [ "$DELETE_STATE" = true ]; then + if [ -d "$ANVIL_TMP_DIR" ]; then + echo "Removing anvil temporary state directory: $ANVIL_TMP_DIR" + rm -rf "$ANVIL_TMP_DIR"/* + echo "Anvil temporary state removed." + else + echo "Anvil temporary state directory not found: $ANVIL_TMP_DIR" + fi +else + echo "Skipping removal of anvil temporary state." +fi diff --git a/packages/protocol/scripts/foundry/verify-bytecode-foundry.ts b/packages/protocol/scripts/foundry/verify-bytecode-foundry.ts new file mode 100644 index 00000000000..aabd4fd1213 --- /dev/null +++ b/packages/protocol/scripts/foundry/verify-bytecode-foundry.ts @@ -0,0 +1,162 @@ +import { + InitializationData, + verifyBytecodes, +} from '@celo/protocol/lib/compatibility/verify-bytecode-foundry' +import { getReleaseVersion } from '../../lib/compatibility/ignored-contracts-v9' + +import { CeloContractName } from '@celo/protocol/lib/registry-utils' +import { ProposalTx } from '@celo/protocol/scripts/truffle/make-release' + +import { instantiateArtifactsFromForge } from '@celo/protocol/lib/compatibility/utils' +import { existsSync, readJsonSync, writeJsonSync } from 'fs-extra' +import { Chain, createPublicClient, defineChain, encodeFunctionData, http } from 'viem' +import * as viemChains from 'viem/chains' + +/* + * This script verifies that a given set of smart contract bytecodes corresponds + * to a Celo system deployed to a given network. It uses the Registry contract + * as its source of truth, potentially modified by an optional contract upgrade + * proposal description. + * + * Expects the following flags: + * --out: The directory in which smart contract build artifacts + * can be found (default: "./build/contracts/") + * --proposal: The JSON file containing a Governance proposal that + * repoints the Registry to newly deployed Proxies and/or repoints existing + * Proxies to new implementation addresses. + * --initialize_data: The JSON file containing, for each newly deployed Proxy, + * the calldata to its logic contract's `initialize` function. + * --network: The name of the network to verify (default: "development"). + * --librariesFile: The file to which linked library addresses will be + * written (default: "libraries.json"). + */ + +const argv = require('minimist')(process.argv.slice(2), { + string: ['build_artifacts', 'proposal', 'initialize_data', 'network', 'librariesFile', 'branch'], +}) + +const branch = (argv.branch ? argv.branch : '') as string +const buildDir05 = `./out-${branch}-truffle-compat` +const buildDir08 = `./out-${branch}-truffle-compat8` +const network: string = argv.network ?? 'development' +const proposal: ProposalTx[] = argv.proposal ? readJsonSync(argv.proposal) : [] +const initializationData: InitializationData = argv.initialize_data + ? readJsonSync(argv.initialize_data) + : {} +const librariesFile = argv.librariesFile ?? 'libraries.json' + +if (!existsSync(buildDir05)) { + throw new Error(`${buildDir05} directory not found. Make sure to run foundry build first`) +} + +if (!existsSync(buildDir08)) { + throw new Error(`${buildDir08} directory not found. Make sure to run foundry build first`) +} + +// TODO deduplicate with make-release +const getViemChain = (networkName: string): Chain => { + switch (networkName.toLowerCase()) { + case 'celo': + case 'mainnet': + case 'rc1': + return viemChains.celo + case 'celo-sepolia': + return defineChain({ + id: 11142220, + name: 'Celo Sepolia', + nativeCurrency: { name: 'Celo', symbol: 'CELO', decimals: 18 }, + rpcUrls: { + default: { http: ['https://forno.celo-sepolia.celo-testnet.org'] }, + }, + blockExplorers: { + default: { name: 'CeloScan', url: 'https://celo-sepolia.blockscout.com' }, + }, + testnet: true, + }) + default: + return { ...viemChains.hardhat, id: 31337 } + } +} +const viemChain = getViemChain(network) +const transportUrl = viemChain.rpcUrls.default.http[0] +const publicClient = createPublicClient({ + chain: viemChain, + transport: http(transportUrl), +}) + +const version = getReleaseVersion(branch) + +const registryAddress = '0x000000000000000000000000000000000000ce10' +const registryAbi = readJsonSync(`${buildDir05}/Registry.sol/Registry.json`).abi +const proxyAbi = readJsonSync(`${buildDir05}/Proxy.sol/Proxy.json`).abi + +const getAddressForString = async (contract: string): Promise => { + const result = await publicClient.readContract({ + address: registryAddress, + abi: registryAbi, + functionName: 'getAddressForString', + args: [contract], + }) + return result as string +} + +const getImplementation = async (address: string): Promise => { + const result = await publicClient.readContract({ + address: address as `0x${string}`, + abi: proxyAbi, + functionName: '_getImplementation', + args: [], + }) + return result as string +} + +const registryLookup = { getAddressForString } +const proxyLookup = { getImplementation } +const chainLookup = { + getCode: (address: `0x${string}`) => { + return publicClient.getBytecode({ address }) + }, + encodeFunctionCall: (abi: any, args: any[]) => { + return encodeFunctionData({ + abi: [abi], + functionName: abi.name, + args, + }) + }, + getProof: (address: `0x${string}`, slots: `0x${string}`[]) => { + return publicClient.getProof({ + address, + storageKeys: slots, + }) + }, +} + +const [artifacts05] = instantiateArtifactsFromForge(buildDir05) +const [artifacts08] = instantiateArtifactsFromForge(buildDir08) +verifyBytecodes( + Object.keys(CeloContractName), + [artifacts05, artifacts08], + registryLookup, + proposal, + proxyLookup, + chainLookup, + initializationData, + version, + network +) + .then(({ libraryLinkingInfo, verifiedLibraries }) => { + const allMapping = libraryLinkingInfo.getAddressMapping() + const verifiedMapping = {} + for (const library of verifiedLibraries) { + verifiedMapping[library] = allMapping[library] + } + + /* eslint-disable no-console */ + console.log(`\n✅ All contracts and libraries verified successfully!`) + console.info(`Writing linked library addresses to ${librariesFile}`) + writeJsonSync(librariesFile, verifiedMapping, { spaces: 2 }) + }) + .catch((error) => { + console.info('Script errored!', error) + process.exit(1) + }) diff --git a/packages/protocol/scripts/foundry/verify-new-chain.sh b/packages/protocol/scripts/foundry/verify-new-chain.sh new file mode 100755 index 00000000000..2a38e47da9d --- /dev/null +++ b/packages/protocol/scripts/foundry/verify-new-chain.sh @@ -0,0 +1,149 @@ +#!/usr/bin/env bash + +# Require env vars +[ -z "${BLOCKSCOUT_API_KEY:-}" ] && echo "Need to set the BLOCKSCOUT_API_KEY via env" && exit 1; +[ -z "${BLOCKSCOUT_URL:-}" ] && echo "Need to set the BLOCKSCOUT_URL via env (example value: https://celo-sepolia.blockscout.com/api)" && exit 1; +[ -z "${CHAIN_ID:-}" ] && echo "Need to set the CHAIN_ID via env (example value: 11142220)" && exit 1; +[ -z "${RPC_URL:-}" ] && echo "Need to set the RPC_URL via env (example value: https://forno.celo-sepolia.celo-testnet.org)" && exit 1; + +verify() { + CONSTRUCTOR_SIG=${3:-} + echo ">>> [Blockscout] $2" + if [ -z ${CONSTRUCTOR_SIG:-} ]; then + forge verify-contract $1 $2 \ + --chain-id $CHAIN_ID \ + --etherscan-api-key=$BLOCKSCOUT_API_KEY \ + --verifier-url=$BLOCKSCOUT_URL \ + --verifier=blockscout \ + --watch + else + forge verify-contract $1 $2 \ + --chain-id $CHAIN_ID \ + --etherscan-api-key=$BLOCKSCOUT_API_KEY \ + --verifier-url=$BLOCKSCOUT_URL \ + --verifier=blockscout \ + --constructor-args $(cast abi-encode $CONSTRUCTOR_SIG ${@:4}) \ + --watch + fi + echo "----------------------------------------" +} + +verify_proxy() { + IMPL_SLOT="0x360894a13ba1a3210667c828492db98dca3e2076cc3735a920a3ca505d382bbc" # keccak256("eip1967.proxy.implementation") + IMPL_ADDRESS_B32=$(cast storage $1 $IMPL_SLOT -r $RPC_URL) + IMPL_ADDRESS=$(cast parse-bytes32-address $IMPL_ADDRESS_B32) + echo "Proxy: $1 Impl: $IMPL_ADDRESS" + verify $IMPL_ADDRESS ${@:2} +} + +echo ">>> Verifying core contracts on Celo Sepolia" +# registry +REGISTRY_ADDRESS="0x000000000000000000000000000000000000ce10" +GET_ADDR="getAddressForStringOrDie(string)(address)" +verify_proxy $REGISTRY_ADDRESS "Registry" + +# freezer +FREEZER_ADDRESS=$(cast call $REGISTRY_ADDRESS "$GET_ADDR" "Freezer" -r $RPC_URL) +verify_proxy $FREEZER_ADDRESS "Freezer" + +# fee currency directory +FCD_ADDRESS=$(cast call $REGISTRY_ADDRESS "$GET_ADDR" "FeeCurrencyDirectory" -r $RPC_URL) +verify_proxy $FCD_ADDRESS "FeeCurrencyDirectory" + +# celo token +CT_ADDRESS=$(cast call $REGISTRY_ADDRESS "$GET_ADDR" "CeloToken" -r $RPC_URL) +verify_proxy $CT_ADDRESS "GoldToken" + +# sorted oracles +SO_ADDRESS=$(cast call $REGISTRY_ADDRESS "$GET_ADDR" "SortedOracles" -r $RPC_URL) +verify_proxy $SO_ADDRESS "SortedOracles" + +# reserve spender multisig +RSM_ADDRESS=$(cast call $REGISTRY_ADDRESS "$GET_ADDR" "ReserveSpenderMultiSig" -r $RPC_URL) +verify_proxy $RSM_ADDRESS "ReserveSpenderMultiSig" + +# reserve +RESERVE_ADDRESS=$(cast call $REGISTRY_ADDRESS "$GET_ADDR" "Reserve" -r $RPC_URL) +verify_proxy $RESERVE_ADDRESS "Reserve" + +# stable tokens +STABLE_TOKENS=$(cast call $RESERVE_ADDRESS "getTokens()(address[])" -r $RPC_URL) +STABLE_TOKENS=$(echo "$STABLE_TOKENS" | sed -e 's/^\[//' -e 's/\]$//' -e 's/, /\n/g') +echo "$STABLE_TOKENS" | while IFS= read -r TOKEN_ADDRESS; do + echo "Token address: $TOKEN_ADDRESS" + verify_proxy $TOKEN_ADDRESS "StableToken" +done + +# exchange +EXCHANGE_ADDRESS=$(cast call $REGISTRY_ADDRESS "$GET_ADDR" "Exchange" -r $RPC_URL) +verify_proxy $EXCHANGE_ADDRESS "Exchange" + +# accounts +ACCOUNTS_ADDRESS=$(cast call $REGISTRY_ADDRESS "$GET_ADDR" "Accounts" -r $RPC_URL) +verify_proxy $ACCOUNTS_ADDRESS "Accounts" + +# locked celo +LC_ADDRESS=$(cast call $REGISTRY_ADDRESS "$GET_ADDR" "LockedCelo" -r $RPC_URL) +verify_proxy $LC_ADDRESS "LockedGold" + +# validators +VALIDATORS_ADDRESS=$(cast call $REGISTRY_ADDRESS "$GET_ADDR" "Validators" -r $RPC_URL) +verify_proxy $VALIDATORS_ADDRESS "Validators" + +# election +ELECTION_ADDRESS=$(cast call $REGISTRY_ADDRESS "$GET_ADDR" "Election" -r $RPC_URL) +verify_proxy $ELECTION_ADDRESS "Election" + +# epoch rewards +ER_ADDRESS=$(cast call $REGISTRY_ADDRESS "$GET_ADDR" "EpochRewards" -r $RPC_URL) +verify_proxy $ER_ADDRESS "EpochRewards" + +# escrow +ESCROW_ADDRESS=$(cast call $REGISTRY_ADDRESS "$GET_ADDR" "Escrow" -r $RPC_URL) +verify_proxy $ESCROW_ADDRESS "Escrow" + +# governance slasher +GS_ADDRESS=$(cast call $REGISTRY_ADDRESS "$GET_ADDR" "GovernanceSlasher" -r $RPC_URL) +verify_proxy $GS_ADDRESS "GovernanceSlasher" + +# federated attestations +FA_ADDRESS=$(cast call $REGISTRY_ADDRESS "$GET_ADDR" "FederatedAttestations" -r $RPC_URL) +verify_proxy $FA_ADDRESS "FederatedAttestations" + +# mento fee handler seller +MFAS_ADDRESS=$(cast call $REGISTRY_ADDRESS "$GET_ADDR" "MentoFeeHandlerSeller" -r $RPC_URL) +verify_proxy $MFAS_ADDRESS "MentoFeeHandlerSeller" + +# uniswap fee handler seller +UFHS_ADDRESS=$(cast call $REGISTRY_ADDRESS "$GET_ADDR" "UniswapFeeHandlerSeller" -r $RPC_URL) +verify_proxy $UFHS_ADDRESS "UniswapFeeHandlerSeller" + +# fee handler +FH_ADDRESS=$(cast call $REGISTRY_ADDRESS "$GET_ADDR" "FeeHandler" -r $RPC_URL) +verify_proxy $FH_ADDRESS "FeeHandler" + +# odis payments +OP_ADDRESS=$(cast call $REGISTRY_ADDRESS "$GET_ADDR" "OdisPayments" -r $RPC_URL) +verify_proxy $OP_ADDRESS "OdisPayments" + +# celo unreleased treasury +CUT_ADDRESS=$(cast call $REGISTRY_ADDRESS "$GET_ADDR" "CeloUnreleasedTreasury" -r $RPC_URL) +verify_proxy $CUT_ADDRESS "CeloUnreleasedTreasury" + +# epoch manager enabler +EME_ADDRESS=$(cast call $REGISTRY_ADDRESS "$GET_ADDR" "EpochManagerEnabler" -r $RPC_URL) +verify_proxy $EME_ADDRESS "EpochManagerEnabler" + +# epoch manager +EM_ADDRESS=$(cast call $REGISTRY_ADDRESS "$GET_ADDR" "EpochManager" -r $RPC_URL) +verify_proxy $EM_ADDRESS "EpochManager" + +# score manager +SM_ADDRESS=$(cast call $REGISTRY_ADDRESS "$GET_ADDR" "ScoreManager" -r $RPC_URL) +verify_proxy $SM_ADDRESS "ScoreManager" + +# governance +GOV_ADDRESS=$(cast call $REGISTRY_ADDRESS "$GET_ADDR" "Governance" -r $RPC_URL) +verify_proxy $GOV_ADDRESS "Governance" + +echo ">>> Finished verifying contracts on Celo Sepolia" diff --git a/packages/protocol/scripts/make-release-3-changes.ts b/packages/protocol/scripts/make-release-3-changes.ts index 8991940cc4b..be495506bec 100644 --- a/packages/protocol/scripts/make-release-3-changes.ts +++ b/packages/protocol/scripts/make-release-3-changes.ts @@ -30,7 +30,7 @@ try { ) const releaseProposal: ProposalTx[] = readJsonSync(argv.input_proposal) writeJsonSync(argv.output_proposal, makeRelease3Changes(releaseProposal), { spaces: 2 }) - console.info(`Modifications made sucessfully; written to ${argv.output_proposal}`) + console.info(`Modifications made successfully; written to ${argv.output_proposal}`) } catch (e) { console.error(`Something went wrong: ${e?.message || e?.toString()}`) } diff --git a/packages/protocol/scripts/make-release-6-changes.ts b/packages/protocol/scripts/make-release-6-changes.ts index ca9a2658eb0..3be7879c258 100644 --- a/packages/protocol/scripts/make-release-6-changes.ts +++ b/packages/protocol/scripts/make-release-6-changes.ts @@ -30,7 +30,7 @@ try { ) const releaseProposal: ProposalTx[] = readJsonSync(argv.input_proposal) writeJsonSync(argv.output_proposal, makeRelease6Changes(releaseProposal), { spaces: 2 }) - console.info(`Modifications made sucessfully; written to ${argv.output_proposal}`) + console.info(`Modifications made successfully; written to ${argv.output_proposal}`) } catch (e) { console.error(`Something went wrong: ${e instanceof Error ? JSON.stringify(e) : e?.toString()}`) } diff --git a/packages/protocol/scripts/run-scripts-tests.ts b/packages/protocol/scripts/run-scripts-tests.ts deleted file mode 100644 index 30ee84d6120..00000000000 --- a/packages/protocol/scripts/run-scripts-tests.ts +++ /dev/null @@ -1,3 +0,0 @@ -import * as jest from 'jest' - -jest.run().catch(console.error) diff --git a/packages/protocol/scripts/sourcify-publish.ts b/packages/protocol/scripts/sourcify-publish.ts index 4ca1e80f041..026e80aad25 100644 --- a/packages/protocol/scripts/sourcify-publish.ts +++ b/packages/protocol/scripts/sourcify-publish.ts @@ -16,8 +16,8 @@ import Web3 from 'web3' * build_artifacts_path: Path containing the artifacts to publish * proposal_path: Path to the proposal file * - * Run using yarn run sourcify-publish, e.g.: - * yarn run sourcify-publish \ + * Run using yarn run utils:sourcify-publish, e.g.: + * yarn run utils:sourcify-publish \ * --network alfajores --build_artifacts_path ./build/contracts --proposal_path ./proposal.json * * report.json is generated by the build script with this format: @@ -43,6 +43,7 @@ type Artifact = { contractName: string } +// TODO: Check if this script works during next deployment async function main(buildTargets: BuildOptions) { const artifactBasePath = buildTargets.buildArtifactsPath || './build/contracts' const artifactPaths = fs.readdirSync(artifactBasePath) @@ -94,8 +95,10 @@ async function main(buildTargets: BuildOptions) { method: 'POST', body: formData, }) + // eslint-disable-next-line @typescript-eslint/no-unsafe-return .then((res) => res.json()) .then((json) => + // eslint-disable-next-line @typescript-eslint/no-unsafe-return fetch( `https://${network}-blockscout.celo-testnet.org/address/${json.result[0].address}/contracts` ) diff --git a/packages/protocol/scripts/truffle/deploy_release_contracts.ts b/packages/protocol/scripts/truffle/deploy_release_contracts.ts index 08ab653b3f4..8f25afb3b2f 100644 --- a/packages/protocol/scripts/truffle/deploy_release_contracts.ts +++ b/packages/protocol/scripts/truffle/deploy_release_contracts.ts @@ -337,7 +337,7 @@ async function compile(template: ReleaseGoldTemplate): Promise (x.identifier === grant.identifier ? 1 : 0)) if (identifierCounts.reduce((a, b) => a + b, 0) > 1) { console.info( diff --git a/packages/protocol/scripts/truffle/govern.ts b/packages/protocol/scripts/truffle/govern.ts deleted file mode 100644 index edfa717bf1d..00000000000 --- a/packages/protocol/scripts/truffle/govern.ts +++ /dev/null @@ -1,54 +0,0 @@ -import assert = require('assert') - -import { - getDeployedProxiedContract, - submitMultiSigTransaction, -} from '@celo/protocol/lib/web3-utils' -import { MultiSigInstance } from 'types' - -/* - * A simple script to process transactions via a MultiSig - * - * Expects the following flags: - * command: the command to run - * - * Run using truffle exec, e.g.: - * truffle exec scripts/truffle/govern.js --command 'stableToken.setMinter(0xdeadbeef)' - * - */ -module.exports = async (callback: (error?: any) => number) => { - try { - const argv = require('minimist')(process.argv.slice(2), { - string: ['command'], - }) - - const multiSig = await getDeployedProxiedContract('MultiSig', artifacts) - - // TODO(asa): Validate function arguments - assert(RegExp('^[A-z]+.[A-z]+(.*)$').test(argv.command)) - - let contract - const [contractName, functionCall] = argv.command.split('.') - if (contractName.includes('Proxy')) { - const Proxy: Truffle.Contract = artifacts.require(contractName) - contract = await Proxy.deployed() - } else { - contract = await getDeployedProxiedContract(contractName, artifacts) - } - - const functionName = functionCall.split('(')[0] - const functionArgs = functionCall.split('(')[1].split(')')[0].split(', ') - - console.info(contractName, contract.address, functionName, functionArgs) - console.info('Calling', '"' + argv.command + '"', 'via MultiSig') - await submitMultiSigTransaction( - multiSig, - contract.address, - // @ts-ignore There is a property 'contract' on the variable contract - contract.contract[functionName].getData(...functionArgs) - ) - callback() - } catch (error) { - callback(error) - } -} diff --git a/packages/protocol/scripts/truffle/make-release.ts b/packages/protocol/scripts/truffle/make-release.ts index 4a8b30add34..48f5dcbd322 100644 --- a/packages/protocol/scripts/truffle/make-release.ts +++ b/packages/protocol/scripts/truffle/make-release.ts @@ -51,7 +51,7 @@ class ContractAddresses { await Promise.all( contracts.map(async (contract: string) => { // without this delay it sometimes fails with ProviderError - await delay(getRandomNumber(1, 1000)) + await delay(getRandomNumber(999, 1000)) try { const registeredAddress = await registry.getAddressForString(contract) @@ -115,17 +115,29 @@ const deployImplementation = async ( console.info(`Deploying ${contractName}`) // Hack to trick truffle, which checks that the provided address has code - // without this delay it sometimes fails with ProviderError - await delay(getRandomNumber(1, 1000)) - const bytecodeSize = (Contract.bytecode.length - 2) / 2 console.log('Bytecode size in bytes:', bytecodeSize) - const contract = await (dryRun - ? Contract.at(celoRegistryAddress) - : Contract.new({ - gas: 5000000, // Setting the gas limit - })) + let contract + + while (true) { + // without this delay it sometimes fails with ProviderError + // the provider error is due two main reasons: RPC rate limit and gas price being too low + await delay(getRandomNumber(99, 100)) + try { + contract = await (dryRun + ? Contract.at(celoRegistryAddress) + : Contract.new({ + gas: 5000000, // Setting the gas limit + })) + + break + } catch (error) { + console.error(`Error deploying ${contractName}:`, error) + console.log('retrying...') + // throw new Error(`Error`) + } + } // Sanity check that any contracts that are being changed set a version number. const getVersionNumberAbi = contract.abi.find( diff --git a/packages/protocol/scripts/truffle/set_block_gas_limit.ts b/packages/protocol/scripts/truffle/set_block_gas_limit.ts deleted file mode 100644 index 6f643884753..00000000000 --- a/packages/protocol/scripts/truffle/set_block_gas_limit.ts +++ /dev/null @@ -1,34 +0,0 @@ -import { - getDeployedProxiedContract, - transferOwnershipOfProxyAndImplementation, -} from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { BlockchainParametersInstance, GovernanceInstance } from 'types' - -/* - * A simple script to set the block gas limit after migrations - */ -module.exports = async (callback: (error?: any) => number) => { - try { - const bcp = await getDeployedProxiedContract( - 'BlockchainParameters', - artifacts - ) - console.info('Setting block gas limit to', config.blockchainParameters.blockGasLimit) - await bcp.setBlockGasLimit(config.blockchainParameters.blockGasLimit) - if (!config.governance.skipTransferOwnership) { - const governance = await getDeployedProxiedContract( - 'Governance', - artifacts - ) - await transferOwnershipOfProxyAndImplementation( - 'BlockchainParameters', - governance.address, - artifacts - ) - } - callback() - } catch (error) { - callback(error) - } -} diff --git a/packages/protocol/scripts/truffle/verify-bytecode.ts b/packages/protocol/scripts/truffle/verify-bytecode.ts index b161d4ee6a9..453b17eb172 100644 --- a/packages/protocol/scripts/truffle/verify-bytecode.ts +++ b/packages/protocol/scripts/truffle/verify-bytecode.ts @@ -1,3 +1,8 @@ +import { + MENTO_PACKAGE, + SOLIDITY_05_PACKAGE, + SOLIDITY_08_PACKAGE, +} from '@celo/protocol/contractPackages' import { verifyBytecodes } from '@celo/protocol/lib/compatibility/verify-bytecode' import { CeloContractName, celoRegistryAddress } from '@celo/protocol/lib/registry-utils' import { getBuildArtifacts } from '@openzeppelin/upgrades' @@ -37,8 +42,11 @@ const argv = require('minimist')(process.argv.slice(2), { const artifactsDirectory = argv.build_artifacts ? argv.build_artifacts : './build/contracts' const artifacts08Directory = argv.build_artifacts - ? `${argv.build_artifacts}-0.8` - : './build/contracts-0.8' + ? `${argv.build_artifacts}-${SOLIDITY_08_PACKAGE.name}` + : `./build/contracts-${SOLIDITY_08_PACKAGE.name}` +const mentoArtifactsDirectory = argv.build_artifacts + ? `${argv.build_artifacts}-${MENTO_PACKAGE.name}` + : `./build/contracts-${MENTO_PACKAGE.name}` const branch = (argv.branch ? argv.branch : '') as string const network = argv.network ?? 'development' const proposal = argv.proposal ? readJsonSync(argv.proposal) : [] @@ -50,11 +58,14 @@ module.exports = async (callback: (error?: any) => number) => { const version = getReleaseVersion(branch) const registry = await Registry.at(celoRegistryAddress) - const buildArtifacts = getBuildArtifacts(artifactsDirectory) - const artifacts08 = getBuildArtifacts(artifacts08Directory) + const artifactsMap: Record> = { + [SOLIDITY_05_PACKAGE.name]: getBuildArtifacts(artifactsDirectory), + [SOLIDITY_08_PACKAGE.name]: getBuildArtifacts(artifacts08Directory), + [MENTO_PACKAGE.name]: getBuildArtifacts(mentoArtifactsDirectory), + } const libraryAddresses = await verifyBytecodes( Object.keys(CeloContractName), - [buildArtifacts, artifacts08], + artifactsMap, registry, proposal, Proxy, @@ -70,6 +81,7 @@ module.exports = async (callback: (error?: any) => number) => { // eslint-disable-next-line: no-console console.info(`Writing linked library addresses to ${librariesFile}`) writeJsonSync(librariesFile, libraryAddresses.addresses, { spaces: 2 }) + callback() } catch (error) { callback(error) } diff --git a/packages/protocol/scripts/ts/network-info.ts b/packages/protocol/scripts/ts/network-info.ts new file mode 100644 index 00000000000..83d3f8df7e2 --- /dev/null +++ b/packages/protocol/scripts/ts/network-info.ts @@ -0,0 +1,31 @@ +import * as path from 'path' + +const network = process.argv[2] +if (!network) { + process.stderr.write('Usage: network-info.ts \n') + process.exit(1) +} + +const configPath = path.resolve(__dirname, '../../truffle-config-parent.js') +const { networks, fornoUrls } = require(configPath) +const networkConfig = networks[network] +if (!networkConfig) { + process.stderr.write(`No network config for '${network}'\n`) + process.exit(1) +} + +const rpcUrl = fornoUrls[network] +if (!rpcUrl) { + process.stderr.write(`No forno URL for network '${network}'\n`) + process.exit(1) +} + +// Output all serializable metadata as JSON +const output: Record = { rpcUrl } +for (const [key, value] of Object.entries(networkConfig as Record)) { + if (typeof value !== 'function') { + output[key] = value + } +} + +process.stdout.write(JSON.stringify(output)) diff --git a/packages/protocol/specs/accounts.spec b/packages/protocol/specs/accounts.spec index 9572c29f645..af7a81be39d 100644 --- a/packages/protocol/specs/accounts.spec +++ b/packages/protocol/specs/accounts.spec @@ -155,7 +155,7 @@ rule authorizedBy_can_not_be_removed(method f, address signer) filtered { f -> ! address _account = _getAuthorizedBy(signer); callArbitrary(f); address account_ = _getAuthorizedBy(signer); - // Whatever transacation occurs, if `_account` was authorized before, it remains authorized. + // Whatever transaction occurs, if `_account` was authorized before, it remains authorized. // If `_account` was null, then we could authorize a new account `account_`. assert _account == account_ || _account == 0, "Account delegating to $signer cannot change from one non-zero value to another non-zero value"; @@ -258,7 +258,7 @@ rule viewFunctionsDoNotRevert(method f) filtered { f -> } /** - * the authorizedBy should never be reflexive, i.e. an account cannot be its own signer, or a signer cannot be authroize itself. + * the authorizedBy should never be reflexive, i.e. an account cannot be its own signer, or a signer cannot be authorized itself. */ invariant authorizedByIsNeverReflexive(address a) a != 0 => _getAuthorizedBy(a) != a @@ -314,7 +314,7 @@ rule cantMakeASignerForNonLegacyRoleWithoutApprovalOfSigner(method f) filtered { address account; bytes32 role; - // Leagcy roles can be updated using a signature + // Legacy roles can be updated using a signature require !isLegacyRole(role); address signer; diff --git a/packages/protocol/specs/goverance_with_dequeue.spec b/packages/protocol/specs/goverance_with_dequeue.spec index ec0b54f2b0b..23033d9f534 100644 --- a/packages/protocol/specs/goverance_with_dequeue.spec +++ b/packages/protocol/specs/goverance_with_dequeue.spec @@ -43,7 +43,7 @@ rule promote_proposal(method f, uint256 p, uint256 index) filtered { f -> !f.isV require !f.isView; sinvoke f(eF,arg); - // deqeued index should contain p now only if eF.block.timestamp is not past p's timestamp+queueExpiry + // dequeued index should contain p now only if eF.block.timestamp is not past p's timestamp+queueExpiry assert getFromDequeued(index) == p => eF.block.timestamp <= _proposalTimestamp + _queueExpiry, "Managed to promote $p in ${eF.block.timestamp} after proposal timestamp ${_proposalTimestamp} + ${_queueExpiry}"; } @@ -71,4 +71,4 @@ rule no_promoting_without_upvotes(uint256 p, uint256 index) { dequeueProposalsIfReady(eF); assert getFromDequeued(index) == p => _upvotes > 0, "Cannot dequeue (promote) proposal $p to index $index unless had some upvotes"; -} \ No newline at end of file +} diff --git a/packages/protocol/test-sol/README.md b/packages/protocol/test-sol/README.md index e180efd1a64..8e008a69cee 100644 --- a/packages/protocol/test-sol/README.md +++ b/packages/protocol/test-sol/README.md @@ -6,13 +6,7 @@ You can build this project by simply running forge build ``` - -**Note**: Due to a regression in Foundry, you might need to use an older -version. You can install the most recent version verified to work by running - -```bash -foundryup --version nightly-f625d0fa7c51e65b4bf1e8f7931cd1c6e2e285e9 -``` +These test suites are verified to work with Foundry version 1.0.0-stable. ### Testing @@ -54,11 +48,6 @@ You can read more about the `forge test` command [here](https://book.getfoundry. To skip a specific test, you can add `vm.skip(true);` as the first line of the test. -If a test name begins with `testFail` rather than `test`, foundry will expect the test to fail / revert. - Please follow the naming convention `test_NameOfTest` / `testFail_NameOfTest`. If you're new to Forge / Foundry, we recommend looking through the [Cheatcode Reference](https://book.getfoundry.sh/cheatcodes/) for a list of useful commands that make writing tests easier. - - - diff --git a/packages/protocol/test-sol/TestWithUtils.sol b/packages/protocol/test-sol/TestWithUtils.sol index 6fc2f8e24d3..a0b91237a76 100644 --- a/packages/protocol/test-sol/TestWithUtils.sol +++ b/packages/protocol/test-sol/TestWithUtils.sol @@ -134,15 +134,10 @@ contract TestWithUtils is Test, TestConstants, IsL2Check, PrecompilesOverrideV2 (max - min + 1)) + min; } - // This function can be also found in OpenZeppelin's library, but in a newer version than the one - function compareStrings(string memory a, string memory b) public pure returns (bool) { - return (keccak256(abi.encodePacked((a))) == keccak256(abi.encodePacked((b)))); - } - function containsLog( Vm.Log[] memory logs, string memory signatureString - ) private view returns (bool) { + ) private pure returns (bool) { bytes32 signature = keccak256(abi.encodePacked(signatureString)); for (uint256 i = 0; i < logs.length; i++) { bytes32 logSignature = logs[i].topics[0]; diff --git a/packages/protocol/test-sol/TestWithUtils08.sol b/packages/protocol/test-sol/TestWithUtils08.sol index 4c5984056bb..36fa5d5ec72 100644 --- a/packages/protocol/test-sol/TestWithUtils08.sol +++ b/packages/protocol/test-sol/TestWithUtils08.sol @@ -1,22 +1,22 @@ pragma solidity >=0.5.13 <0.9.0; -import { Test as ForgeTest } from "@lib/celo-foundry-8/lib/forge-std/src/Test.sol"; +import { Test as ForgeTest } from "forge-std-8/Test.sol"; import { TestConstants } from "@test-sol/constants.sol"; import { PrecompileHandler } from "@test-sol/utils/PrecompileHandler.sol"; import { IEpochManagerEnablerMock } from "@test-sol/unit/common/interfaces/IEpochManagerEnablerMock.sol"; import { EpochManagerEnablerMock } from "@test-sol/mocks/EpochManagerEnablerMock.sol"; import { MockCeloUnreleasedTreasury } from "@celo-contracts-8/common/test/MockCeloUnreleasedTreasury.sol"; -import "@celo-contracts-8/common/test/MockCeloToken.sol"; +import { MockCeloToken08 } from "@celo-contracts-8/common/test/MockCeloToken.sol"; -import "@celo-contracts/common/interfaces/IRegistry.sol"; +import { IRegistry } from "@celo-contracts/common/interfaces/IRegistry.sol"; import { IAccounts } from "@celo-contracts/common/interfaces/IAccounts.sol"; -import "@celo-contracts-8/common/mocks/EpochManager_WithMocks.sol"; -import "@celo-contracts-8/common/IsL2Check.sol"; -import "@celo-contracts-8/common/PrecompilesOverrideV2.sol"; +import { EpochManager_WithMocks } from "@celo-contracts-8/common/mocks/EpochManager_WithMocks.sol"; +import { IsL2Check } from "@celo-contracts-8/common/IsL2Check.sol"; +import { PrecompilesOverrideV2 } from "@celo-contracts-8/common/PrecompilesOverrideV2.sol"; contract TestWithUtils08 is ForgeTest, TestConstants, IsL2Check, PrecompilesOverrideV2 { - IRegistry registry; + IRegistry registry; // TODO: Remove since registryContract already present in UsingRegistry PrecompileHandler ph; EpochManager_WithMocks public epochManager; EpochManagerEnablerMock epochManagerEnabler; @@ -139,11 +139,6 @@ contract TestWithUtils08 is ForgeTest, TestConstants, IsL2Check, PrecompilesOver return (addr, pk); } - // This function can be also found in OpenZeppelin's library, but in a newer version than the one we use. - function compareStrings(string memory a, string memory b) public pure returns (bool) { - return (keccak256(abi.encodePacked((a))) == keccak256(abi.encodePacked((b)))); - } - function _registerAndElectValidatorsForL2() internal { address enablerAddr = registry.getAddressFor(EPOCH_MANAGER_ENABLER_REGISTRY_ID); epochManagerEnablerMockInterface = IEpochManagerEnablerMock(enablerAddr); diff --git a/packages/protocol/test-sol/devchain/Import05Dependencies.sol b/packages/protocol/test-sol/devchain/Import05Dependencies.sol index 30c1a4fa5f8..a5f83076ca1 100644 --- a/packages/protocol/test-sol/devchain/Import05Dependencies.sol +++ b/packages/protocol/test-sol/devchain/Import05Dependencies.sol @@ -19,8 +19,4 @@ import { StableTokenEUR } from "@mento-core/contracts/StableTokenEUR.sol"; import { StableTokenBRL } from "@mento-core/contracts/StableTokenBRL.sol"; import { Exchange } from "@mento-core/contracts/Exchange.sol"; -import { IEpochManager } from "@celo-contracts/common/interfaces/IEpochManager.sol"; -import { IValidators } from "@celo-contracts/governance/interfaces/IValidators.sol"; -import "@celo-contracts/common/interfaces/ICeloUnreleasedTreasury.sol"; - contract Import05 {} diff --git a/packages/protocol/test-sol/devchain/e2e/common/EpochManager.t.sol b/packages/protocol/test-sol/devchain/e2e/common/EpochManager.t.sol index 3a708931121..b03330fb648 100644 --- a/packages/protocol/test-sol/devchain/e2e/common/EpochManager.t.sol +++ b/packages/protocol/test-sol/devchain/e2e/common/EpochManager.t.sol @@ -1,5 +1,5 @@ // SPDX-License-Identifier: UNLICENSED -pragma solidity >=0.8.7 <0.8.20; +pragma solidity >=0.8.7 <0.9.0; import { Devchain } from "@test-sol/devchain/e2e/utils.sol"; @@ -8,6 +8,8 @@ import "@test-sol/utils/ECDSAHelper08.sol"; import "@openzeppelin/contracts8/utils/structs/EnumerableSet.sol"; import { console } from "forge-std-8/console.sol"; +import { EpochManagerEnabler } from "@celo-contracts-8/common/EpochManagerEnabler.sol"; + contract E2E_EpochManager is ECDSAHelper08, Devchain { using EnumerableSet for EnumerableSet.AddressSet; @@ -47,40 +49,7 @@ contract E2E_EpochManager is ECDSAHelper08, Devchain { epochDuration = epochManagerContract.epochDuration(); - vm.deal(address(celoUnreleasedTreasuryContract), L2_INITIAL_STASH_BALANCE); // 80% of the total supply to the treasury - whis will be yet distributed - vm.prank(address(0)); - celoTokenContract.mint(address(celoUnreleasedTreasuryContract), L2_INITIAL_STASH_BALANCE); - } - - function activateValidators() public { - uint256[] memory valKeys = new uint256[](9); - valKeys[0] = 0x59c6995e998f97a5a0044966f0945389dc9e86dae88c7a8412f4603b6b78690d; - valKeys[1] = 0x5de4111afa1a4b94908f83103eb1f1706367c2e68ca870fc3fb9a804cdab365a; - valKeys[2] = 0x7c852118294e51e653712a81e05800f419141751be58f605c371e15141b007a6; - valKeys[3] = 0x47e179ec197488593b187f80a00eb0da91f1b9d0b13f8733639f19c30a34926a; - valKeys[4] = 0x8b3a350cf5c34c9194ca85829a2df0ec3153be0318b5e2d3348e872092edffba; - valKeys[5] = 0x92db14e403b83dfe3df233f83dfa3a0d7096f21ca9b0d6d6b8d88b2b4ec1564e; - valKeys[6] = 0x4bbbf85ce3377467afe5d46f804f221813b2bb87f24d81f60f1fcdbf7cbf4356; - valKeys[7] = 0xdbda1821b80551c9d65939329250298aa3472ba22feea921c0cf5d620ea67b97; - valKeys[8] = 0x2a871d0798f97d79848a013d4936a73bf4cc922c825d33c1cf7073dff6d409c6; - - for (uint256 i = 0; i < valKeys.length; i++) { - address account = vm.addr(valKeys[i]); - addressToPrivateKeys[account] = valKeys[i]; - } - - address[] memory registeredValidators = getValidators().getRegisteredValidators(); - travelNEpochL1(4); - - for (uint256 i = 0; i < registeredValidators.length; i++) { - (, , address validatorGroup, , ) = getValidators().getValidator(registeredValidators[i]); - if (getElection().getPendingVotesForGroup(validatorGroup) == 0) { - continue; - } - vm.startPrank(validatorGroup); - election.activate(validatorGroup); - vm.stopPrank(); - } + vm.deal(address(celoUnreleasedTreasuryContract), L2_INITIAL_STASH_BALANCE); // 80% of the total supply to the treasury - this will be yet distributed } function authorizeVoteSigner(uint256 signerPk, address account) internal { @@ -103,7 +72,7 @@ contract E2E_EpochManager is ECDSAHelper08, Devchain { ) { (, , uint256 maxTotalRewards, , ) = epochManagerContract.getEpochProcessingState(); - (, groupWithVotes) = getGroupsWithVotes(); + groupWithVotes = getGroupsWithVotes(); lessers = new address[](_groups.length); greaters = new address[](_groups.length); @@ -111,6 +80,7 @@ contract E2E_EpochManager is ECDSAHelper08, Devchain { uint256[] memory rewards = new uint256[](_groups.length); for (uint256 i = 0; i < _groups.length; i++) { + if (_groups[i] == address(0)) continue; uint256 _groupScore = scoreManager.getGroupScore(_groups[i]); rewards[i] = election.getGroupEpochRewardsBasedOnScore( _groups[i], @@ -119,6 +89,7 @@ contract E2E_EpochManager is ECDSAHelper08, Devchain { ); } for (uint256 i = 0; i < _groups.length; i++) { + if (_groups[i] == address(0)) continue; for (uint256 j = 0; j < groupWithVotes.length; j++) { if (groupWithVotes[j].group == _groups[i]) { groupWithVotes[j].votes += rewards[i]; @@ -143,13 +114,9 @@ contract E2E_EpochManager is ECDSAHelper08, Devchain { } } - function getGroupsWithVotes() - internal - view - returns (address[] memory groupsInOrder, GroupWithVotes[] memory groupWithVotes) - { - uint256[] memory votesTotal; - (groupsInOrder, votesTotal) = election.getTotalVotesForEligibleValidatorGroups(); + function getGroupsWithVotes() internal view returns (GroupWithVotes[] memory groupWithVotes) { + (address[] memory groupsInOrder, uint256[] memory votesTotal) = election + .getTotalVotesForEligibleValidatorGroups(); groupWithVotes = new GroupWithVotes[](groupsInOrder.length); for (uint256 i = 0; i < groupsInOrder.length; i++) { @@ -174,7 +141,11 @@ contract E2E_EpochManager is ECDSAHelper08, Devchain { function assertGroupWithVotes(GroupWithVotes[] memory groupWithVotes) internal { for (uint256 i = 0; i < groupWithVotes.length; i++) { - assertEq(election.getTotalVotesForGroup(groupWithVotes[i].group), groupWithVotes[i].votes); + assertEq( + election.getTotalVotesForGroup(groupWithVotes[i].group), + groupWithVotes[i].votes, + "assertGroupWithVotes" + ); } } @@ -183,7 +154,7 @@ contract E2E_EpochManager is ECDSAHelper08, Devchain { uint256 validatorCount ) internal returns (address newValidatorGroup, address newValidator) { require(validatorCount > 0, "validatorCount must be at least 1"); - (, GroupWithVotes[] memory groupWithVotes) = getGroupsWithVotes(); + GroupWithVotes[] memory groupWithVotes = getGroupsWithVotes(); uint256 newGroupPK = uint256(keccak256(abi.encodePacked("newGroup", index + 1))); address[] memory validatorAddresses = new address[](validatorCount); @@ -301,7 +272,7 @@ contract E2E_EpochManager is ECDSAHelper08, Devchain { address[] memory currentlyElected = epochManagerContract.getElectedAccounts(); for (uint256 i = 0; i < currentlyElected.length; i++) { - (, , address group, , ) = validators.getValidator(currentlyElected[i]); + address group = validators.getMembershipInLastEpoch(currentlyElected[i]); electedGroupsHelper.add(group); } return electedGroupsHelper.values(); @@ -309,43 +280,21 @@ contract E2E_EpochManager is ECDSAHelper08, Devchain { } contract E2E_EpochManager_InitializeSystem is E2E_EpochManager { - function setUp() public override { - super.setUp(); - whenL2(); - } - function test_shouldRevert_WhenCalledByNonEnabler() public { vm.expectRevert("msg.sender is not Enabler"); epochManagerContract.initializeSystem(1, 1, firstElected); } - - function test_ShouldInitializeSystem() public { + function test_shouldRevert_WhenAlreadyInitialized() public { vm.prank(epochManagerEnablerAddress); - epochManagerContract.initializeSystem(42, 43, firstElected); - - assertEq(epochManagerContract.firstKnownEpoch(), 42); - assertEq(epochManagerContract.getCurrentEpochNumber(), 42); - - assertTrue(epochManagerContract.systemAlreadyInitialized()); + vm.expectRevert("Epoch system already initialized"); + epochManagerContract.initializeSystem(1, 1, firstElected); } } -contract E2E_EpochManager_GetCurrentEpoch is E2E_EpochManager { - function setUp() public override { - super.setUp(); - whenL2(); - } - - function test_Revert_WhenSystemNotInitialized() public { - vm.expectRevert("Epoch system not initialized"); - epochManagerContract.getCurrentEpoch(); - } +contract E2E_EpochManager_GetCurrentEpoch is E2E_EpochManager { function test_ReturnExpectedValues() public { - vm.prank(epochManagerEnablerAddress); - epochManagerContract.initializeSystem(42, 43, firstElected); - - assertEq(epochManagerContract.firstKnownEpoch(), 42); - assertEq(epochManagerContract.getCurrentEpochNumber(), 42); + assertEq(epochManagerContract.firstKnownEpoch(), 5); + assertEq(epochManagerContract.getCurrentEpochNumber(), 5); ( uint256 firstBlock, @@ -353,7 +302,7 @@ contract E2E_EpochManager_GetCurrentEpoch is E2E_EpochManager { uint256 startTimestamp, uint256 rewardsBlock ) = epochManagerContract.getCurrentEpoch(); - assertEq(firstBlock, 43); + assertEq(firstBlock, 400); assertEq(lastBlock, 0); assertEq(startTimestamp, block.timestamp); assertEq(rewardsBlock, 0); @@ -363,8 +312,6 @@ contract E2E_EpochManager_GetCurrentEpoch is E2E_EpochManager { contract E2E_EpochManager_StartNextEpochProcess is E2E_EpochManager { function setUp() public override { super.setUp(); - activateValidators(); - whenL2(); validatorsArray = getValidators().getRegisteredValidators(); groups = getValidators().getRegisteredValidatorGroups(); @@ -384,14 +331,11 @@ contract E2E_EpochManager_StartNextEpochProcess is E2E_EpochManager { scoreManager.setValidatorScore(validatorsArray[5], validatorScore[5]); vm.stopPrank(); - - vm.prank(epochManagerEnablerAddress); - epochManagerContract.initializeSystem(1, 1, firstElected); } function test_shouldHaveInitialValues() public { - assertEq(epochManagerContract.firstKnownEpoch(), 1); - assertEq(epochManagerContract.getCurrentEpochNumber(), 1); + assertEq(epochManagerContract.firstKnownEpoch(), 5); + assertEq(epochManagerContract.getCurrentEpochNumber(), 5); // get getEpochProcessingState ( @@ -433,13 +377,14 @@ contract E2E_EpochManager_FinishNextEpochProcess is E2E_EpochManager { EnumerableSet.AddressSet internal originalyElected; + struct AliceContext { + address alice; + uint256 lockedAmount; + address targetGroup; + } + function setUp() public override { super.setUp(); - activateValidators(); - whenL2(); - - vm.prank(epochManagerEnablerAddress); - epochManagerContract.initializeSystem(1, 1, firstElected); validatorsArray = getValidators().getRegisteredValidators(); groups = getValidators().getRegisteredValidatorGroups(); @@ -567,6 +512,509 @@ contract E2E_EpochManager_FinishNextEpochProcess is E2E_EpochManager { assertEq(epochManagerContract.getElectedAccounts().length, validatorsArray.length - 1); } + function test_shouldFinishNextEpochProcessing_WhenValidatorDeaffiliatesBeforeStart() public { + address[] memory lessers; + address[] memory greaters; + GroupWithVotes[] memory groupWithVotes; + (lessers, greaters, groupWithVotes) = getLessersAndGreaters(groups); + + uint256 currentEpoch = epochManagerContract.getCurrentEpochNumber(); + address[] memory currentlyElected = epochManagerContract.getElectedAccounts(); + for (uint256 i = 0; i < currentlyElected.length; i++) { + originalyElected.add(currentlyElected[i]); + } + + // wait some time before finishing + timeTravel(epochDuration / 2); + blockTravel(100); + + epochManagerContract.finishNextEpochProcess(groups, lessers, greaters); + + assertEq(currentEpoch + 1, epochManagerContract.getCurrentEpochNumber()); + + for (uint256 i = 0; i < currentlyElected.length; i++) { + assertEq(originalyElected.contains(currentlyElected[i]), true); + } + + timeTravel(epochDuration + 1); + epochManagerContract.startNextEpochProcess(); + + // wait some time before finishing + timeTravel(epochDuration / 2); + blockTravel(100); + + (lessers, greaters, groupWithVotes) = getLessersAndGreaters(groups); + epochManagerContract.finishNextEpochProcess(groups, lessers, greaters); + assertGroupWithVotes(groupWithVotes); + + assertEq(currentEpoch + 2, epochManagerContract.getCurrentEpochNumber()); + + address[] memory newlyElected2 = epochManagerContract.getElectedAccounts(); + + for (uint256 i = 0; i < currentlyElected.length; i++) { + assertEq(originalyElected.contains(newlyElected2[i]), true); + } + + // add new validator group and validator + (address newValidatorGroup, address newValidator) = registerNewValidatorGroupWithValidator( + 0, + 1 + ); + + vm.prank(currentlyElected[0]); + validators.deaffiliate(); + + timeTravel(epochDuration + 1); + epochManagerContract.startNextEpochProcess(); + + timeTravel(epochDuration / 2); + blockTravel(100); + + (lessers, greaters, groupWithVotes) = getLessersAndGreaters(groups); + epochManagerContract.finishNextEpochProcess(groups, lessers, greaters); + assertGroupWithVotes(groupWithVotes); + + groups.push(newValidatorGroup); + validatorsArray.push(newValidator); + + assertEq( + epochManagerContract.getElectedAccounts().length, + validators.getRegisteredValidators().length - 1 // -1 because the validator deaffiliated + ); + assertEq(groups.length, validators.getRegisteredValidatorGroups().length); + + timeTravel(epochDuration + 1); + epochManagerContract.startNextEpochProcess(); + (lessers, greaters, groupWithVotes) = getLessersAndGreaters(groups); + epochManagerContract.finishNextEpochProcess(groups, lessers, greaters); + assertGroupWithVotes(groupWithVotes); + + assertEq(epochManagerContract.getElectedAccounts().length, validatorsArray.length - 1); // -1 because the validator deaffiliated + } + + function test_shouldFinishNextEpochProcessing_WhenValidatorDeaffiliatesAndDeregistersBeforeStart() + public + { + address[] memory lessers; + address[] memory greaters; + GroupWithVotes[] memory groupWithVotes; + (lessers, greaters, groupWithVotes) = getLessersAndGreaters(groups); + + uint256 currentEpoch = epochManagerContract.getCurrentEpochNumber(); + address[] memory currentlyElected = epochManagerContract.getElectedAccounts(); + for (uint256 i = 0; i < currentlyElected.length; i++) { + originalyElected.add(currentlyElected[i]); + } + + // wait some time before finishing + timeTravel(epochDuration / 2); + blockTravel(100); + + epochManagerContract.finishNextEpochProcess(groups, lessers, greaters); + + assertEq(currentEpoch + 1, epochManagerContract.getCurrentEpochNumber(), "wrong epoch 1"); + + for (uint256 i = 0; i < currentlyElected.length; i++) { + assertEq(originalyElected.contains(currentlyElected[i]), true); + } + + timeTravel(epochDuration + 1); + epochManagerContract.startNextEpochProcess(); + + // wait some time before finishing + timeTravel(epochDuration / 2); + blockTravel(100); + + (lessers, greaters, groupWithVotes) = getLessersAndGreaters(groups); + epochManagerContract.finishNextEpochProcess(groups, lessers, greaters); + assertGroupWithVotes(groupWithVotes); + + assertEq(currentEpoch + 2, epochManagerContract.getCurrentEpochNumber(), "wrong epoch 2"); + + address[] memory newlyElected2 = epochManagerContract.getElectedAccounts(); + + for (uint256 i = 0; i < currentlyElected.length; i++) { + assertEq(originalyElected.contains(newlyElected2[i]), true, "Wrong number of elected"); + } + + // add new validator group and validator + (address newValidatorGroup, address newValidator) = registerNewValidatorGroupWithValidator( + 0, + 1 + ); + + uint256 registeredValidatorsBeforeDeregistering = validators.getRegisteredValidators().length; + + // 7 + // commenting these asserts as for some reason console.log doesnt work + // assertEq( + // registeredValidatorsBeforeDeregistering, 0, + // "Registered validators before deregistering:" + // ); + // assertEq(epochManagerContract.getElectedAccounts().length, 0, "Elected accounts length before deregistering:"); + // // console.log("Elected accounts length before deregistering:", epochManagerContract.getElectedAccounts().length); + + vm.prank(currentlyElected[0]); + validators.deaffiliate(); + (, uint256 duration) = validators.getValidatorLockedGoldRequirements(); + vm.warp(block.timestamp + duration + 1); + vm.prank(currentlyElected[0]); + validators.deregisterValidator(0); + + timeTravel(epochDuration + 1); + + // assertEq( + // validators.getRegisteredValidators().length, 0, + // "Registered validators after deregistering:" + // ); + // assertEq(epochManagerContract.getElectedAccounts().length, 0, "Elected accounts length after deregistering:"); + + assertEq( + registeredValidatorsBeforeDeregistering - 1, + validators.getRegisteredValidators().length, + "Registered validators length should decrease by 1 after deregistering" + ); + + assertEq( + newlyElected2.length, + epochManagerContract.getElectedAccounts().length, + "Elected accounts length should not change" + ); + + assertEq( // note: Pavel check this + epochManagerContract.getElectedAccounts().length, + validators.getRegisteredValidators().length, // not all validators are elected + "Wrong number of validators before passing the epoch" + ); + + epochManagerContract.startNextEpochProcess(); + + timeTravel(epochDuration / 2); + blockTravel(100); + + (lessers, greaters, groupWithVotes) = getLessersAndGreaters(groups); + epochManagerContract.finishNextEpochProcess(groups, lessers, greaters); + assertGroupWithVotes(groupWithVotes); + + groups.push(newValidatorGroup); + validatorsArray.push(newValidator); + + assertEq( + epochManagerContract.getElectedAccounts().length, + validators.getRegisteredValidators().length, + "Wrong number of validators" + ); + assertEq( + groups.length, + validators.getRegisteredValidatorGroups().length, + "wrong number of groups" + ); + + timeTravel(epochDuration + 1); + epochManagerContract.startNextEpochProcess(); + (lessers, greaters, groupWithVotes) = getLessersAndGreaters(groups); + epochManagerContract.finishNextEpochProcess(groups, lessers, greaters); + assertGroupWithVotes(groupWithVotes); + + assertEq( + epochManagerContract.getElectedAccounts().length, + validatorsArray.length - 1, + "Wrong number of elected accounts" + ); // -1 because the validator deaffiliated + } + + function test_shouldFinishNextEpochProcessing_WhenValidatorDeaffiliatesBeforeFinish() public { + address[] memory lessers; + address[] memory greaters; + GroupWithVotes[] memory groupWithVotes; + (lessers, greaters, groupWithVotes) = getLessersAndGreaters(groups); + + uint256 currentEpoch = epochManagerContract.getCurrentEpochNumber(); + address[] memory currentlyElected = epochManagerContract.getElectedAccounts(); + for (uint256 i = 0; i < currentlyElected.length; i++) { + originalyElected.add(currentlyElected[i]); + } + + // wait some time before finishing + timeTravel(epochDuration / 2); + blockTravel(100); + + epochManagerContract.finishNextEpochProcess(groups, lessers, greaters); + + assertEq(currentEpoch + 1, epochManagerContract.getCurrentEpochNumber()); + + for (uint256 i = 0; i < currentlyElected.length; i++) { + assertEq(originalyElected.contains(currentlyElected[i]), true); + } + + timeTravel(epochDuration + 1); + epochManagerContract.startNextEpochProcess(); + + // wait some time before finishing + timeTravel(epochDuration / 2); + blockTravel(100); + + (lessers, greaters, groupWithVotes) = getLessersAndGreaters(groups); + epochManagerContract.finishNextEpochProcess(groups, lessers, greaters); + assertGroupWithVotes(groupWithVotes); + + assertEq(currentEpoch + 2, epochManagerContract.getCurrentEpochNumber()); + + address[] memory newlyElected2 = epochManagerContract.getElectedAccounts(); + + for (uint256 i = 0; i < currentlyElected.length; i++) { + assertEq(originalyElected.contains(newlyElected2[i]), true); + } + + // add new validator group and validator + (address newValidatorGroup, address newValidator) = registerNewValidatorGroupWithValidator( + 0, + 1 + ); + + timeTravel(epochDuration + 1); + epochManagerContract.startNextEpochProcess(); + + vm.prank(currentlyElected[0]); + validators.deaffiliate(); + + timeTravel(epochDuration / 2); + blockTravel(100); + + (lessers, greaters, groupWithVotes) = getLessersAndGreaters(groups); + epochManagerContract.finishNextEpochProcess(groups, lessers, greaters); + assertGroupWithVotes(groupWithVotes); + + groups.push(newValidatorGroup); + validatorsArray.push(newValidator); + + assertEq( + epochManagerContract.getElectedAccounts().length, + validators.getRegisteredValidators().length - 1, // -1 because the validator deaffiliated + "getElectedAccounts != getRegisteredValidators" + ); + assertEq( + groups.length, + validators.getRegisteredValidatorGroups().length, + "groups != registeredValidatorGroups" + ); + + timeTravel(epochDuration + 1); + epochManagerContract.startNextEpochProcess(); + (lessers, greaters, groupWithVotes) = getLessersAndGreaters(groups); + epochManagerContract.finishNextEpochProcess(groups, lessers, greaters); + assertGroupWithVotes(groupWithVotes); + + assertEq(epochManagerContract.getElectedAccounts().length, validatorsArray.length - 1); // -1 because the validator deaffiliated + } + + /** + * @notice Calculates the lesser and greater neighbors for a group after a potential vote. + * @param targetGroup The group receiving the vote. + * @param voteAmount The amount of the vote. + * @return lesser The address of the group with the next lower vote count after the vote. + * @return greater The address of the group with the next higher vote count after the vote. + */ + function _calculateVoteNeighbors( + address targetGroup, + uint256 voteAmount + ) internal view returns (address lesser, address greater) { + GroupWithVotes[] memory groupWithVotesSimulated = getGroupsWithVotes(); + + uint targetGroupIndex = groupWithVotesSimulated.length; // Sentinel value + for (uint i = 0; i < groupWithVotesSimulated.length; i++) { + if (groupWithVotesSimulated[i].group == targetGroup) { + groupWithVotesSimulated[i].votes += voteAmount; + targetGroupIndex = i; + break; + } + } + require( + targetGroupIndex < groupWithVotesSimulated.length, + "Target group not found for voting simulation" + ); + + sort(groupWithVotesSimulated); + + lesser = address(0); + greater = address(0); + for (uint i = 0; i < groupWithVotesSimulated.length; i++) { + if (groupWithVotesSimulated[i].group == targetGroup) { + lesser = (i == groupWithVotesSimulated.length - 1) + ? address(0) + : groupWithVotesSimulated[i + 1].group; + greater = (i == 0) ? address(0) : groupWithVotesSimulated[i - 1].group; + break; + } + } + } + + /** + * @notice Calculates the lesser, greater neighbors, and original index for a group after a potential vote revocation. + * @param targetGroup The group whose votes are being revoked. + * @param revokeAmount The amount of votes being revoked. + * @return lesser The address of the group with the next lower vote count after revocation. + * @return greater The address of the group with the next higher vote count after revocation. + * @return index The original index of the targetGroup in the eligible list before simulation. + */ + function _calculateRevokeNeighbors( + address targetGroup, + uint256 revokeAmount + ) internal view returns (address lesser, address greater, uint index) { + GroupWithVotes[] memory groupWithVotesSimulated = getGroupsWithVotes(); + + uint targetGroupIndexSimulated = groupWithVotesSimulated.length; + for (uint i = 0; i < groupWithVotesSimulated.length; i++) { + if (groupWithVotesSimulated[i].group == targetGroup) { + if (groupWithVotesSimulated[i].votes >= revokeAmount) { + groupWithVotesSimulated[i].votes -= revokeAmount; + } else { + groupWithVotesSimulated[i].votes = 0; + } + targetGroupIndexSimulated = i; + break; + } + } + require( + targetGroupIndexSimulated < groupWithVotesSimulated.length, + "Target group not found for revoke simulation" + ); + index = targetGroupIndexSimulated; + + sort(groupWithVotesSimulated); + + lesser = address(0); + greater = address(0); + for (uint i = 0; i < groupWithVotesSimulated.length; i++) { + if (groupWithVotesSimulated[i].group == targetGroup) { + lesser = (i == groupWithVotesSimulated.length - 1) + ? address(0) + : groupWithVotesSimulated[i + 1].group; + greater = (i == 0) ? address(0) : groupWithVotesSimulated[i - 1].group; + break; + } + } + } + + function _setupAlice() internal returns (AliceContext memory ctx) { + ctx.alice = vm.addr(uint256(keccak256(abi.encodePacked("alice")))); + vm.deal(ctx.alice, 100_000 ether); + ctx.lockedAmount = 1 ether; + + vm.startPrank(ctx.alice); + accounts.createAccount(); + lockedCelo.lock{ value: ctx.lockedAmount }(); + vm.stopPrank(); + + uint256 actualLocked = lockedCelo.getAccountTotalLockedGold(ctx.alice); + require(actualLocked == ctx.lockedAmount, "Alice lock failed"); + + GroupWithVotes[] memory initialGroups = getGroupsWithVotes(); + require(initialGroups.length >= 3, "Not enough groups for test setup"); + ctx.targetGroup = initialGroups[2].group; + } + + function _aliceVote(AliceContext memory ctx) internal { + (address lesser, address greater) = _calculateVoteNeighbors(ctx.targetGroup, ctx.lockedAmount); + vm.prank(ctx.alice); + election.vote(ctx.targetGroup, ctx.lockedAmount, lesser, greater); + vm.stopPrank(); + } + + function _aliceActivate(AliceContext memory ctx) internal { + vm.startPrank(ctx.alice); + election.activate(ctx.targetGroup); + vm.stopPrank(); + // assertApproxEqAbs is used because usually locked celo != active votes since votes are adjusted by Celo inflation in function unitsToVotes + assertApproxEqAbs( + election.getActiveVotesForGroupByAccount(ctx.targetGroup, ctx.alice), + ctx.lockedAmount, + 1, + "Alice activation failed" + ); + } + + function _aliceRevoke(AliceContext memory ctx) internal { + (address lesser, address greater, uint index) = _calculateRevokeNeighbors( + ctx.targetGroup, + ctx.lockedAmount + ); + + vm.startPrank(ctx.alice); + uint256 activeVotes = election.getTotalVotesForGroupByAccount(ctx.targetGroup, ctx.alice); + require(activeVotes >= ctx.lockedAmount, "Insufficient active votes to revoke"); + election.revokeActive(ctx.targetGroup, ctx.lockedAmount, lesser, greater, index); + vm.stopPrank(); + } + + function _aliceUnlock(AliceContext memory ctx) internal { + uint256 unlockingPeriod = lockedCelo.unlockingPeriod(); + timeTravel(unlockingPeriod + 1); + + vm.prank(ctx.alice); + lockedCelo.unlock(ctx.lockedAmount); + vm.stopPrank(); + } + + function _advanceAndFinishNextEpochProcessing( + uint256 expectedStartEpoch + ) internal returns (uint256 newEpoch) { + assertEq( + epochManagerContract.getCurrentEpochNumber(), + expectedStartEpoch, + "Epoch mismatch before finish" + ); + timeTravel(epochDuration / 2); + blockTravel(100); + + address[] memory lessers; + address[] memory greaters; + (lessers, greaters, ) = getLessersAndGreaters(groups); + + epochManagerContract.finishNextEpochProcess(groups, lessers, greaters); + newEpoch = epochManagerContract.getCurrentEpochNumber(); + assertEq(newEpoch, expectedStartEpoch + 1, "Epoch did not increment after finish"); + } + + function _advanceAndStartNextEpochProcessing(uint256 expectedStartEpoch) internal { + assertEq( + epochManagerContract.getCurrentEpochNumber(), + expectedStartEpoch, + "Epoch mismatch before start" + ); + timeTravel(epochDuration + 1); + epochManagerContract.startNextEpochProcess(); + uint256 currentEpoch = epochManagerContract.getCurrentEpochNumber(); + assertEq( + currentEpoch, + expectedStartEpoch, + "Epoch number changed unexpectedly on startNextEpochProcess" + ); + } + + function test_shouldFinishNextEpochProcessing_WithAlice_Votes() public { + AliceContext memory aliceCtx = _setupAlice(); + _aliceVote(aliceCtx); + + uint256 epochAfterVote = _advanceAndFinishNextEpochProcessing( + epochManagerContract.getCurrentEpochNumber() + ); + + _advanceAndStartNextEpochProcessing(epochAfterVote); + + _aliceActivate(aliceCtx); + + uint256 epochAfterActivate = _advanceAndFinishNextEpochProcessing(epochAfterVote); + + _advanceAndStartNextEpochProcessing(epochAfterActivate); + + _aliceRevoke(aliceCtx); + + _aliceUnlock(aliceCtx); + + _advanceAndFinishNextEpochProcessing(epochAfterActivate); + } + function clearElectedGroupsHelper() internal { address[] memory values = electedGroupsHelper.values(); @@ -581,12 +1029,6 @@ contract E2E_GasTest_Setup is E2E_EpochManager { EnumerableSet.AddressSet internal originalyElected; function setUpHelper(uint256 validatorGroupCount, uint256 validatorPerGroupCount) internal { - activateValidators(); - whenL2(); - - vm.prank(epochManagerEnablerAddress); - epochManagerContract.initializeSystem(1, 1, firstElected); - validatorsArray = getValidators().getRegisteredValidators(); groups = getValidators().getRegisteredValidatorGroups(); @@ -662,8 +1104,6 @@ contract E2E_GasTest_Setup is E2E_EpochManager { (lessers, greaters, groupWithVotes) = getLessersAndGreaters(groups); epochManagerContract.finishNextEpochProcess(groups, lessers, greaters); - activateValidators(); - timeTravel(epochDuration + 1); epochManagerContract.startNextEpochProcess(); @@ -694,10 +1134,9 @@ contract E2E_GasTest1_FinishNextEpochProcess is E2E_GasTest_Setup { uint256 gasLeftBefore1 = gasleft(); epochManagerContract.finishNextEpochProcess(groups, lessers, greaters); uint256 gasLeftAfter1 = gasleft(); - console.log("validator groups: 120"); - console.log("validators per group: 2"); - console.log("finishNextEpochProcess gas used 2: ", gasLeftBefore1 - gasLeftAfter1); - console.log("elected count2: ", epochManagerContract.getElectedAccounts().length); + + uint256 gasUsed = gasLeftBefore1 - gasLeftAfter1; + assertGt(gasUsed, 0); } } @@ -721,10 +1160,9 @@ contract E2E_GasTest2_FinishNextEpochProcess is E2E_GasTest_Setup { uint256 gasLeftBefore1 = gasleft(); epochManagerContract.finishNextEpochProcess(groups, lessers, greaters); uint256 gasLeftAfter1 = gasleft(); - console.log("validator groups: 60"); - console.log("validators per group: 2"); - console.log("finishNextEpochProcess gas used 2: ", gasLeftBefore1 - gasLeftAfter1); - console.log("elected count2: ", epochManagerContract.getElectedAccounts().length); + + uint256 gasUsed = gasLeftBefore1 - gasLeftAfter1; + assertGt(gasUsed, 0); } } @@ -734,12 +1172,6 @@ contract E2E_FinishNextEpochProcess_Split is E2E_GasTest_Setup { function setUp() public override { super.setUp(); - activateValidators(); - whenL2(); - - vm.prank(epochManagerEnablerAddress); - epochManagerContract.initializeSystem(1, 1, firstElected); - validatorsArray = getValidators().getRegisteredValidators(); groups = getValidators().getRegisteredValidatorGroups(); @@ -826,25 +1258,85 @@ contract E2E_FinishNextEpochProcess_Split is E2E_GasTest_Setup { for (uint256 i = 0; i < groups.length; i++) { epochManagerContract.processGroup(groups[i], lessers[i], greaters[i]); } + } + + /** + * @notice Test the gas used by finishNextEpochProcess + This test is trying to measure gas used by finishNextEpochProcess in a real life worst case. We have 126 validators and 123 groups. + There are two main loops in the function, one for calculating rewards and the other for updating the elected validators. + FinishNextEpochProcess is called twice, first time with going from 6 -> 110 validators which consumes approx. 6M gas and the second time with going from 110 -> 110 validators which consumes approx. 19M gas. + */ + function test_shouldFinishNextEpochProcessing_GasTest_Split() public { + timeTravel(epochDuration + 1); + epochManagerContract.startNextEpochProcess(); + + groups = getCurrentlyElectedGroups(); - activateValidators(); + timeTravel(epochDuration / 2); + blockTravel(100); + + address[] memory lessers; + address[] memory greaters; + GroupWithVotes[] memory groupWithVotes; + (lessers, greaters, groupWithVotes) = getLessersAndGreaters(groups); + epochManagerContract.setToProcessGroups(); + for (uint256 i = 0; i < groups.length; i++) { + uint256 gasLeftBefore1 = gasleft(); + epochManagerContract.processGroup(groups[i], lessers[i], greaters[i]); + uint256 gasLeftAfter1 = gasleft(); + + uint256 gasUsed = gasLeftBefore1 - gasLeftAfter1; + assertGt(gasUsed, 0); + } + } + + function test_shouldFinishNextEpochProcessing_GasTest_Split_DeaffiliateBeforeStart() public { timeTravel(epochDuration + 1); + + address[] memory currentlyElected = epochManagerContract.getElectedAccounts(); + + vm.prank(currentlyElected[0]); + validators.deaffiliate(); + epochManagerContract.startNextEpochProcess(); groups = getCurrentlyElectedGroups(); timeTravel(epochDuration / 2); blockTravel(100); + + address[] memory lessers; + address[] memory greaters; + GroupWithVotes[] memory groupWithVotes; + (lessers, greaters, groupWithVotes) = getLessersAndGreaters(groups); + epochManagerContract.setToProcessGroups(); + + for (uint256 i = 0; i < groups.length; i++) { + uint256 gasLeftBefore1 = gasleft(); + epochManagerContract.processGroup(groups[i], lessers[i], greaters[i]); + uint256 gasLeftAfter1 = gasleft(); + + uint256 gasUsed = gasLeftBefore1 - gasLeftAfter1; + assertGt(gasUsed, 0); + } } - /** - * @notice Test the gas used by finishNextEpochProcess - This test is trying to measure gas used by finishNextEpochProcess in a real life worst case. We have 126 validators and 123 groups. - There are two main loops in the function, one for calculating rewards and the other for updating the elected validators. - FinishNextEpochProcess is called twice, first time with going from 6 -> 110 validators which consumes approx. 6M gas and the second time with going from 110 -> 110 validators which consumes approx. 19M gas. - */ - function test_shouldFinishNextEpochProcessing_GasTest_Split() public { + function test_shouldFinishNextEpochProcessing_GasTest_Split_DeaffiliateBeforeFinish() public { + timeTravel(epochDuration + 1); + + address[] memory currentlyElected = epochManagerContract.getElectedAccounts(); + + epochManagerContract.startNextEpochProcess(); + + vm.prank(currentlyElected[0]); + validators.deaffiliate(); + + groups = getCurrentlyElectedGroups(); + + timeTravel(epochDuration / 2); + blockTravel(100); + address[] memory lessers; address[] memory greaters; GroupWithVotes[] memory groupWithVotes; @@ -855,7 +1347,9 @@ contract E2E_FinishNextEpochProcess_Split is E2E_GasTest_Setup { uint256 gasLeftBefore1 = gasleft(); epochManagerContract.processGroup(groups[i], lessers[i], greaters[i]); uint256 gasLeftAfter1 = gasleft(); - console.log("processGroup gas used: ", gasLeftBefore1 - gasLeftAfter1); + + uint256 gasUsed = gasLeftBefore1 - gasLeftAfter1; + assertGt(gasUsed, 0); } } } diff --git a/packages/protocol/test-sol/devchain/e2e/common/Governance.t.sol b/packages/protocol/test-sol/devchain/e2e/common/Governance.t.sol new file mode 100644 index 00000000000..bba7bab2cf8 --- /dev/null +++ b/packages/protocol/test-sol/devchain/e2e/common/Governance.t.sol @@ -0,0 +1,365 @@ +// SPDX-License-Identifier: UNLICENSED +pragma solidity >=0.8.7 <0.8.20; + +// Foundry imports +import { console } from "forge-std-8/console.sol"; +import { stdJson } from "forge-std-8/StdJson.sol"; + +// OpenZeppelin imports +import { Ownable } from "@openzeppelin/contracts8/access/Ownable.sol"; + +// Governance +import { IGovernance } from "@celo-contracts/governance/interfaces/IGovernance.sol"; +import { IGovernanceVote } from "@celo-contracts/governance/interfaces/IGovernanceVote.sol"; +import { IGovernanceSlasher } from "@celo-contracts/governance/interfaces/IGovernanceSlasher.sol"; + +// Common imports +import { IMultiSig } from "@celo-contracts/common/interfaces/IMultiSig.sol"; +import { IRegistry } from "@celo-contracts/common/interfaces/IRegistry.sol"; + +// Test imports +import { Devchain } from "@test-sol/devchain/e2e/utils.sol"; +import { ConstitutionHelper } from "@test-sol/utils/ConstitutionHelper.sol"; + +contract E2E_Election is Devchain { + function test_shouldElectAllValidators() public { + // elect all validators + address[] memory allValidators_ = election.electValidatorSigners(); + + // assert there are 6 validators + assertEq(allValidators_.length, 6); + } + + function test_shouldElectSpecifiedValidators() public { + // elect between 1 and 4 validators (out of 6 total) + address[] memory selectedValidators_ = election.electNValidatorSigners(1, 4); + + // assert there are 4 validators + assertEq(selectedValidators_.length, 4); + } +} + +contract E2E_Constitution is Devchain { + // test cases + ConstitutionHelper.ConstitutionEntry[] internal constitutionCases; + ConstitutionHelper.ConstitutionEntry internal currentCase; + + // event for transparency + event LogConstitutionCase(ConstitutionHelper.ConstitutionEntry); + + // snapshot + uint256 internal constitutionSnapshot; + + // parametrization + modifier parametrized__constitutionCase() { + for (uint256 i = 0; i < constitutionCases.length; i++) { + currentCase = constitutionCases[i]; + if (constitutionSnapshot == 0) constitutionSnapshot = vm.snapshot(); + _; + vm.revertTo(constitutionSnapshot); + } + } + + function setUp() public virtual override { + // read constitution + ConstitutionHelper.readConstitution(constitutionCases, registryContract, vm); + } + + function test_shouldHaveCorrectThreshold() public parametrized__constitutionCase { + emit LogConstitutionCase(currentCase); + assertEq( + governance.getConstitution(currentCase.contractAddress, currentCase.functionSelector), + currentCase.threshold + ); + } +} + +contract E2E_Governance is Devchain { + using stdJson for string; + + // config + address internal ownerAddress = 0xf39Fd6e51aad88F6F4ce6aB8827279cffFb92266; + uint256 internal minDeposit; + uint256 internal dequeueFrequency; + uint256 internal approvalDuration; + uint256 internal referendumDuration; + + // test vars + address internal tester = actor("e2e"); + uint256 internal locked; + uint256 internal proposalId = 1; + uint256 internal dequeueIndex = 0; + + function setUp() public virtual override { + // setup config + string memory config_ = vm.readFile("./migrations_sol/migrationsConfig.json"); + minDeposit = config_.readUint(".governance.minDeposit"); + dequeueFrequency = config_.readUint(".governance.dequeueFrequency"); + approvalDuration = config_.readUint(".governance.approvalStageDuration"); + referendumDuration = config_.readUint(".governance.referendumStageDuration"); + + // transfer out ownership to governance + vm.prank(ownerAddress); + Ownable(address(registryContract)).transferOwnership(address(governance)); + + // setup tester account + vm.deal(tester, 10_000_001 ether + minDeposit); + vm.startPrank(tester); + accounts.createAccount(); + lockedCelo.lock{ value: 10_000_000 ether }(); + vm.stopPrank(); + + // retrieve locked celo + locked = lockedCelo.getAccountTotalLockedGold(tester); + } + + function beforeTestSetup( + bytes4 _testSelector + ) public pure virtual returns (bytes[] memory beforeCalldata_) { + // ensure tests inherit state + if (_testSelector == this.test_shouldUpvoteProposal.selector) { + beforeCalldata_ = new bytes[](1); + beforeCalldata_[0] = abi.encodePacked(this.test_shouldIncrementProposalCount.selector); + } else if (_testSelector == this.test_shouldApproveProposal.selector) { + beforeCalldata_ = new bytes[](2); + beforeCalldata_[0] = abi.encodePacked(this.test_shouldIncrementProposalCount.selector); + beforeCalldata_[1] = abi.encodePacked(this.test_shouldUpvoteProposal.selector); + } else if (_testSelector == this.test_shouldIncrementVoteTotals.selector) { + beforeCalldata_ = new bytes[](3); + beforeCalldata_[0] = abi.encodePacked(this.test_shouldIncrementProposalCount.selector); + beforeCalldata_[1] = abi.encodePacked(this.test_shouldUpvoteProposal.selector); + beforeCalldata_[2] = abi.encodePacked(this.test_shouldApproveProposal.selector); + } else if (_testSelector == this.test_shouldExecuteProposal.selector) { + beforeCalldata_ = new bytes[](4); + beforeCalldata_[0] = abi.encodePacked(this.test_shouldIncrementProposalCount.selector); + beforeCalldata_[1] = abi.encodePacked(this.test_shouldUpvoteProposal.selector); + beforeCalldata_[2] = abi.encodePacked(this.test_shouldApproveProposal.selector); + beforeCalldata_[3] = abi.encodePacked(this.test_shouldIncrementVoteTotals.selector); + } + } + + function test_shouldIncrementProposalCount() public virtual { + // setup values + uint256[] memory values_ = new uint256[](2); + values_[0] = 0; + values_[1] = 0; + + // setup destinations + address[] memory destinations_ = new address[](2); + destinations_[0] = registryAddress; + destinations_[1] = registryAddress; + + // setup data + bytes[] memory data_ = new bytes[](2); + data_[0] = abi.encodeWithSelector(IRegistry.setAddressFor.selector, "test1", address(11)); + data_[1] = abi.encodeWithSelector(IRegistry.setAddressFor.selector, "test2", address(12)); + + // setup data lengths + uint256[] memory dataLengths_ = new uint256[](2); + dataLengths_[0] = data_[0].length; + dataLengths_[1] = data_[1].length; + + // propose + vm.prank(tester); + governance.propose{ value: minDeposit }( + values_, + destinations_, + abi.encodePacked(data_[0], data_[1]), + dataLengths_, + "url" + ); + + // assert + assertEq(governance.proposalCount(), proposalId); + } + + function test_shouldUpvoteProposal() public { + // upvote + vm.prank(tester); + governance.upvote( + proposalId, + 0, // lesser + 0 // greater + ); + + // assert + assertEq(governance.getUpvotes(proposalId), locked); + assertGt(locked, 0); + } + + function test_shouldApproveProposal() public { + // increase time and mine 1 block + timeTravel(dequeueFrequency); + blockTravel(1); + + // submit tx from multisig + address multisig_ = registryContract.getAddressForString("GovernanceApproverMultiSig"); + vm.prank(ownerAddress); + IMultiSig(multisig_).submitTransaction( + address(governance), + 0, // value + abi.encodeWithSelector(IGovernance.approve.selector, proposalId, dequeueIndex) + ); + + // assert + assertTrue(governance.isApproved(proposalId)); + } + + function test_shouldIncrementVoteTotals() public { + // increase time and mine 1 block + timeTravel(approvalDuration); + blockTravel(1); + + // vote + vm.prank(tester); + IGovernanceVote(address(governance)).vote( + proposalId, + dequeueIndex, + IGovernanceVote.VoteValue.Yes + ); + + // assert + (uint256 yesVotes, , ) = governance.getVoteTotals(proposalId); + assertEq(yesVotes, locked); + } + + function test_shouldExecuteProposal() public virtual { + // increase time and mine 1 block + timeTravel(referendumDuration); + blockTravel(1); + + // execute + vm.prank(tester); + governance.execute(proposalId, dequeueIndex); + + // assert + assertEq(registryContract.getAddressForStringOrDie("test1"), address(11)); + assertEq(registryContract.getAddressForStringOrDie("test2"), address(12)); + } +} + +// TODO: Altough not present in Truffle integration tests -> to fully test GovernanceSlasher it would be great to add: +// TODO: - test case for account that already voted some of his locked celo (slashing requires revoking votes) +// TODO: - test case for a validator account that is a member of a validator group (group additionally gets slashed) +contract E2E_GovernanceSlashing is E2E_Governance { + // test vars + IGovernanceSlasher internal governanceSlasher; + address internal slashed = actor("slashed"); + uint256 internal penalty = 10_000_000 ether; + + function setUp() public virtual override { + super.setUp(); + + // get slasher + governanceSlasher = IGovernanceSlasher( + registryContract.getAddressForOrDie(GOVERNANCE_SLASHER_REGISTRY_ID) + ); + + // transfer out ownership to governance + vm.prank(ownerAddress); + Ownable(address(governanceSlasher)).transferOwnership(address(governance)); + + // setup slashed account + vm.deal(slashed, penalty + 1 ether); + vm.startPrank(slashed); + accounts.createAccount(); + lockedCelo.lock{ value: penalty }(); + vm.stopPrank(); + } + + function beforeTestSetup( + bytes4 _testSelector + ) public pure virtual override returns (bytes[] memory beforeCalldata_) { + if ( + _testSelector == this.test_shouldSetApprovedSlashingZero.selector || + _testSelector == this.test_shouldSlashAccount.selector + ) { + beforeCalldata_ = new bytes[](5); + beforeCalldata_[0] = abi.encodePacked(this.test_shouldIncrementProposalCount.selector); + beforeCalldata_[1] = abi.encodePacked(this.test_shouldUpvoteProposal.selector); + beforeCalldata_[2] = abi.encodePacked(this.test_shouldApproveProposal.selector); + beforeCalldata_[3] = abi.encodePacked(this.test_shouldIncrementVoteTotals.selector); + beforeCalldata_[4] = abi.encodePacked(this.test_shouldExecuteProposal.selector); + } else return super.beforeTestSetup(_testSelector); + } + + function test_shouldIncrementProposalCount() public virtual override { + // setup values + uint256[] memory values_ = new uint256[](2); + values_[0] = 0; + values_[1] = 0; + + // setup destinations + address[] memory destinations_ = new address[](2); + destinations_[0] = address(governanceSlasher); + destinations_[1] = address(governanceSlasher); + + // setup data + bytes[] memory data_ = new bytes[](2); + data_[0] = abi.encodeWithSelector( + IGovernanceSlasher.approveSlashing.selector, + slashed, + penalty + ); + data_[1] = abi.encodeWithSelector(IGovernanceSlasher.setSlasherExecuter.selector, tester); + + // setup data lengths + uint256[] memory dataLengths_ = new uint256[](2); + dataLengths_[0] = data_[0].length; + dataLengths_[1] = data_[1].length; + + // propose + vm.prank(tester); + governance.propose{ value: minDeposit }( + values_, + destinations_, + abi.encodePacked(data_[0], data_[1]), + dataLengths_, + "url" + ); + + // assert + assertEq(governance.proposalCount(), proposalId); + } + + function test_shouldExecuteProposal() public virtual override { + // increase time and mine 1 block + timeTravel(referendumDuration); + blockTravel(1); + + // execute + vm.prank(tester); + governance.execute(proposalId, dequeueIndex); + + // assert + assertEq(governanceSlasher.getApprovedSlashing(slashed), penalty); + } + + function _slash() internal { + // increase time and mine 1 block + timeTravel(referendumDuration); + blockTravel(1); + + // slash + address group_ = address(0); + address[] memory lessers_; + address[] memory greaters_; + uint256[] memory indices_; + vm.prank(tester); + governanceSlasher.slash(slashed, group_, lessers_, greaters_, indices_); + } + + function test_shouldSetApprovedSlashingZero() public { + _slash(); + + // should set approved slashing value to 0 + assertEq(governanceSlasher.getApprovedSlashing(slashed), 0); + } + + function test_shouldSlashAccount() public { + _slash(); + + // whole locked celo should be slashed + assertEq(lockedCelo.getAccountTotalLockedGold(slashed), 0); + } +} diff --git a/packages/protocol/test-sol/devchain/e2e/utils.sol b/packages/protocol/test-sol/devchain/e2e/utils.sol index 804e14aef89..e76a8935bc0 100644 --- a/packages/protocol/test-sol/devchain/e2e/utils.sol +++ b/packages/protocol/test-sol/devchain/e2e/utils.sol @@ -1,37 +1,55 @@ // SPDX-License-Identifier: UNLICENSED pragma solidity >=0.8.7 <0.8.20; -import "@celo-contracts/common/interfaces/IRegistry.sol"; -import { IEpochManager } from "@celo-contracts/common/interfaces/IEpochManager.sol"; +// Test imports +import { TestWithUtils08 } from "@test-sol/TestWithUtils08.sol"; + +// Celo contracts imports import { IAccounts } from "@celo-contracts/common/interfaces/IAccounts.sol"; +import { ICeloUnreleasedTreasury } from "@celo-contracts/common/interfaces/ICeloUnreleasedTreasury.sol"; +import { ICeloToken } from "@celo-contracts/common/interfaces/ICeloToken.sol"; +import { IElection } from "@celo-contracts/governance/interfaces/IElection.sol"; +import { IEpochRewards } from "@celo-contracts/governance/interfaces/IEpochRewards.sol"; +import { IEpochManagerEnabler } from "@celo-contracts/common/interfaces/IEpochManagerEnabler.sol"; +import { IEpochManager } from "@celo-contracts/common/interfaces/IEpochManager.sol"; +import { IEscrow } from "@celo-contracts/identity/interfaces/IEscrow.sol"; +import { IFederatedAttestations } from "@celo-contracts/identity/interfaces/IFederatedAttestations.sol"; +import { FeeCurrencyDirectory } from "@celo-contracts-8/common/FeeCurrencyDirectory.sol"; +import { IFeeHandler } from "@celo-contracts/common/interfaces/IFeeHandler.sol"; +import { IFreezer } from "@celo-contracts/common/interfaces/IFreezer.sol"; +import { IGovernance } from "@celo-contracts/governance/interfaces/IGovernance.sol"; +import { ILockedGold } from "@celo-contracts/governance/interfaces/ILockedGold.sol"; +import { IOdisPayments } from "@celo-contracts/identity/interfaces/IOdisPayments.sol"; +import { IRandom } from "@celo-contracts/identity/interfaces/IRandom.sol"; import { IScoreManager } from "@celo-contracts-8/common/interfaces/IScoreManager.sol"; +import { ISortedOracles } from "@celo-contracts/stability/interfaces/ISortedOracles.sol"; import { IValidators } from "@celo-contracts/governance/interfaces/IValidators.sol"; -import { IElection } from "@celo-contracts/governance/interfaces/IElection.sol"; -import { ILockedCelo } from "@celo-contracts/governance/interfaces/ILockedCelo.sol"; -import { ICeloToken } from "@celo-contracts/common/interfaces/ICeloToken.sol"; - -// All core contracts that are expected to be in the Registry on the devchain -import "@celo-contracts-8/common/FeeCurrencyDirectory.sol"; -import "@celo-contracts/stability/interfaces/ISortedOracles.sol"; -import "@celo-contracts/common/interfaces/ICeloUnreleasedTreasury.sol"; - -import "@test-sol/TestWithUtils08.sol"; contract Devchain is TestWithUtils08 { // All core contracts that are expected to be in the Registry on the devchain - ISortedOracles sortedOracles; - FeeCurrencyDirectory feeCurrencyDirectory; - IEpochManager epochManagerContract; - ICeloUnreleasedTreasury celoUnreleasedTreasuryContract; - IValidators validators; + // TODO: Change all contracts to be imported as interfaces IAccounts accounts; - IScoreManager scoreManager; - IElection election; - ILockedCelo lockedCelo; + ICeloUnreleasedTreasury celoUnreleasedTreasuryContract; ICeloToken celoTokenContract; + IElection election; + IEpochRewards epochRewards; + IEpochManagerEnabler epochManagerEnablerContract; + IEpochManager epochManagerContract; + IEscrow escrow; + IFederatedAttestations federatedAttestations; + FeeCurrencyDirectory feeCurrencyDirectory; + IFeeHandler feeHandler; + IFreezer freezer; + IGovernance governance; + ILockedGold lockedCelo; + IOdisPayments odisPayments; + IRandom randomContract; + IScoreManager scoreManager; + ISortedOracles sortedOracles; + IValidators validators; constructor() { - // Fetch all core contracts that are expeceted to be in the Registry on the devchain + // Fetch all core contracts that are expected to be in the Registry on the devchain sortedOracles = getSortedOracles(); feeCurrencyDirectory = FeeCurrencyDirectory( registryContract.getAddressForStringOrDie("FeeCurrencyDirectory") @@ -41,12 +59,27 @@ contract Devchain is TestWithUtils08 { celoUnreleasedTreasuryContract = getCeloUnreleasedTreasury(); validators = getValidators(); accounts = getAccounts(); - scoreManager = IScoreManager(address(getScoreReader())); - election = getElection(); - lockedCelo = getLockedCelo(); + celoUnreleasedTreasuryContract = getCeloUnreleasedTreasury(); celoTokenContract = ICeloToken(registryContract.getAddressForOrDie(GOLD_TOKEN_REGISTRY_ID)); + election = getElection(); + epochRewards = getEpochRewards(); + epochManagerEnablerContract = getEpochManagerEnabler(); + epochManagerContract = getEpochManager(); + escrow = getEscrow(); + federatedAttestations = getFederatedAttestations(); + feeCurrencyDirectory = FeeCurrencyDirectory( + registryContract.getAddressForOrDie(FEE_CURRENCY_DIRECTORY_REGISTRY_ID) + ); // FIXME: FeeCurrencyDirectory is not in UsingRegistry.sol + feeHandler = getFeeHandler(); + freezer = getFreezer(); + governance = getGovernance(); + lockedCelo = getLockedGold(); + odisPayments = getOdisPayments(); + randomContract = getRandom(); + scoreManager = IScoreManager(address(getScoreReader())); + sortedOracles = getSortedOracles(); + validators = getValidators(); - // TODO: Add missing core contracts below (see list in migrations_sol/constants.sol) // TODO: Consider asserting that all contracts we expect are available in the Devchain class // (see list in migrations_sol/constants.sol) } diff --git a/packages/protocol/test-sol/devchain/migration/Migration.t.sol b/packages/protocol/test-sol/devchain/migration/Migration.t.sol index 91fd7ff2601..928d043ad71 100644 --- a/packages/protocol/test-sol/devchain/migration/Migration.t.sol +++ b/packages/protocol/test-sol/devchain/migration/Migration.t.sol @@ -78,7 +78,7 @@ contract RegistryIntegrationTest is IntegrationTest, MigrationsConstants { bytes32 hashLockedCelo = keccak256(abi.encodePacked("LockedCelo")); bytes32 hashEpochManager = keccak256(abi.encodePacked("EpochManager")); - for (uint256 i = 0; i < contractsInRegistry.length; i++) { + for (uint256 i = 0; i < contractsInRegistryPath.length; i++) { // Read name from list of core contracts string memory contractName = contractsInRegistry[i]; console2.log("Checking bytecode of:", contractName); @@ -114,8 +114,9 @@ contract RegistryIntegrationTest is IntegrationTest, MigrationsConstants { string memory contractFileName = string(abi.encodePacked(contractName, ".sol")); // Get bytecode from build artifacts + // this has to be built twice like we do when migrating bytes memory expectedBytecodeWithMetadataFromArtifacts = vm.getDeployedCode( - contractFileName + contractsInRegistryPath[i] ); bytes memory expectedBytecodeFromArtifacts = removeMetadataFromBytecode( expectedBytecodeWithMetadataFromArtifacts @@ -181,11 +182,6 @@ contract EpochManagerIntegrationTest is IntegrationTest, MigrationsConstants { celoTokenContract = ICeloToken(registry.getAddressForStringOrDie("GoldToken")); vm.deal(address(0), CELO_SUPPLY_CAP); - vm.prank(address(0)); - celoTokenContract.mint(reserveAddress, RESERVE_BALANCE); - - vm.prank(address(0)); - celoTokenContract.mint(randomAddress, L1_MINTED_CELO_SUPPLY - RESERVE_BALANCE); // mint outstanding l1 supply before L2. epochManagerContract = IEpochManager(registry.getAddressForStringOrDie("EpochManager")); epochManagerEnablerContract = IEpochManagerEnabler( @@ -193,51 +189,19 @@ contract EpochManagerIntegrationTest is IntegrationTest, MigrationsConstants { ); } - function activateValidators() public { - address[] memory registeredValidators = validatorsContract.getRegisteredValidators(); - travelNEpochL1(4); - - for (uint256 i = 0; i < registeredValidators.length; i++) { - (, , address validatorGroup, , ) = validatorsContract.getValidator(registeredValidators[i]); - if (election.getPendingVotesForGroup(validatorGroup) == 0) { - continue; - } - vm.startPrank(validatorGroup); - election.activate(validatorGroup); - vm.stopPrank(); - } - } - - function test_Reverts_whenSystemNotInitialized() public { - vm.expectRevert("Epoch system not initialized"); - epochManagerContract.startNextEpochProcess(); - } - function test_Reverts_WhenEndOfEpochHasNotBeenReached() public { - // fund treasury - vm.prank(address(0)); - celoTokenContract.mint(unreleasedTreasury, L2_INITIAL_STASH_BALANCE); - vm.deal(unreleasedTreasury, L2_INITIAL_STASH_BALANCE); - - uint256 l1EpochNumber = IPrecompiles(address(validatorsContract)).getEpochNumber(); - - vm.prank(address(epochManagerEnablerContract)); - epochManagerContract.initializeSystem(l1EpochNumber, block.number, validatorsList); - vm.expectRevert("Epoch is not ready to start"); epochManagerContract.startNextEpochProcess(); } function test_Reverts_whenAlreadyInitialized() public { - _MockL2Migration(validatorsList); - vm.prank(address(epochManagerEnablerContract)); vm.expectRevert("Epoch system already initialized"); epochManagerContract.initializeSystem(100, block.number, firstElected); } function test_Reverts_whenTransferingCeloToUnreleasedTreasury() public { - _MockL2Migration(validatorsList); + _setValidatorScore(); blockTravel(43200); timeTravel(DAY); @@ -249,7 +213,7 @@ contract EpochManagerIntegrationTest is IntegrationTest, MigrationsConstants { } function test_SetsCurrentRewardBlock() public { - _MockL2Migration(validatorsList); + _setValidatorScore(); blockTravel(L2_BLOCK_IN_EPOCH); timeTravel(DAY); @@ -262,28 +226,7 @@ contract EpochManagerIntegrationTest is IntegrationTest, MigrationsConstants { assertEq(status, 1); } - function _MockL2Migration(address[] memory _validatorsList) internal { - for (uint256 i = 0; i < _validatorsList.length; i++) { - firstElected.push(_validatorsList[i]); - } - - uint256 l1EpochNumber = IPrecompiles(address(validatorsContract)).getEpochNumber(); - - activateValidators(); - vm.deal(unreleasedTreasury, L2_INITIAL_STASH_BALANCE); - - vm.prank(address(0)); - celoTokenContract.mint(unreleasedTreasury, L2_INITIAL_STASH_BALANCE); - - whenL2(); - _setValidatorL2Score(); - - vm.prank(address(epochManagerEnablerContract)); - - epochManagerContract.initializeSystem(l1EpochNumber, block.number, firstElected); - } - - function _setValidatorL2Score() internal { + function _setValidatorScore() internal { address scoreManagerOwner = scoreManager.owner(); vm.startPrank(scoreManagerOwner); scoreManager.setGroupScore(groupList[0], groupScore[0]); diff --git a/packages/protocol/test-sol/integration/CompileValidatorMock.t.sol b/packages/protocol/test-sol/integration/CompileValidatorMock.t.sol index 7fbc16ca0c1..05ff5b8cded 100644 --- a/packages/protocol/test-sol/integration/CompileValidatorMock.t.sol +++ b/packages/protocol/test-sol/integration/CompileValidatorMock.t.sol @@ -5,9 +5,9 @@ import "celo-foundry-8/Test.sol"; import "forge-std/console.sol"; // here only to forge compile of ValidatorsMock -import "@test-sol/unit/governance/validators/mocks/ValidatorsMock.sol"; +import "@test-sol/unit/governance/validators/mocks/ValidatorsCompile.sol"; -contract CompileValidatorMock is Test { +contract CompileValidatos is Test { function test_nop() public view { console.log("nop"); } diff --git a/packages/protocol/test-sol/integration/RevokeCeloAfterL2Transition.sol b/packages/protocol/test-sol/integration/RevokeCeloAfterL2Transition.sol index d16c6001b30..c410b7382c3 100644 --- a/packages/protocol/test-sol/integration/RevokeCeloAfterL2Transition.sol +++ b/packages/protocol/test-sol/integration/RevokeCeloAfterL2Transition.sol @@ -59,19 +59,6 @@ contract RevokeCeloAfterL2Transition is TestWithUtils, ECDSAHelper { address authorizedVoteSigner2; uint256 authorizedVoteSignerPK2; - bytes public constant blsPublicKey = - abi.encodePacked( - bytes32(0x0101010101010101010101010101010101010101010101010101010101010101), - bytes32(0x0202020202020202020202020202020202020202020202020202020202020202), - bytes32(0x0303030303030303030303030303030303030303030303030303030303030303) - ); - bytes public constant blsPop = - abi.encodePacked( - bytes16(0x04040404040404040404040404040404), - bytes16(0x05050505050505050505050505050505), - bytes16(0x06060606060606060606060606060606) - ); - struct ValidatorLockedGoldRequirements { uint256 value; uint256 duration; @@ -118,7 +105,6 @@ contract RevokeCeloAfterL2Transition is TestWithUtils, ECDSAHelper { uint256 validatorRegistrationEpochNumber; function setUp() public { - ph.setEpochSize(DAY / 5); owner = address(this); accApprover = actor("approver"); group = actor("group"); @@ -172,7 +158,7 @@ contract RevokeCeloAfterL2Transition is TestWithUtils, ECDSAHelper { election = new Election(true); lockedGold = new LockedGold(true); address validatorsAddress = actor("Validators"); - deployCodeTo("ValidatorsMock.sol", validatorsAddress); + deployCodeTo("ValidatorsCompile", validatorsAddress); validators = IValidators(validatorsAddress); // TODO move to create2 validatorsMockTunnel = new ValidatorsMockTunnel(address(validators)); @@ -238,26 +224,18 @@ contract RevokeCeloAfterL2Transition is TestWithUtils, ECDSAHelper { duration: 100 * DAY }); - originalValidatorScoreParameters = ValidatorScoreParameters({ - exponent: 5, - adjustmentSpeed: FixidityLib.newFixedFraction(5, 20) - }); - initParams = ValidatorsMockTunnel.InitParams({ registryAddress: REGISTRY_ADDRESS, groupRequirementValue: originalGroupLockedGoldRequirements.value, groupRequirementDuration: originalGroupLockedGoldRequirements.duration, validatorRequirementValue: originalValidatorLockedGoldRequirements.value, - validatorRequirementDuration: originalValidatorLockedGoldRequirements.duration, - validatorScoreExponent: originalValidatorScoreParameters.exponent, - validatorScoreAdjustmentSpeed: originalValidatorScoreParameters.adjustmentSpeed.unwrap() + validatorRequirementDuration: originalValidatorLockedGoldRequirements.duration }); initParams2 = ValidatorsMockTunnel.InitParams2({ _membershipHistoryLength: membershipHistoryLength, _slashingMultiplierResetPeriod: slashingMultiplierResetPeriod, _maxGroupSize: maxGroupSize, - _commissionUpdateDelay: commissionUpdateDelay, - _downtimeGracePeriod: downtimeGracePeriod + _commissionUpdateDelay: commissionUpdateDelay }); validatorsMockTunnel.MockInitialize(owner, initParams, initParams2); @@ -283,7 +261,7 @@ contract RevokeCeloAfterL2Transition is TestWithUtils, ECDSAHelper { } function _whenL2() public { - uint256 l1EpochNumber = IPrecompiles(address(validators)).getEpochNumber(); + uint256 l1EpochNumber = 100; deployCodeTo("Registry.sol", abi.encode(false), PROXY_ADMIN_ADDRESS); @@ -347,10 +325,8 @@ contract RevokeCeloAfterL2Transition is TestWithUtils, ECDSAHelper { bytes memory _ecdsaPubKey = _generateEcdsaPubKey(_validator, _validatorPk); - ph.mockSuccess(ph.PROOF_OF_POSSESSION(), abi.encodePacked(_validator, blsPublicKey, blsPop)); - vm.prank(_validator); - validators.registerValidator(_ecdsaPubKey, blsPublicKey, blsPop); + validators.registerValidatorNoBls(_ecdsaPubKey); validatorRegistrationEpochNumber = IPrecompiles(address(validators)).getEpochNumber(); return _ecdsaPubKey; } @@ -478,10 +454,8 @@ contract RevokeCeloAfterL2TransitionTest is RevokeCeloAfterL2Transition { ) internal returns (bytes memory) { (bytes memory _ecdsaPubKey, , , ) = _generateEcdsaPubKeyWithSigner(_validator, signerPk); - ph.mockSuccess(ph.PROOF_OF_POSSESSION(), abi.encodePacked(_validator, blsPublicKey, blsPop)); - vm.prank(_validator); - validators.registerValidator(_ecdsaPubKey, blsPublicKey, blsPop); + validators.registerValidatorNoBls(_ecdsaPubKey); validatorRegistrationEpochNumber = IPrecompiles(address(validators)).getEpochNumber(); return _ecdsaPubKey; } diff --git a/packages/protocol/test-sol/unit/common/Accounts.t.sol b/packages/protocol/test-sol/unit/common/Accounts.t.sol index d28bc244889..54d08637eba 100644 --- a/packages/protocol/test-sol/unit/common/Accounts.t.sol +++ b/packages/protocol/test-sol/unit/common/Accounts.t.sol @@ -7,7 +7,6 @@ import "@celo-contracts/common/Accounts.sol"; import "@celo-contracts/governance/test/MockValidators.sol"; import { TestWithUtils } from "@test-sol/TestWithUtils.sol"; -import "@test-sol/utils/WhenL2.sol"; contract AccountsTest is TestWithUtils { using FixidityLib for FixidityLib.Fraction; @@ -91,6 +90,7 @@ contract AccountsTest is TestWithUtils { (caller, callerPK) = actorWithPK("caller"); (caller2, caller2PK) = actorWithPK("caller2"); + whenL2WithEpochManagerInitialization(); } function getParsedSignatureOfAddress( @@ -193,13 +193,7 @@ contract AccountsTest is TestWithUtils { } } -contract AccountsTest_L2 is AccountsTest, WhenL2 {} - contract AccountsTest_createAccount is AccountsTest { - function setUp() public { - super.setUp(); - } - function test_ShouldCreateTheAccount() public { assertEq(accounts.isAccount(address(this)), false); accounts.createAccount(); @@ -213,13 +207,7 @@ contract AccountsTest_createAccount is AccountsTest { } } -contract AccountsTest_createAccount_L2 is AccountsTest_L2, AccountsTest_createAccount {} - contract AccountsTest_setAccountDataEncryptionKey is AccountsTest { - function setUp() public { - super.setUp(); - } - function test_ShouldSetDataEncryptionKey() public { accounts.setAccountDataEncryptionKey(dataEncryptionKey); assertEq(accounts.getDataEncryptionKey(address(this)), dataEncryptionKey); @@ -250,16 +238,7 @@ contract AccountsTest_setAccountDataEncryptionKey is AccountsTest { } } -contract AccountsTest_setAccountDataEncryptionKey_L2 is - AccountsTest_L2, - AccountsTest_setAccountDataEncryptionKey -{} - contract AccountsTest_setAccount is AccountsTest { - function setUp() public { - super.setUp(); - } - function test_ShouldSetTheNameDataEncryptionKeyAndWalletAddress_WhenTheAccountHasBeenCreated() public { @@ -341,13 +320,7 @@ contract AccountsTest_setAccount is AccountsTest { } } -contract AccountsTest_setAccount_L2 is AccountsTest_L2, AccountsTest_setAccount {} - contract AccountsTest_setWalletAddress is AccountsTest { - function setUp() public { - super.setUp(); - } - function test_ShouldRevert_WhenAccountHasNotBeenCreated() public { vm.expectRevert("Unknown account"); accounts.setWalletAddress(address(this), 0, 0x0, 0x0); @@ -393,13 +366,7 @@ contract AccountsTest_setWalletAddress is AccountsTest { } } -contract AccountsTest_setWalletAddress_L2 is AccountsTest_L2, AccountsTest_setWalletAddress {} - contract AccountsTest_setMetadataURL is AccountsTest { - function setUp() public { - super.setUp(); - } - function test_ShouldRevert_WhenAccountHasNotBeenCreated() public { vm.expectRevert("Unknown account"); accounts.setMetadataURL(metadataURL); @@ -419,13 +386,7 @@ contract AccountsTest_setMetadataURL is AccountsTest { } } -contract AccountsTest_setMetadataURL_L2 is AccountsTest_L2, AccountsTest_setMetadataURL {} - contract AccountsTest_batchGetMetadataURL is AccountsTest { - function setUp() public { - super.setUp(); - } - function parseSolidityStringArray( uint256[] memory stringLengths, bytes memory data @@ -471,13 +432,7 @@ contract AccountsTest_batchGetMetadataURL is AccountsTest { } } -contract AccountsTest_batchGetMetadataURL_L2 is AccountsTest_L2, AccountsTest_batchGetMetadataURL {} - contract AccountsTest_addStorageRoot is AccountsTest { - function setUp() public { - super.setUp(); - } - function test_ShouldRevert_WhenAccountHasNotBeenCreated() public { vm.expectRevert("Unknown account"); accounts.addStorageRoot(storageRoot); @@ -519,13 +474,7 @@ contract AccountsTest_addStorageRoot is AccountsTest { } } -contract AccountsTest_addStorageRoot_L2 is AccountsTest_L2, AccountsTest_addStorageRoot {} - contract AccountsTest_removeStorageRoot is AccountsTest { - function setUp() public { - super.setUp(); - } - function test_ShouldRevert_WhenAccountHasNotBeenCreated() public { vm.expectRevert("Unknown account"); accounts.removeStorageRoot(0); @@ -597,17 +546,11 @@ contract AccountsTest_removeStorageRoot is AccountsTest { } } -contract AccountsTest_removeStorageRoot_L2 is AccountsTest_L2, AccountsTest_removeStorageRoot {} - contract AccountsTest_setPaymentDelegation is AccountsTest { address beneficiary = actor("beneficiary"); uint256 fraction = FixidityLib.newFixedFraction(2, 10).unwrap(); uint256 badFraction = FixidityLib.newFixedFraction(12, 10).unwrap(); - function setUp() public { - super.setUp(); - } - function test_ShouldNotBeCallableByNonAccount() public { vm.expectRevert("Must first register address with Account.createAccount"); accounts.setPaymentDelegation((beneficiary), fraction); @@ -641,11 +584,6 @@ contract AccountsTest_setPaymentDelegation is AccountsTest { } } -contract AccountsTest_setPaymentDelegation_L2 is - AccountsTest_L2, - AccountsTest_setPaymentDelegation -{} - contract AccountsTest_deletePaymentDelegation is AccountsTest { address beneficiary = actor("beneficiary"); uint256 fraction = FixidityLib.newFixedFraction(2, 10).unwrap(); @@ -676,16 +614,7 @@ contract AccountsTest_deletePaymentDelegation is AccountsTest { } } -contract AccountsTest_deletePaymentDelegation_L2 is - AccountsTest_L2, - AccountsTest_deletePaymentDelegation -{} - contract AccountsTest_setName is AccountsTest { - function setUp() public { - super.setUp(); - } - function test_ShouldNotBeCallableByNonAccount() public { vm.expectRevert("Register with createAccount to set account name"); accounts.setName(name); @@ -705,8 +634,6 @@ contract AccountsTest_setName is AccountsTest { } } -contract AccountsTest_setName_L2 is AccountsTest_L2, AccountsTest_setName {} - contract AccountsTest_GenericAuthorization is AccountsTest { address account2 = actor("account2"); address signer; @@ -891,11 +818,6 @@ contract AccountsTest_GenericAuthorization is AccountsTest { } } -contract AccountsTest_GenericAuthorization_L2 is - AccountsTest_L2, - AccountsTest_GenericAuthorization -{} - contract AccountsTest_BackwardCompatibility is AccountsTest { address account = address(this); address otherAccount = actor("otherAccount"); @@ -1563,8 +1485,3 @@ contract AccountsTest_BackwardCompatibility is AccountsTest { helper_ShouldRemoveSigner(Role.Validator, false, true); } } - -contract AccountsTest_BackwardCompatibility_L2 is - AccountsTest_L2, - AccountsTest_BackwardCompatibility -{} diff --git a/packages/protocol/test-sol/unit/common/CeloTokenMock.sol b/packages/protocol/test-sol/unit/common/CeloTokenMock.sol index dd0d1db7b19..4d907e9d46a 100644 --- a/packages/protocol/test-sol/unit/common/CeloTokenMock.sol +++ b/packages/protocol/test-sol/unit/common/CeloTokenMock.sol @@ -10,10 +10,6 @@ contract CeloTokenMock is GoldToken(true) { uint8 public constant decimals = 18; mapping(address => uint256) balances; - function setTotalSupply(uint256 value) external { - totalSupply_ = value; - } - function transfer(address to, uint256 amount) external returns (bool) { return _transfer(msg.sender, to, amount); } diff --git a/packages/protocol/test-sol/unit/common/CeloUnreleasedTreasury.t.sol b/packages/protocol/test-sol/unit/common/CeloUnreleasedTreasury.t.sol index 9c066a34ad7..b64cbad153c 100644 --- a/packages/protocol/test-sol/unit/common/CeloUnreleasedTreasury.t.sol +++ b/packages/protocol/test-sol/unit/common/CeloUnreleasedTreasury.t.sol @@ -9,7 +9,6 @@ import "@celo-contracts/common/interfaces/ICeloToken.sol"; import { CeloUnreleasedTreasury } from "@celo-contracts-8/common/CeloUnreleasedTreasury.sol"; import { TestWithUtils08 } from "@test-sol/TestWithUtils08.sol"; -import { WhenL2, WhenL2NoInitialization } from "@test-sol/utils/WhenL2-08.sol"; contract CeloUnreleasedTreasuryTest is TestWithUtils08 { using FixidityLib for FixidityLib.Fraction; @@ -34,6 +33,7 @@ contract CeloUnreleasedTreasuryTest is TestWithUtils08 { registry.setAddressFor(CeloTokenContract, address(celoTokenContract)); newCeloUnreleasedTreasury(); + whenL2WithEpochManagerInitialization(); } function newCeloUnreleasedTreasury() internal { @@ -47,12 +47,6 @@ contract CeloUnreleasedTreasuryTest is TestWithUtils08 { } } -contract CeloUnreleasedTreasuryTest_L2 is CeloUnreleasedTreasuryTest, WhenL2 { - function setUp() public virtual override(CeloUnreleasedTreasuryTest, WhenL2) { - super.setUp(); - } -} - contract CeloUnreleasedTreasuryTest_initialize is CeloUnreleasedTreasuryTest { function test_ShouldSetAnOwnerToCeloUnreleasedTreasuryInstance() public { assertEq(celoUnreleasedTreasuryContract.owner(), celoDistributionOwner); @@ -84,15 +78,7 @@ contract CeloUnreleasedTreasuryTest_initialize is CeloUnreleasedTreasuryTest { } contract CeloUnreleasedTreasuryTest_release is CeloUnreleasedTreasuryTest { - function test_Reverts_WhenL1() public { - vm.expectRevert("Insufficient balance."); - vm.prank(address(epochManager)); - - celoUnreleasedTreasuryContract.release(randomAddress, 4); - } -} -contract CeloUnreleasedTreasuryTest_release_L2 is CeloUnreleasedTreasuryTest_L2 { - function setUp() public override(CeloUnreleasedTreasuryTest_L2) { + function setUp() public override(CeloUnreleasedTreasuryTest) { super.setUp(); } @@ -115,26 +101,7 @@ contract CeloUnreleasedTreasuryTest_release_L2 is CeloUnreleasedTreasuryTest_L2 contract CeloUnreleasedTreasuryTest_getRemainingBalanceToRelease is CeloUnreleasedTreasuryTest { uint256 _startingBalance; - function setUp() public virtual override { - super.setUp(); - _startingBalance = address(celoUnreleasedTreasuryContract).balance; - } - - function test_ShouldReturnContractBalanceBeforeFirstRelease() public { - uint256 _remainingBalance = celoUnreleasedTreasuryContract.getRemainingBalanceToRelease(); - - assertEq(_startingBalance, _remainingBalance); - } -} - -contract CeloUnreleasedTreasuryTest_getRemainingBalanceToRelease_L2 is - CeloUnreleasedTreasuryTest_L2, - CeloUnreleasedTreasuryTest_getRemainingBalanceToRelease -{ - function setUp() - public - override(CeloUnreleasedTreasuryTest_L2, CeloUnreleasedTreasuryTest_getRemainingBalanceToRelease) - { + function setUp() public override(CeloUnreleasedTreasuryTest) { super.setUp(); _startingBalance = address(celoUnreleasedTreasuryContract).balance; diff --git a/packages/protocol/test-sol/unit/common/EpochManager.t.sol b/packages/protocol/test-sol/unit/common/EpochManager.t.sol index a8946ad9e0b..314db130182 100644 --- a/packages/protocol/test-sol/unit/common/EpochManager.t.sol +++ b/packages/protocol/test-sol/unit/common/EpochManager.t.sol @@ -15,8 +15,6 @@ import { EpochRewardsMock08 } from "@celo-contracts-8/governance/test/EpochRewar import "@celo-contracts-8/stability/test/MockStableToken.sol"; import { TestWithUtils08 } from "@test-sol/TestWithUtils08.sol"; -import { ValidatorsMock } from "@test-sol/unit/governance/validators/mocks/ValidatorsMock.sol"; -import { WhenL2, WhenL2NoInitialization } from "@test-sol/utils/WhenL2-08.sol"; contract EpochManagerTest is TestWithUtils08 { EpochManager_WithMocks epochManagerContract; @@ -67,6 +65,12 @@ contract EpochManagerTest is TestWithUtils08 { event OracleAddressSet(address indexed newOracleAddress); event GroupMarkedForProcessing(address indexed group, uint256 indexed epochNumber); event GroupProcessed(address indexed group, uint256 indexed epochNumber); + event ValidatorEpochRewardAllocated( + address indexed validator, + uint256 validatorReward, + address indexed group, + uint256 indexed epochNumber + ); function setUp() public virtual override { super.setUp(); @@ -117,6 +121,7 @@ contract EpochManagerTest is TestWithUtils08 { validators.setEpochRewards(validator1, validator1Reward); validators.setEpochRewards(validator2, validator2Reward); + whenL2WithEpochManagerInitialization(); } function setupAndElectValidators() public { @@ -181,18 +186,6 @@ contract EpochManagerTest is TestWithUtils08 { } } -contract EpochManagerTest_L2_NoInit is EpochManagerTest, WhenL2NoInitialization { - function setUp() public virtual override(EpochManagerTest, WhenL2NoInitialization) { - super.setUp(); - } -} - -contract EpochManagerTest_L2 is EpochManagerTest, WhenL2 { - function setUp() public virtual override(EpochManagerTest, WhenL2) { - super.setUp(); - } -} - contract EpochManagerTest_initialize is EpochManagerTest { function test_initialize() public virtual { assertEq(address(epochManagerContract.registry()), REGISTRY_ADDRESS); @@ -215,22 +208,12 @@ contract EpochManagerTest_initializeSystem is EpochManagerTest { function setUp() public override { super.setUp(); - _registerAndElectValidatorsForL2(); - epochManagerEnabler.captureEpochAndValidators(); - setCeloUnreleasedTreasuryBalance(); lastKnownEpochNumber = epochManagerEnabler.lastKnownEpochNumber(); lastKnownFirstBlockOfEpoch = epochManagerEnabler.lastKnownFirstBlockOfEpoch(); lastKnownElectedAccounts = epochManagerEnabler.getlastKnownElectedAccounts(); } function test_processCanBeStarted() public virtual { - vm.prank(address(epochManagerEnabler)); - - epochManagerContract.initializeSystem( - lastKnownEpochNumber, - lastKnownFirstBlockOfEpoch, - lastKnownElectedAccounts - ); ( uint256 _firstEpochBlock, uint256 _lastEpochBlock, @@ -247,12 +230,6 @@ contract EpochManagerTest_initializeSystem is EpochManagerTest { } function test_Reverts_processCannotBeStartedAgain() public virtual { - vm.prank(address(epochManagerEnabler)); - epochManagerContract.initializeSystem( - lastKnownEpochNumber, - lastKnownFirstBlockOfEpoch, - lastKnownElectedAccounts - ); vm.prank(address(epochManagerEnabler)); vm.expectRevert("Epoch system already initialized"); epochManagerContract.initializeSystem( @@ -273,20 +250,6 @@ contract EpochManagerTest_initializeSystem is EpochManagerTest { } contract EpochManagerTest_startNextEpochProcess is EpochManagerTest { - function test_Reverts_onL1() public { - vm.expectRevert("Epoch system not initialized"); - epochManagerContract.startNextEpochProcess(); - } -} - -contract EpochManagerTest_startNextEpochProcess_L2_NoInit is EpochManagerTest_L2_NoInit { - function test_Reverts_whenSystemNotInitialized() public { - vm.expectRevert("Epoch system not initialized"); - epochManagerContract.startNextEpochProcess(); - } -} - -contract EpochManagerTest_startNextEpochProcess_L2 is EpochManagerTest_L2 { function test_Reverts_WhenEndOfEpochHasNotBeenReached() public { vm.expectRevert("Epoch is not ready to start"); epochManagerContract.startNextEpochProcess(); @@ -344,11 +307,11 @@ contract EpochManagerTest_startNextEpochProcess_L2 is EpochManagerTest_L2 { function test_ShouldReleaseCorrectAmountToReserve() public { setupAndElectValidators(); epochManagerContract.startNextEpochProcess(); + (uint256 numerator, uint256 denominator) = sortedOracles.getExchangeRate(address(stableToken)); uint256 reserveBalanceAfter = celoToken.balanceOf(reserveAddress); - assertEq( - reserveBalanceAfter, - (stableAmountForRate * (validator1Reward + validator2Reward)) / 1e24 - ); + uint256 CELOequivalent = (denominator * (validator1Reward + validator2Reward)) / numerator; + + assertEq(reserveBalanceAfter, CELOequivalent); } } @@ -375,15 +338,6 @@ contract EpochManagerTest_setEpochDuration is EpochManagerTest { vm.expectRevert("New epoch duration must be greater than zero."); epochManagerContract.setEpochDuration(0); } -} - -contract EpochManagerTest_setEpochDuration_L2 is - EpochManagerTest_L2, - EpochManagerTest_setEpochDuration -{ - function setUp() public override(EpochManagerTest_L2, EpochManagerTest) { - super.setUp(); - } function test_Reverts_WhenIsOnEpochProcess() public { setupAndElectValidators(); @@ -423,15 +377,6 @@ contract EpochManagerTest_setOracleAddress is EpochManagerTest { vm.expectRevert("Oracle address cannot be the same."); epochManagerContract.setOracleAddress(address(sortedOracles)); } -} - -contract EpochManagerTest_setOracleAddress_L2 is - EpochManagerTest_L2, - EpochManagerTest_setOracleAddress -{ - function setUp() public override(EpochManagerTest_L2, EpochManagerTest) { - super.setUp(); - } function test_Reverts_WhenIsOnEpochProcess() public { setupAndElectValidators(); @@ -453,47 +398,7 @@ contract EpochManagerTest_sendValidatorPayment is EpochManagerTest { uint256 epochManagerBalanceBefore; - function setUp() public override { - super.setUp(); - _registerAndElectValidatorsForL2(); - validators.setValidatorGroup(group); - validators.setValidator(validator1); - validators.setValidator(validator2); - - address[] memory members = new address[](2); - members[0] = validator1; - members[1] = validator2; - validators.setMembers(group, members); - - stableToken.mint(address(epochManagerContract), paymentAmount * 2); - epochManagerBalanceBefore = stableToken.balanceOf(address(epochManagerContract)); - epochManagerContract._setPaymentAllocation(validator1, paymentAmount); - } - - function test_Reverts_onL1() public { - validators.setCommission(group, fiftyPercent); - vm.prank(validator1); - accountsContract.setPaymentDelegation(beneficiary, fiftyPercent); - - vm.expectRevert("Epoch system not initialized"); - - epochManagerContract.sendValidatorPayment(validator1); - } -} - -contract EpochManagerTest_sendValidatorPayment_L2 is EpochManagerTest_L2 { - address beneficiary = actor("beneficiary"); - - uint256 paymentAmount = 4 ether; - uint256 quarterOfPayment = paymentAmount / 4; - uint256 halfOfPayment = paymentAmount / 2; - uint256 threeQuartersOfPayment = (paymentAmount / 4) * 3; - uint256 twentyFivePercent = 250000000000000000000000; - uint256 fiftyPercent = 500000000000000000000000; - - uint256 epochManagerBalanceBefore; - - function setUp() public override(EpochManagerTest_L2) { + function setUp() public override(EpochManagerTest) { super.setUp(); validators.setValidatorGroup(group); @@ -615,18 +520,9 @@ contract EpochManagerTest_sendValidatorPayment_L2 is EpochManagerTest_L2 { } contract EpochManagerTest_finishNextEpochProcess is EpochManagerTest { - function test_Reverts_onL1() public { - address[] memory groups = new address[](0); - - vm.expectRevert("Epoch process is not started"); - epochManagerContract.finishNextEpochProcess(groups, groups, groups); - } -} - -contract EpochManagerTest_finishNextEpochProcess_L2 is EpochManagerTest_L2 { uint256 groupEpochRewards = 44e18; - function setUp() public override(EpochManagerTest_L2) { + function setUp() public override(EpochManagerTest) { super.setUp(); setupAndElectValidators(); @@ -725,16 +621,9 @@ contract EpochManagerTest_finishNextEpochProcess_L2 is EpochManagerTest_L2 { } contract EpochManagerTest_setToProcessGroups is EpochManagerTest { - function test_Reverts_onL1() public { - vm.expectRevert("Epoch process is not started"); - epochManagerContract.setToProcessGroups(); - } -} - -contract EpochManagerTest_setToProcessGroups_L2 is EpochManagerTest_L2 { uint256 groupEpochRewards = 44e18; - function setUp() public override(EpochManagerTest_L2) { + function setUp() public override(EpochManagerTest) { super.setUp(); setupAndElectValidators(); @@ -802,16 +691,9 @@ contract EpochManagerTest_setToProcessGroups_L2 is EpochManagerTest_L2 { } contract EpochManagerTest_processGroup is EpochManagerTest { - function test_Reverts_onL1() public { - vm.expectRevert("Indivudual epoch process is not started"); - epochManagerContract.processGroup(group, address(0), address(0)); - } -} - -contract EpochManagerTest_processGroup_L2 is EpochManagerTest_L2 { uint256 groupEpochRewards = 44e18; - function setUp() public override(EpochManagerTest_L2) { + function setUp() public override(EpochManagerTest) { super.setUp(); setupAndElectValidators(); @@ -910,15 +792,7 @@ contract EpochManagerTest_processGroup_L2 is EpochManagerTest_L2 { } contract EpochManagerTest_getEpochByNumber is EpochManagerTest { - function test_Reverts_onL1() public { - vm.expectRevert("Epoch system not initialized"); - - epochManagerContract.getEpochByNumber(9); - } -} - -contract EpochManagerTest_getEpochByNumber_L2 is EpochManagerTest_L2 { - function setUp() public override(EpochManagerTest_L2) { + function setUp() public override(EpochManagerTest) { super.setUp(); setupAndElectValidators(); @@ -1001,14 +875,7 @@ contract EpochManagerTest_getEpochByNumber_L2 is EpochManagerTest_L2 { } contract EpochManagerTest_getEpochNumberOfBlock is EpochManagerTest { - function test_Reverts_WhenL1() public { - vm.expectRevert("Epoch system not initialized"); - epochManagerContract.getEpochNumberOfBlock(75); - } -} - -contract EpochManagerTest_getEpochNumberOfBlock_L2 is EpochManagerTest_L2 { - function setUp() public override(EpochManagerTest_L2) { + function setUp() public override(EpochManagerTest) { super.setUp(); } @@ -1021,14 +888,7 @@ contract EpochManagerTest_getEpochNumberOfBlock_L2 is EpochManagerTest_L2 { } contract EpochManagerTest_getEpochByBlockNumber is EpochManagerTest { - function test_Reverts_WhenL1() public { - vm.expectRevert("Epoch system not initialized"); - epochManagerContract.getEpochNumberOfBlock(1000); - } -} - -contract EpochManagerTest_getEpochByBlockNumber_L2 is EpochManagerTest_L2 { - function setUp() public override(EpochManagerTest_L2) { + function setUp() public override(EpochManagerTest) { super.setUp(); setupAndElectValidators(); _travelAndProcess_N_L2Epoch(2); @@ -1049,14 +909,7 @@ contract EpochManagerTest_getEpochByBlockNumber_L2 is EpochManagerTest_L2 { } contract EpochManagerTest_numberOfElectedInCurrentSet is EpochManagerTest { - function test_Reverts_WhenL1() public { - vm.expectRevert("Epoch system not initialized"); - epochManagerContract.numberOfElectedInCurrentSet(); - } -} - -contract EpochManagerTest_numberOfElectedInCurrentSet_L2 is EpochManagerTest_L2 { - function setUp() public override(EpochManagerTest_L2) { + function setUp() public override(EpochManagerTest) { super.setUp(); setupAndElectValidators(); } @@ -1069,14 +922,7 @@ contract EpochManagerTest_numberOfElectedInCurrentSet_L2 is EpochManagerTest_L2 } contract EpochManagerTest_getElectedAccounts is EpochManagerTest { - function test_Reverts_WhenL1() public { - vm.expectRevert("Epoch system not initialized"); - epochManagerContract.getElectedAccounts(); - } -} - -contract EpochManagerTest_getElectedAccounts_L2 is EpochManagerTest_L2 { - function setUp() public override(EpochManagerTest_L2) { + function setUp() public override(EpochManagerTest) { super.setUp(); setupAndElectValidators(); } @@ -1089,14 +935,7 @@ contract EpochManagerTest_getElectedAccounts_L2 is EpochManagerTest_L2 { } contract EpochManagerTest_getElectedAccountByIndex is EpochManagerTest { - function test_Reverts_WhenL1() public { - vm.expectRevert("Epoch system not initialized"); - epochManagerContract.getElectedAccountByIndex(0); - } -} - -contract EpochManagerTest_getElectedAccountByIndex_L2 is EpochManagerTest_L2 { - function setUp() public override(EpochManagerTest_L2) { + function setUp() public override(EpochManagerTest) { super.setUp(); setupAndElectValidators(); } @@ -1106,14 +945,7 @@ contract EpochManagerTest_getElectedAccountByIndex_L2 is EpochManagerTest_L2 { } contract EpochManagerTest_getElectedSigners is EpochManagerTest { - function test_Reverts_WhenL1() public { - vm.expectRevert("Epoch system not initialized"); - epochManagerContract.getElectedSigners(); - } -} - -contract EpochManagerTest_getElectedSigners_L2 is EpochManagerTest_L2 { - function setUp() public override(EpochManagerTest_L2) { + function setUp() public override(EpochManagerTest) { super.setUp(); setupAndElectValidators(); } @@ -1129,14 +961,7 @@ contract EpochManagerTest_getElectedSigners_L2 is EpochManagerTest_L2 { } contract EpochManagerTest_getElectedSignerByIndex is EpochManagerTest { - function test_Reverts_WhenL1() public { - vm.expectRevert("Epoch system not initialized"); - epochManagerContract.getElectedSignerByIndex(1); - } -} - -contract EpochManagerTest_getElectedSignerByIndex_L2 is EpochManagerTest_L2 { - function setUp() public override(EpochManagerTest_L2) { + function setUp() public override(EpochManagerTest) { super.setUp(); setupAndElectValidators(); } diff --git a/packages/protocol/test-sol/unit/common/EpochManagerEnabler.t.sol b/packages/protocol/test-sol/unit/common/EpochManagerEnabler.t.sol index 30df64068c4..0d53d1cd1c1 100644 --- a/packages/protocol/test-sol/unit/common/EpochManagerEnabler.t.sol +++ b/packages/protocol/test-sol/unit/common/EpochManagerEnabler.t.sol @@ -9,7 +9,6 @@ import { EpochRewardsMock08 } from "@celo-contracts-8/governance/test/EpochRewar import { TestWithUtils08 } from "@test-sol/TestWithUtils08.sol"; import { EpochManagerEnablerMock } from "@test-sol/mocks/EpochManagerEnablerMock.sol"; import "@celo-contracts-8/common/mocks/EpochManager_WithMocks.sol"; -import { ValidatorsMock } from "@test-sol/unit/governance/validators/mocks/ValidatorsMock.sol"; import "@test-sol/utils/WhenL2-08.sol"; contract EpochManagerEnablerTest is TestWithUtils08 { diff --git a/packages/protocol/test-sol/unit/common/FeeCurrencyDirectory.t.sol b/packages/protocol/test-sol/unit/common/FeeCurrencyDirectory.t.sol index 51da01075f3..938ace7b91b 100644 --- a/packages/protocol/test-sol/unit/common/FeeCurrencyDirectory.t.sol +++ b/packages/protocol/test-sol/unit/common/FeeCurrencyDirectory.t.sol @@ -2,7 +2,6 @@ pragma solidity >=0.8.7 <0.8.20; import { TestWithUtils08 } from "@test-sol/TestWithUtils08.sol"; -import "@test-sol/utils/WhenL2-08.sol"; import "@celo-contracts-8/common/FeeCurrencyDirectory.sol"; import "@celo-contracts-8/common/mocks/MockOracle.sol"; @@ -23,12 +22,7 @@ contract FeeCurrencyDirectoryTest is TestWithUtils08 { directory = new FeeCurrencyDirectory(true); directory.initialize(); - } -} - -contract FeeCurrencyDirectoryTest_L2 is FeeCurrencyDirectoryTest, WhenL2 { - function setUp() public virtual override(FeeCurrencyDirectoryTest, WhenL2) { - super.setUp(); + whenL2WithEpochManagerInitialization(); } } @@ -84,12 +78,6 @@ contract TestSetCurrencyConfig is FeeCurrencyDirectoryTest { } } -contract TestSetCurrencyConfig_L2 is FeeCurrencyDirectoryTest_L2, TestSetCurrencyConfig { - function setUp() public override(FeeCurrencyDirectoryTest, FeeCurrencyDirectoryTest_L2) { - super.setUp(); - } -} - contract TestRemoveCurrencies is FeeCurrencyDirectoryTest { function setUp() public virtual override { super.setUp(); @@ -133,12 +121,6 @@ contract TestRemoveCurrencies is FeeCurrencyDirectoryTest { } } -contract TestRemoveCurrencies_L2 is FeeCurrencyDirectoryTest_L2, TestRemoveCurrencies { - function setUp() public override(TestRemoveCurrencies, FeeCurrencyDirectoryTest_L2) { - super.setUp(); - } -} - contract TestGetExchangeRate is FeeCurrencyDirectoryTest { address token; @@ -160,9 +142,3 @@ contract TestGetExchangeRate is FeeCurrencyDirectoryTest { directory.getExchangeRate(address(4)); } } - -contract TestGetExchangeRate_L2 is FeeCurrencyDirectoryTest_L2, TestGetExchangeRate { - function setUp() public override(TestGetExchangeRate, FeeCurrencyDirectoryTest_L2) { - super.setUp(); - } -} diff --git a/packages/protocol/test-sol/unit/common/FeeCurrencyWhitelist.t.sol b/packages/protocol/test-sol/unit/common/FeeCurrencyWhitelist.t.sol index 6dae11cb09f..c66e8121a51 100644 --- a/packages/protocol/test-sol/unit/common/FeeCurrencyWhitelist.t.sol +++ b/packages/protocol/test-sol/unit/common/FeeCurrencyWhitelist.t.sol @@ -1,27 +1,23 @@ // SPDX-License-Identifier: UNLICENSED pragma solidity ^0.5.13; -import "celo-foundry/Test.sol"; +import { TestWithUtils } from "@test-sol/TestWithUtils.sol"; import "@celo-contracts/common/FeeCurrencyWhitelist.sol"; -import { TestConstants } from "@test-sol/constants.sol"; - -contract FeeCurrencyWhitelistTest is Test, TestConstants { +contract FeeCurrencyWhitelistTest is TestWithUtils { FeeCurrencyWhitelist feeCurrencyWhitelist; address nonOwner; address owner; function setUp() public { + super.setUp(); + whenL2WithEpochManagerInitialization(); owner = address(this); nonOwner = actor("nonOwner"); feeCurrencyWhitelist = new FeeCurrencyWhitelist(true); feeCurrencyWhitelist.initialize(); } - - function _whenL2() public { - deployCodeTo("Registry.sol", abi.encode(false), PROXY_ADMIN_ADDRESS); - } } contract FeeCurrencyWhitelistInitialize is FeeCurrencyWhitelistTest { @@ -37,94 +33,28 @@ contract FeeCurrencyWhitelistInitialize is FeeCurrencyWhitelistTest { } contract FeeCurrencyWhitelistAddToken is FeeCurrencyWhitelistTest { - function test_ShouldAllowTheOwnerToAddAToken() public { - feeCurrencyWhitelist.addToken(address(1)); - address[] memory whitelist = feeCurrencyWhitelist.getWhitelist(); - assertEq(whitelist.length, 1); - assertEq(whitelist[0], address(1)); - } - - function test_ShouldRevert_WhenNonOwnerAddsAToken() public { - vm.expectRevert("Ownable: caller is not the owner"); - vm.prank(nonOwner); - feeCurrencyWhitelist.addToken(address(1)); - } - - function test_Reverts_WhenCalledOnL2() public { - _whenL2(); + function test_Reverts_WhenCalled() public { vm.expectRevert("This method is no longer supported in L2."); feeCurrencyWhitelist.addToken(address(1)); } } contract FeeCurrencyWhitelistRemoveToken is FeeCurrencyWhitelistTest { - function setUp() public { - super.setUp(); - feeCurrencyWhitelist.addToken(address(1)); - feeCurrencyWhitelist.addToken(address(2)); - feeCurrencyWhitelist.addToken(address(3)); - } - - function test_ShouldRemoveToken() public { - feeCurrencyWhitelist.removeToken(address(2), 1); - address[] memory whitelist = feeCurrencyWhitelist.getWhitelist(); - assertEq(whitelist.length, 2); - assertEq(whitelist[0], address(1)); - assertEq(whitelist[1], address(3)); - } - - function test_ShouldRevert_WhenIndexIsWrong() public { - vm.expectRevert("Index does not match"); - feeCurrencyWhitelist.removeToken(address(2), 2); - } - - function test_ShouldRevert_WhenNonOwnerRemovesToken() public { - vm.expectRevert("Ownable: caller is not the owner"); - vm.prank(nonOwner); - feeCurrencyWhitelist.removeToken(address(2), 1); - } - - function test_Reverts_WhenCalledOnL2() public { - _whenL2(); + function test_Reverts_WhenCalled() public { vm.expectRevert("This method is no longer supported in L2."); feeCurrencyWhitelist.removeToken(address(2), 1); } } contract FeeCurrencyWhitelist_whitelist is FeeCurrencyWhitelistTest { - function setUp() public { - super.setUp(); - feeCurrencyWhitelist.addToken(address(1)); - } - - function test_ShouldRetrieveAToken() public { - address token = feeCurrencyWhitelist.whitelist(0); - assertEq(token, address(1)); - } - - function test_Reverts_WhenCalledOnL2() public { - _whenL2(); + function test_Reverts_WhenCalled() public { vm.expectRevert("This method is no longer supported in L2."); feeCurrencyWhitelist.whitelist(0); } } contract FeeCurrencyWhitelist_getWhitelist is FeeCurrencyWhitelistTest { - function setUp() public { - super.setUp(); - feeCurrencyWhitelist.addToken(address(1)); - feeCurrencyWhitelist.addToken(address(2)); - } - - function test_ShouldRetrieveAToken() public { - address[] memory tokens = feeCurrencyWhitelist.getWhitelist(); - assertEq(tokens.length, 2); - assertEq(tokens[0], address(1)); - assertEq(tokens[1], address(2)); - } - - function test_Reverts_WhenCalledOnL2() public { - _whenL2(); + function test_Reverts_WhenCalled() public { vm.expectRevert("This method is no longer supported in L2."); feeCurrencyWhitelist.getWhitelist(); } diff --git a/packages/protocol/test-sol/unit/common/FeeHandler.t.sol b/packages/protocol/test-sol/unit/common/FeeHandler.t.sol index e3757e488d7..78882e4a195 100644 --- a/packages/protocol/test-sol/unit/common/FeeHandler.t.sol +++ b/packages/protocol/test-sol/unit/common/FeeHandler.t.sol @@ -7,8 +7,7 @@ pragma experimental ABIEncoderV2; import "@celo-contracts/common/FeeHandler.sol"; -import { TestWithUtils } from "@test-sol/TestWithUtils.sol"; -import "@test-sol/utils/WhenL2.sol"; +import "@test-sol/TestWithUtils.sol"; import { Exchange } from "@mento-core/contracts/Exchange.sol"; import { StableToken } from "@mento-core/contracts/StableToken.sol"; @@ -121,9 +120,9 @@ contract FeeHandlerTest is TestWithUtils { mockReserve.addToken(address(stableTokenEUR)); address[] memory tokenAddresses; - uint256[] memory newMininumReports; + uint256[] memory newMinimumReports; - mentoSeller.initialize(address(registry), tokenAddresses, newMininumReports); + mentoSeller.initialize(address(registry), tokenAddresses, newMinimumReports); celoToken.initialize(address(registry)); stableToken.initialize( "Celo Dollar", @@ -199,6 +198,7 @@ contract FeeHandlerTest is TestWithUtils { new uint256[](0), new uint256[](0) ); + whenL2WithEpochManagerInitialization(); } function fundReserve() public { @@ -206,8 +206,6 @@ contract FeeHandlerTest is TestWithUtils { } } -contract FeeHandlerTest_L2 is WhenL2, FeeHandlerTest {} - contract FeeHandlerTest_Initialize is FeeHandlerTest { function test_Reverts_WhenAlreadyInitialized() public { vm.expectRevert("contract already initialized"); @@ -271,11 +269,6 @@ contract FeeHandlerTest_SetCarbonFraction is FeeHandlerTest { } } -contract FeeHandlerTest_SetCarbonFraction_L2 is - FeeHandlerTest_L2, - FeeHandlerTest_SetCarbonFraction -{} - // TODO change beneficiary allocation contract FeeHandlerTest_changeOtherBeneficiaryAllocation is FeeHandlerTest { function setUp() public { @@ -311,11 +304,6 @@ contract FeeHandlerTest_changeOtherBeneficiaryAllocation is FeeHandlerTest { } } -contract FeeHandlerTest_changeOtherBeneficiaryAllocation_L2 is - FeeHandlerTest_L2, - FeeHandlerTest_changeOtherBeneficiaryAllocation -{} - contract FeeHandlerTest_SetHandler is FeeHandlerTest { function test_Reverts_WhenCallerNotOwner() public { vm.prank(user); @@ -333,8 +321,6 @@ contract FeeHandlerTest_SetHandler is FeeHandlerTest { } } -contract FeeHandlerTest_SetHandler_L2 is FeeHandlerTest_L2, FeeHandlerTest_SetHandler {} - contract FeeHandlerTest_AddToken is FeeHandlerTest { function test_Reverts_WhenCallerNotOwner() public { vm.prank(user); @@ -358,8 +344,6 @@ contract FeeHandlerTest_AddToken is FeeHandlerTest { } } -contract FeeHandlerTest_AddToken_L2 is FeeHandlerTest_L2, FeeHandlerTest_AddToken {} - contract FeeHandlerTest_RemoveToken is FeeHandlerTest { function test_Reverts_WhenCallerNotOwner() public { vm.prank(user); @@ -383,8 +367,6 @@ contract FeeHandlerTest_RemoveToken is FeeHandlerTest { } } -contract FeeHandlerTest_RemoveToken_L2 is FeeHandlerTest_L2, FeeHandlerTest_RemoveToken {} - contract FeeHandlerTest_DeactivateAndActivateToken is FeeHandlerTest { function test_Reverts_WhenActivateCallerNotOwner() public { vm.prank(user); @@ -412,11 +394,6 @@ contract FeeHandlerTest_DeactivateAndActivateToken is FeeHandlerTest { } } -contract FeeHandlerTest_DeactivateAndActivateToken_L2 is - FeeHandlerTest_L2, - FeeHandlerTest_DeactivateAndActivateToken -{} - contract FeeHandlerTest_SetFeeBeneficiary is FeeHandlerTest { function test_Reverts_WhenCallerNotOwner() public { vm.prank(user); @@ -436,11 +413,6 @@ contract FeeHandlerTest_SetFeeBeneficiary is FeeHandlerTest { } } -contract FeeHandlerTest_SetFeeBeneficiary_L2 is - FeeHandlerTest_L2, - FeeHandlerTest_SetFeeBeneficiary -{} - contract FeeHandlerTestAbstract is FeeHandlerTest { function addAndActivateToken(address token, address handler) public { feeHandler.addToken(token, handler); @@ -469,8 +441,6 @@ contract FeeHandlerTestAbstract is FeeHandlerTest { } } -contract FeeHandlerTestAbstract_L2 is FeeHandlerTest_L2, FeeHandlerTestAbstract {} - contract FeeHandlerTest_AddOtherBeneficiary is FeeHandlerTestAbstract { // TODO only owner function test_addsSucsesfully() public { @@ -536,11 +506,6 @@ contract FeeHandlerTest_AddOtherBeneficiary is FeeHandlerTestAbstract { } } -contract FeeHandlerTest_AddOtherBeneficiary_L2 is - FeeHandlerTestAbstract_L2, - FeeHandlerTest_AddOtherBeneficiary -{} - contract FeeHandlerTest_Distribute is FeeHandlerTestAbstract { function setUp() public { super.setUp(); @@ -610,8 +575,6 @@ contract FeeHandlerTest_Distribute is FeeHandlerTestAbstract { } } -contract FeeHandlerTest_Distribute_L2 is FeeHandlerTestAbstract_L2, FeeHandlerTest_Distribute {} - contract FeeHandlerTest_Distribute_WhenOtherBeneficiaries is FeeHandlerTestAbstract { function setUp() public { super.setUp(); @@ -660,11 +623,6 @@ contract FeeHandlerTest_Distribute_WhenOtherBeneficiaries is FeeHandlerTestAbstr } } -contract FeeHandlerTest_Distribute_WhenOtherBeneficiaries_L2 is - FeeHandlerTestAbstract_L2, - FeeHandlerTest_Distribute_WhenOtherBeneficiaries -{} - contract FeeHandlerTest_BurnCelo is FeeHandlerTestAbstract { function setUp() public { super.setUp(); @@ -701,8 +659,6 @@ contract FeeHandlerTest_BurnCelo is FeeHandlerTestAbstract { } } -contract FeeHandlerTest_BurnCelo_L2 is FeeHandlerTestAbstract_L2, FeeHandlerTest_BurnCelo {} - contract FeeHandlerTest_SellMentoTokensAbstract is FeeHandlerTestAbstract { function setUp() public { super.setUp(); @@ -711,11 +667,6 @@ contract FeeHandlerTest_SellMentoTokensAbstract is FeeHandlerTestAbstract { } } -contract FeeHandlerTest_SellMentoTokensAbstract_L2 is - FeeHandlerTestAbstract_L2, - FeeHandlerTest_SellMentoTokensAbstract -{} - contract FeeHandlerTest_SellMentoTokens_WhenTokenEnabled is FeeHandlerTest_SellMentoTokensAbstract { function setUp() public { super.setUp(); @@ -853,11 +804,6 @@ contract FeeHandlerTest_SellMentoTokens_WhenTokenEnabled is FeeHandlerTest_SellM } } -contract FeeHandlerTest_SellMentoTokens_WhenTokenEnabled_L2 is - FeeHandlerTest_SellMentoTokensAbstract_L2, - FeeHandlerTest_SellMentoTokens_WhenTokenEnabled -{} - contract FeeHandlerTest_SellMentoTokens_WhenTokenNotEnabled is FeeHandlerTest_SellMentoTokensAbstract { @@ -868,11 +814,6 @@ contract FeeHandlerTest_SellMentoTokens_WhenTokenNotEnabled is } } -contract FeeHandlerTest_SellMentoTokens_WhenTokenNotEnabled_L2 is - FeeHandlerTest_SellMentoTokensAbstract_L2, - FeeHandlerTest_SellMentoTokens_WhenTokenNotEnabled -{} - contract FeeHandlerTest_SellNonMentoTokens is FeeHandlerTestAbstract { uint256 deadline; @@ -1026,11 +967,6 @@ contract FeeHandlerTest_SellNonMentoTokens is FeeHandlerTestAbstract { } } -contract FeeHandlerTest_SellNonMentoTokens_L2 is - FeeHandlerTestAbstract_L2, - FeeHandlerTest_SellNonMentoTokens -{} - contract FeeHandlerTest_HandleCelo is FeeHandlerTestAbstract { function setUp() public { super.setUp(); @@ -1081,8 +1017,6 @@ contract FeeHandlerTest_HandleCelo is FeeHandlerTestAbstract { } } -contract FeeHandlerTest_HandleCelo_L2 is FeeHandlerTestAbstract_L2, FeeHandlerTest_HandleCelo {} - contract FeeHandlerTest_HandleMentoTokens is FeeHandlerTestAbstract { function setUp() public { super.setUp(); @@ -1111,11 +1045,6 @@ contract FeeHandlerTest_HandleMentoTokens is FeeHandlerTestAbstract { } } -contract FeeHandlerTest_HandleMentoTokens_L2 is - FeeHandlerTestAbstract_L2, - FeeHandlerTest_HandleMentoTokens -{} - contract FeeHandlerTest_HandleAll is FeeHandlerTestAbstract { function setUp() public { super.setUp(); @@ -1149,8 +1078,6 @@ contract FeeHandlerTest_HandleAll is FeeHandlerTestAbstract { } } -contract FeeHandlerTest_HandleAll_L2 is FeeHandlerTestAbstract_L2, FeeHandlerTest_HandleAll {} - contract FeeHandlerTest_Transfer is FeeHandlerTest { modifier mintToken(uint256 amount) { tokenA.mint(address(feeHandler), amount); @@ -1169,8 +1096,6 @@ contract FeeHandlerTest_Transfer is FeeHandlerTest { } } -contract FeeHandlerTest_Transfer_L2 is FeeHandlerTest_L2, FeeHandlerTest_Transfer {} - contract FeeHandlerTest_SetDailySellLimit is FeeHandlerTest { uint256 newCeloAmountForRate; @@ -1197,11 +1122,6 @@ contract FeeHandlerTest_SetDailySellLimit is FeeHandlerTest { } } -contract FeeHandlerTest_SetDailySellLimit_L2 is - FeeHandlerTest_L2, - FeeHandlerTest_SetDailySellLimit -{} - contract FeeHandlerTest_SetMaxSlippage is FeeHandlerTest { uint256 newMaxSlipapge; @@ -1228,8 +1148,6 @@ contract FeeHandlerTest_SetMaxSlippage is FeeHandlerTest { } } -contract FeeHandlerTest_SetMaxSlippage_L2 is FeeHandlerTest_L2, FeeHandlerTest_SetMaxSlippage {} - contract FeeHandlerTest_RemoveOtherBeneficiary is FeeHandlerTestAbstract { event BeneficiaryRemoved(address beneficiary); function setUp() public { @@ -1268,11 +1186,6 @@ contract FeeHandlerTest_RemoveOtherBeneficiary is FeeHandlerTestAbstract { } } -contract FeeHandlerTest_RemoveOtherBeneficiary_L2 is - FeeHandlerTestAbstract_L2, - FeeHandlerTest_RemoveOtherBeneficiary -{} - contract FeeHandlerTest_SetBeneficiaryFraction is FeeHandlerTestAbstract { function setUp() public { super.setUp(); @@ -1306,11 +1219,6 @@ contract FeeHandlerTest_SetBeneficiaryFraction is FeeHandlerTestAbstract { } } -contract FeeHandlerTest_SetBeneficiaryFraction_L2 is - FeeHandlerTestAbstract_L2, - FeeHandlerTest_SetBeneficiaryFraction -{} - contract FeeHandlerTest_SetBeneficiaryName is FeeHandlerTestAbstract { function setUp() public { super.setUp(); @@ -1344,8 +1252,3 @@ contract FeeHandlerTest_SetBeneficiaryName is FeeHandlerTestAbstract { feeHandler.setBeneficiaryName(op, "OP revenue share updated"); } } - -contract FeeHandlerTest_SetBeneficiaryName_L2 is - FeeHandlerTestAbstract_L2, - FeeHandlerTest_SetBeneficiaryName -{} diff --git a/packages/protocol/test-sol/unit/common/FeeHandlerSeller.t.sol b/packages/protocol/test-sol/unit/common/FeeHandlerSeller.t.sol index 235fae18f01..de89c34249c 100644 --- a/packages/protocol/test-sol/unit/common/FeeHandlerSeller.t.sol +++ b/packages/protocol/test-sol/unit/common/FeeHandlerSeller.t.sol @@ -10,7 +10,6 @@ import { MentoFeeHandlerSeller } from "@celo-contracts/common/MentoFeeHandlerSel import { UniswapFeeHandlerSeller } from "@celo-contracts/common/UniswapFeeHandlerSeller.sol"; import { TestWithUtils } from "@test-sol/TestWithUtils.sol"; -import "@test-sol/utils/WhenL2.sol"; contract FeeHandlerSellerTest is TestWithUtils { // Actors @@ -44,11 +43,10 @@ contract FeeHandlerSellerTest is TestWithUtils { feeHandlerSellerInstances.push(mentoFeeHandlerSeller); feeHandlerSellerInstances.push(uniswapFeeHandlerSeller); + whenL2WithEpochManagerInitialization(); } } -contract FeeHandlerSellerTest_L2 is WhenL2, FeeHandlerSellerTest {} - contract FeeHandlerSellerTest_Transfer is FeeHandlerSellerTest { uint256 constant ZERO_CELOTOKEN = 0; uint256 constant ONE_CELOTOKEN = 1e18; @@ -94,11 +92,6 @@ contract FeeHandlerSellerTest_Transfer is FeeHandlerSellerTest { } } -contract FeeHandlerSellerTest_Transfer_L2 is - FeeHandlerSellerTest_L2, - FeeHandlerSellerTest_Transfer -{} - contract FeeHandlerSellerTest_SetMinimumReports is FeeHandlerSellerTest { address ARBITRARY_TOKEN_ADDRESS = actor("Arbitrary Token Address"); uint256 constant ARBITRARY_NR_OF_MINIMUM_REPORTS = 15; @@ -133,11 +126,6 @@ contract FeeHandlerSellerTest_SetMinimumReports is FeeHandlerSellerTest { } } -contract FeeHandlerSellerTest_SetMinimumReports_L2 is - FeeHandlerSellerTest_L2, - FeeHandlerSellerTest_SetMinimumReports -{} - contract FeeHandlerSellerTest_setOracleAddress is FeeHandlerSellerTest { function test_Reverts_WhenCalledByNonOwner() public { vm.prank(NON_OWNER_ADDRESS); @@ -161,8 +149,3 @@ contract FeeHandlerSellerTest_setOracleAddress is FeeHandlerSellerTest { uniswapFeeHandlerSeller.setOracleAddress(address(celoToken), oracle); } } - -contract FeeHandlerSellerTest_setOracleAddress_L2 is - FeeHandlerSellerTest_L2, - FeeHandlerSellerTest_setOracleAddress -{} diff --git a/packages/protocol/test-sol/unit/common/GasPriceMinimum.t.sol b/packages/protocol/test-sol/unit/common/GasPriceMinimum.t.sol index d00d81ed952..c73289859e9 100644 --- a/packages/protocol/test-sol/unit/common/GasPriceMinimum.t.sol +++ b/packages/protocol/test-sol/unit/common/GasPriceMinimum.t.sol @@ -53,12 +53,7 @@ contract GasPriceMinimumTest is TestWithUtils08 { adjustmentSpeed, 0 ); - } -} - -contract GasPriceMinimumTest_L2 is GasPriceMinimumTest, WhenL2 { - function setUp() public override(GasPriceMinimumTest, WhenL2) { - super.setUp(); + whenL2WithEpochManagerInitialization(); } } @@ -67,18 +62,6 @@ contract GasPriceMinimumTest_initialize is GasPriceMinimumTest { assertEq(gasPriceMinimum.owner(), owner); } - function test_shouldHaveTargetDensity() public { - assertEq(gasPriceMinimum.targetDensity(), targetDensity); - } - - function test_shouldHaveAdjustmentSpeed() public { - assertEq(gasPriceMinimum.adjustmentSpeed(), adjustmentSpeed); - } - - function test_shouldHaveGasPriceMinimumFloor() public { - assertEq(gasPriceMinimum.gasPriceMinimumFloor(), gasPriceMinimumFloor); - } - function test_shouldRevertWhenCalledAgain() public { vm.expectRevert("contract already initialized"); gasPriceMinimum.initialize( @@ -92,35 +75,6 @@ contract GasPriceMinimumTest_initialize is GasPriceMinimumTest { } contract GasPriceMinimumTest_setAdjustmentSpeed is GasPriceMinimumTest { - using FixidityLib for FixidityLib.Fraction; - - uint256 newAdjustmentSpeed = FixidityLib.newFixedFraction(1, 3).unwrap(); - - function test_shouldSetTheAdjustmentSpeed() public { - gasPriceMinimum.setAdjustmentSpeed(newAdjustmentSpeed); - - assertEq(gasPriceMinimum.adjustmentSpeed(), newAdjustmentSpeed); - } - - function test_Emits_AdjustmentSpeedSetEvent() public { - vm.expectEmit(true, false, false, false); - emit AdjustmentSpeedSet(newAdjustmentSpeed); - gasPriceMinimum.setAdjustmentSpeed(newAdjustmentSpeed); - } - - function test_shouldRevertWhenTheProvidedFractionIsGreaterThanOne() public { - vm.expectRevert("adjustment speed must be smaller than 1"); - gasPriceMinimum.setAdjustmentSpeed(FixidityLib.newFixedFraction(3, 2).unwrap()); - } - - function test_shouldRevertWhenCalledByNonOwner() public { - vm.prank(nonOwner); - vm.expectRevert("Ownable: caller is not the owner"); - gasPriceMinimum.setAdjustmentSpeed(newAdjustmentSpeed); - } -} - -contract GasPriceMinimumTest_setAdjustmentSpeed_L2 is GasPriceMinimumTest_L2 { uint256 newAdjustmentSpeed = 5; function test_Reverts_WhenL2() public { @@ -130,34 +84,6 @@ contract GasPriceMinimumTest_setAdjustmentSpeed_L2 is GasPriceMinimumTest_L2 { } contract GasPriceMinimumTest_setTargetDensity is GasPriceMinimumTest { - using FixidityLib for FixidityLib.Fraction; - - uint256 newTargetDensity = FixidityLib.newFixedFraction(1, 3).unwrap(); - - function test_shouldSetTargetDensity() public { - gasPriceMinimum.setTargetDensity(newTargetDensity); - assertEq(gasPriceMinimum.targetDensity(), newTargetDensity); - } - - function test_Emits_TargetDensitySetEvent() public { - vm.expectEmit(true, true, true, true); - emit TargetDensitySet(newTargetDensity); - gasPriceMinimum.setTargetDensity(newTargetDensity); - } - - function test_ShouldRevertWhenProvidedFractionIsGreaterThanOne() public { - vm.expectRevert("target density must be smaller than 1"); - gasPriceMinimum.setTargetDensity(FixidityLib.newFixedFraction(3, 2).unwrap()); - } - - function test_ShouldRevertWhenCalledByNonOwner() public { - vm.prank(nonOwner); - vm.expectRevert("Ownable: caller is not the owner"); - gasPriceMinimum.setTargetDensity(newTargetDensity); - } -} - -contract GasPriceMinimumTest_setTargetDensity_L2 is GasPriceMinimumTest_L2 { function test_Reverts_WhenL2() public { vm.expectRevert("This method is no longer supported in L2."); gasPriceMinimum.setTargetDensity(5); @@ -165,33 +91,6 @@ contract GasPriceMinimumTest_setTargetDensity_L2 is GasPriceMinimumTest_L2 { } contract GasPriceMinimumTest_setGasPriceMinimumFloor is GasPriceMinimumTest { - uint256 newGasPriceMinimumFloor = 150; - - function test_ShouldSetGasPriceMinimumFloor() public { - gasPriceMinimum.setGasPriceMinimumFloor(newGasPriceMinimumFloor); - - assertEq(gasPriceMinimum.gasPriceMinimumFloor(), newGasPriceMinimumFloor); - } - - function test_Emits_GasPriceMinimumFloorSet() public { - vm.expectEmit(true, true, true, true); - emit GasPriceMinimumFloorSet(newGasPriceMinimumFloor); - gasPriceMinimum.setGasPriceMinimumFloor(newGasPriceMinimumFloor); - } - - function test_shouldRevertWhenProvidedFloorIsZero() public { - vm.expectRevert("gas price minimum floor must be greater than zero"); - gasPriceMinimum.setGasPriceMinimumFloor(0); - } - - function test_shouldRevertWhenCalledByNonOwner() public { - vm.prank(nonOwner); - vm.expectRevert("Ownable: caller is not the owner"); - gasPriceMinimum.setGasPriceMinimumFloor(newGasPriceMinimumFloor); - } -} - -contract GasPriceMinimumTest_setGasPriceMinimumFloor_L2 is GasPriceMinimumTest_L2 { function test_Reverts_WhenL2() public { vm.expectRevert("This method is no longer supported in L2."); gasPriceMinimum.setGasPriceMinimumFloor(5); @@ -199,106 +98,6 @@ contract GasPriceMinimumTest_setGasPriceMinimumFloor_L2 is GasPriceMinimumTest_L } contract GasPriceMinimumTest_getUpdatedGasPriceMinimum is GasPriceMinimumTest { - using FixidityLib for FixidityLib.Fraction; - uint256 nonce = 0; - - function getExpectedUpdatedGasPriceMinimum( - uint256 gasPriceMinFloor, - uint256 previousGasPriceMinimum, - FixidityLib.Fraction memory density, - FixidityLib.Fraction memory _targetDensity, - FixidityLib.Fraction memory _adjustmentSpeed - ) public pure returns (uint256) { - uint256 one = 1; - uint256 newGasPriceMin = previousGasPriceMinimum * - one + - FixidityLib.fromFixed(_adjustmentSpeed) * - FixidityLib.fromFixed(density) - - FixidityLib.fromFixed(_targetDensity); - - return newGasPriceMin < gasPriceMinFloor ? gasPriceMinFloor : newGasPriceMin; - } - - function random(uint256 minNumber, uint256 maxNumber) public returns (uint256) { - nonce += 1; - if (minNumber > 0) { - return - (uint256(keccak256(abi.encodePacked(nonce, msg.sender, blockhash(block.number - 1)))) % - (maxNumber - 1)) + 1; - } - return (uint256(keccak256(abi.encodePacked(nonce, msg.sender, blockhash(block.number - 1)))) % - maxNumber); - } - - function test_shouldReturn25PercentMoreThanInitialMinimumAndShouldNotBeLimitedByGasPriceMinimumFloorAsAWhole_WhenTheBlockIsFull() - public - { - uint256 currentGasPriceMinimum = gasPriceMinimum.gasPriceMinimum(); - - gasPriceMinimum.setGasPriceMinimumFloor(currentGasPriceMinimum); - - uint256 expectedUpdatedGasPriceMinimum = (currentGasPriceMinimum * 5) / 4 + 1; - - assertEq(gasPriceMinimum.getUpdatedGasPriceMinimum(1, 1), expectedUpdatedGasPriceMinimum); - } - - function test_shouldReturn25PercentLessThanInitialMinimumButShouldBeLimitedByGasPriceMinimumFloorIfNewGasLiesBelowMinimum_WhenTheBlockIsEmtpy() - public - { - uint256 currentGasPriceMinimum = gasPriceMinimum.gasPriceMinimum(); - - gasPriceMinimum.setGasPriceMinimumFloor(currentGasPriceMinimum); - - uint256 expectedCappedUpdatedGasPriceMinimum = gasPriceMinimum.gasPriceMinimumFloor(); - - assertEq(gasPriceMinimum.getUpdatedGasPriceMinimum(0, 1), expectedCappedUpdatedGasPriceMinimum); - } - - function test_shouldReturn25PercentLessThanInitialMinimumAndShouldNotBeLimitedByGasPriceMinimumFloorIfNewGasPriceLiesAboveMinimum_WhenTheBlockIsEmtpy() - public - { - uint256 currentGasPriceMinimum = gasPriceMinimum.gasPriceMinimum(); - - gasPriceMinimum.setGasPriceMinimumFloor(1); - - uint256 expectedUpdatedGasPriceMinimum = (currentGasPriceMinimum * 3) / 4 + 1; - - assertEq(gasPriceMinimum.getUpdatedGasPriceMinimum(0, 1), expectedUpdatedGasPriceMinimum); - } - - function test_shouldReturnAnUpdatedGasPriceMinimumThatMatchesARandomNumber_WhenTheFullnessOfTheBlockIsRandom() - public - { - uint256 numIterations = 100; - uint256 currentGasPriceMinimum = gasPriceMinimum.gasPriceMinimum(); - uint256 gasPriceMinFloor = currentGasPriceMinimum; - gasPriceMinimum.setGasPriceMinimumFloor(gasPriceMinFloor); - - for (uint256 i = 0; i < numIterations; i++) { - uint256 currGas = gasPriceMinimum.gasPriceMinimum(); - - uint256 blockGasLimit = random(1, 105); - uint256 gasUsed = random(0, 1) * blockGasLimit; - - uint256 actualUpdatedGasPriceMinimum = gasPriceMinimum.getUpdatedGasPriceMinimum( - gasUsed, - blockGasLimit - ); - - uint256 expectedUpdatedGasPriceMinimum = getExpectedUpdatedGasPriceMinimum( - gasPriceMinFloor, - currGas, - FixidityLib.newFixedFraction(gasUsed, blockGasLimit), - targetDensityFraction, - adjustmentSpeedFraction - ); - - assertEq(actualUpdatedGasPriceMinimum, expectedUpdatedGasPriceMinimum); - } - } -} - -contract GasPriceMinimumTest_getUpdatedGasPriceMinimum_L2 is GasPriceMinimumTest_L2 { function test_shouldRevert_WhenCalledOnL2() public { vm.expectRevert("This method is no longer supported in L2."); gasPriceMinimum.getUpdatedGasPriceMinimum(0, 1); @@ -306,13 +105,6 @@ contract GasPriceMinimumTest_getUpdatedGasPriceMinimum_L2 is GasPriceMinimumTest } contract GasPriceMinimumTest_gasPriceMinimumFloor is GasPriceMinimumTest { - function test_shouldReturnTheGasPriceMinimumFloor() public { - uint256 gasPriceMinFloor = gasPriceMinimum.gasPriceMinimumFloor(); - assertEq(gasPriceMinFloor, gasPriceMinimumFloor); - } -} - -contract GasPriceMinimumTest_gasPriceMinimumFloor_L2 is GasPriceMinimumTest_L2 { function test_shouldRevert_WhenCalledOnL2() public { vm.expectRevert("This method is no longer supported in L2."); gasPriceMinimum.gasPriceMinimumFloor(); @@ -320,13 +112,6 @@ contract GasPriceMinimumTest_gasPriceMinimumFloor_L2 is GasPriceMinimumTest_L2 { } contract GasPriceMinimumTest_targetDensity is GasPriceMinimumTest { - function test_shouldReturnTheTargetDensity() public { - uint256 realTargetDensity = gasPriceMinimum.targetDensity(); - assertEq(realTargetDensity, targetDensity); - } -} - -contract GasPriceMinimumTest_targetDensity_L2 is GasPriceMinimumTest_L2 { function test_shouldRevert_WhenCalledOnL2() public { vm.expectRevert("This method is no longer supported in L2."); gasPriceMinimum.targetDensity(); @@ -334,13 +119,6 @@ contract GasPriceMinimumTest_targetDensity_L2 is GasPriceMinimumTest_L2 { } contract GasPriceMinimumTest_adjustmentSpeed is GasPriceMinimumTest { - function test_shouldReturnTheAdjustementSpeed() public { - uint256 realAdjustementSpeed = gasPriceMinimum.adjustmentSpeed(); - assertEq(realAdjustementSpeed, adjustmentSpeed); - } -} - -contract GasPriceMinimumTest_adjustmentSpeed_L2 is GasPriceMinimumTest_L2 { function test_shouldRevert_WhenCalledOnL2() public { vm.expectRevert("This method is no longer supported in L2."); gasPriceMinimum.adjustmentSpeed(); @@ -348,20 +126,6 @@ contract GasPriceMinimumTest_adjustmentSpeed_L2 is GasPriceMinimumTest_L2 { } contract GasPriceMinimumTest_baseFeeOpCodeActivationBlock is GasPriceMinimumTest { - uint256 baseFeeOpCodeActivationBlock = 123; - - function setUp() public override { - super.setUp(); - gasPriceMinimum.setBaseFeeOpCodeActivationBlock(baseFeeOpCodeActivationBlock); - } - - function test_shouldReturnTheBaseFeeOpCodeActivationBlock() public { - uint256 realBaseFeeOpCodeActivationBlock = gasPriceMinimum.baseFeeOpCodeActivationBlock(); - assertEq(realBaseFeeOpCodeActivationBlock, baseFeeOpCodeActivationBlock); - } -} - -contract GasPriceMinimumTest_baseFeeOpCodeActivationBlock_L2 is GasPriceMinimumTest_L2 { function test_shouldRevert_WhenCalledOnL2() public { vm.expectRevert("This method is no longer supported in L2."); gasPriceMinimum.baseFeeOpCodeActivationBlock(); @@ -369,13 +133,6 @@ contract GasPriceMinimumTest_baseFeeOpCodeActivationBlock_L2 is GasPriceMinimumT } contract GasPriceMinimumTest_gasPriceMinimum is GasPriceMinimumTest { - function test_shouldReturnTheGasPriceMinimum() public { - uint256 realGasPriceMinimum = gasPriceMinimum.gasPriceMinimum(); - assertEq(realGasPriceMinimum, 100); - } -} - -contract GasPriceMinimumTest_gasPriceMinimum_L2 is GasPriceMinimumTest_L2 { function test_shouldRevert_WhenCalledOnL2() public { vm.expectRevert("This method is no longer supported in L2."); gasPriceMinimum.gasPriceMinimum(); @@ -383,13 +140,6 @@ contract GasPriceMinimumTest_gasPriceMinimum_L2 is GasPriceMinimumTest_L2 { } contract GasPriceMinimumTest_getGasPriceMinimum is GasPriceMinimumTest { - function test_shouldReturnTheGasPriceMinimum() public { - uint256 realGasPriceMinimum = gasPriceMinimum.getGasPriceMinimum(address(0)); - assertEq(realGasPriceMinimum, 100); - } -} - -contract GasPriceMinimumTest_getGasPriceMinimum_L2 is GasPriceMinimumTest_L2 { function test_shouldRevert_WhenCalledOnL2() public { vm.expectRevert("This method is no longer supported in L2."); gasPriceMinimum.getGasPriceMinimum(address(0)); diff --git a/packages/protocol/test-sol/unit/common/GoldToken.t.sol b/packages/protocol/test-sol/unit/common/GoldToken.t.sol index 8d3cbf9cb64..cf1b330ed70 100644 --- a/packages/protocol/test-sol/unit/common/GoldToken.t.sol +++ b/packages/protocol/test-sol/unit/common/GoldToken.t.sol @@ -4,7 +4,6 @@ pragma solidity ^0.5.13; import "@celo-contracts/common/GoldToken.sol"; import { TestWithUtils } from "@test-sol/TestWithUtils.sol"; -import "@test-sol/utils/WhenL2.sol"; contract CeloTokenTest is TestWithUtils { GoldToken celoToken; @@ -34,36 +33,18 @@ contract CeloTokenTest is TestWithUtils { sender = actor("sender"); randomAddress = actor("random"); - vm.prank(address(0)); - celoToken.mint(receiver, ONE_CELOTOKEN); // Increase total supply. - vm.prank(address(0)); - celoToken.mint(sender, ONE_CELOTOKEN); - vm.prank(address(0)); - celoToken.mint(randomAddress, L1_MINTED_CELO_SUPPLY - (2 * ONE_CELOTOKEN)); // Increase total supply. - vm.deal(receiver, ONE_CELOTOKEN); vm.deal(sender, ONE_CELOTOKEN); vm.deal(randomAddress, L1_MINTED_CELO_SUPPLY - (2 * ONE_CELOTOKEN)); // Increases balance. + vm.deal(celoUnreleasedTreasuryAddress, L2_INITIAL_STASH_BALANCE); // This step is required, as `vm.prank` funds the address, // and causes a safeMath overflow when getting the circulating supply. vm.deal(address(0), 0); + whenL2WithEpochManagerInitialization(); } } -contract CeloTokenTest_PreL2 is CeloTokenTest { - function setUp() public { - super.setUp(); - - vm.prank(address(0)); - celoToken.mint(celoUnreleasedTreasuryAddress, L2_INITIAL_STASH_BALANCE); - vm.deal(celoUnreleasedTreasuryAddress, L2_INITIAL_STASH_BALANCE); - - vm.deal(address(0), 0); - } -} -contract CeloTokenTest_L2 is CeloTokenTest_PreL2, WhenL2 {} - contract CeloTokenTest_general is CeloTokenTest { function test_name() public { assertEq(celoToken.name(), "Celo native asset"); @@ -110,8 +91,6 @@ contract CeloTokenTest_general is CeloTokenTest { } } -contract CeloTokenTest_general_L2 is CeloTokenTest_L2, CeloTokenTest_general {} - contract CeloTokenTest_transfer is CeloTokenTest { function setUp() public { super.setUp(); @@ -167,8 +146,6 @@ contract CeloTokenTest_transfer is CeloTokenTest { } } -contract CeloTokenTest_transfer_L2 is CeloTokenTest_L2, CeloTokenTest_transfer {} - contract CeloTokenTest_transferFrom is CeloTokenTest { function setUp() public { super.setUp(); @@ -214,8 +191,6 @@ contract CeloTokenTest_transferFrom is CeloTokenTest { } } -contract CeloTokenTest_transferFrom_L2 is CeloTokenTest_L2, CeloTokenTest_transferFrom {} - contract CeloTokenTest_burn is CeloTokenTest { uint256 startBurn; address burnAddress = address(0x000000000000000000000000000000000000dEaD); @@ -243,98 +218,7 @@ contract CeloTokenTest_burn is CeloTokenTest { } } -contract CeloTokenTest_burn_L2 is CeloTokenTest_L2, CeloTokenTest_burn {} - -contract CeloTokenTest_mint is CeloTokenTest { - function test_Reverts_whenCalledByOtherThanVm() public { - vm.prank(celoTokenOwner); - vm.expectRevert("Only VM can call"); - celoToken.mint(receiver, ONE_CELOTOKEN); - - vm.prank(celoUnreleasedTreasuryAddress); - vm.expectRevert("Only VM can call"); - celoToken.mint(receiver, ONE_CELOTOKEN); - } - - function test_Should_increaseCeloTokenTotalSupplyWhencalledByVm() public { - uint256 celoTokenSupplyBefore = celoToken.totalSupply(); - vm.prank(address(0)); - celoToken.mint(receiver, ONE_CELOTOKEN); - uint256 celoTokenSupplyAfter = celoToken.totalSupply(); - assertGt(celoTokenSupplyAfter, celoTokenSupplyBefore); - } - - function test_Emits_TransferEvent() public { - vm.prank(address(0)); - vm.expectEmit(true, true, true, true); - emit Transfer(address(0), receiver, ONE_CELOTOKEN); - celoToken.mint(receiver, ONE_CELOTOKEN); - } -} - -contract CeloTokenTest_mint_L2 is CeloTokenTest_L2 { - function test_Reverts_whenL2() public { - vm.expectRevert("This method is no longer supported in L2."); - vm.prank(celoUnreleasedTreasuryAddress); - celoToken.mint(receiver, ONE_CELOTOKEN); - vm.expectRevert("This method is no longer supported in L2."); - vm.prank(address(0)); - celoToken.mint(receiver, ONE_CELOTOKEN); - } -} - -contract CeloTokenTest_increaseSupply is CeloTokenTest { - function test_ShouldIncreaseTotalSupply() public { - uint256 celoTokenSupplyBefore = celoToken.totalSupply(); - vm.prank(address(0)); - celoToken.increaseSupply(ONE_CELOTOKEN); - uint256 celoTokenSupplyAfter = celoToken.totalSupply(); - assertGt(celoTokenSupplyAfter, celoTokenSupplyBefore); - } - - function test_Reverts_WhenCalledByOtherThanVm() public { - vm.prank(celoTokenOwner); - vm.expectRevert("Only VM can call"); - celoToken.increaseSupply(ONE_CELOTOKEN); - } -} - -contract CeloTokenTest_increaseSupply_L2 is CeloTokenTest_L2 { - function test_Reverts_WhenL2() public { - vm.prank(celoTokenOwner); - vm.expectRevert("This method is no longer supported in L2."); - celoToken.increaseSupply(ONE_CELOTOKEN); - } -} - -contract CeloTokenTest_circulatingSupply is CeloTokenTest { - function test_ShouldMatchCirculatingSupply_WhenNoBurn() public { - assertEq(celoToken.circulatingSupply(), celoToken.allocatedSupply()); - assertEq(celoToken.circulatingSupply(), L1_MINTED_CELO_SUPPLY); - } - - function test_ShouldDecreaseCirculatingSupply_WhenThereWasBurn() public { - vm.prank(randomAddress); - celoToken.burn(ONE_CELOTOKEN); - assertEq(celoToken.circulatingSupply(), L1_MINTED_CELO_SUPPLY - ONE_CELOTOKEN); - assertEq(celoToken.circulatingSupply(), celoToken.allocatedSupply() - ONE_CELOTOKEN); - } -} - -contract CeloTokenTest_circulatingSupply_L2 is CeloTokenTest_L2, CeloTokenTest_circulatingSupply { - function test_ShouldBeLessThanTheTotalSupply() public { - assertLt(celoToken.circulatingSupply(), celoToken.totalSupply()); - } -} - contract CeloTokenTest_AllocatedSupply is CeloTokenTest { - function test_ShouldReturnTotalSupply() public { - assertEq(celoToken.allocatedSupply(), L1_MINTED_CELO_SUPPLY); - assertEq(celoToken.allocatedSupply(), celoToken.totalSupply()); - } -} - -contract CeloTokenTest_AllocatedSupply_L2 is CeloTokenTest_L2 { function test_ShouldReturnTotalSupplyMinusCeloUnreleasedTreasuryBalance() public { assertEq(celoToken.allocatedSupply(), CELO_SUPPLY_CAP - L2_INITIAL_STASH_BALANCE); assertEq(celoToken.allocatedSupply(), celoToken.totalSupply() - L2_INITIAL_STASH_BALANCE); @@ -347,13 +231,7 @@ contract CeloTokenTest_AllocatedSupply_L2 is CeloTokenTest_L2 { } contract CeloTokenTest_TotalSupply is CeloTokenTest { - function test_ShouldReturnL1MintedSupply() public { - assertEq(celoToken.totalSupply(), L1_MINTED_CELO_SUPPLY); - } -} - -contract CeloTokenTest_TotalSupply_L2 is CeloTokenTest_L2 { - function test_ShouldReturnSupplyCap_WhenL2() public { + function test_ShouldReturnSupplyCap() public { assertEq(celoToken.totalSupply(), CELO_SUPPLY_CAP); } } diff --git a/packages/protocol/test-sol/unit/common/ProxyFactory08.t.sol b/packages/protocol/test-sol/unit/common/ProxyFactory08.t.sol index 0ad37c022b0..ee31011657a 100644 --- a/packages/protocol/test-sol/unit/common/ProxyFactory08.t.sol +++ b/packages/protocol/test-sol/unit/common/ProxyFactory08.t.sol @@ -1,11 +1,16 @@ pragma solidity ^0.8.15; -import "@celo-contracts-8/common/ProxyFactory08.sol"; -import "@celo-contracts/common/interfaces/IProxy.sol"; +// Celo imports +import { ProxyFactory08 } from "@celo-contracts-8/common/ProxyFactory08.sol"; +import { IProxy } from "@celo-contracts/common/interfaces/IProxy.sol"; +// Test imports import { TestWithUtils08 } from "@test-sol/TestWithUtils08.sol"; +import { StringUtils } from "@test-sol/utils/StringUtils.sol"; contract ProxyFactoryTest is TestWithUtils08 { + using StringUtils for string; + ProxyFactory08 proxyFactory08; bytes proxyInitCode; address constant owner = address(0xAA963FC97281d9632d96700aB62A4D1340F9a28a); @@ -61,7 +66,7 @@ contract ProxyFactoryTest is TestWithUtils08 { string memory bytecodeToCompare = substring(bytecodeString, 0, compareLength); // Assert that the truncated bytecode matches - assert(compareStrings(bytecodeBackUpToCompare, bytecodeToCompare)); + assert(bytecodeBackUpToCompare.equals(bytecodeToCompare)); } function substring( diff --git a/packages/protocol/test-sol/unit/common/Registry.t.sol b/packages/protocol/test-sol/unit/common/Registry.t.sol index bcfda7012bc..340004fbc74 100644 --- a/packages/protocol/test-sol/unit/common/Registry.t.sol +++ b/packages/protocol/test-sol/unit/common/Registry.t.sol @@ -1,11 +1,11 @@ // SPDX-License-Identifier: UNLICENSED pragma solidity ^0.5.13; -import "@test-sol/utils/WhenL2.sol"; +import { TestWithUtils } from "@test-sol/TestWithUtils.sol"; import "@celo-contracts/common/Registry.sol"; -contract RegistryTest is Test { +contract RegistryTest is TestWithUtils { event RegistryUpdated(string identifier, bytes32 indexed identifierHash, address indexed addr); address constant SOME_ADDRESS = address(0x06012c8cf97BEaD5deAe237070F9587f8E7A266d); @@ -18,6 +18,8 @@ contract RegistryTest is Test { address owner; function setUp() public { + super.setUp(); + whenL2WithEpochManagerInitialization(); owner = address(this); vm.prank(owner); _registry = new Registry(true); @@ -25,13 +27,6 @@ contract RegistryTest is Test { } } -contract RegistryTest_L2 is WhenL2, RegistryTest { - function setUp() public { - super.setUp(); - registry = IRegistry(address(_registry)); - } -} - contract RegistryTest_initialize is RegistryTest { function test_SetsTheOwner() public { assertEq(_registry.owner(), owner); @@ -64,8 +59,6 @@ contract RegistryTest_setAddressFor is RegistryTest { } } -contract RegistryTest_setAddressFor_L2 is RegistryTest_L2, RegistryTest_setAddressFor {} - contract RegistryTest_getAddressFor is RegistryTest { function test_GetsRightAddress() public { _registry.setAddressFor(SOME_ID, SOME_ADDRESS); @@ -77,8 +70,6 @@ contract RegistryTest_getAddressFor is RegistryTest { } } -contract RegistryTest_getAddressFor_L2 is RegistryTest_L2, RegistryTest_getAddressFor {} - contract RegistryTest_getAddressForString is RegistryTest { function test_GetsRightAddress() public { _registry.setAddressFor(SOME_ID, SOME_ADDRESS); @@ -90,8 +81,6 @@ contract RegistryTest_getAddressForString is RegistryTest { } } -contract RegistryTest_getAddressForString_L2 is RegistryTest_L2, RegistryTest_getAddressForString {} - contract RegistryTest_getAddressForOrDie is RegistryTest { function test_GetsRightAddress() public { _registry.setAddressFor(SOME_ID, SOME_ADDRESS); @@ -104,8 +93,6 @@ contract RegistryTest_getAddressForOrDie is RegistryTest { } } -contract RegistryTest_getAddressForOrDie_L2 is RegistryTest_L2, RegistryTest_getAddressForOrDie {} - contract RegistryTest_getAddressForStringOrDie is RegistryTest { function test_GetAddressForStringOrDie_gets_address() public { _registry.setAddressFor(SOME_ID, SOME_ADDRESS); @@ -117,8 +104,3 @@ contract RegistryTest_getAddressForStringOrDie is RegistryTest { _registry.getAddressForStringOrDie(SOME_ID); } } - -contract RegistryTest_getAddressForStringOrDie_L2 is - RegistryTest_L2, - RegistryTest_getAddressForStringOrDie -{} diff --git a/packages/protocol/test-sol/unit/common/ScoreManager.t.sol b/packages/protocol/test-sol/unit/common/ScoreManager.t.sol index 98d6ddac653..5819405c25b 100644 --- a/packages/protocol/test-sol/unit/common/ScoreManager.t.sol +++ b/packages/protocol/test-sol/unit/common/ScoreManager.t.sol @@ -2,7 +2,6 @@ pragma solidity >=0.8.7 <0.8.20; import { TestWithUtils08 } from "@test-sol/TestWithUtils08.sol"; -import { WhenL2, WhenL2NoInitialization } from "@test-sol/utils/WhenL2-08.sol"; import "@celo-contracts/common/interfaces/IRegistry.sol"; import "@celo-contracts/common/interfaces/IScoreManagerGovernance.sol"; @@ -40,16 +39,11 @@ contract ScoreManagerTest is TestWithUtils08 { registry.setAddressFor("ScoreManager", address(scoreManager)); + whenL2WithEpochManagerInitialization(); scoreManagerImpl.initialize(); } } -contract ScoreManagerTest_L2 is ScoreManagerTest, WhenL2 { - function setUp() public virtual override(ScoreManagerTest, WhenL2) { - super.setUp(); - } -} - contract ScoreManagerTest_setGroupScore is ScoreManagerTest { function test_setGroupScore() public { scoreManager.setGroupScore(owner, 42); @@ -90,11 +84,6 @@ contract ScoreManagerTest_setGroupScore is ScoreManagerTest { assertEq(scoreManager.getGroupScore(owner), 42); } } -contract ScoreManagerTest_setGroupScore_L2 is ScoreManagerTest_L2, ScoreManagerTest_setGroupScore { - function setUp() public override(ScoreManagerTest, ScoreManagerTest_L2) { - super.setUp(); - } -} contract ScoreManagerTest_setValidatorScore is ScoreManagerTest { function test_setValidatorScore() public { @@ -136,14 +125,6 @@ contract ScoreManagerTest_setValidatorScore is ScoreManagerTest { assertEq(scoreManager.getValidatorScore(owner), 42); } } -contract ScoreManagerTest_setValidatorScore_L2 is - ScoreManagerTest_L2, - ScoreManagerTest_setValidatorScore -{ - function setUp() public override(ScoreManagerTest, ScoreManagerTest_L2) { - super.setUp(); - } -} contract ScoreManagerTest_setScoreManagerSetter is ScoreManagerTest { function test_onlyOwnwerCanSetScoreManager() public { @@ -163,11 +144,3 @@ contract ScoreManagerTest_setScoreManagerSetter is ScoreManagerTest { scoreManager.setScoreManagerSetter(nonOwner); } } -contract ScoreManagerTest_setScoreManagerSetter_L2 is - ScoreManagerTest_L2, - ScoreManagerTest_setScoreManagerSetter -{ - function setUp() public override(ScoreManagerTest, ScoreManagerTest_L2) { - super.setUp(); - } -} diff --git a/packages/protocol/test-sol/unit/common/SuperBridgeETHWrapper.t.sol b/packages/protocol/test-sol/unit/common/SuperBridgeETHWrapper.t.sol new file mode 100644 index 00000000000..bb8464f034d --- /dev/null +++ b/packages/protocol/test-sol/unit/common/SuperBridgeETHWrapper.t.sol @@ -0,0 +1,71 @@ +// SPDX-License-Identifier: LGPL-3.0-only +pragma solidity >=0.8.7 <0.8.20; + +import { Test } from "celo-foundry-8/Test.sol"; + +import { SuperBridgeETHWrapper } from "@celo-contracts-8/common/SuperBridgeETHWrapper.sol"; + +import { MockWETH } from "./mocks/MockWETH.sol"; +import { MockStandardBridge } from "./mocks/MockStandardBridge.sol"; +import { IWETH } from "@celo-contracts-8/common/interfaces/IWETH.sol"; + +contract SuperBridgeETHWrapperTestBase is Test { + SuperBridgeETHWrapper public wrapper; + MockWETH public mockWethLocal; + MockStandardBridge public mockBridge; + + address public wethLocalAddr; + address public wethRemoteAddr = address(0x0000000000000000000000000000000000000042); + address public bridgeAddr; + + address public user = actor("user"); + + event WrappedAndBridged(address indexed sender, uint256 amount); + + function setUp() public { + mockWethLocal = new MockWETH(); + mockBridge = new MockStandardBridge(); + + wethLocalAddr = address(mockWethLocal); + bridgeAddr = address(mockBridge); + + wrapper = new SuperBridgeETHWrapper(wethLocalAddr, wethRemoteAddr, bridgeAddr); + + vm.deal(user, 10 ether); + } +} + +contract SuperBridgeETHWrapper_WrapAndBridge is SuperBridgeETHWrapperTestBase { + function test_SuperBridge_ShouldWrapAndSend() public { + uint256 amountToSend = 1 ether; + + assertEq(mockWethLocal.balanceOf(address(wrapper)), 0); + assertEq(mockWethLocal.allowance(address(wrapper), bridgeAddr), 0); + + uint256 userBalanceBefore = address(user).balance; + + vm.expectEmit(true, true, false, true); + emit WrappedAndBridged(user, amountToSend); + vm.prank(user); + wrapper.wrapAndBridge{ value: amountToSend }(user, 200_000); + + assertEq(address(user).balance, userBalanceBefore - amountToSend); + assertEq(mockWethLocal.balanceOf(address(wrapper)), 0); + assertEq(mockWethLocal.balanceOf(bridgeAddr), amountToSend); + + assertEq(mockBridge.lastAmount(), amountToSend, "amount to send"); + assertEq(mockBridge.lastLocalToken(), wethLocalAddr, "local token"); + assertEq(mockBridge.lastRemoteToken(), wethRemoteAddr, "remote token"); + assertEq(mockBridge.lastTo(), user, "to"); + assertEq(mockBridge.lastMinGasLimit(), 200_000, "gas limit"); + assertEq(mockBridge.lastExtraData(), bytes(""), "bytes"); + + assertEq(mockWethLocal.allowance(address(wrapper), bridgeAddr), 0); + } + + function test_Revert_WhenNoValueSent() public { + vm.prank(user); + vm.expectRevert("No ETH sent"); + wrapper.wrapAndBridge{ value: 0 }(user, 200_000); + } +} diff --git a/packages/protocol/test-sol/unit/common/mocks/MockEpochManager.sol b/packages/protocol/test-sol/unit/common/mocks/MockEpochManager.sol index 5bc6899bdd0..77a1ca116d9 100644 --- a/packages/protocol/test-sol/unit/common/mocks/MockEpochManager.sol +++ b/packages/protocol/test-sol/unit/common/mocks/MockEpochManager.sol @@ -100,6 +100,14 @@ contract MockEpochManager is IEpochManager { numberOfElectedAccounts = value; } + function setElectedAccounts(address[] calldata _electedAccounts) external { + electedAccounts = _electedAccounts; + } + + function setElectedSigners(address[] calldata _electedSigners) external { + electedSigners = _electedSigners; + } + function getCurrentEpoch() external view returns (uint256, uint256, uint256, uint256) { return getEpochByNumber(currentEpochNumber); } diff --git a/packages/protocol/test-sol/unit/common/mocks/MockStandardBridge.sol b/packages/protocol/test-sol/unit/common/mocks/MockStandardBridge.sol new file mode 100644 index 00000000000..de22da1db37 --- /dev/null +++ b/packages/protocol/test-sol/unit/common/mocks/MockStandardBridge.sol @@ -0,0 +1,40 @@ +// SPDX-License-Identifier: MIT +pragma solidity ^0.8.15; + +import "@openzeppelin/contracts8/token/ERC20/IERC20.sol"; + +contract MockStandardBridge { + address public lastLocalToken; + address public lastRemoteToken; + address public lastTo; + uint256 public lastAmount; + uint32 public lastMinGasLimit; + bytes public lastExtraData; + + /// @notice Sends ERC20 tokens to a receiver's address on the other chain. + /// @param _localToken Address of the ERC20 on this chain. + /// @param _remoteToken Address of the corresponding token on the remote chain. + /// @param _to Address of the receiver. + /// @param _amount Amount of local tokens to deposit. + /// @param _minGasLimit Minimum amount of gas that the bridge can be relayed with. + /// @param _extraData Extra data to be sent with the transaction. Note that the recipient will + /// not be triggered with this data, but it will be emitted and can be used + /// to identify the transaction. + function bridgeERC20To( + address _localToken, + address _remoteToken, + address _to, + uint256 _amount, + uint32 _minGasLimit, + bytes calldata _extraData + ) external { + IERC20(_localToken).transferFrom(msg.sender, address(this), _amount); + + lastLocalToken = _localToken; + lastRemoteToken = _remoteToken; + lastTo = _to; + lastAmount = _amount; + lastMinGasLimit = _minGasLimit; + lastExtraData = _extraData; + } +} diff --git a/packages/protocol/test-sol/unit/common/mocks/MockWETH.sol b/packages/protocol/test-sol/unit/common/mocks/MockWETH.sol new file mode 100644 index 00000000000..56024412bb5 --- /dev/null +++ b/packages/protocol/test-sol/unit/common/mocks/MockWETH.sol @@ -0,0 +1,27 @@ +// SPDX-License-Identifier: LGPL-3.0-only +pragma solidity >=0.8.7 <0.8.20; + +import "@openzeppelin/contracts8/token/ERC20/ERC20.sol"; + +contract MockWETH is ERC20 { + event Deposit(address indexed dst, uint wad); + event Withdrawal(address indexed src, uint wad); + + uint256 public totalDeposited; + + constructor() ERC20("Mock Wrapped Ether", "MWETH") {} + + function deposit() external payable { + _mint(msg.sender, msg.value); + totalDeposited += msg.value; + emit Deposit(msg.sender, msg.value); + } + + function withdraw(uint256 wad) external { + require(balanceOf(msg.sender) >= wad, "MockWETH: insufficient balance"); + totalDeposited -= wad; + _burn(msg.sender, wad); + payable(msg.sender).transfer(wad); + emit Withdrawal(msg.sender, wad); + } +} diff --git a/packages/protocol/test-sol/unit/governance/mock/MockGovernance.sol b/packages/protocol/test-sol/unit/governance/mock/MockGovernance.sol index 7718a9d9082..10c543f3cf3 100644 --- a/packages/protocol/test-sol/unit/governance/mock/MockGovernance.sol +++ b/packages/protocol/test-sol/unit/governance/mock/MockGovernance.sol @@ -35,27 +35,65 @@ contract MockGovernance is IGovernance { totalVotes[voter] = votes; } - function removeVotesWhenRevokingDelegatedVotes( - address account, - uint256 maxAmountAllowed - ) external { - removeVotesCalledFor[account] = maxAmountAllowed; + function setConstitution(address, bytes4, uint256) external pure { + revert("not implemented"); } - function setConstitution(address, bytes4, uint256) external { + function getConstitution(address, bytes4) external pure returns (uint256) { revert("not implemented"); } - function votePartially(uint256, uint256, uint256, uint256, uint256) external returns (bool) { - return true; + function propose( + uint256[] calldata, + address[] calldata, + bytes calldata, + uint256[] calldata, + string calldata + ) external payable returns (uint256) { + return 0; } function getProposal( uint256 - ) external view returns (address, uint256, uint256, uint256, string memory, uint256, bool) { + ) external pure returns (address, uint256, uint256, uint256, string memory, uint256, bool) { return (address(0), 0, 0, 0, "", 0, false); } + function proposalCount() external pure returns (uint256) { + return 0; + } + + function upvote(uint256, uint256, uint256) external pure returns (bool) { + return true; + } + + function getUpvotes(uint256) external pure returns (uint256) { + return 0; + } + + function approve(uint256, uint256) external pure returns (bool) { + return true; + } + + function isApproved(uint256) external pure returns (bool) { + return true; + } + + function votePartially(uint256, uint256, uint256, uint256, uint256) external pure returns (bool) { + return true; + } + + function removeVotesWhenRevokingDelegatedVotes( + address account, + uint256 maxAmountAllowed + ) external { + removeVotesCalledFor[account] = maxAmountAllowed; + } + + function getVoteTotals(uint256) external pure returns (uint256, uint256, uint256) { + return (0, 0, 0); + } + function getAmountOfGoldUsedForVoting(address account) external view returns (uint256) { return totalVotes[account]; } @@ -63,4 +101,8 @@ contract MockGovernance is IGovernance { function getReferendumStageDuration() external pure returns (uint256) { return 0; } + + function execute(uint256, uint256) external pure returns (bool) { + return true; + } } diff --git a/packages/protocol/test-sol/unit/governance/network/BlockchainParameters.t.sol b/packages/protocol/test-sol/unit/governance/network/BlockchainParameters.t.sol index ebee8644d3d..0812ecaf662 100644 --- a/packages/protocol/test-sol/unit/governance/network/BlockchainParameters.t.sol +++ b/packages/protocol/test-sol/unit/governance/network/BlockchainParameters.t.sol @@ -17,60 +17,25 @@ contract BlockchainParametersTest is TestWithUtils { event OwnershipTransferred(address indexed previousOwner, address indexed newOwner); function setUp() public { + super.setUp(); nonOwner = actor("nonOwner"); ph.setEpochSize(DAY / 5); blockchainParameters = new BlockchainParameters(true); + whenL2WithEpochManagerInitialization(); } } contract BlockchainParametersTest_initialize is BlockchainParametersTest { uint256 constant lookbackWindow = 20; - function test_ShouldSetTheVariables() public { - blockchainParameters.initialize(gasForNonGoldCurrencies, gasLimit, lookbackWindow); - assertEq(blockchainParameters.blockGasLimit(), gasLimit); - blockTravel(ph.epochSize()); - assertEq(blockchainParameters.getUptimeLookbackWindow(), lookbackWindow); - } - - function test_Emits_IntrinsicGasForAlternativeFeeCurrencySet() public { - vm.expectEmit(true, true, true, true); - emit IntrinsicGasForAlternativeFeeCurrencySet(gasForNonGoldCurrencies); - blockchainParameters.initialize(gasForNonGoldCurrencies, gasLimit, lookbackWindow); - } - - function test_Emits_UptimeLookbackWindowSet() public { - vm.expectEmit(true, true, true, true); - emit UptimeLookbackWindowSet(lookbackWindow, 2); - blockchainParameters.initialize(gasForNonGoldCurrencies, gasLimit, lookbackWindow); - } - - function test_Emits_OwnershipTransferred() public { - vm.expectEmit(true, true, true, true); - emit OwnershipTransferred(address(this), address(this)); + function test_Reverts_WhenCalledOnL2() public { + vm.expectRevert("This method is no longer supported in L2."); blockchainParameters.initialize(gasForNonGoldCurrencies, gasLimit, lookbackWindow); } } contract BlockchainParametersTest_setBlockGasLimit is BlockchainParametersTest { - function test_ShouldSetTheVariable() public { - blockchainParameters.setBlockGasLimit(gasLimit); - assertEq(blockchainParameters.blockGasLimit(), gasLimit); - } - - function test_Emits_BlockGasLimitSet() public { - vm.expectEmit(true, true, true, true); - emit BlockGasLimitSet(gasLimit); - blockchainParameters.setBlockGasLimit(gasLimit); - } - - function test_Reverts_WhenCalledByNonOwner() public { - vm.prank(nonOwner); - vm.expectRevert("Ownable: caller is not the owner"); - blockchainParameters.setBlockGasLimit(gasLimit); - } function test_Reverts_WhenCalledOnL2() public { - _whenL2(); vm.expectRevert("This method is no longer supported in L2."); blockchainParameters.setBlockGasLimit(gasLimit); } @@ -79,44 +44,14 @@ contract BlockchainParametersTest_setBlockGasLimit is BlockchainParametersTest { contract BlockchainParametersTest_setIntrinsicGasForAlternativeFeeCurrency is BlockchainParametersTest { - function test_ShouldSetTheVariable() public { - blockchainParameters.setIntrinsicGasForAlternativeFeeCurrency(gasForNonGoldCurrencies); - assertEq(blockchainParameters.intrinsicGasForAlternativeFeeCurrency(), gasForNonGoldCurrencies); - } - - function test_Emits_intrinsicGasForAlternativeFeeCurrencySet() public { - vm.expectEmit(true, true, true, true); - emit IntrinsicGasForAlternativeFeeCurrencySet(gasForNonGoldCurrencies); - blockchainParameters.setIntrinsicGasForAlternativeFeeCurrency(gasForNonGoldCurrencies); - } - - function test_Revert_WhenCalledByNonOwner() public { - vm.prank(nonOwner); - vm.expectRevert("Ownable: caller is not the owner"); - blockchainParameters.setIntrinsicGasForAlternativeFeeCurrency(gasForNonGoldCurrencies); - } - function test_Reverts_WhenCalledOnL2() public { - _whenL2(); vm.expectRevert("This method is no longer supported in L2."); blockchainParameters.setIntrinsicGasForAlternativeFeeCurrency(gasForNonGoldCurrencies); } } contract BlockchainParametersTest_getUptimeLookbackWindow is BlockchainParametersTest { - function test_dRevert_WhenNotSet() public { - vm.expectRevert("UptimeLookbackWindow is not initialized"); - blockchainParameters.getUptimeLookbackWindow(); - } - - function test_Revert_WhenInitializedButOnCurrentEpoch() public { - blockchainParameters.setUptimeLookbackWindow(20); - vm.expectRevert("UptimeLookbackWindow is not initialized"); - blockchainParameters.getUptimeLookbackWindow(); - } - function test_Reverts_WhenCalledOnL2() public { - _whenL2(); vm.expectRevert("This method is no longer supported in L2."); blockchainParameters.getUptimeLookbackWindow(); } @@ -126,43 +61,7 @@ contract BlockchainParametersTest_setUptimeLookbackWindow is BlockchainParameter uint256 constant newValue = 20; uint256 constant otherValue = 50; - function test_ShouldSetTheValueForNextEpoch() public { - blockchainParameters.setUptimeLookbackWindow(newValue); - blockTravel(ph.epochSize()); - assertEq(blockchainParameters.getUptimeLookbackWindow(), newValue); - } - - function test_MultipleCallsWithinEpochOnlyAppliesLast() public { - blockchainParameters.setUptimeLookbackWindow(newValue); - blockchainParameters.setUptimeLookbackWindow(otherValue); - blockTravel(ph.epochSize()); - assertEq(blockchainParameters.getUptimeLookbackWindow(), otherValue); - } - - function test_Emits_UptimeLookbackWindowSet() public { - vm.expectEmit(true, true, true, true); - emit UptimeLookbackWindowSet(newValue, 2); - blockchainParameters.setUptimeLookbackWindow(newValue); - } - - function test_Revert_WhenCalledByNonOwner() public { - vm.prank(nonOwner); - vm.expectRevert("Ownable: caller is not the owner"); - blockchainParameters.setUptimeLookbackWindow(newValue); - } - - function test_Revert_WhenUsingValueLowerThanSafeMinimum() public { - vm.expectRevert("UptimeLookbackWindow must be within safe range"); - blockchainParameters.setUptimeLookbackWindow(2); - } - - function test_Revert_WhenUsingValueGreaterThanSafeMaximum() public { - vm.expectRevert("UptimeLookbackWindow must be within safe range"); - blockchainParameters.setUptimeLookbackWindow(721); - } - function test_Reverts_WhenCalledOnL2() public { - _whenL2(); vm.expectRevert("This method is no longer supported in L2."); blockchainParameters.setUptimeLookbackWindow(100); } @@ -170,7 +69,6 @@ contract BlockchainParametersTest_setUptimeLookbackWindow is BlockchainParameter contract BlockchainParametersTest_blockGasLimit is BlockchainParametersTest { function test_Reverts_WhenCalledOnL2() public { - _whenL2(); vm.expectRevert("This method is no longer supported in L2."); blockchainParameters.blockGasLimit(); } @@ -180,7 +78,6 @@ contract BlockchainParametersTest_intrinsicGasForAlternativeFeeCurrency is BlockchainParametersTest { function test_Reverts_WhenCalledOnL2() public { - _whenL2(); vm.expectRevert("This method is no longer supported in L2."); blockchainParameters.intrinsicGasForAlternativeFeeCurrency(); } diff --git a/packages/protocol/test-sol/unit/governance/network/EpochRewards.t.sol b/packages/protocol/test-sol/unit/governance/network/EpochRewards.t.sol index 8aaf7177457..1f360201e2e 100644 --- a/packages/protocol/test-sol/unit/governance/network/EpochRewards.t.sol +++ b/packages/protocol/test-sol/unit/governance/network/EpochRewards.t.sol @@ -13,7 +13,6 @@ import { MockStableToken } from "@celo-contracts/stability/test/MockStableToken. import { CeloTokenMock } from "@test-sol/unit/common/CeloTokenMock.sol"; import { TestWithUtils } from "@test-sol/TestWithUtils.sol"; -import "@test-sol/utils/WhenL2.sol"; contract EpochRewardsTest is TestWithUtils { uint256 constant targetVotingYieldParamsInitial = 0.00016e24; // 0.00016 @@ -61,6 +60,25 @@ contract EpochRewardsTest is TestWithUtils { function setUp() public { super.setUp(); + preEpochRewardsSetup(); + + epochRewards.initialize( + address(registry), + targetVotingYieldParamsInitial, + targetVotingYieldParamsMax, + targetVotingYieldParamsAdjustmentFactor, + rewardsMultiplierMax, + rewardsMultiplierAdjustmentsUnderspend, + rewardsMultiplierAdjustmentsOverspend, + targetVotingGoldFraction, + targetValidatorEpochPayment, + communityRewardFraction, + address(0), + carbonOffsettingFraction + ); + whenL2WithEpochManagerInitialization(); + } + function preEpochRewardsSetup() public { // Mocked contracts epochRewards = new EpochRewardsMock(); election = new MockElection(); @@ -69,7 +87,6 @@ contract EpochRewardsTest is TestWithUtils { mockCeloToken = new CeloTokenMock(); mockCeloToken.setRegistry(REGISTRY_ADDRESS); - mockCeloToken.setTotalSupply(L1_MINTED_CELO_SUPPLY); freezer = new Freezer(true); @@ -88,39 +105,15 @@ contract EpochRewardsTest is TestWithUtils { address(mockStableToken), sortedOraclesDenominator * exchangeRate ); - - epochRewards.initialize( - address(registry), - targetVotingYieldParamsInitial, - targetVotingYieldParamsMax, - targetVotingYieldParamsAdjustmentFactor, - rewardsMultiplierMax, - rewardsMultiplierAdjustmentsUnderspend, - rewardsMultiplierAdjustmentsOverspend, - targetVotingGoldFraction, - targetValidatorEpochPayment, - communityRewardFraction, - address(0), - carbonOffsettingFraction - ); } function _setNumberOfElectedInCurrentSetBaseOnLayer(uint256 numberValidators) internal { - if (isL2()) { - epochManager.setNumberOfElectedInCurrentSet(numberValidators); - } else { - epochRewards.setNumberValidatorsInCurrentSet(numberValidators); - } + epochManager.setNumberOfElectedInCurrentSet(numberValidators); } function _updateTargetVotingYieldBasedOnLayer() internal { - if (isL2()) { - vm.prank(address(epochManager)); - epochRewards.updateTargetVotingYield(); - } else { - vm.prank(address(0)); - epochRewards.updateTargetVotingYield(); - } + vm.prank(address(epochManager)); + epochRewards.updateTargetVotingYield(); } function getExpectedTargetTotalSupply(uint256 timeDelta) internal pure returns (uint256) { @@ -130,9 +123,25 @@ contract EpochRewardsTest is TestWithUtils { } } -contract EpochRewardsTest_L2 is WhenL2, EpochRewardsTest {} - contract EpochRewardsTest_initialize is EpochRewardsTest { + function setUp() public { + super.setUp(); + preEpochRewardsSetup(); + epochRewards.initialize( + address(registry), + targetVotingYieldParamsInitial, + targetVotingYieldParamsMax, + targetVotingYieldParamsAdjustmentFactor, + rewardsMultiplierMax, + rewardsMultiplierAdjustmentsUnderspend, + rewardsMultiplierAdjustmentsOverspend, + targetVotingGoldFraction, + targetValidatorEpochPayment, + communityRewardFraction, + address(0), + carbonOffsettingFraction + ); + } function test_ShouldHaveSetOwner() public { assertEq(epochRewards.owner(), caller); } @@ -217,11 +226,6 @@ contract EpochRewardsTest_setTargetVotingGoldFraction is EpochRewardsTest { } } -contract EpochRewardsTest_setTargetVotingGoldFraction_L2 is - EpochRewardsTest_L2, - EpochRewardsTest_setTargetVotingGoldFraction -{} - contract EpochRewardsTest_setCommunityRewardFraction is EpochRewardsTest { uint256 newFraction = communityRewardFraction + 1; @@ -261,11 +265,6 @@ contract EpochRewardsTest_setCommunityRewardFraction is EpochRewardsTest { } } -contract EpochRewardsTest_setCommunityRewardFraction_L2 is - EpochRewardsTest_L2, - EpochRewardsTest_setCommunityRewardFraction -{} - contract EpochRewardsTest_setTargetValidatorEpochPayment is EpochRewardsTest { uint256 newPayment = targetValidatorEpochPayment + 1; @@ -296,11 +295,6 @@ contract EpochRewardsTest_setTargetValidatorEpochPayment is EpochRewardsTest { } } -contract EpochRewardsTest_setTargetValidatorEpochPayment_L2 is - EpochRewardsTest_L2, - EpochRewardsTest_setTargetValidatorEpochPayment -{} - contract EpochRewardsTest_setRewardsMultiplierParameters is EpochRewardsTest { uint256 newRewardsMultiplierAdjustmentsUnderspend = rewardsMultiplierAdjustmentsUnderspend + 1; @@ -353,11 +347,6 @@ contract EpochRewardsTest_setRewardsMultiplierParameters is EpochRewardsTest { } } -contract EpochRewardsTest_setRewardsMultiplierParameters_L2 is - EpochRewardsTest_L2, - EpochRewardsTest_setRewardsMultiplierParameters -{} - contract EpochRewardsTest_setTargetVotingYieldParameters is EpochRewardsTest { uint256 newTargetVotingYieldParamsMax = targetVotingYieldParamsMax + 1; uint256 newTargetVotingYieldParamsAdjustmentFactor = targetVotingYieldParamsAdjustmentFactor + 1; @@ -404,11 +393,6 @@ contract EpochRewardsTest_setTargetVotingYieldParameters is EpochRewardsTest { } } -contract EpochRewardsTest_setTargetVotingYieldParameters_L2 is - EpochRewardsTest_L2, - EpochRewardsTest_setTargetVotingYieldParameters -{} - contract EpochRewardsTest_setTargetVotingYield is EpochRewardsTest { uint256 constant newTargetVotingYieldParamsInitial = targetVotingYieldParamsInitial + 1; @@ -432,11 +416,6 @@ contract EpochRewardsTest_setTargetVotingYield is EpochRewardsTest { } } -contract EpochRewardsTest_setTargetVotingYield_L2 is - EpochRewardsTest_L2, - EpochRewardsTest_setTargetVotingYield -{} - contract EpochRewardsTest_getTargetGoldTotalSupply is EpochRewardsTest { function test_ShouldReturn1B_WhenLessThan15YearsSinceGenesis() public { uint256 timeDelta = YEAR * 10; @@ -445,11 +424,6 @@ contract EpochRewardsTest_getTargetGoldTotalSupply is EpochRewardsTest { } } -contract EpochRewardsTest_getTargetGoldTotalSupply_L2 is - EpochRewardsTest_L2, - EpochRewardsTest_getTargetGoldTotalSupply -{} - contract EpochRewardsTest_getTargetVoterRewards is EpochRewardsTest { function test_ShouldReturnAPercentageOfActiveVotes_WhenThereAreActiveVotes() public { uint256 activeVotes = 1000000; @@ -460,11 +434,6 @@ contract EpochRewardsTest_getTargetVoterRewards is EpochRewardsTest { } } -contract EpochRewardsTest_getTargetVoterRewards_L2 is - EpochRewardsTest_L2, - EpochRewardsTest_getTargetVoterRewards -{} - contract EpochRewardsTest_getTargetTotalEpochPaymentsInGold is EpochRewardsTest { function test_ShouldgetTargetTotalEpochPaymentsInGold_WhenExchangeRateIsSet() public { uint256 numberValidators = 100; @@ -475,11 +444,6 @@ contract EpochRewardsTest_getTargetTotalEpochPaymentsInGold is EpochRewardsTest } } -contract EpochRewardsTest_getTargetTotalEpochPaymentsInGold_L2 is - EpochRewardsTest_L2, - EpochRewardsTest_getTargetTotalEpochPaymentsInGold -{} - contract EpochRewardsTest_getRewardsMultiplier is EpochRewardsTest { uint256 constant timeDelta = YEAR * 10; uint256 expectedTargetTotalSupply; @@ -498,12 +462,8 @@ contract EpochRewardsTest_getRewardsMultiplier is EpochRewardsTest { } function test_ShouldReturnOne_WhenTheTargetSupplyIsEqualToTheActualSupplyAfterRewards() public { - if (isL2()) { - uint256 celoUnreleasedTreasuryBalance = SUPPLY_CAP - expectedTargetTotalSupply; - vm.deal(celoUnreleasedTreasuryAddress, celoUnreleasedTreasuryBalance - targetEpochReward); - } else { - mockCeloToken.setTotalSupply(expectedTargetTotalSupply - targetEpochReward); - } + uint256 celoUnreleasedTreasuryBalance = SUPPLY_CAP - expectedTargetTotalSupply; + vm.deal(celoUnreleasedTreasuryAddress, celoUnreleasedTreasuryBalance - targetEpochReward); assertEq(epochRewards.getRewardsMultiplier(), FIXED1); } @@ -513,12 +473,7 @@ contract EpochRewardsTest_getRewardsMultiplier is EpochRewardsTest { { uint256 actualRemainingSupply = uint256((expectedTargetRemainingSupply * 11) / 10); - if (isL2()) { - vm.deal(celoUnreleasedTreasuryAddress, actualRemainingSupply - targetEpochReward); - } else { - uint256 totalSupply = SUPPLY_CAP - actualRemainingSupply - targetEpochReward; - mockCeloToken.setTotalSupply(totalSupply); - } + vm.deal(celoUnreleasedTreasuryAddress, actualRemainingSupply - targetEpochReward); uint256 actual = epochRewards.getRewardsMultiplier(); uint256 expected = uint256((FIXED1 + (rewardsMultiplierAdjustmentsUnderspend / 10))); @@ -530,12 +485,7 @@ contract EpochRewardsTest_getRewardsMultiplier is EpochRewardsTest { { uint256 actualRemainingSupply = uint256((expectedTargetRemainingSupply * 9) / 10); - if (isL2()) { - vm.deal(celoUnreleasedTreasuryAddress, actualRemainingSupply - targetEpochReward); - } else { - uint256 totalSupply = SUPPLY_CAP - actualRemainingSupply - targetEpochReward; - mockCeloToken.setTotalSupply(totalSupply); - } + vm.deal(celoUnreleasedTreasuryAddress, actualRemainingSupply - targetEpochReward); uint256 actual = epochRewards.getRewardsMultiplier(); uint256 expected = uint256((FIXED1 - (rewardsMultiplierAdjustmentsOverspend / 10))); @@ -543,11 +493,6 @@ contract EpochRewardsTest_getRewardsMultiplier is EpochRewardsTest { } } -contract EpochRewardsTest_getRewardsMultiplier_L2 is - EpochRewardsTest_L2, - EpochRewardsTest_getRewardsMultiplier -{} - contract EpochRewardsTest_updateTargetVotingYield is EpochRewardsTest { uint256 constant totalSupplyL1 = 6000000 ether; uint256 constant celoUnreleasedTreasuryBalance = SUPPLY_CAP - totalSupplyL1; @@ -578,11 +523,7 @@ contract EpochRewardsTest_updateTargetVotingYield is EpochRewardsTest { 2 * FIXED1 ); - if (isL2()) { - vm.deal(celoUnreleasedTreasuryAddress, celoUnreleasedTreasuryBalance); - } else { - mockCeloToken.setTotalSupply(totalSupplyL1); - } + vm.deal(celoUnreleasedTreasuryAddress, celoUnreleasedTreasuryBalance); vm.deal(address(reserve), reserveBalance); } @@ -804,11 +745,6 @@ contract EpochRewardsTest_updateTargetVotingYield is EpochRewardsTest { } } -contract EpochRewardsTest_updateTargetVotingYield_L2 is - EpochRewardsTest_L2, - EpochRewardsTest_updateTargetVotingYield -{} - contract EpochRewardsTest_WhenThereAreActiveVotesAStableTokenExchangeRateIsSetAndTheActualRemainingSupplyIs10pMoreThanTheTargetRemainingSupplyAfterRewards_calculateTargetEpochRewards is EpochRewardsTest { @@ -836,15 +772,10 @@ contract EpochRewardsTest_WhenThereAreActiveVotesAStableTokenExchangeRateIsSetAn uint256 expectedTargetRemainingSupply = SUPPLY_CAP - expectedTargetTotalSupply; uint256 actualRemainingSupply = (expectedTargetRemainingSupply * 11) / 10; - if (isL2()) { - vm.deal( - celoUnreleasedTreasuryAddress, - actualRemainingSupply + expectedTargetGoldSupplyIncrease - ); - } else { - uint256 totalSupply = SUPPLY_CAP - actualRemainingSupply - expectedTargetGoldSupplyIncrease; - mockCeloToken.setTotalSupply(totalSupply); - } + vm.deal( + celoUnreleasedTreasuryAddress, + actualRemainingSupply + expectedTargetGoldSupplyIncrease + ); expectedMultiplier = (FIXED1 + rewardsMultiplierAdjustmentsUnderspend / 10); validatorReward = (targetValidatorEpochPayment * numberValidators) / exchangeRate; @@ -892,140 +823,3 @@ contract EpochRewardsTest_WhenThereAreActiveVotesAStableTokenExchangeRateIsSetAn assertApproxEqRel(result, expected, 5e13); } } - -contract EpochRewardsTest_WhenThereAreActiveVotesAStableTokenExchangeRateIsSetAndTheActualRemainingSupplyIs10pMoreThanTheTargetRemainingSupplyAfterRewards_calculateTargetEpochRewards_L2 is - EpochRewardsTest_L2, - EpochRewardsTest_WhenThereAreActiveVotesAStableTokenExchangeRateIsSetAndTheActualRemainingSupplyIs10pMoreThanTheTargetRemainingSupplyAfterRewards_calculateTargetEpochRewards -{} - -contract EpochRewardsTest_isReserveLow is EpochRewardsTest { - uint256 constant stableBalance = 2397846127684712867321; - - function setUp() public { - super.setUp(); - - uint256 totalSupply = 129762987346298761037469283746; - reserve = new Reserve(true); - registry.setAddressFor("Reserve", address(reserve)); - - initialAssetAllocationWeights = new uint256[](2); - initialAssetAllocationWeights[0] = FIXED1 / 2; - initialAssetAllocationWeights[1] = FIXED1 / 2; - - initialAssetAllocationSymbols = new bytes32[](2); - initialAssetAllocationSymbols[0] = bytes32("cGLD"); - initialAssetAllocationSymbols[1] = bytes32("empty"); - - reserve.initialize( - address(registry), - 60, - FIXED1, - 0, - 0, - initialAssetAllocationSymbols, - initialAssetAllocationWeights, - 0.005e24, // 0.005 - 2 * FIXED1 - ); - reserve.addToken(address(mockStableToken)); - mockCeloToken.setTotalSupply(totalSupply); - mockStableToken.setTotalSupply(stableBalance); - } - - // reserve ratio of 0.5' - function test_ShouldBeLowAtStart_WhenReserveRatioIs05() public { - uint256 celoBalance = ((stableBalance / exchangeRate) / 2) / 2; - vm.deal(address(reserve), celoBalance); - // no time travel - assertEq(epochRewards.isReserveLow(), true); - } - - // reserve ratio of 1.5 - function test_ShouldBeLowAt15Years_WhenReserveRatioIs05() public { - uint256 celoBalance = ((stableBalance / exchangeRate) / 2) / 2; - vm.deal(address(reserve), celoBalance); - uint256 timeDelta = YEAR * 15; - timeTravel(timeDelta); - - assertEq(epochRewards.isReserveLow(), true); - } - - function test_ShouldBeLowAt25Years_WhenReserveRatioIs05() public { - uint256 celoBalance = ((stableBalance / exchangeRate) / 2) / 2; - vm.deal(address(reserve), celoBalance); - uint256 timeDelta = YEAR * 25; - timeTravel(timeDelta); - - assertEq(epochRewards.isReserveLow(), true); - } - - function test_ShouldBeLowAtStar_WhenReserveRatioIs1point5() public { - uint256 celoBalance = ((3 * stableBalance) / exchangeRate) / 4; - vm.deal(address(reserve), celoBalance); - // no time travel - assertEq(epochRewards.isReserveLow(), true); - } - - function test_ShouldBeLowAt12Years_WhenReserveRatioIs1point5() public { - uint256 celoBalance = ((3 * stableBalance) / exchangeRate) / 4; - vm.deal(address(reserve), celoBalance); - uint256 timeDelta = YEAR * 12; - timeTravel(timeDelta); - assertEq(epochRewards.isReserveLow(), true); - } - - function test_ShouldNotBeLowAt15Years_WhenReserveRatioIs1point5() public { - uint256 celoBalance = ((3 * stableBalance) / exchangeRate) / 4; - vm.deal(address(reserve), celoBalance); - uint256 timeDelta = YEAR * 15; - timeTravel(timeDelta); - assertEq(epochRewards.isReserveLow(), false); - } - - function test_ShouldNotBeLowAt25Years_WhenReserveRatioIs1point5() public { - uint256 celoBalance = ((3 * stableBalance) / exchangeRate) / 4; - vm.deal(address(reserve), celoBalance); - uint256 timeDelta = YEAR * 25; - timeTravel(timeDelta); - assertEq(epochRewards.isReserveLow(), false); - } - - function test_ShouldBeLowAtStar_WhenReserveRatioIs2point5() public { - uint256 celoBalance = ((5 * stableBalance) / exchangeRate) / 4; - vm.deal(address(reserve), celoBalance); - // no time travel - assertEq(epochRewards.isReserveLow(), false); - } - - function test_ShouldNotBeLowAt15Years_WhenReserveRatioIs2point5() public { - uint256 celoBalance = ((5 * stableBalance) / exchangeRate) / 4; - vm.deal(address(reserve), celoBalance); - uint256 timeDelta = YEAR * 15; - timeTravel(timeDelta); - assertEq(epochRewards.isReserveLow(), false); - } - - function test_ShouldNotBeLowAt25Years_WhenReserveRatioIs2point5() public { - uint256 celoBalance = ((5 * stableBalance) / exchangeRate) / 4; - vm.deal(address(reserve), celoBalance); - uint256 timeDelta = YEAR * 25; - timeTravel(timeDelta); - assertEq(epochRewards.isReserveLow(), false); - } - - // when the contract is frozen - function test_ShouldMakeUpdateTargetVotingyieldRevert_WhenTheContractIsFrozen() public { - freezer.freeze(address(epochRewards)); - if (isL2()) { - vm.prank(address(epochManager)); - vm.expectRevert("can't call when contract is frozen"); - epochRewards.updateTargetVotingYield(); - } else { - vm.prank(address(0)); - vm.expectRevert("can't call when contract is frozen"); - epochRewards.updateTargetVotingYield(); - } - } -} - -contract EpochRewardsTest_isReserveLow_L2 is EpochRewardsTest_L2, EpochRewardsTest_isReserveLow {} diff --git a/packages/protocol/test-sol/unit/governance/network/Governance.t.sol b/packages/protocol/test-sol/unit/governance/network/Governance.t.sol index ee4a8284cc3..d2ca64b52c2 100644 --- a/packages/protocol/test-sol/unit/governance/network/Governance.t.sol +++ b/packages/protocol/test-sol/unit/governance/network/Governance.t.sol @@ -1,7 +1,6 @@ pragma solidity ^0.5.13; import { TestWithUtils } from "@test-sol/TestWithUtils.sol"; -import "@test-sol/utils/WhenL2.sol"; import "solidity-bytes-utils/contracts/BytesLib.sol"; import "openzeppelin-solidity/contracts/cryptography/ECDSA.sol"; @@ -131,6 +130,7 @@ contract GovernanceTest is TestWithUtils { setUpVoterAccount(); setUpProposalStubs(); + whenL2WithEpochManagerInitialization(); } function assertNotEq(uint256 a, uint256 b) internal { @@ -188,7 +188,7 @@ contract GovernanceTest is TestWithUtils { vm.prank(account); accounts.authorizeVoteSigner(vm.addr(signerPk), v, r, s); } - function setUpVoterAccount() private { + function setUpVoterAccount() public { vm.prank(accVoter); accounts.createAccount(); @@ -196,7 +196,7 @@ contract GovernanceTest is TestWithUtils { mockLockedGold.setAccountTotalGovernancePower(accVoter, VOTER_GOLD); } - function setUpContracts() private { + function setUpContracts() public { vm.startPrank(accOwner); mockValidators = new MockValidators(); @@ -229,7 +229,7 @@ contract GovernanceTest is TestWithUtils { registry.setAddressFor("Accounts", address(accounts)); } - function setUpProposalStubs() private { + function setUpProposalStubs() public { testTransactions = new TestTransactions(); string memory setValueSignature = "setValue(uint256,uint256,bool)"; @@ -263,9 +263,16 @@ contract GovernanceTest is TestWithUtils { } } -contract GovernanceTest_L2 is GovernanceTest, WhenL2 {} - contract GovernanceTest_initialize is GovernanceTest { + function setUp() public { + super.setUp(); + + setUpContracts(); + + setUpVoterAccount(); + + setUpProposalStubs(); + } function test_SetsTheOwner() public { assertEq(governance.owner(), accOwner); } @@ -370,8 +377,6 @@ contract GovernanceTest_setApprover is GovernanceTest { } } -contract GovernanceTest_setApprover_L2 is GovernanceTest_L2, GovernanceTest_setApprover {} - contract GovernanceTest_setMinDeposit is GovernanceTest { uint256 NEW_MINDEPOSIT = 45; event MinDepositSet(uint256 minDeposit); @@ -402,8 +407,6 @@ contract GovernanceTest_setMinDeposit is GovernanceTest { } } -contract GovernanceTest_setMinDeposit_L2 is GovernanceTest_L2, GovernanceTest_setMinDeposit {} - contract GovernanceTest_setConcurrentProposals is GovernanceTest { uint256 NEW_CONCURRENT_PROPOSALS = 45; event ConcurrentProposalsSet(uint256 concurrentProposals); @@ -440,11 +443,6 @@ contract GovernanceTest_setConcurrentProposals is GovernanceTest { } } -contract GovernanceTest_setConcurrentProposals_L2 is - GovernanceTest_L2, - GovernanceTest_setConcurrentProposals -{} - contract GovernanceTest_setQueueExpiry is GovernanceTest { event QueueExpirySet(uint256 queueExpiry); @@ -480,8 +478,6 @@ contract GovernanceTest_setQueueExpiry is GovernanceTest { } } -contract GovernanceTest_setQueueExpiry_L2 is GovernanceTest_L2, GovernanceTest_setQueueExpiry {} - contract GovernanceTest_setDequeueFrequency is GovernanceTest { event DequeueFrequencySet(uint256 dequeueFrequency); @@ -517,11 +513,6 @@ contract GovernanceTest_setDequeueFrequency is GovernanceTest { } } -contract GovernanceTest_setDequeueFrequency_L2 is - GovernanceTest_L2, - GovernanceTest_setDequeueFrequency -{} - contract GovernanceTest_setReferendumStageDuration is GovernanceTest { event ReferendumStageDurationSet(uint256 value); @@ -557,11 +548,6 @@ contract GovernanceTest_setReferendumStageDuration is GovernanceTest { } } -contract GovernanceTest_setReferendumStageDuration_L2 is - GovernanceTest_L2, - GovernanceTest_setReferendumStageDuration -{} - contract GovernanceTest_setExecutionStageDuration is GovernanceTest { event ExecutionStageDurationSet(uint256 dequeueFrequency); @@ -597,11 +583,6 @@ contract GovernanceTest_setExecutionStageDuration is GovernanceTest { } } -contract GovernanceTest_setExecutionStageDuration_L2 is - GovernanceTest_L2, - GovernanceTest_setExecutionStageDuration -{} - contract GovernanceTest_setParticipationFloor is GovernanceTest { event ParticipationFloorSet(uint256 value); @@ -632,11 +613,6 @@ contract GovernanceTest_setParticipationFloor is GovernanceTest { } } -contract GovernanceTest_setParticipationFloor_L2 is - GovernanceTest_L2, - GovernanceTest_setParticipationFloor -{} - contract GovernanceTest_setBaselineUpdateFactor is GovernanceTest { event ParticipationBaselineUpdateFactorSet(uint256 value); @@ -667,11 +643,6 @@ contract GovernanceTest_setBaselineUpdateFactor is GovernanceTest { } } -contract GovernanceTest_setBaselineUpdateFactor_L2 is - GovernanceTest_L2, - GovernanceTest_setBaselineUpdateFactor -{} - contract GovernanceTest_setBaselineQuorumFactor is GovernanceTest { event ParticipationBaselineQuorumFactorSet(uint256 value); @@ -702,11 +673,6 @@ contract GovernanceTest_setBaselineQuorumFactor is GovernanceTest { } } -contract GovernanceTest_setBaselineQuorumFactor_L2 is - GovernanceTest_L2, - GovernanceTest_setBaselineQuorumFactor -{} - contract GovernanceTest_setConstitution is GovernanceTest { event ConstitutionSet(address indexed destination, bytes4 indexed functionId, uint256 threshold); @@ -783,8 +749,6 @@ contract GovernanceTest_setConstitution is GovernanceTest { } } -contract GovernanceTest_setConstitution_L2 is GovernanceTest_L2, GovernanceTest_setConstitution {} - contract GovernanceTest_setSecurityCouncil is GovernanceTest { event SecurityCouncilSet(address indexed council); @@ -830,11 +794,6 @@ contract GovernanceTest_setSecurityCouncil is GovernanceTest { } } -contract GovernanceTest_setSecurityCouncil_L2 is - GovernanceTest_L2, - GovernanceTest_setSecurityCouncil -{} - contract GovernanceTest_setHotfixExecutionTimeWindow is GovernanceTest { event HotfixExecutionTimeWindowSet(uint256 timeDelta); @@ -865,11 +824,6 @@ contract GovernanceTest_setHotfixExecutionTimeWindow is GovernanceTest { } } -contract GovernanceTest_setHotfixExecutionTimeWindow_L2 is - GovernanceTest_L2, - GovernanceTest_setHotfixExecutionTimeWindow -{} - contract GovernanceTest_propose is GovernanceTest { event ProposalQueued( uint256 indexed proposalId, @@ -1044,8 +998,6 @@ contract GovernanceTest_propose is GovernanceTest { } } -contract GovernanceTest_propose_L2 is GovernanceTest_L2, GovernanceTest_propose {} - contract GovernanceTest_upvote is GovernanceTest { event ProposalUpvoted(uint256 indexed proposalId, address indexed account, uint256 upvotes); event ProposalExpired(uint256 indexed proposalId); @@ -1245,8 +1197,6 @@ contract GovernanceTest_upvote is GovernanceTest { } } -contract GovernanceTest_upvote_L2 is GovernanceTest_L2, GovernanceTest_upvote {} - contract GovernanceTest_revokeUpvote is GovernanceTest { event ProposalExpired(uint256 indexed proposalId); event ProposalUpvoteRevoked( @@ -1336,8 +1286,6 @@ contract GovernanceTest_revokeUpvote is GovernanceTest { } } -contract GovernanceTest_revokeUpvote_L2 is GovernanceTest_L2, GovernanceTest_revokeUpvote {} - contract GovernanceTest_withdraw is GovernanceTest { address accProposer; @@ -1375,8 +1323,6 @@ contract GovernanceTest_withdraw is GovernanceTest { } } -contract GovernanceTest_withdraw_L2 is GovernanceTest_L2, GovernanceTest_withdraw {} - contract GovernanceTest_approve is GovernanceTest { uint256 INDEX = 0; // first proposal index @@ -1551,8 +1497,6 @@ contract GovernanceTest_approve is GovernanceTest { } } -contract GovernanceTest_approve_L2 is GovernanceTest_L2, GovernanceTest_approve {} - contract GovernanceTest_revokeVotes is GovernanceTest { uint256 numVoted; @@ -1681,8 +1625,6 @@ contract GovernanceTest_revokeVotes is GovernanceTest { } } -contract GovernanceTest_revokeVotes_L2 is GovernanceTest_L2, GovernanceTest_revokeVotes {} - contract GovernanceTest_vote_WhenProposalIsApproved is GovernanceTest { event ProposalVotedV2( uint256 indexed proposalId, @@ -1921,11 +1863,6 @@ contract GovernanceTest_vote_WhenProposalIsApproved is GovernanceTest { } } -contract GovernanceTest_vote_WhenProposalIsApproved_L2 is - GovernanceTest_L2, - GovernanceTest_vote_WhenProposalIsApproved -{} - contract GovernanceTest_vote_WhenProposalIsApprovedAndHaveSigner is GovernanceTest { address accSigner; @@ -2005,11 +1942,6 @@ contract GovernanceTest_vote_WhenProposalIsApprovedAndHaveSigner is GovernanceTe } } -contract GovernanceTest_vote_WhenProposalIsApprovedAndHaveSigner_L2 is - GovernanceTest_L2, - GovernanceTest_vote_WhenProposalIsApprovedAndHaveSigner -{} - contract GovernanceTest_vote_WhenProposalIsNotApproved is GovernanceTest { event ProposalVotedV2( uint256 indexed proposalId, @@ -2090,11 +2022,6 @@ contract GovernanceTest_vote_WhenProposalIsNotApproved is GovernanceTest { } } -contract GovernanceTest_vote_WhenProposalIsNotApproved_L2 is - GovernanceTest_L2, - GovernanceTest_vote_WhenProposalIsNotApproved -{} - contract GovernanceTest_vote_WhenVotingOnDifferentProposalWithSameIndex is GovernanceTest { function test_IgnoreVotesFromPreviousProposal() public { uint256 proposalId1 = makeValidProposal(); @@ -2145,11 +2072,6 @@ contract GovernanceTest_vote_WhenVotingOnDifferentProposalWithSameIndex is Gover } } -contract GovernanceTest_vote_WhenVotingOnDifferentProposalWithSameIndex_L2 is - GovernanceTest_L2, - GovernanceTest_vote_WhenVotingOnDifferentProposalWithSameIndex -{} - contract GovernanceTest_vote_PartiallyWhenProposalIsApproved is GovernanceTest { event ProposalVotedV2( uint256 indexed proposalId, @@ -2378,11 +2300,6 @@ contract GovernanceTest_vote_PartiallyWhenProposalIsApproved is GovernanceTest { } } -contract GovernanceTest_vote_PartiallyWhenProposalIsApproved_L2 is - GovernanceTest_L2, - GovernanceTest_vote_PartiallyWhenProposalIsApproved -{} - contract GovernanceTest_votePartially_WhenProposalIsApprovedAndHaveSigner is GovernanceTest { address accSigner; @@ -2497,11 +2414,6 @@ contract GovernanceTest_votePartially_WhenProposalIsApprovedAndHaveSigner is Gov } } -contract GovernanceTest_votePartially_WhenProposalIsApprovedAndHaveSigner_L2 is - GovernanceTest_L2, - GovernanceTest_votePartially_WhenProposalIsApprovedAndHaveSigner -{} - contract GovernanceTest_votePartially_WhenProposalIsNotApproved is GovernanceTest { event ProposalVotedV2( uint256 indexed proposalId, @@ -2582,11 +2494,6 @@ contract GovernanceTest_votePartially_WhenProposalIsNotApproved is GovernanceTes } } -contract GovernanceTest_votePartially_WhenProposalIsNotApproved_L2 is - GovernanceTest_L2, - GovernanceTest_votePartially_WhenProposalIsNotApproved -{} - contract GovernanceTest_votePartially_WhenVotingOnDifferentProposalWithSameIndex is GovernanceTest { function test_IgnoreVotesFromPreviousProposal() public { uint256 proposalId1 = makeValidProposal(); @@ -2637,11 +2544,6 @@ contract GovernanceTest_votePartially_WhenVotingOnDifferentProposalWithSameIndex } } -contract GovernanceTest_votePartially_WhenVotingOnDifferentProposalWithSameIndex_L2 is - GovernanceTest_L2, - GovernanceTest_votePartially_WhenVotingOnDifferentProposalWithSameIndex -{} - contract GovernanceTest_execute is GovernanceTest { event ParticipationBaselineUpdated(uint256 participationBaseline); event ProposalExecuted(uint256 indexed proposalId); @@ -3073,54 +2975,12 @@ contract GovernanceTest_execute is GovernanceTest { } } -contract GovernanceTest_execute_L2 is GovernanceTest_L2, GovernanceTest_execute {} - contract GovernanceTest_approveHotfix is GovernanceTest { bytes32 constant HOTFIX_HASH = bytes32(uint256(0x123456789)); event HotfixApproved(bytes32 indexed hash, address approver); - - function test_markHotfixRecordApprovedWhenCalledByApprover() public { - vm.prank(accApprover); - governance.approveHotfix(HOTFIX_HASH); - (bool approved, , ) = governance.getL1HotfixRecord(HOTFIX_HASH); - assertTrue(approved); - } - - function test_Emits_HotfixApprovedEvent() public { - vm.expectEmit(true, true, true, true); - emit HotfixApproved(HOTFIX_HASH, accApprover); - vm.prank(accApprover); - governance.approveHotfix(HOTFIX_HASH); - } - - function test_Reverts_WhenCalledByNonApproverOrCouncil() public { - vm.expectRevert("msg.sender not approver or Security Council"); - governance.approveHotfix(HOTFIX_HASH); - } - - function test_Reverts_WhenCalledBySecurityCouncilOnL1() public { - vm.prank(accOwner); - governance.setSecurityCouncil(accCouncil); - - vm.prank(accCouncil); - vm.expectRevert("Hotfix approval by security council is not available on L1."); - governance.approveHotfix(HOTFIX_HASH); - } - - function test_Reverts_WhenCalledByZeroAddressOnL1() public { - vm.prank(address(0)); - vm.expectRevert("msg.sender cannot be address zero"); - governance.approveHotfix(HOTFIX_HASH); - } -} - -contract GovernanceTest_approveHotfix_L2 is GovernanceTest { - bytes32 constant HOTFIX_HASH = bytes32(uint256(0x123456789)); - event HotfixApproved(bytes32 indexed hash, address approver); function setUp() public { super.setUp(); - _whenL2(); vm.prank(accOwner); governance.setHotfixExecutionTimeWindow(DAY); } @@ -3129,7 +2989,7 @@ contract GovernanceTest_approveHotfix_L2 is GovernanceTest { vm.prank(accApprover); governance.approveHotfix(HOTFIX_HASH); - (bool approved, , , ) = governance.getL2HotfixRecord(HOTFIX_HASH); + (bool approved, , , ) = governance.getHotfixRecord(HOTFIX_HASH); assertTrue(approved); } function test_markHotfixRecordApprovedWhenCalledBySecurityCouncil() public { @@ -3139,7 +2999,7 @@ contract GovernanceTest_approveHotfix_L2 is GovernanceTest { vm.prank(accCouncil); governance.approveHotfix(HOTFIX_HASH); - (, bool approved, , ) = governance.getL2HotfixRecord(HOTFIX_HASH); + (, bool approved, , ) = governance.getHotfixRecord(HOTFIX_HASH); assertTrue(approved); } @@ -3174,242 +3034,13 @@ contract GovernanceTest_approveHotfix_L2 is GovernanceTest { } } -contract GovernanceTest_whitelistHotfix_setup is GovernanceTest { - bytes32 constant HOTFIX_HASH = bytes32(uint256(0x123456789)); - event HotfixWhitelisted(bytes32 indexed hash, address whitelister); -} - -contract GovernanceTest_whitelistHotfix is GovernanceTest_whitelistHotfix_setup { - function test_ShouldWhitelistHotfixByValidator() public { - address validator = actor("validator1"); - governance.addValidator(validator); - vm.prank(validator); - governance.whitelistHotfix(HOTFIX_HASH); - - assertTrue(governance.isHotfixWhitelistedBy(HOTFIX_HASH, validator)); - } - function test_Emits_HotfixWhitelistEvent() public { - address validator = actor("validator1"); - governance.addValidator(validator); - governance.addValidator(actor("validator2")); - - vm.expectEmit(true, true, true, true); - emit HotfixWhitelisted(HOTFIX_HASH, validator); - vm.prank(validator); - governance.whitelistHotfix(HOTFIX_HASH); - } -} - -contract GovernanceTest_whitelistHotfix_L2 is - GovernanceTest_L2, - GovernanceTest_whitelistHotfix_setup -{ - function test_Reverts_WhenCalled() public { - address validator = actor("validator1"); - governance.addValidator(validator); - vm.expectRevert("This method is no longer supported in L2."); - vm.prank(validator); - governance.whitelistHotfix(HOTFIX_HASH); - } -} - -contract GovernanceTest_hotfixWhitelistValidatorTally_setup is GovernanceTest { - bytes32 constant HOTFIX_HASH = bytes32(uint256(0x123456789)); - - address[] validators; - address[] signers; - - function setUp() public { - super.setUp(); - for (uint256 i = 1; i < 4; i++) { - address validator = vm.addr(i); - uint256 signerPk = i * 10; - address signer = vm.addr(signerPk); - - vm.prank(validator); - accounts.createAccount(); - authorizeValidatorSigner(signerPk, validator); - - governance.addValidator(signer); - - validators.push(validator); - signers.push(signer); - } - } -} - -contract GovernanceTest_hotfixWhitelistValidatorTally is - GovernanceTest_hotfixWhitelistValidatorTally_setup -{ - function test_countValidatorAccountsThatHaveWhitelisted() public { - for (uint256 i = 0; i < 3; i++) { - vm.prank(validators[i]); - governance.whitelistHotfix(HOTFIX_HASH); - } - - assertEq(governance.hotfixWhitelistValidatorTally(HOTFIX_HASH), 3); - } - - function test_count_authorizedValidatorSignersThatHaveWhitelisted() public { - for (uint256 i = 0; i < 3; i++) { - vm.prank(signers[i]); - governance.whitelistHotfix(HOTFIX_HASH); - } - - assertEq(governance.hotfixWhitelistValidatorTally(HOTFIX_HASH), 3); - } - - function test_notDoubleCountValidatorAccountAndAuthorizedSignerAccounts() public { - for (uint256 i = 0; i < 3; i++) { - vm.prank(validators[i]); - governance.whitelistHotfix(HOTFIX_HASH); - vm.prank(signers[i]); - governance.whitelistHotfix(HOTFIX_HASH); - } - - assertEq(governance.hotfixWhitelistValidatorTally(HOTFIX_HASH), 3); - } - - function test_returnTheCorrectTallyAfterKeyRotation() public { - for (uint256 i = 0; i < 3; i++) { - vm.prank(signers[i]); - governance.whitelistHotfix(HOTFIX_HASH); - } - - // rotate signer - uint256 signerPk = 44; - authorizeValidatorSigner(signerPk, validators[0]); - - assertEq(governance.hotfixWhitelistValidatorTally(HOTFIX_HASH), 3); - } -} - -contract GovernanceTest_hotfixWhitelistValidatorTally_L2 is - GovernanceTest_L2, - GovernanceTest_hotfixWhitelistValidatorTally_setup -{ - function test_Reverts_WhenCalled() public { - address validator = actor("validator1"); - governance.addValidator(validator); - vm.expectRevert("This method is no longer supported in L2."); - governance.hotfixWhitelistValidatorTally(HOTFIX_HASH); - } -} - -contract GovernanceTest_isHotfixPassing_setup is GovernanceTest { - bytes32 constant HOTFIX_HASH = bytes32(uint256(0x123456789)); - address validator1; - address validator2; - - function setUp() public { - super.setUp(); - validator1 = actor("validator1"); - governance.addValidator(validator1); - vm.prank(validator1); - accounts.createAccount(); - - validator2 = actor("validator2"); - governance.addValidator(validator2); - vm.prank(validator2); - accounts.createAccount(); - } -} - -contract GovernanceTest_isHotfixPassing is GovernanceTest_isHotfixPassing_setup { - function test_returnFalseWhenHotfixHasNotBeenWhitelisted() public { - assertFalse(governance.isHotfixPassing(HOTFIX_HASH)); - } - - function test_returnFalseWhenHotfixHasBeenWhitelistedButNotByQuorum() public { - vm.prank(validator1); - governance.whitelistHotfix(HOTFIX_HASH); - assertFalse(governance.isHotfixPassing(HOTFIX_HASH)); - } - - function test_returnTrueWhenHotfixIsWhitelistedByQuorum() public { - vm.prank(validator1); - governance.whitelistHotfix(HOTFIX_HASH); - vm.prank(validator2); - governance.whitelistHotfix(HOTFIX_HASH); - assertTrue(governance.isHotfixPassing(HOTFIX_HASH)); - } -} - -contract GovernanceTest_isHotfixPassing_L2 is - GovernanceTest_L2, - GovernanceTest_isHotfixPassing_setup -{ - function test_Reverts_WhenCalled() public { - vm.expectRevert("This method is no longer supported in L2."); - governance.isHotfixPassing(HOTFIX_HASH); - } -} - contract GovernanceTest_prepareHotfix is GovernanceTest { bytes32 constant HOTFIX_HASH = bytes32(uint256(0x123456789)); - address validator1; event HotfixPrepared(bytes32 indexed hash, uint256 indexed epoch); function setUp() public { super.setUp(); - validator1 = actor("validator1"); - governance.addValidator(validator1); - vm.prank(validator1); - accounts.createAccount(); - } - - function test_markHotfixRecordPreparedEpoch_whenHotfixIsPassing() public { - vm.roll(block.number + governance.getEpochSize()); - vm.prank(validator1); - governance.whitelistHotfix(HOTFIX_HASH); - governance.prepareHotfix(HOTFIX_HASH); - (, , uint256 preparedEpoch) = governance.getL1HotfixRecord(HOTFIX_HASH); - - assertEq(preparedEpoch, governance.getEpochNumber()); - } - - function test_emitHotfixPreparedEvent_whenHotfixIsPassing() public { - vm.roll(block.number + governance.getEpochSize()); - vm.prank(validator1); - governance.whitelistHotfix(HOTFIX_HASH); - uint256 epoch = governance.getEpochNumber(); - vm.expectEmit(true, true, true, true); - emit HotfixPrepared(HOTFIX_HASH, epoch); - governance.prepareHotfix(HOTFIX_HASH); - } - - function test_succeedForEpochDifferentPreparedEpoch_whenHotfixIsPassing() public { - vm.roll(block.number + governance.getEpochSize()); - vm.prank(validator1); - governance.whitelistHotfix(HOTFIX_HASH); - governance.prepareHotfix(HOTFIX_HASH); - vm.roll(block.number + governance.getEpochSize()); - governance.prepareHotfix(HOTFIX_HASH); - } - - function test_Reverts_IfHotfixIsNotPassing() public { - vm.expectRevert("hotfix not whitelisted by 2f+1 validators"); - governance.prepareHotfix(HOTFIX_HASH); - } - - function test_Reverts_IfEpochEqualsPreparedEpoch_whenHotfixIsPassing() public { - vm.roll(block.number + governance.getEpochSize()); - vm.prank(validator1); - governance.whitelistHotfix(HOTFIX_HASH); - governance.prepareHotfix(HOTFIX_HASH); - vm.expectRevert("hotfix already prepared for this epoch"); - governance.prepareHotfix(HOTFIX_HASH); - } -} - -contract GovernanceTest_prepareHotfix_L2 is GovernanceTest { - bytes32 constant HOTFIX_HASH = bytes32(uint256(0x123456789)); - event HotfixPrepared(bytes32 indexed hash, uint256 indexed epoch); - - function setUp() public { - super.setUp(); - _whenL2(); vm.prank(accOwner); governance.setSecurityCouncil(accCouncil); } @@ -3424,7 +3055,7 @@ contract GovernanceTest_prepareHotfix_L2 is GovernanceTest { governance.approveHotfix(HOTFIX_HASH); governance.prepareHotfix(HOTFIX_HASH); - (, , , uint256 preparedTimeLimit) = governance.getL2HotfixRecord(HOTFIX_HASH); + (, , , uint256 preparedTimeLimit) = governance.getHotfixRecord(HOTFIX_HASH); assertEq(preparedTimeLimit, block.timestamp + DAY); } @@ -3446,7 +3077,7 @@ contract GovernanceTest_prepareHotfix_L2 is GovernanceTest { timeTravel(DAY + 3600); governance.resetHotFixRecord(HOTFIX_HASH); - (_approved, _councilApproved, , _preparedTimeLimit) = governance.getL2HotfixRecord(HOTFIX_HASH); + (_approved, _councilApproved, , _preparedTimeLimit) = governance.getHotfixRecord(HOTFIX_HASH); assertFalse(_approved); assertFalse(_councilApproved); @@ -3458,7 +3089,7 @@ contract GovernanceTest_prepareHotfix_L2 is GovernanceTest { governance.approveHotfix(HOTFIX_HASH); governance.prepareHotfix(HOTFIX_HASH); - (_approved, _councilApproved, , _preparedTimeLimit) = governance.getL2HotfixRecord(HOTFIX_HASH); + (_approved, _councilApproved, , _preparedTimeLimit) = governance.getHotfixRecord(HOTFIX_HASH); assertTrue(_approved); assertTrue(_councilApproved); @@ -3536,43 +3167,7 @@ contract GovernanceTest_resetHotfix_setup is GovernanceTest { } } -contract GovernanceTest_resetHotfix is GovernanceTest_resetHotfix_setup { - function setUp() public { - super.setUp(); - - validator1 = actor("validator1"); - governance.addValidator(validator1); - vm.prank(validator1); - accounts.createAccount(); - } - - function test_Reverts_whenCalledOnL1() public { - vm.prank(accOwner); - governance.setHotfixExecutionTimeWindow(DAY); - - vm.prank(accApprover); - governance.approveHotfix(HOTFIX_HASH); - - (bool approved, , ) = governance.getHotfixRecord(HOTFIX_HASH); - - assertTrue(approved); - - vm.roll(block.number + governance.getEpochSize()); - vm.prank(validator1); - governance.whitelistHotfix(HOTFIX_HASH); - - uint256 epoch = governance.getEpochNumber(); - - governance.prepareHotfix(HOTFIX_HASH); - - timeTravel(DAY + 1); - - vm.expectRevert("hotfix not prepared"); - governance.resetHotFixRecord(HOTFIX_HASH); - } -} - -contract GovernanceTest_resetHotfix_L2 is GovernanceTest_L2, GovernanceTest_resetHotfix_setup { +contract GovernanceTest_resetHotfix is GovernanceTest, GovernanceTest_resetHotfix_setup { function test_ShouldResetHotfixRecordWhenExecutionTimeLimitHasPassed() public { vm.prank(accOwner); governance.setHotfixExecutionTimeWindow(DAY); @@ -3584,7 +3179,7 @@ contract GovernanceTest_resetHotfix_L2 is GovernanceTest_L2, GovernanceTest_rese governance.approveHotfix(HOTFIX_HASH); (bool approved, bool councilApproved, , uint256 _preparedTimeLimit) = governance - .getL2HotfixRecord(HOTFIX_HASH); + .getHotfixRecord(HOTFIX_HASH); assertTrue(approved); assertTrue(councilApproved); @@ -3593,7 +3188,7 @@ contract GovernanceTest_resetHotfix_L2 is GovernanceTest_L2, GovernanceTest_rese timeTravel(DAY + 1); governance.resetHotFixRecord(HOTFIX_HASH); - (approved, councilApproved, , _preparedTimeLimit) = governance.getL2HotfixRecord(HOTFIX_HASH); + (approved, councilApproved, , _preparedTimeLimit) = governance.getHotfixRecord(HOTFIX_HASH); assertFalse(approved); assertFalse(councilApproved); } @@ -3671,105 +3266,7 @@ contract GovernanceTest_executeHotfix is GovernanceTest { function setUp() public { super.setUp(); - validator = actor("validator"); - vm.prank(validator); - accounts.createAccount(); - governance.addValidator(validator); - // call governance test method to generate proper hotfix (needs calldata arguments) - hotfixHash = governance.getHotfixHash( - okProp.values, - okProp.destinations, - okProp.data, - okProp.dataLengths, - SALT - ); - } - - function test_Reverts_IfHotfixNotApproved() public { - vm.expectRevert("hotfix not approved"); - executeHotfixTx(); - } - - function test_Reverts_IfHotfixNotPreparedForCurrentEpoch() public { - vm.roll(block.number + governance.getEpochSize()); - vm.prank(accApprover); - governance.approveHotfix(hotfixHash); - - vm.expectRevert("hotfix must be prepared for this epoch"); - executeHotfixTx(); - } - - function test_Reverts_IfHotfixPreparedButNotForCurrentEpoch() public { - vm.prank(accApprover); - governance.approveHotfix(hotfixHash); - vm.prank(validator); - governance.whitelistHotfix(hotfixHash); - governance.prepareHotfix(hotfixHash); - vm.roll(block.number + governance.getEpochSize()); - vm.expectRevert("hotfix must be prepared for this epoch"); - executeHotfixTx(); - } - - function test_executeHotfix_WhenApprovedAndPreparedForCurrentEpoch() public { - approveAndPrepareHotfix(); - executeHotfixTx(); - assertEq(testTransactions.getValue(1), 1); - } - - function test_markHotfixAsExecuted_WhenApprovedAndPreparedForCurrentEpoch() public { - approveAndPrepareHotfix(); - executeHotfixTx(); - (, bool executed, ) = governance.getL1HotfixRecord(hotfixHash); - assertTrue(executed); - } - - function test_emitHotfixExecutedEvent_WhenApprovedAndPreparedForCurrentEpoch() public { - approveAndPrepareHotfix(); - vm.expectEmit(true, true, true, true); - emit HotfixExecuted(hotfixHash); - executeHotfixTx(); - } - - function test_notBeExecutableAgain_WhenApprovedAndPreparedForCurrentEpoch() public { - approveAndPrepareHotfix(); - executeHotfixTx(); - vm.expectRevert("hotfix already executed"); - executeHotfixTx(); - } - - function executeHotfixTx() private { - governance.executeHotfix( - okProp.values, - okProp.destinations, - okProp.data, - okProp.dataLengths, - SALT - ); - } - - function approveAndPrepareHotfix() private { - vm.prank(accApprover); - governance.approveHotfix(hotfixHash); - vm.roll(block.number + governance.getEpochSize()); - vm.prank(validator); - governance.whitelistHotfix(hotfixHash); - governance.prepareHotfix(hotfixHash); - } -} - -contract GovernanceTest_executeHotfix_L2 is GovernanceTest { - bytes32 SALT = 0x657ed9d64e84fa3d1af43b3a307db22aba2d90a158015df1c588c02e24ca08f0; - bytes32 hotfixHash; - - address validator; - - event HotfixExecuted(bytes32 indexed hash); - - function setUp() public { - super.setUp(); - - _whenL2(); vm.prank(accOwner); governance.setSecurityCouncil(accCouncil); vm.prank(accOwner); @@ -3795,7 +3292,7 @@ contract GovernanceTest_executeHotfix_L2 is GovernanceTest { approveAndPrepareHotfix(); executeHotfixTx(); - (, , bool executed, ) = governance.getL2HotfixRecord(hotfixHash); + (, , bool executed, ) = governance.getHotfixRecord(hotfixHash); assertTrue(executed); } @@ -3916,8 +3413,6 @@ contract GovernanceTest_isVoting is GovernanceTest { } } -contract GovernanceTest_isVoting_L2 is GovernanceTest_L2, GovernanceTest_isVoting {} - contract GovernanceTest_isProposalPassing is GovernanceTest { address accSndVoter; @@ -3960,11 +3455,6 @@ contract GovernanceTest_isProposalPassing is GovernanceTest { } } -contract GovernanceTest_isProposalPassing_L2 is - GovernanceTest_L2, - GovernanceTest_isProposalPassing -{} - contract GovernanceTest_dequeueProposalsIfReady is GovernanceTest { function test_notUpdateLastDequeueWhenThereAreNoQueuedProposals() public { uint256 originalLastDequeue = governance.lastDequeue(); @@ -3998,11 +3488,6 @@ contract GovernanceTest_dequeueProposalsIfReady is GovernanceTest { } } -contract GovernanceTest_dequeueProposalsIfReady_L2 is - GovernanceTest_L2, - GovernanceTest_dequeueProposalsIfReady -{} - contract GovernanceTest_getProposalStage is GovernanceTest { function test_returnNoneStageWhenProposalDoesNotExists() public { assertEq(uint256(governance.getProposalStage(0)), uint256(Proposals.Stage.None)); @@ -4135,8 +3620,6 @@ contract GovernanceTest_getProposalStage is GovernanceTest { } } -contract GovernanceTest_getProposalStage_L2 is GovernanceTest_L2, GovernanceTest_getProposalStage {} - contract GovernanceTest_getAmountOfGoldUsedForVoting is GovernanceTest { function test_showCorrectNumberOfVotes_whenVotingOn1ConcurrentProposal() public { makeAndApprove3ConcurrentProposals(); @@ -4265,11 +3748,6 @@ contract GovernanceTest_getAmountOfGoldUsedForVoting is GovernanceTest { } } -contract GovernanceTest_getAmountOfGoldUsedForVoting_L2 is - GovernanceTest_L2, - GovernanceTest_getAmountOfGoldUsedForVoting -{} - contract GovernanceTest_removeVotesWhenRevokingDelegatedVotes is GovernanceTest { uint256[] proposalIds; @@ -4430,8 +3908,3 @@ contract GovernanceTest_removeVotesWhenRevokingDelegatedVotes is GovernanceTest assertVoteRecord(2, proposalIds[2], 0, 0, 51); } } - -contract GovernanceTest_removeVotesWhenRevokingDelegatedVotes_L2 is - GovernanceTest_L2, - GovernanceTest_removeVotesWhenRevokingDelegatedVotes -{} diff --git a/packages/protocol/test-sol/unit/governance/network/GovernanceSlasher.t.sol b/packages/protocol/test-sol/unit/governance/network/GovernanceSlasher.t.sol index eb8e8599c95..c4dd69111d8 100644 --- a/packages/protocol/test-sol/unit/governance/network/GovernanceSlasher.t.sol +++ b/packages/protocol/test-sol/unit/governance/network/GovernanceSlasher.t.sol @@ -2,7 +2,6 @@ pragma solidity ^0.5.13; import { TestWithUtils } from "@test-sol/TestWithUtils.sol"; -import "@test-sol/utils/WhenL2.sol"; import "@celo-contracts/common/Accounts.sol"; import "@celo-contracts/common/FixidityLib.sol"; @@ -13,8 +12,7 @@ import "@celo-contracts/governance/GovernanceSlasher.sol"; contract GovernanceSlasherTest is TestWithUtils { event SlashingApproved(address indexed account, uint256 amount); - event GovernanceSlashPerformed(address indexed account, uint256 amount); - event GovernanceSlashL2Performed(address indexed account, address indexed group, uint256 amount); + event GovernanceSlashPerformed(address indexed account, address indexed group, uint256 amount); event HavelSlashingMultiplierHalved(address validator); event ValidatorDeaffiliatedCalled(address validator); @@ -34,6 +32,13 @@ contract GovernanceSlasherTest is TestWithUtils { function setUp() public { super.setUp(); + preSetup(); + governanceSlasher.initialize(REGISTRY_ADDRESS); + mockLockedGold.setAccountTotalLockedGold(validator, 5000); + whenL2WithEpochManagerInitialization(); + } + + function preSetup() public { owner = address(this); nonOwner = actor("nonOwner"); validator = actor("validator"); @@ -46,15 +51,16 @@ contract GovernanceSlasherTest is TestWithUtils { registry.setAddressFor("Accounts", address(accounts)); registry.setAddressFor("LockedGold", address(mockLockedGold)); - - governanceSlasher.initialize(REGISTRY_ADDRESS); - mockLockedGold.setAccountTotalLockedGold(validator, 5000); } } -contract GovernanceSlasherTest_L2 is GovernanceSlasherTest, WhenL2 {} - contract GovernanceSlasherTest_initialize is GovernanceSlasherTest { + function setUp() public { + super.setUp(); + preSetup(); + governanceSlasher.initialize(REGISTRY_ADDRESS); + mockLockedGold.setAccountTotalLockedGold(validator, 5000); + } function test_shouldHaveSetOwner() public { assertEq(governanceSlasher.owner(), owner); } @@ -83,7 +89,7 @@ contract GovernanceSlasherTest_approveSlashing is GovernanceSlasherTest { governanceSlasher.approveSlashing(slashedAddress, 1000); } - function test_EmitsSlashingApprovedEvent() public { + function test_Emits_SlashingApprovedEvent() public { vm.expectEmit(true, true, true, true); emit SlashingApproved(slashedAddress, 1000); governanceSlasher.approveSlashing(slashedAddress, 1000); @@ -96,87 +102,40 @@ contract GovernanceSlasherTest_approveSlashing is GovernanceSlasherTest { } } -contract GovernanceSlasherTest_approveSlashing_L2 is - GovernanceSlasherTest_L2, - GovernanceSlasherTest_approveSlashing -{} +contract GovernanceSlasherTest_slash_WhenNotGroup is GovernanceSlasherTest { + address group = address(0); -contract GovernanceSlasherTest_slash is GovernanceSlasherTest { - function test_ShouldFailIfThereIsNothingToSlash() public { - vm.expectRevert("No penalty given by governance"); - governanceSlasher.slash(validator, lessers, greaters, indices); - } + // only owner or multisig can call function test_ShouldDecrementCelo() public { governanceSlasher.approveSlashing(validator, 1000); - governanceSlasher.slash(validator, lessers, greaters, indices); + governanceSlasher.slashL2(validator, group, lessers, greaters, indices); assertEq(mockLockedGold.accountTotalLockedGold(validator), 4000); } function test_ShouldHaveSetTheApprovedSlashingToZero() public { governanceSlasher.approveSlashing(validator, 1000); - governanceSlasher.slash(validator, lessers, greaters, indices); + governanceSlasher.slashL2(validator, group, lessers, greaters, indices); assertEq(governanceSlasher.getApprovedSlashing(validator), 0); } - function test_EmitsGovernanceSlashPerformedEvent() public { + function test_Emits_GovernanceSlashPerformedEventWhenCallingSlashL2() public { governanceSlasher.approveSlashing(validator, 1000); vm.expectEmit(true, true, true, true); - emit GovernanceSlashPerformed(validator, 1000); - governanceSlasher.slash(validator, lessers, greaters, indices); - } -} - -contract GovernanceSlasherTest_slash_L2 is GovernanceSlasherTest_L2 { - function test_Reverts_WhenL2() public { - governanceSlasher.approveSlashing(validator, 1000); - vm.expectRevert("This method is no longer supported in L2."); - governanceSlasher.slash(validator, lessers, greaters, indices); - } -} - -contract GovernanceSlasherTest_slashL2_WhenL1 is GovernanceSlasherTest { - function test_Reverts() public { - governanceSlasher.approveSlashing(validator, 1000); - vm.expectRevert("This method is not supported in L1."); - governanceSlasher.slashL2( - validator, - validator, - new address[](0), - new address[](0), - new uint256[](0) - ); - } -} - -// should work just like the deprecated version -contract GovernanceSlasherTest_slashL2_WhenNotGroup_L2 is GovernanceSlasherTest_L2 { - address group = address(0); - - // only onwer or multisig can call - - function test_ShouldDecrementCelo() public { - governanceSlasher.approveSlashing(validator, 1000); + emit GovernanceSlashPerformed(validator, group, 1000); governanceSlasher.slashL2(validator, group, lessers, greaters, indices); - assertEq(mockLockedGold.accountTotalLockedGold(validator), 4000); } - function test_ShouldHaveSetTheApprovedSlashingToZero() public { - governanceSlasher.approveSlashing(validator, 1000); - governanceSlasher.slashL2(validator, group, lessers, greaters, indices); - assertEq(governanceSlasher.getApprovedSlashing(validator), 0); - } - - function test_EmitsGovernanceSlashPerformedEvent() public { + function test_Emits_GovernanceSlashPerformedEventWhenCallingSlash() public { governanceSlasher.approveSlashing(validator, 1000); vm.expectEmit(true, true, true, true); - emit GovernanceSlashL2Performed(validator, group, 1000); - governanceSlasher.slashL2(validator, group, lessers, greaters, indices); + emit GovernanceSlashPerformed(validator, group, 1000); + governanceSlasher.slash(validator, group, lessers, greaters, indices); } } // should work just like the deprecated version -contract GovernanceSlasherTest_slashL2_WhenGroup_L2 is GovernanceSlasherTest_L2 { +contract GovernanceSlasherTest_slash_WhenGroup is GovernanceSlasherTest { address group; MockValidators validators; @@ -212,13 +171,20 @@ contract GovernanceSlasherTest_slashL2_WhenGroup_L2 is GovernanceSlasherTest_L2 assertEq(governanceSlasher.getApprovedSlashing(validator), 0); } - function test_EmitsGovernanceSlashPerformedEvent() public { + function test_Emits_GovernanceSlashPerformedEvent() public { governanceSlasher.approveSlashing(validator, 1000); vm.expectEmit(true, true, true, true); - emit GovernanceSlashL2Performed(validator, group, 1000); + emit GovernanceSlashPerformed(validator, group, 1000); governanceSlasher.slashL2(validator, group, lessers, greaters, indices); } + function test_Emits_GovernanceSlashPerformedEventWhenCallingSlash() public { + governanceSlasher.approveSlashing(validator, 1000); + vm.expectEmit(true, true, true, true); + emit GovernanceSlashPerformed(validator, group, 1000); + governanceSlasher.slash(validator, group, lessers, greaters, indices); + } + function test_validatorDeAffiliatedAndScoreReduced() public { governanceSlasher.approveSlashing(validator, 100); @@ -255,8 +221,3 @@ contract GovernanceSlasherTest_setSlasherExecuter is GovernanceSlasherTest { assertEq(governanceSlasher.getSlasherExecuter(), nonOwner, "Score Manager not set"); } } - -contract GovernanceSlasherTest_setSlasherExecuter_L2 is - GovernanceSlasherTest_L2, - GovernanceSlasherTest_setSlasherExecuter -{} diff --git a/packages/protocol/test-sol/unit/governance/network/Proposal.t.sol b/packages/protocol/test-sol/unit/governance/network/Proposal.t.sol index a065a005767..8bd5e4d713d 100644 --- a/packages/protocol/test-sol/unit/governance/network/Proposal.t.sol +++ b/packages/protocol/test-sol/unit/governance/network/Proposal.t.sol @@ -1,25 +1,24 @@ // SPDX-License-Identifier: UNLICENSED pragma solidity ^0.5.13; -import "celo-foundry/Test.sol"; -import "@test-sol/utils/WhenL2.sol"; +import { TestWithUtils } from "@test-sol/TestWithUtils.sol"; import "@celo-contracts/governance/Proposals.sol"; import "@celo-contracts/common/FixidityLib.sol"; -contract ProposalTest is Test { +contract ProposalTest is TestWithUtils { using Proposals for Proposals.Proposal; using FixidityLib for FixidityLib.Fraction; Proposals.Proposal internal proposal; function setUp() public { + super.setUp(); proposal.networkWeight = 100; + whenL2WithEpochManagerInitialization(); } } -contract ProposalTest_L2 is WhenL2, ProposalTest {} - contract ProposalTest_getSupportWithQuorumPadding is ProposalTest { function test_ShouldReturnSupportRatioWhenParticipationAboveCriticalBaseline() public { proposal.votes.yes = 15; @@ -52,8 +51,3 @@ contract ProposalTest_getSupportWithQuorumPadding is ProposalTest { assertEq(proposal.getSupportWithQuorumPadding(quorum).unwrap(), 0); } } - -contract ProposalTest_getSupportWithQuorumPadding_L2 is - ProposalTest_L2, - ProposalTest_getSupportWithQuorumPadding -{} diff --git a/packages/protocol/test-sol/unit/governance/validators/DoubleSigningSlasher.t.sol b/packages/protocol/test-sol/unit/governance/validators/DoubleSigningSlasher.t.sol index 7edde24c64b..0456c3f229e 100644 --- a/packages/protocol/test-sol/unit/governance/validators/DoubleSigningSlasher.t.sol +++ b/packages/protocol/test-sol/unit/governance/validators/DoubleSigningSlasher.t.sol @@ -2,8 +2,7 @@ pragma solidity ^0.5.13; pragma experimental ABIEncoderV2; -import "celo-foundry/Test.sol"; -import { TestConstants } from "@test-sol/constants.sol"; +import { TestWithUtils } from "@test-sol/TestWithUtils.sol"; import "@celo-contracts/common/FixidityLib.sol"; import "@celo-contracts/common/Registry.sol"; @@ -13,7 +12,11 @@ import "@celo-contracts/governance/test/MockLockedGold.sol"; import "@celo-contracts/governance/DoubleSigningSlasher.sol"; import "@celo-contracts/governance/test/MockUsingPrecompiles.sol"; -contract DoubleSigningSlasherTest is DoubleSigningSlasher(true), MockUsingPrecompiles, Test { +contract DoubleSigningSlasherTest is + DoubleSigningSlasher(true), + MockUsingPrecompiles, + TestWithUtils +{ struct SlashParams { address signer; uint256 index; @@ -51,12 +54,11 @@ contract DoubleSigningSlasherTest is DoubleSigningSlasher(true), MockUsingPrecom } } -contract DoubleSigningSlasherBaseTest is Test, TestConstants { +contract DoubleSigningSlasherBaseTest is TestWithUtils { using FixidityLib for FixidityLib.Fraction; SlashingIncentives public expectedSlashingIncentives; - Registry registry; Accounts accounts; MockValidators validators; MockLockedGold lockedGold; @@ -96,6 +98,7 @@ contract DoubleSigningSlasherBaseTest is Test, TestConstants { event DoubleSigningSlashPerformed(address indexed validator, uint256 indexed blockNumber); function setUp() public { + super.setUp(); ph.setEpochSize(100); (nonOwner, nonOwnerPK) = actorWithPK("nonOwner"); (validator, validatorPK) = actorWithPK("validator"); @@ -111,8 +114,6 @@ contract DoubleSigningSlasherBaseTest is Test, TestConstants { lockedGold = new MockLockedGold(); slasher = new DoubleSigningSlasherTest(); - registry = Registry(REGISTRY_ADDRESS); - accounts.createAccount(); vm.prank(nonOwner); @@ -153,10 +154,7 @@ contract DoubleSigningSlasherBaseTest is Test, TestConstants { lockedGold.setAccountTotalLockedGold(otherValidator, 50000); lockedGold.setAccountTotalLockedGold(group, 50000); lockedGold.setAccountTotalLockedGold(otherGroup, 50000); - } - - function _whenL2() public { - deployCodeTo("Registry.sol", abi.encode(false), PROXY_ADMIN_ADDRESS); + whenL2WithEpochManagerInitialization(); } } @@ -178,40 +176,10 @@ contract DoubleSigningSlasherInitialize is DoubleSigningSlasherBaseTest { } contract DoubleSigningSlasherSetSlashingIncentives is DoubleSigningSlasherBaseTest { - function test_RevertWhen_CalledByNonOwner() public { - vm.prank(nonOwner); - vm.expectRevert("Ownable: caller is not the owner"); - slasher.setSlashingIncentives(123, 67); - } - - function test_RevertWhen_RewardGreaterThanPenalty() public { - vm.expectRevert("Penalty has to be larger than reward"); - slasher.setSlashingIncentives(123, 678); - } - - function test_ShouldSetSlashingIncentives() public { - uint256 newPenalty = 123; - uint256 newReward = 67; - slasher.setSlashingIncentives(newPenalty, newReward); - - (uint256 actualPenalty, uint256 actualReward) = slasher.slashingIncentives(); - assertEq(actualPenalty, newPenalty); - assertEq(actualReward, newReward); - } - - function test_Emits_SlashingIncentivesSetEvent() public { - uint256 newPenalty = 123; - uint256 newReward = 67; - vm.expectEmit(true, true, true, true); - emit SlashingIncentivesSet(newPenalty, newReward); - slasher.setSlashingIncentives(newPenalty, newReward); - } - function test_ShouldRevert_WhenInL2() public { uint256 newPenalty = 123; uint256 newReward = 67; - _whenL2(); vm.expectRevert("This method is no longer supported in L2."); slasher.setSlashingIncentives(newPenalty, newReward); } @@ -235,181 +203,7 @@ contract DoubleSigningSlasherSlash is DoubleSigningSlasherBaseTest { address[] groupElectionGreaters = new address[](0); uint256[] groupElectionIndices = new uint256[](0); - function setUp() public { - super.setUp(); - - slasher.setBlockNumber(headerA, blockNumber); - slasher.setBlockNumber(headerB, blockNumber + 1); - slasher.setBlockNumber(headerC, blockNumber); - - epoch = slasher.getEpochNumberOfBlock(blockNumber); - - slasher.setEpochSigner(epoch, validatorIndex, address(validator)); - slasher.setEpochSigner(epoch, otherValidatorIndex, address(otherValidator)); - - slasher.setNumberValidators(7); - slasher.setVerifiedSealBitmap(headerA, bitmap); - slasher.setVerifiedSealBitmap(headerB, bitmap); - slasher.setVerifiedSealBitmap(headerC, bitmap); - } - - function test_RevertIf_BlockNumbersDoNotMatch() public { - params = DoubleSigningSlasherTest.SlashParams({ - signer: validator, - index: validatorIndex, - headerA: headerA, - headerB: headerB, - groupMembershipHistoryIndex: 0, - validatorElectionLessers: validatorElectionLessers, - validatorElectionGreaters: validatorElectionGreaters, - validatorElectionIndices: validatorElectionIndices, - groupElectionLessers: groupElectionLessers, - groupElectionGreaters: groupElectionGreaters, - groupElectionIndices: groupElectionIndices - }); - vm.expectRevert("Block headers are from different height"); - slasher.mockSlash(params, validator); - } - - function test_RevertIf_NotSignedAtIndex() public { - params = DoubleSigningSlasherTest.SlashParams({ - signer: otherValidator, - index: otherValidatorIndex, - headerA: headerA, - headerB: headerC, - groupMembershipHistoryIndex: 0, - validatorElectionLessers: validatorElectionLessers, - validatorElectionGreaters: validatorElectionGreaters, - validatorElectionIndices: validatorElectionIndices, - groupElectionLessers: groupElectionLessers, - groupElectionGreaters: groupElectionGreaters, - groupElectionIndices: groupElectionIndices - }); - - vm.expectRevert("Didn't sign first block"); - - slasher.mockSlash(params, otherValidator); - } - - function test_RevertIf_EpochSignerIsWrong() public { - params = DoubleSigningSlasherTest.SlashParams({ - signer: otherValidator, - index: validatorIndex, - headerA: headerA, - headerB: headerC, - groupMembershipHistoryIndex: 0, - validatorElectionLessers: validatorElectionLessers, - validatorElectionGreaters: validatorElectionGreaters, - validatorElectionIndices: validatorElectionIndices, - groupElectionLessers: groupElectionLessers, - groupElectionGreaters: groupElectionGreaters, - groupElectionIndices: groupElectionIndices - }); - - vm.expectRevert("Wasn't a signer with given index"); - slasher.mockSlash(params, validator); - } - - function test_RevertIf_NotEnoughSigners() public { - slasher.setNumberValidators(100); - - params = DoubleSigningSlasherTest.SlashParams({ - signer: validator, - index: validatorIndex, - headerA: headerA, - headerB: headerC, - groupMembershipHistoryIndex: 0, - validatorElectionLessers: validatorElectionLessers, - validatorElectionGreaters: validatorElectionGreaters, - validatorElectionIndices: validatorElectionIndices, - groupElectionLessers: groupElectionLessers, - groupElectionGreaters: groupElectionGreaters, - groupElectionIndices: groupElectionIndices - }); - - vm.expectRevert("Not enough signers in the first block"); - slasher.mockSlash(params, validator); - } - - function test_Emits_DoubleSigningSlashPerformedEvent() public { - params = DoubleSigningSlasherTest.SlashParams({ - signer: validator, - index: validatorIndex, - headerA: headerA, - headerB: headerC, - groupMembershipHistoryIndex: 0, - validatorElectionLessers: validatorElectionLessers, - validatorElectionGreaters: validatorElectionGreaters, - validatorElectionIndices: validatorElectionIndices, - groupElectionLessers: groupElectionLessers, - groupElectionGreaters: groupElectionGreaters, - groupElectionIndices: groupElectionIndices - }); - vm.expectEmit(true, true, true, true); - emit DoubleSigningSlashPerformed(validator, blockNumber); - slasher.mockSlash(params, validator); - } - - function test_ShouldDecrementCELO() public { - params = DoubleSigningSlasherTest.SlashParams({ - signer: validator, - index: validatorIndex, - headerA: headerA, - headerB: headerC, - groupMembershipHistoryIndex: 0, - validatorElectionLessers: validatorElectionLessers, - validatorElectionGreaters: validatorElectionGreaters, - validatorElectionIndices: validatorElectionIndices, - groupElectionLessers: groupElectionLessers, - groupElectionGreaters: groupElectionGreaters, - groupElectionIndices: groupElectionIndices - }); - - slasher.mockSlash(params, validator); - - assertEq(lockedGold.accountTotalLockedGold(validator), 40000); - } - - function test_ShouldAlsoSlashGroup() public { - params = DoubleSigningSlasherTest.SlashParams({ - signer: validator, - index: validatorIndex, - headerA: headerA, - headerB: headerC, - groupMembershipHistoryIndex: 0, - validatorElectionLessers: validatorElectionLessers, - validatorElectionGreaters: validatorElectionGreaters, - validatorElectionIndices: validatorElectionIndices, - groupElectionLessers: groupElectionLessers, - groupElectionGreaters: groupElectionGreaters, - groupElectionIndices: groupElectionIndices - }); - - slasher.mockSlash(params, validator); - assertEq(lockedGold.accountTotalLockedGold(group), 40000); - } - - function test_RevertWhen_SlashedTwice() public { - params = DoubleSigningSlasherTest.SlashParams({ - signer: validator, - index: validatorIndex, - headerA: headerA, - headerB: headerC, - groupMembershipHistoryIndex: 0, - validatorElectionLessers: validatorElectionLessers, - validatorElectionGreaters: validatorElectionGreaters, - validatorElectionIndices: validatorElectionIndices, - groupElectionLessers: groupElectionLessers, - groupElectionGreaters: groupElectionGreaters, - groupElectionIndices: groupElectionIndices - }); - slasher.mockSlash(params, validator); - vm.expectRevert("Already slashed"); - slasher.mockSlash(params, validator); - } - function test_Reverts_WhenL2() public { - _whenL2(); params = DoubleSigningSlasherTest.SlashParams({ signer: validator, index: validatorIndex, diff --git a/packages/protocol/test-sol/unit/governance/validators/DowntimeSlasher.t.sol b/packages/protocol/test-sol/unit/governance/validators/DowntimeSlasher.t.sol index 35416ebe177..63a9edb4085 100644 --- a/packages/protocol/test-sol/unit/governance/validators/DowntimeSlasher.t.sol +++ b/packages/protocol/test-sol/unit/governance/validators/DowntimeSlasher.t.sol @@ -2,8 +2,6 @@ pragma solidity ^0.5.13; pragma experimental ABIEncoderV2; -import "celo-foundry/Test.sol"; - import "openzeppelin-solidity/contracts/math/SafeMath.sol"; import "@celo-contracts/common/FixidityLib.sol"; import "@celo-contracts/common/Registry.sol"; @@ -14,7 +12,7 @@ import "@celo-contracts/governance/DowntimeSlasher.sol"; import "@celo-contracts/governance/test/MockUsingPrecompiles.sol"; import { TestWithUtils } from "@test-sol/TestWithUtils.sol"; -contract DowntimeSlasherMock is DowntimeSlasher(true), MockUsingPrecompiles, Test { +contract DowntimeSlasherMock is DowntimeSlasher(true), MockUsingPrecompiles, TestWithUtils { struct SlashParams { uint256[] startBlocks; uint256[] endBlocks; @@ -87,7 +85,6 @@ contract DowntimeSlasherTest is TestWithUtils { uint256 reward; } - Registry registry; Accounts accounts; MockValidators validators; MockLockedGold lockedGold; @@ -162,6 +159,7 @@ contract DowntimeSlasherTest is TestWithUtils { ); function setUp() public { + super.setUp(); ph.setEpochSize(100); (nonOwner, nonOwnerPK) = actorWithPK("nonOwner"); (validator, validatorPK) = actorWithPK("validator"); @@ -226,6 +224,7 @@ contract DowntimeSlasherTest is TestWithUtils { lockedGold.setAccountTotalLockedGold(otherValidator1, 50000); lockedGold.setAccountTotalLockedGold(group, 50000); lockedGold.setAccountTotalLockedGold(otherGroup, 50000); + whenL2WithEpochManagerInitialization(); } // This function will wait until the middle of a new epoch is reached. @@ -347,77 +346,22 @@ contract DowntimeSlasherTestInitialize is DowntimeSlasherTest { } contract DowntimeSlasherTestSetIncentives is DowntimeSlasherTest { - function test_CanOnlyBeCalledByOwner() public { - vm.expectRevert("Ownable: caller is not the owner"); - - vm.prank(nonOwner); - slasher.setSlashingIncentives(slashingPenalty, slashingReward); - } - - function test_ShouldHaveSetSlashingIncentives() public { - uint256 _newPenalty = 123; - uint256 _newReward = 67; - slasher.setSlashingIncentives(_newPenalty, _newReward); - - (uint256 _penalty, uint256 _reward) = slasher.slashingIncentives(); - - assertEq(_penalty, _newPenalty); - assertEq(_reward, _newReward); - } - function test_Reverts_WhenInL2() public { uint256 _newPenalty = 123; uint256 _newReward = 67; - _whenL2(); + vm.expectRevert("This method is no longer supported in L2."); slasher.setSlashingIncentives(_newPenalty, _newReward); } - - function test_Reverts_WhenRewardLargerThanPenalty() public { - vm.expectRevert("Penalty has to be larger than reward"); - slasher.setSlashingIncentives(123, 678); - } - - function test_Emits_SlashingIncentivesSetEvent() public { - vm.expectEmit(true, true, true, true); - emit SlashingIncentivesSet(123, 67); - - slasher.setSlashingIncentives(123, 67); - } } contract DowntimeSlasherTestSetSlashableDowntime is DowntimeSlasherTest { - function test_CanOnlyBeCalledByOwner() public { - vm.expectRevert("Ownable: caller is not the owner"); - - vm.prank(nonOwner); - slasher.setSlashableDowntime(slashableDowntime); - } - - function test_ShouldHaveSetSlashableDowntime() public { - uint256 _newSlashableDowntime = 23; - - slasher.setSlashableDowntime(_newSlashableDowntime); - - uint256 _slashableDowntime = slasher.slashableDowntime(); - - assertEq(_slashableDowntime, _newSlashableDowntime); - } - function test_Reverts_WhenInL2() public { uint256 _newSlashableDowntime = 23; - _whenL2(); vm.expectRevert("This method is no longer supported in L2."); slasher.setSlashableDowntime(_newSlashableDowntime); } - - function test_Emits_SlashableDowntimeSetEvent() public { - vm.expectEmit(true, true, true, true); - emit SlashableDowntimeSet(23); - - slasher.setSlashableDowntime(23); - } } contract DowntimeSlasherTestGetBitmapForInterval is DowntimeSlasherTest { @@ -432,35 +376,15 @@ contract DowntimeSlasherTestGetBitmapForInterval is DowntimeSlasherTest { slasher.setEpochSigner(epoch, 0, validator); } - function test_Reverts_IfEndBlockIsLessThanStartBlock() public { - vm.expectRevert("endBlock must be greater or equal than startBlock"); - slasher.getBitmapForInterval(3, 2); - } - - function test_Reverts_IfCurrentBlockIsPartOfInterval() public { - vm.expectRevert("the signature bitmap for endBlock is not yet available"); - slasher.getBitmapForInterval(blockNumber, blockNumber); - } - - function test_Reverts_IfBlockIsOlderThan4Epochs() public { + function test_Reverts_WhenL2() public { epochSize = ph.epochSize(); blockTravel(epochSize.mul(4).add(2)); uint256 _blockNumber = block.number.sub(epochSize.mul(4)); - vm.expectRevert("startBlock must be within 4 epochs of the current head"); + vm.expectRevert("This method is no longer supported in L2."); slasher.getBitmapForInterval(_blockNumber, _blockNumber); } - - function test_Reverts_IfStartBlockAndEndBlockAreNotFromSameEpoch() public { - blockTravel(ph.epochSize()); - - uint256 _currentEpoch = ph.epochSize(); - uint256 _blockNumber = block.number.sub(2); - - vm.expectRevert("startBlock and endBlock must be in the same epoch"); - slasher.getBitmapForInterval(_blockNumber.sub(_currentEpoch), _blockNumber); - } } contract DowntimeSlasherTestSetBitmapForInterval is DowntimeSlasherTest { @@ -491,32 +415,13 @@ contract DowntimeSlasherTestSetBitmapForInterval is DowntimeSlasherTest { ); } - function test_Reverts_IfIntervalWasAlreadySet() public { - slasher.setBitmapForInterval(blockNumber, blockNumber.add(1)); - vm.expectRevert("bitmap already set"); - - slasher.setBitmapForInterval(blockNumber, blockNumber.add(1)); - } - - function test_Emits_BitmapSetForIntervalEvent() public { - vm.expectEmit(true, true, true, true); - emit BitmapSetForInterval( - address(this), - blockNumber, - blockNumber.add(1), - bytes32(0x0000000000000000000000000000000000000000000000000000000000000003) - ); - slasher.setBitmapForInterval(blockNumber, blockNumber.add(1)); - } - function test_Reverts_WhenInL2_SetBitmapForInterval() public { - _whenL2(); vm.expectRevert("This method is no longer supported in L2."); slasher.setBitmapForInterval(blockNumber, blockNumber.add(1)); } } -contract DowntimeSlasherTestSlash_WhenIntervalInSameEpoch is DowntimeSlasherTest { +contract DowntimeSlasherTestSlash_WhenSlashing is DowntimeSlasherTest { uint256[] private _signerIndices = new uint256[](1); address[] private _validatorsList = new address[](1); bytes32[] private _bitmaps0 = new bytes32[](1); @@ -562,132 +467,6 @@ contract DowntimeSlasherTestSlash_WhenIntervalInSameEpoch is DowntimeSlasherTest slasher.mockSlash(slashParams, _validatorsList); } - function test_Reverts_IfFirstBlockWasSigned_WhenSlashableIntervalInSameEpoch() public { - slasher.setEpochSigner(epoch, validatorIndexInEpoch, validator); - uint256 startBlock = _getFirstBlockNumberOfEpoch(epoch); - - _bitmaps0[0] = bitmapWithoutValidator[validatorIndexInEpoch]; - _presetParentSealForBlock(startBlock.add(1), slashableDowntime.sub(1), _bitmaps0); - // First block with every validator signatures - _bitmaps1[0] = bitmapVI01; - _presetParentSealForBlock(startBlock, 1, _bitmaps1); - - (uint256[] memory startBlocks, uint256[] memory endBlocks) = _calculateEverySlot(startBlock); - - slashParams = DowntimeSlasherMock.SlashParams({ - startBlocks: startBlocks, - endBlocks: endBlocks, - signerIndices: _signerIndices, - groupMembershipHistoryIndex: 0, - validatorElectionLessers: validatorElectionLessers, - validatorElectionGreaters: validatorElectionGreaters, - validatorElectionIndices: validatorElectionIndices, - groupElectionLessers: groupElectionLessers, - groupElectionGreaters: groupElectionGreaters, - groupElectionIndices: groupElectionIndices - }); - - vm.expectRevert("not down"); - - slasher.mockSlash(slashParams, _validatorsList); - } - - function test_Reverts_IfLastBlockWasSigned() public { - uint256 startBlock = _getFirstBlockNumberOfEpoch(epoch); - - _bitmaps0[0] = bitmapWithoutValidator[validatorIndexInEpoch]; - _presetParentSealForBlock(startBlock, slashableDowntime.sub(1), _bitmaps0); - - // Last block with every validator signatures - _bitmaps1[0] = bitmapVI01; - _presetParentSealForBlock(startBlock.add(slashableDowntime.sub(1)), 1, _bitmaps1); - - (uint256[] memory startBlocks, uint256[] memory endBlocks) = _calculateEverySlot(startBlock); - - slashParams = DowntimeSlasherMock.SlashParams({ - startBlocks: startBlocks, - endBlocks: endBlocks, - signerIndices: _signerIndices, - groupMembershipHistoryIndex: 0, - validatorElectionLessers: validatorElectionLessers, - validatorElectionGreaters: validatorElectionGreaters, - validatorElectionIndices: validatorElectionIndices, - groupElectionLessers: groupElectionLessers, - groupElectionGreaters: groupElectionGreaters, - groupElectionIndices: groupElectionIndices - }); - - vm.expectRevert("not down"); - - slasher.mockSlash(slashParams, _validatorsList); - } - - function test_Reverts_IfOneBlockInTheMiddleWasSigned() public { - uint256 startBlock = _getFirstBlockNumberOfEpoch(epoch); - - // Set the parentSeal bitmaps for every block without the validator's signature - _bitmaps0[0] = bitmapWithoutValidator[validatorIndexInEpoch]; - _presetParentSealForBlock(startBlock, slashableDowntime, _bitmaps0); - - // Middle block with every validator signatures - _bitmaps1[0] = bitmapVI01; - _presetParentSealForBlock(startBlock.add(intervalSize), 1, _bitmaps1); - - (uint256[] memory startBlocks, uint256[] memory endBlocks) = _calculateEverySlot(startBlock); - - slashParams = DowntimeSlasherMock.SlashParams({ - startBlocks: startBlocks, - endBlocks: endBlocks, - signerIndices: _signerIndices, - groupMembershipHistoryIndex: 0, - validatorElectionLessers: validatorElectionLessers, - validatorElectionGreaters: validatorElectionGreaters, - validatorElectionIndices: validatorElectionIndices, - groupElectionLessers: groupElectionLessers, - groupElectionGreaters: groupElectionGreaters, - groupElectionIndices: groupElectionIndices - }); - - vm.expectRevert("not down"); - - slasher.mockSlash(slashParams, _validatorsList); - } - - function test_Reverts_IfFirstBlockSignedUsingBigIndex() public { - slasher.setNumberValidators(100); - slasher.setEpochSigner(epoch, 99, otherValidator1); - - uint256 startBlock = _getFirstBlockNumberOfEpoch(epoch); - _signerIndices[0] = 99; - _validatorsList[0] = otherValidator1; - // Set the parentSeal bitmaps for every block without the validator's signature - _bitmaps0[0] = bitmapVI0; - _presetParentSealForBlock(startBlock.add(1), slashableDowntime.sub(1), _bitmaps0); - - // Middle block with every validator signatures - _bitmaps1[0] = bitmapVI99; - _presetParentSealForBlock(startBlock, 1, _bitmaps1); - - (uint256[] memory startBlocks, uint256[] memory endBlocks) = _calculateEverySlot(startBlock); - - slashParams = DowntimeSlasherMock.SlashParams({ - startBlocks: startBlocks, - endBlocks: endBlocks, - signerIndices: _signerIndices, - groupMembershipHistoryIndex: 0, - validatorElectionLessers: validatorElectionLessers, - validatorElectionGreaters: validatorElectionGreaters, - validatorElectionIndices: validatorElectionIndices, - groupElectionLessers: groupElectionLessers, - groupElectionGreaters: groupElectionGreaters, - groupElectionIndices: groupElectionIndices - }); - - vm.expectRevert("not down"); - - slasher.mockSlash(slashParams, _validatorsList); - } - function test_Reverts_WhenL2_IfIntervalsOverlap_WhenIntervalCoverSlashableDowntimeWindow() public { @@ -702,8 +481,6 @@ contract DowntimeSlasherTestSlash_WhenIntervalInSameEpoch is DowntimeSlasherTest _endBlocks[0] = startBlock.add(slashableDowntime.sub(3)); _endBlocks[1] = startBlock.add(slashableDowntime.sub(1)); - _generateProofs(_startBlocks, _endBlocks); - slashParams = DowntimeSlasherMock.SlashParams({ startBlocks: _startBlocks, endBlocks: _endBlocks, @@ -717,479 +494,7 @@ contract DowntimeSlasherTestSlash_WhenIntervalInSameEpoch is DowntimeSlasherTest groupElectionIndices: groupElectionIndices }); - _whenL2(); vm.expectRevert("This method is no longer supported in L2."); slasher.mockSlash(slashParams, _validatorsList); } - - function test_SucceedsIfIntervalsOverlap_WhenIntervalCoverSlashableDowntimeWindow() public { - uint256 startBlock = _getFirstBlockNumberOfEpoch(epoch); - _bitmaps0[0] = bitmapWithoutValidator[validatorIndexInEpoch]; - _presetParentSealForBlock(startBlock, slashableDowntime, _bitmaps0); - - uint256[] memory _startBlocks = new uint256[](2); - uint256[] memory _endBlocks = new uint256[](2); - _startBlocks[0] = startBlock; - _startBlocks[1] = startBlock.add(2); - _endBlocks[0] = startBlock.add(slashableDowntime.sub(3)); - _endBlocks[1] = startBlock.add(slashableDowntime.sub(1)); - - _generateProofs(_startBlocks, _endBlocks); - - slashParams = DowntimeSlasherMock.SlashParams({ - startBlocks: _startBlocks, - endBlocks: _endBlocks, - signerIndices: _signerIndices, - groupMembershipHistoryIndex: 0, - validatorElectionLessers: validatorElectionLessers, - validatorElectionGreaters: validatorElectionGreaters, - validatorElectionIndices: validatorElectionIndices, - groupElectionLessers: groupElectionLessers, - groupElectionGreaters: groupElectionGreaters, - groupElectionIndices: groupElectionIndices - }); - - slasher.mockSlash(slashParams, _validatorsList); - - uint256 balance = lockedGold.accountTotalLockedGold(validator); - - assertEq(balance, 40000); - } - - function test_SucceedsIfIntervalsCoverMoreThanSlashableDowntimeWindow() public { - uint256 startBlock = _getFirstBlockNumberOfEpoch(epoch); - _bitmaps0[0] = bitmapWithoutValidator[validatorIndexInEpoch]; - _presetParentSealForBlock(startBlock, slashableDowntime, _bitmaps0); - - uint256[] memory _startBlocks = new uint256[](2); - uint256[] memory _endBlocks = new uint256[](2); - _startBlocks[0] = startBlock; - _startBlocks[1] = startBlock.add(intervalSize); - _endBlocks[0] = startBlock.add(intervalSize.sub(1)); - _endBlocks[1] = startBlock.add(slashableDowntime.add(3)); - - for (uint256 i = 0; i < _startBlocks.length; i++) { - _presetParentSealForBlock(_startBlocks[i], _endBlocks[i].sub(_startBlocks[i]), _bitmaps0); - } - _generateProofs(_startBlocks, _endBlocks); - - slashParams = DowntimeSlasherMock.SlashParams({ - startBlocks: _startBlocks, - endBlocks: _endBlocks, - signerIndices: _signerIndices, - groupMembershipHistoryIndex: 0, - validatorElectionLessers: validatorElectionLessers, - validatorElectionGreaters: validatorElectionGreaters, - validatorElectionIndices: validatorElectionIndices, - groupElectionLessers: groupElectionLessers, - groupElectionGreaters: groupElectionGreaters, - groupElectionIndices: groupElectionIndices - }); - - slasher.mockSlash(slashParams, _validatorsList); - - uint256 balance = lockedGold.accountTotalLockedGold(validator); - - assertEq(balance, 40000); - } - - function test_Reverts_IfIntervalsAreNotContinuous() public { - // The slashableDowntime is covered with interval(0) and interval(2), but - // interval(1) breaks the interval contiguity. - uint256 startBlock = _getFirstBlockNumberOfEpoch(epoch); - _bitmaps0[0] = bitmapWithoutValidator[validatorIndexInEpoch]; - _presetParentSealForBlock(startBlock, slashableDowntime, _bitmaps0); - - uint256[] memory _startBlocks = new uint256[](3); - uint256[] memory _endBlocks = new uint256[](3); - _startBlocks[0] = startBlock; - _startBlocks[1] = startBlock.add(intervalSize.mul(2)); - _startBlocks[2] = startBlock.add(intervalSize); - _endBlocks[0] = _startBlocks[0].add(intervalSize.sub(1)); - _endBlocks[1] = _startBlocks[1].add(intervalSize.add(1)); - _endBlocks[2] = startBlock.add(slashableDowntime.sub(1)); - - for (uint256 i = 0; i < _startBlocks.length; i++) { - _presetParentSealForBlock( - _startBlocks[i], - _endBlocks[i].sub(_startBlocks[i].add(1)), - _bitmaps0 - ); - } - - _generateProofs(_startBlocks, _endBlocks); - - slashParams = DowntimeSlasherMock.SlashParams({ - startBlocks: _startBlocks, - endBlocks: _endBlocks, - signerIndices: _signerIndices, - groupMembershipHistoryIndex: 0, - validatorElectionLessers: validatorElectionLessers, - validatorElectionGreaters: validatorElectionGreaters, - validatorElectionIndices: validatorElectionIndices, - groupElectionLessers: groupElectionLessers, - groupElectionGreaters: groupElectionGreaters, - groupElectionIndices: groupElectionIndices - }); - - vm.expectRevert( - "each interval must start at most one block after the end of the previous interval" - ); - slasher.mockSlash(slashParams, _validatorsList); - } - - function test_Reverts_WhenIntervalDontCoverSlashableDowntimeWindow() public { - uint256 startBlock = _getFirstBlockNumberOfEpoch(epoch); - _bitmaps0[0] = bitmapWithoutValidator[validatorIndexInEpoch]; - _presetParentSealForBlock(startBlock, slashableDowntime, _bitmaps0); - - uint256[] memory _startBlocks = new uint256[](2); - uint256[] memory _endBlocks = new uint256[](3); - _startBlocks[0] = startBlock; - _startBlocks[1] = startBlock.add(intervalSize); - _endBlocks[0] = _startBlocks[0].add(intervalSize.sub(1)); - _endBlocks[1] = _startBlocks[1].add(intervalSize.add(1)); - _endBlocks[2] = startBlock.add(slashableDowntime.sub(1)); - - for (uint256 i = 0; i < _startBlocks.length; i++) { - _presetParentSealForBlock( - _startBlocks[i], - _endBlocks[i].sub(_startBlocks[i].add(1)), - _bitmaps0 - ); - } - - _generateProofs(_startBlocks, _endBlocks); - - slashParams = DowntimeSlasherMock.SlashParams({ - startBlocks: _startBlocks, - endBlocks: _endBlocks, - signerIndices: _signerIndices, - groupMembershipHistoryIndex: 0, - validatorElectionLessers: validatorElectionLessers, - validatorElectionGreaters: validatorElectionGreaters, - validatorElectionIndices: validatorElectionIndices, - groupElectionLessers: groupElectionLessers, - groupElectionGreaters: groupElectionGreaters, - groupElectionIndices: groupElectionIndices - }); - - vm.expectRevert("startBlocks and endBlocks must have the same length"); - slasher.mockSlash(slashParams, _validatorsList); - } - - function test_Emits_DowntimeSlashPerformedEvent() public { - slasher.setEpochSigner(epoch, 0, validator); - uint256 startBlock = _getFirstBlockNumberOfEpoch(epoch); - - (uint256[] memory _startBlocks, uint256[] memory _endBlocks) = _ensureValidatorIsSlashable( - startBlock, - _signerIndices - ); - - uint256 endBlock = _endBlocks[_endBlocks.length.sub(1)]; - - slashParams = DowntimeSlasherMock.SlashParams({ - startBlocks: _startBlocks, - endBlocks: _endBlocks, - signerIndices: _signerIndices, - groupMembershipHistoryIndex: 0, - validatorElectionLessers: validatorElectionLessers, - validatorElectionGreaters: validatorElectionGreaters, - validatorElectionIndices: validatorElectionIndices, - groupElectionLessers: groupElectionLessers, - groupElectionGreaters: groupElectionGreaters, - groupElectionIndices: groupElectionIndices - }); - - vm.expectEmit(true, true, true, true); - emit DowntimeSlashPerformed(validator, startBlock, endBlock); - - slasher.mockSlash(slashParams, _validatorsList); - } - - function test_ShouldDecrementGold() public { - _setupSlashTest(); - uint256 _balance = lockedGold.accountTotalLockedGold(validator); - assertEq(_balance, 40000); - } - - function test_AlsoSlashesGroup() public { - _setupSlashTest(); - uint256 _balance = lockedGold.accountTotalLockedGold(group); - assertEq(_balance, 40000); - } - function test_ItCanBeSlashedTwiceInSameEpoch() public { - _setupSlashTest(); - uint256 _balance = lockedGold.accountTotalLockedGold(validator); - assertEq(_balance, 40000); - - uint256 newStartBlock = _getFirstBlockNumberOfEpoch(epoch).add(slashableDowntime.mul(2)); - - uint256[] memory validatorIndices = new uint256[](2); - validatorIndices[0] = validatorIndexInEpoch; - validatorIndices[1] = validatorIndexInEpoch; - (uint256[] memory _startBlocks, uint256[] memory _endBlocks) = _ensureValidatorIsSlashable( - newStartBlock, - validatorIndices - ); - - slashParams = DowntimeSlasherMock.SlashParams({ - startBlocks: _startBlocks, - endBlocks: _endBlocks, - signerIndices: _signerIndices, - groupMembershipHistoryIndex: 0, - validatorElectionLessers: validatorElectionLessers, - validatorElectionGreaters: validatorElectionGreaters, - validatorElectionIndices: validatorElectionIndices, - groupElectionLessers: groupElectionLessers, - groupElectionGreaters: groupElectionGreaters, - groupElectionIndices: groupElectionIndices - }); - slasher.mockSlash(slashParams, _validatorsList); - - _balance = lockedGold.accountTotalLockedGold(validator); - assertEq(_balance, 30000); - } -} - -contract DowntimeSlasherTestSlash_WhenIntervalCrossingEpoch is DowntimeSlasherTest { - uint256 startBlock; - - uint256[] private _signerIndices = new uint256[](2); - bytes32[] private _bitmaps0 = new bytes32[](2); - bytes32[] private _bitmaps1 = new bytes32[](1); - address[] private _validatorsList = new address[](2); - - function setUp() public { - super.setUp(); - - _signerIndices[0] = validatorIndexInEpoch; - _signerIndices[1] = validatorIndexInEpoch; - - _validatorsList[0] = validator; - _validatorsList[1] = validator; - - _setEpochSettings(); - - epoch = epoch.add(1); - - _waitUntilSafeBlocks(epoch); - slasher.setEpochSigner(epoch, validatorIndexInEpoch, validator); - startBlock = _getFirstBlockNumberOfEpoch(epoch).sub(intervalSize); - - slasher.setEpochSigner(epoch, validatorIndexInEpoch, validator); - } - - function test_Reverts_IfItDidNotSwitchIndices_WhenLastBlockWasSigned() public { - slasher.setEpochSigner(epoch.sub(1), validatorIndexInEpoch, validator); - - _bitmaps0[0] = bitmapWithoutValidator[validatorIndexInEpoch]; - _bitmaps0[1] = bitmapWithoutValidator[validatorIndexInEpoch]; - _bitmaps1[0] = bitmapVI01; - - _presetParentSealForBlock(startBlock, slashableDowntime.sub(1), _bitmaps0); - - _presetParentSealForBlock(startBlock.add(slashableDowntime).sub(1), 1, _bitmaps1); - - (uint256[] memory startBlocks, uint256[] memory endBlocks) = _calculateEverySlot(startBlock); - - slashParams = DowntimeSlasherMock.SlashParams({ - startBlocks: startBlocks, - endBlocks: endBlocks, - signerIndices: _signerIndices, - groupMembershipHistoryIndex: 0, - validatorElectionLessers: validatorElectionLessers, - validatorElectionGreaters: validatorElectionGreaters, - validatorElectionIndices: validatorElectionIndices, - groupElectionLessers: groupElectionLessers, - groupElectionGreaters: groupElectionGreaters, - groupElectionIndices: groupElectionIndices - }); - - vm.expectRevert("not down"); - - slasher.mockSlash(slashParams, _validatorsList); - } - - function test_Reverts_IfSwitchedIndices_WhenLastBlockWasSigned() public { - slasher.setEpochSigner(epoch.sub(1), 1, validator); - - _bitmaps0[0] = bitmapWithoutValidator[1]; - _bitmaps0[1] = bitmapWithoutValidator[validatorIndexInEpoch]; - _bitmaps1[0] = bitmapVI01; - - _presetParentSealForBlock(startBlock, slashableDowntime.sub(1), _bitmaps0); - - _presetParentSealForBlock(startBlock.add(slashableDowntime).sub(1), 1, _bitmaps1); - - (uint256[] memory startBlocks, uint256[] memory endBlocks) = _calculateEverySlot(startBlock); - - slashParams = DowntimeSlasherMock.SlashParams({ - startBlocks: startBlocks, - endBlocks: endBlocks, - signerIndices: _signerIndices, - groupMembershipHistoryIndex: 0, - validatorElectionLessers: validatorElectionLessers, - validatorElectionGreaters: validatorElectionGreaters, - validatorElectionIndices: validatorElectionIndices, - groupElectionLessers: groupElectionLessers, - groupElectionGreaters: groupElectionGreaters, - groupElectionIndices: groupElectionIndices - }); - - vm.expectRevert("not down"); - - slasher.mockSlash(slashParams, _validatorsList); - } - - function test_SucceedsWithValidatorIndexChange_WhenValidatorWasDown() public { - slasher.setEpochSigner(epoch.sub(1), validatorIndexInEpoch, validator); - - _signerIndices[0] = 1; - _signerIndices[1] = validatorIndexInEpoch; - - (uint256[] memory _startBlocks, uint256[] memory _endBlocks) = _ensureValidatorIsSlashable( - startBlock, - _signerIndices - ); - - slashParams = DowntimeSlasherMock.SlashParams({ - startBlocks: _startBlocks, - endBlocks: _endBlocks, - signerIndices: _signerIndices, - groupMembershipHistoryIndex: 0, - validatorElectionLessers: validatorElectionLessers, - validatorElectionGreaters: validatorElectionGreaters, - validatorElectionIndices: validatorElectionIndices, - groupElectionLessers: groupElectionLessers, - groupElectionGreaters: groupElectionGreaters, - groupElectionIndices: groupElectionIndices - }); - - slasher.mockSlash(slashParams, _validatorsList); - - uint256 balance = lockedGold.accountTotalLockedGold(validator); - - assertEq(balance, 40000); - } - - function test_SucceedsWithoutValidatorIndexChange_WhenValidatorWasDown() public { - slasher.setEpochSigner(epoch.sub(1), validatorIndexInEpoch, validator); - - (uint256[] memory _startBlocks, uint256[] memory _endBlocks) = _ensureValidatorIsSlashable( - startBlock, - _signerIndices - ); - - slashParams = DowntimeSlasherMock.SlashParams({ - startBlocks: _startBlocks, - endBlocks: _endBlocks, - signerIndices: _signerIndices, - groupMembershipHistoryIndex: 0, - validatorElectionLessers: validatorElectionLessers, - validatorElectionGreaters: validatorElectionGreaters, - validatorElectionIndices: validatorElectionIndices, - groupElectionLessers: groupElectionLessers, - groupElectionGreaters: groupElectionGreaters, - groupElectionIndices: groupElectionIndices - }); - - slasher.mockSlash(slashParams, _validatorsList); - - uint256 balance = lockedGold.accountTotalLockedGold(validator); - - assertEq(balance, 40000); - } - - function test_Reverts_IfIndicesDontMatchSameValidato_WhenValidatorWasDown() public { - slasher.setEpochSigner(epoch.sub(1), 1, validator); - slasher.setEpochSigner(epoch, 1, otherValidator0); - - _validatorsList[0] = validator; - _validatorsList[1] = otherValidator0; - - _signerIndices[0] = 1; - _signerIndices[1] = validatorIndexInEpoch; - - (uint256[] memory _startBlocks, uint256[] memory _endBlocks) = _ensureValidatorIsSlashable( - startBlock, - _signerIndices - ); - uint256[] memory _wrongSignerIndices = new uint256[](2); - _wrongSignerIndices[0] = 1; - _wrongSignerIndices[1] = 1; - - slashParams = DowntimeSlasherMock.SlashParams({ - startBlocks: _startBlocks, - endBlocks: _endBlocks, - signerIndices: _wrongSignerIndices, - groupMembershipHistoryIndex: 0, - validatorElectionLessers: validatorElectionLessers, - validatorElectionGreaters: validatorElectionGreaters, - validatorElectionIndices: validatorElectionIndices, - groupElectionLessers: groupElectionLessers, - groupElectionGreaters: groupElectionGreaters, - groupElectionIndices: groupElectionIndices - }); - vm.expectRevert("indices do not point to the same validator"); - slasher.mockSlash(slashParams, _validatorsList); - } - - function test_Reverts_IfValidatorHasNewerSlash_WhenSlashingSucceeds() public { - slasher.setEpochSigner(epoch.sub(1), validatorIndexInEpoch, validator); - (uint256[] memory _startBlocks, uint256[] memory _endBlocks) = _ensureValidatorIsSlashable( - startBlock, - _signerIndices - ); - - slashParams = DowntimeSlasherMock.SlashParams({ - startBlocks: _startBlocks, - endBlocks: _endBlocks, - signerIndices: _signerIndices, - groupMembershipHistoryIndex: 0, - validatorElectionLessers: validatorElectionLessers, - validatorElectionGreaters: validatorElectionGreaters, - validatorElectionIndices: validatorElectionIndices, - groupElectionLessers: groupElectionLessers, - groupElectionGreaters: groupElectionGreaters, - groupElectionIndices: groupElectionIndices - }); - - slasher.mockSlash(slashParams, _validatorsList); - - uint256 newStartBlock = _getFirstBlockNumberOfEpoch(epoch.sub(1)).add(1); - // Just to make sure that it was slashed - uint256 balance = lockedGold.accountTotalLockedGold(validator); - - assertEq(balance, 40000); - - uint256[] memory _newSignerIndices = new uint256[](1); - _newSignerIndices[0] = validatorIndexInEpoch; - address[] memory newValidatorsList = new address[](1); - newValidatorsList[0] = validator; - - ( - uint256[] memory _newStartBlocks, - uint256[] memory _newEndBlocks - ) = _ensureValidatorIsSlashable(newStartBlock, _newSignerIndices); - - slashParams = DowntimeSlasherMock.SlashParams({ - startBlocks: _newStartBlocks, - endBlocks: _newEndBlocks, - signerIndices: _newSignerIndices, - groupMembershipHistoryIndex: 0, - validatorElectionLessers: validatorElectionLessers, - validatorElectionGreaters: validatorElectionGreaters, - validatorElectionIndices: validatorElectionIndices, - groupElectionLessers: groupElectionLessers, - groupElectionGreaters: groupElectionGreaters, - groupElectionIndices: groupElectionIndices - }); - - vm.expectRevert( - "cannot slash validator for downtime for which they may already have been slashed" - ); - slasher.mockSlash(slashParams, newValidatorsList); - } } diff --git a/packages/protocol/test-sol/unit/governance/validators/Validators.t.sol b/packages/protocol/test-sol/unit/governance/validators/Validators.t.sol index 9183fbd6692..9d3790e7466 100644 --- a/packages/protocol/test-sol/unit/governance/validators/Validators.t.sol +++ b/packages/protocol/test-sol/unit/governance/validators/Validators.t.sol @@ -21,8 +21,6 @@ import "@test-sol/unit/governance/validators/mocks/ValidatorsMockTunnel.sol"; import "@test-sol/utils/ECDSAHelper.sol"; import { TestWithUtils } from "@test-sol/TestWithUtils.sol"; -import "@test-sol/utils/WhenL2.sol"; - contract ValidatorsTest is TestWithUtils, ECDSAHelper { using FixidityLib for FixidityLib.Fraction; using SafeMath for uint256; @@ -88,7 +86,6 @@ contract ValidatorsTest is TestWithUtils, ECDSAHelper { uint256 public membershipHistoryLength = 5; uint256 public maxGroupSize = 5; uint256 public commissionUpdateDelay = 3; - uint256 public downtimeGracePeriod = 0; ValidatorsMockTunnel.InitParams public initParams; ValidatorsMockTunnel.InitParams2 public initParams2; @@ -101,7 +98,6 @@ contract ValidatorsTest is TestWithUtils, ECDSAHelper { ); event MaxGroupSizeSet(uint256 size); event CommissionUpdateDelaySet(uint256 delay); - event ValidatorScoreParametersSet(uint256 exponent, uint256 adjustmentSpeed); event GroupLockedGoldRequirementsSet(uint256 value, uint256 duration); event ValidatorLockedGoldRequirementsSet(uint256 value, uint256 duration); event MembershipHistoryLengthSet(uint256 length); @@ -110,8 +106,6 @@ contract ValidatorsTest is TestWithUtils, ECDSAHelper { event ValidatorAffiliated(address indexed validator, address indexed group); event ValidatorDeaffiliated(address indexed validator, address indexed group); event ValidatorEcdsaPublicKeyUpdated(address indexed validator, bytes ecdsaPublicKey); - event ValidatorBlsPublicKeyUpdated(address indexed validator, bytes blsPublicKey); - event ValidatorScoreUpdated(address indexed validator, uint256 score, uint256 epochScore); event ValidatorGroupRegistered(address indexed group, uint256 commission); event ValidatorGroupDeregistered(address indexed group); event ValidatorGroupMemberAdded(address indexed group, address indexed validator); @@ -164,38 +158,16 @@ contract ValidatorsTest is TestWithUtils, ECDSAHelper { lockedGold = new MockLockedGold(); election = new MockElection(); - address validatorsAddress = actor("Validators"); - - deployCodeTo("ValidatorsMock.sol", validatorsAddress); - validators = IValidators(validatorsAddress); - validatorsMockTunnel = new ValidatorsMockTunnel(address(validators)); stableToken = new MockStableToken(); registry.setAddressFor(AccountsContract, address(accounts)); registry.setAddressFor(ElectionContract, address(election)); registry.setAddressFor(LockedGoldContract, address(lockedGold)); - registry.setAddressFor(ValidatorsContract, address(validators)); registry.setAddressFor(StableTokenContract, address(stableToken)); - initParams = ValidatorsMockTunnel.InitParams({ - registryAddress: REGISTRY_ADDRESS, - groupRequirementValue: originalGroupLockedGoldRequirements.value, - groupRequirementDuration: originalGroupLockedGoldRequirements.duration, - validatorRequirementValue: originalValidatorLockedGoldRequirements.value, - validatorRequirementDuration: originalValidatorLockedGoldRequirements.duration, - validatorScoreExponent: originalValidatorScoreParameters.exponent, - validatorScoreAdjustmentSpeed: originalValidatorScoreParameters.adjustmentSpeed.unwrap() - }); - initParams2 = ValidatorsMockTunnel.InitParams2({ - _membershipHistoryLength: membershipHistoryLength, - _slashingMultiplierResetPeriod: slashingMultiplierResetPeriod, - _maxGroupSize: maxGroupSize, - _commissionUpdateDelay: commissionUpdateDelay, - _downtimeGracePeriod: downtimeGracePeriod - }); - - validatorsMockTunnel.MockInitialize(owner, initParams, initParams2); + address validatorsAddress = actor("Validators"); + deployAndInitValidatorsContract(validatorsAddress); vm.prank(validator); accounts.createAccount(); @@ -208,8 +180,35 @@ contract ValidatorsTest is TestWithUtils, ECDSAHelper { vm.prank(nonValidator); accounts.createAccount(); + + whenL2WithEpochManagerInitialization(); } + function deployAndInitValidatorsContract(address _validatorsContractAddress) public { + // ValidatorsCompile should be Validators.sol + // The reason is deployed like this is because is in an old solidity version + // and forge can't deploy contracts that are not imported explicitly + deployCodeTo("ValidatorsCompile", _validatorsContractAddress); + validators = IValidators(_validatorsContractAddress); + validatorsMockTunnel = new ValidatorsMockTunnel(address(validators)); + registry.setAddressFor(ValidatorsContract, address(validators)); + + initParams = ValidatorsMockTunnel.InitParams({ + registryAddress: REGISTRY_ADDRESS, + groupRequirementValue: originalGroupLockedGoldRequirements.value, + groupRequirementDuration: originalGroupLockedGoldRequirements.duration, + validatorRequirementValue: originalValidatorLockedGoldRequirements.value, + validatorRequirementDuration: originalValidatorLockedGoldRequirements.duration + }); + initParams2 = ValidatorsMockTunnel.InitParams2({ + _membershipHistoryLength: membershipHistoryLength, + _slashingMultiplierResetPeriod: slashingMultiplierResetPeriod, + _maxGroupSize: maxGroupSize, + _commissionUpdateDelay: commissionUpdateDelay + }); + + validatorsMockTunnel.MockInitialize(owner, initParams, initParams2); + } function _registerValidatorGroupWithMembers(address _group, uint256 _numMembers) public { _registerValidatorGroupHelper(_group, _numMembers); @@ -306,37 +305,13 @@ contract ValidatorsTest is TestWithUtils, ECDSAHelper { vm.prank(_validator); accounts.authorizeValidatorSigner(_signer, v, r, s); - if (isL2()) { - vm.prank(_validator); - validators.registerValidatorNoBls(_ecdsaPubKey); - } else { - ph.mockSuccess(ph.PROOF_OF_POSSESSION(), abi.encodePacked(_validator, blsPublicKey, blsPop)); - - vm.prank(_validator); - validators.registerValidator(_ecdsaPubKey, blsPublicKey, blsPop); - } + vm.prank(_validator); + validators.registerValidatorNoBls(_ecdsaPubKey); validatorRegistrationEpochNumber = getEpochNumber(); return _ecdsaPubKey; } - function _registerValidatorWithSignerHelper_noBls() internal returns (bytes memory) { - lockedGold.setAccountTotalLockedGold(validator, originalValidatorLockedGoldRequirements.value); - - (bytes memory _ecdsaPubKey, uint8 v, bytes32 r, bytes32 s) = _generateEcdsaPubKeyWithSigner( - validator, - signerPk - ); - - vm.prank(validator); - accounts.authorizeValidatorSigner(signer, v, r, s); - - vm.prank(validator); - validators.registerValidatorNoBls(_ecdsaPubKey); - validatorRegistrationEpochNumber = epochManager.getCurrentEpochNumber(); - return _ecdsaPubKey; - } - function _generateEcdsaPubKey( address _account, uint256 _accountPk @@ -359,15 +334,8 @@ contract ValidatorsTest is TestWithUtils, ECDSAHelper { lockedGold.setAccountTotalLockedGold(_validator, originalValidatorLockedGoldRequirements.value); bytes memory _ecdsaPubKey = _generateEcdsaPubKey(_validator, _validatorPk); - if (isL2()) { - vm.prank(_validator); - validators.registerValidatorNoBls(_ecdsaPubKey); - } else { - ph.mockSuccess(ph.PROOF_OF_POSSESSION(), abi.encodePacked(_validator, blsPublicKey, blsPop)); - - vm.prank(_validator); - validators.registerValidator(_ecdsaPubKey, blsPublicKey, blsPop); - } + vm.prank(_validator); + validators.registerValidatorNoBls(_ecdsaPubKey); validatorRegistrationEpochNumber = getEpochNumber(); return _ecdsaPubKey; @@ -429,9 +397,12 @@ contract ValidatorsTest is TestWithUtils, ECDSAHelper { } } -contract ValidatorsTest_L2 is ValidatorsTest, WhenL2 {} - contract ValidatorsTest_Initialize is ValidatorsTest { + function setUp() public { + super.setUp(); + address newValidatorsContractAddress = actor("ValidatorsContract"); + deployAndInitValidatorsContract(newValidatorsContractAddress); + } function test_ShouldhaveSetTheOwner() public { assertEq(Ownable(address(validators)).owner(), owner, "Incorrect Owner."); } @@ -469,20 +440,6 @@ contract ValidatorsTest_Initialize is ValidatorsTest { ); } - function test_shouldHaveSetValidatorScoreParameters() public { - (uint256 exponent, uint256 adjustmentSpeed) = validators.getValidatorScoreParameters(); - assertEq( - exponent, - originalValidatorScoreParameters.exponent, - "Wrong validatorScoreParameters exponent." - ); - assertEq( - adjustmentSpeed, - originalValidatorScoreParameters.adjustmentSpeed.unwrap(), - "Wrong validatorScoreParameters adjustmentSpeed." - ); - } - function test_shouldHaveSetMembershipHistory() public { uint256 actual = validators.getMembershipHistoryLength(); assertEq(actual, membershipHistoryLength, "Wrong membershipHistoryLength."); @@ -497,11 +454,6 @@ contract ValidatorsTest_Initialize is ValidatorsTest { uint256 actual = validators.getCommissionUpdateDelay(); assertEq(actual, commissionUpdateDelay, "Wrong commissionUpdateDelay."); } - - function test_shouldHaveSetDowntimeGracePeriod() public { - uint256 actual = validators.downtimeGracePeriod(); - assertEq(actual, downtimeGracePeriod, "Wrong downtimeGracePeriod."); - } } contract ValidatorsTest_setCommissionUpdateDelay is ValidatorsTest { @@ -513,26 +465,6 @@ contract ValidatorsTest_setCommissionUpdateDelay is ValidatorsTest { } } -contract ValidatorsTest_setCommissionUpdateDelay_L2 is - ValidatorsTest_L2, - ValidatorsTest_setCommissionUpdateDelay -{} - -contract ValidatorsTest_setDowntimeGracePeriod is ValidatorsTest { - function test_shouldSetDowntimeGracePeriod() public { - validators.setDowntimeGracePeriod(downtimeGracePeriod + 1); - uint256 actual = validators.downtimeGracePeriod(); - assertEq(actual, downtimeGracePeriod + 1, "Wrong downtime grace period."); - } -} - -contract ValidatorsTest_setDowntimeGracePeriod_L2 is ValidatorsTest_L2 { - function test_shouldRevert() public { - vm.expectRevert("This method is no longer supported in L2."); - validators.setDowntimeGracePeriod(downtimeGracePeriod + 1); - } -} - contract ValidatorsTest_SetMembershipHistoryLength is ValidatorsTest { uint256 newLength = membershipHistoryLength + 1; @@ -559,10 +491,15 @@ contract ValidatorsTest_SetMembershipHistoryLength is ValidatorsTest { } } -contract ValidatorsTest_SetMembershipHistoryLength_L2 is - ValidatorsTest_L2, - ValidatorsTest_SetMembershipHistoryLength -{} +contract ValidatorsTest_ComputeEpochReward is ValidatorsTest { + function test_returnsZero_WhenNotAValidator() public { + assertEq( + validators.computeEpochReward(nonValidator, 1e24, 150e18), + 0, + "Should return zero reward for non-validator" + ); + } +} contract ValidatorsTest_SetMaxGroupSize is ValidatorsTest { uint256 newSize = maxGroupSize + 1; @@ -587,8 +524,6 @@ contract ValidatorsTest_SetMaxGroupSize is ValidatorsTest { } } -contract ValidatorsTest_SetMaxGroupSize_L2 is ValidatorsTest_L2, ValidatorsTest_SetMaxGroupSize {} - contract ValidatorsTest_SetGroupLockedGoldRequirements is ValidatorsTest { GroupLockedGoldRequirements private newRequirements = GroupLockedGoldRequirements({ @@ -624,11 +559,6 @@ contract ValidatorsTest_SetGroupLockedGoldRequirements is ValidatorsTest { } } -contract ValidatorsTest_SetGroupLockedGoldRequirements_L2 is - ValidatorsTest_L2, - ValidatorsTest_SetGroupLockedGoldRequirements -{} - contract ValidatorsTest_SetValidatorLockedGoldRequirements is ValidatorsTest { ValidatorLockedGoldRequirements private newRequirements = ValidatorLockedGoldRequirements({ @@ -664,63 +594,7 @@ contract ValidatorsTest_SetValidatorLockedGoldRequirements is ValidatorsTest { } } -contract ValidatorsTest_SetValidatorLockedGoldRequirements_L2 is - ValidatorsTest_L2, - ValidatorsTest_SetValidatorLockedGoldRequirements -{} - -contract ValidatorsTest_SetValidatorScoreParameters_Setup is ValidatorsTest { - ValidatorScoreParameters newParams = - ValidatorScoreParameters({ - exponent: originalValidatorScoreParameters.exponent + 1, - adjustmentSpeed: FixidityLib.newFixedFraction(6, 20) - }); - - event ValidatorScoreParametersSet(uint256 exponent, uint256 adjustmentSpeed); -} - -contract ValidatorsTest_SetValidatorScoreParameters_L1 is - ValidatorsTest_SetValidatorScoreParameters_Setup -{ - function test_ShouldSetExponentAndAdjustmentSpeed() public { - validators.setValidatorScoreParameters(newParams.exponent, newParams.adjustmentSpeed.unwrap()); - (uint256 _exponent, uint256 _adjustmentSpeed) = validators.getValidatorScoreParameters(); - assertEq(_exponent, newParams.exponent, "Incorrect Exponent"); - assertEq(_adjustmentSpeed, newParams.adjustmentSpeed.unwrap(), "Incorrect AdjustmentSpeed"); - } - - function test_Emits_ValidatorScoreParametersSet() public { - vm.expectEmit(true, true, true, true); - emit ValidatorScoreParametersSet(newParams.exponent, newParams.adjustmentSpeed.unwrap()); - validators.setValidatorScoreParameters(newParams.exponent, newParams.adjustmentSpeed.unwrap()); - } - - function test_Reverts_WhenCalledByNonOwner() public { - vm.prank(nonOwner); - vm.expectRevert("Ownable: caller is not the owner"); - validators.setValidatorScoreParameters(newParams.exponent, newParams.adjustmentSpeed.unwrap()); - } - - function test_Reverts_WhenLockupsAreUnchanged() public { - vm.expectRevert("Adjustment speed and exponent not changed"); - validators.setValidatorScoreParameters( - originalValidatorScoreParameters.exponent, - originalValidatorScoreParameters.adjustmentSpeed.unwrap() - ); - } -} - -contract ValidatorsTest_SetValidatorScoreParameters_L2 is - ValidatorsTest_L2, - ValidatorsTest_SetValidatorScoreParameters_Setup -{ - function test_Reverts() public { - vm.expectRevert("This method is no longer supported in L2."); - validators.setValidatorScoreParameters(newParams.exponent, newParams.adjustmentSpeed.unwrap()); - } -} - -contract ValidatorsTest_RegisterValidator is ValidatorsTest { +contract ValidatorsTest_RegisterValidatorNoBls is ValidatorsTest { function setUp() public { super.setUp(); @@ -739,7 +613,7 @@ contract ValidatorsTest_RegisterValidator is ValidatorsTest { vm.expectRevert("Cannot vote for more than max number of groups"); vm.prank(validator); - validators.registerValidator(pubKey, blsPublicKey, blsPop); + validators.registerValidatorNoBls(pubKey); } function test_Reverts_WhenDelagatingCELO() public { @@ -751,7 +625,7 @@ contract ValidatorsTest_RegisterValidator is ValidatorsTest { vm.expectRevert("Cannot delegate governance power"); vm.prank(validator); - validators.registerValidator(pubKey, blsPublicKey, blsPop); + validators.registerValidatorNoBls(pubKey); } function test_ShouldMarkAccountAsValidator_WhenAccountHasAuthorizedValidatorSigner() public { @@ -779,11 +653,11 @@ contract ValidatorsTest_RegisterValidator is ValidatorsTest { assertEq(actualEcdsaPubKey, _registeredEcdsaPubKey); } - function test_ShouldSetValidatorBlsPublicKey_WhenAccountHasAuthorizedValidatorSigner() public { + function test_ShouldNotSetValidatorBlsPublicKey_WhenAccountHasAuthorizedValidatorSigner() public { _registerValidatorWithSignerHelper(validator, signer, signerPk); (, bytes memory actualBlsPubKey, , , ) = validators.getValidator(validator); - assertEq(actualBlsPubKey, blsPublicKey); + assertEq(actualBlsPubKey, ""); } function test_ShouldSetValidatorSigner_WhenAccountHasAuthorizedValidatorSigner() public { @@ -816,7 +690,7 @@ contract ValidatorsTest_RegisterValidator is ValidatorsTest { assertEq(_membershipGroups, expectedMembershipGroups); } - function test_Emits_ValidatorBlsPublicKeyUpdatedEvent() public { + function _performRegistrationNoBls() internal { (bytes memory _ecdsaPubKey, uint8 v, bytes32 r, bytes32 s) = _generateEcdsaPubKeyWithSigner( validator, signerPk @@ -824,14 +698,12 @@ contract ValidatorsTest_RegisterValidator is ValidatorsTest { vm.prank(validator); accounts.authorizeValidatorSigner(signer, v, r, s); - - ph.mockSuccess(ph.PROOF_OF_POSSESSION(), abi.encodePacked(validator, blsPublicKey, blsPop)); - - vm.expectEmit(true, true, true, true); - emit ValidatorBlsPublicKeyUpdated(validator, blsPublicKey); - vm.prank(validator); - validators.registerValidator(_ecdsaPubKey, blsPublicKey, blsPop); + validators.registerValidatorNoBls(_ecdsaPubKey); + } + + function test_DoesNotEmit_ValidatorBlsPublicKeyUpdatedEvent() public { + assertDoesNotEmit(_performRegistrationNoBls, "ValidatorBlsPublicKeyUpdated(address,bytes)"); } function test_Emits_ValidatorRegisteredEvent() public { @@ -843,13 +715,11 @@ contract ValidatorsTest_RegisterValidator is ValidatorsTest { vm.prank(validator); accounts.authorizeValidatorSigner(signer, v, r, s); - ph.mockSuccess(ph.PROOF_OF_POSSESSION(), abi.encodePacked(validator, blsPublicKey, blsPop)); - vm.expectEmit(true, true, true, true); emit ValidatorRegistered(validator); vm.prank(validator); - validators.registerValidator(_ecdsaPubKey, blsPublicKey, blsPop); + validators.registerValidatorNoBls(_ecdsaPubKey); } function test_Reverts_WhenAccountAlreadyRegisteredAsValidator() public { @@ -858,19 +728,17 @@ contract ValidatorsTest_RegisterValidator is ValidatorsTest { signer, signerPk ); - vm.expectRevert("Already registered"); vm.prank(validator); - validators.registerValidator(_registeredEcdsaPubKey, blsPublicKey, blsPop); + vm.expectRevert("Already registered"); + validators.registerValidatorNoBls(_registeredEcdsaPubKey); } function test_Reverts_WhenAccountAlreadyRegisteredAsValidatorGroup() public { _registerValidatorGroupHelper(validator, 1); - vm.expectRevert("Already registered"); vm.prank(validator); - validators.registerValidator( - abi.encodePacked(bytes32(0x0101010101010101010101010101010101010101010101010101010101010101)), - blsPublicKey, - blsPop + vm.expectRevert("Already registered"); + validators.registerValidatorNoBls( + abi.encodePacked(bytes32(0x0101010101010101010101010101010101010101010101010101010101010101)) ); } @@ -881,292 +749,94 @@ contract ValidatorsTest_RegisterValidator is ValidatorsTest { ); vm.expectRevert("Deposit too small"); vm.prank(validator); - validators.registerValidator( - abi.encodePacked(bytes32(0x0101010101010101010101010101010101010101010101010101010101010101)), - blsPublicKey, - blsPop + validators.registerValidatorNoBls( + abi.encodePacked(bytes32(0x0101010101010101010101010101010101010101010101010101010101010101)) ); } } -contract ValidatorsTest_RegisterValidator_L2 is ValidatorsTest_L2 { - function test_shouldRevert() public { - lockedGold.setAccountTotalLockedGold(validator, originalValidatorLockedGoldRequirements.value); - - (bytes memory _ecdsaPubKey, uint8 v, bytes32 r, bytes32 s) = _generateEcdsaPubKeyWithSigner( - validator, - signerPk - ); +contract ValidatorsTest_DeregisterValidator_WhenAccountHasNeverBeenMemberOfValidatorGroup is + ValidatorsTest +{ + uint256 public constant INDEX = 0; - ph.mockSuccess(ph.PROOF_OF_POSSESSION(), abi.encodePacked(validator, blsPublicKey, blsPop)); + function setUp() public { + super.setUp(); - vm.prank(validator); - accounts.authorizeValidatorSigner(signer, v, r, s); + _registerValidatorHelper(validator, validatorPk); - vm.prank(validator); - vm.expectRevert("This method is no longer supported in L2."); - validators.registerValidator(_ecdsaPubKey, blsPublicKey, blsPop); + timeTravel(originalValidatorLockedGoldRequirements.duration); } -} - -contract ValidatorsTest_RegisterValidatorNoBls is ValidatorsTest { - function test_ShouldRevert_WhenInL1() public { - lockedGold.setAccountTotalLockedGold(validator, originalValidatorLockedGoldRequirements.value); - (bytes memory _ecdsaPubKey, uint8 v, bytes32 r, bytes32 s) = _generateEcdsaPubKeyWithSigner( - validator, - signerPk - ); + function test_ShouldMarkAccountAsNotValidator_WhenAccountHasNeverBeenMemberOfValidatorGroup() + public + { + assertTrue(validators.isValidator(validator)); - vm.prank(validator); - accounts.authorizeValidatorSigner(signer, v, r, s); + _deregisterValidator(validator); - vm.expectRevert("This method is not supported in L1."); - vm.prank(validator); - validators.registerValidatorNoBls(_ecdsaPubKey); + assertFalse(validators.isValidator(validator)); } -} -contract ValidatorsTest_RegisterValidatorNoBls_L2 is ValidatorsTest_L2 { - function setUp() public { - super.setUp(); + function test_ShouldRemoveAccountFromValidatorList_WhenAccountHasNeverBeenMemberOfValidatorGroup() + public + { + address[] memory ExpectedRegisteredValidators = new address[](0); - lockedGold.setAccountTotalLockedGold(validator, originalValidatorLockedGoldRequirements.value); + assertTrue(validators.isValidator(validator)); + _deregisterValidator(validator); + assertEq(validators.getRegisteredValidators().length, ExpectedRegisteredValidators.length); } - function test_Reverts_WhenVoteOverMaxNumberOfGroupsSetToTrue() public { - vm.prank(validator); - election.setAllowedToVoteOverMaxNumberOfGroups(validator, true); - - (uint8 v, bytes32 r, bytes32 s) = getParsedSignatureOfAddress(validator, signerPk); - - vm.prank(validator); - accounts.authorizeValidatorSigner(signer, v, r, s); - bytes memory pubKey = addressToPublicKey("random msg", v, r, s); + function test_ShouldResetAccountBalanceRequirements_WhenAccountHasNeverBeenMemberOfValidatorGroup() + public + { + assertTrue(validators.isValidator(validator)); + _deregisterValidator(validator); + assertEq(validators.getAccountLockedGoldRequirement(validator), 0); + } - vm.expectRevert("Cannot vote for more than max number of groups"); - vm.prank(validator); - validators.registerValidatorNoBls(pubKey); + function test_Emits_ValidatorDeregisteredEvent_WhenAccountHasNeverBeenMemberOfValidatorGroup() + public + { + vm.expectEmit(true, true, true, true); + emit ValidatorDeregistered(validator); + _deregisterValidator(validator); } - function test_Reverts_WhenDelagatingCELO() public { - lockedGold.setAccountTotalDelegatedAmountInPercents(validator, 10); - (uint8 v, bytes32 r, bytes32 s) = getParsedSignatureOfAddress(validator, signerPk); - vm.prank(validator); - accounts.authorizeValidatorSigner(signer, v, r, s); - bytes memory pubKey = addressToPublicKey("random msg", v, r, s); + function test_Reverts_WhenAccountNotRegisteredValidator() public { + vm.expectRevert("Not a validator"); + vm.prank(nonValidator); + validators.deregisterValidator(INDEX); + } - vm.expectRevert("Cannot delegate governance power"); + function test_Reverts_WhenWrongIndexProvided() public { + timeTravel(originalValidatorLockedGoldRequirements.duration); + vm.expectRevert("deleteElement: index out of range"); vm.prank(validator); - validators.registerValidatorNoBls(pubKey); + validators.deregisterValidator(INDEX + 1); } - function test_ShouldMarkAccountAsValidator_WhenAccountHasAuthorizedValidatorSigner() public { - _registerValidatorWithSignerHelper_noBls(); - - assertTrue(validators.isValidator(validator)); + function _deregisterValidator(address _validator) internal { + vm.prank(_validator); + validators.deregisterValidator(INDEX); } +} - function test_ShouldAddAccountToValidatorList_WhenAccountHasAuthorizedValidatorSigner() public { - address[] memory ExpectedRegisteredValidators = new address[](1); - ExpectedRegisteredValidators[0] = validator; - _registerValidatorWithSignerHelper_noBls(); - assertEq(validators.getRegisteredValidators().length, ExpectedRegisteredValidators.length); - assertEq(validators.getRegisteredValidators()[0], ExpectedRegisteredValidators[0]); - } +contract ValidatorsTest_DeregisterValidator_WhenAccountHasBeenMemberOfValidatorGroup is + ValidatorsTest +{ + uint256 public constant INDEX = 0; - function test_ShouldSetValidatorEcdsaPublicKey_WhenAccountHasAuthorizedValidatorSigner() public { - bytes memory _registeredEcdsaPubKey = _registerValidatorWithSignerHelper_noBls(); - (bytes memory actualEcdsaPubKey, , , , ) = validators.getValidator(validator); + function setUp() public { + super.setUp(); - assertEq(actualEcdsaPubKey, _registeredEcdsaPubKey); - } + _registerValidatorHelper(validator, validatorPk); - function test_ShouldNotSetValidatorBlsPublicKey_WhenAccountHasAuthorizedValidatorSigner() public { - _registerValidatorWithSignerHelper_noBls(); - (, bytes memory actualBlsPubKey, , , ) = validators.getValidator(validator); + _registerValidatorGroupHelper(group, 1); - assertEq(actualBlsPubKey, ""); - } - - function test_ShouldSetValidatorSigner_WhenAccountHasAuthorizedValidatorSigner() public { - _registerValidatorWithSignerHelper_noBls(); - (, , , , address ActualSigner) = validators.getValidator(validator); - - assertEq(ActualSigner, signer); - } - - function test_ShouldSetLockGoldRequirements_WhenAccountHasAuthorizedValidatorSigner() public { - _registerValidatorWithSignerHelper_noBls(); - uint256 _lockedGoldReq = validators.getAccountLockedGoldRequirement(validator); - - assertEq(_lockedGoldReq, originalValidatorLockedGoldRequirements.value); - } - - function test_ShouldSetValidatorMembershipHistory_WhenAccountHasAuthorizedValidatorSigner() - public - { - _registerValidatorWithSignerHelper_noBls(); - (uint256[] memory _epoch, address[] memory _membershipGroups, , ) = validators - .getMembershipHistory(validator); - - uint256[] memory validatorRegistrationEpochNumberList = new uint256[](1); - validatorRegistrationEpochNumberList[0] = validatorRegistrationEpochNumber; - address[] memory expectedMembershipGroups = new address[](1); - expectedMembershipGroups[0] = address(0); - - assertEq(_epoch, validatorRegistrationEpochNumberList); - assertEq(_membershipGroups, expectedMembershipGroups); - } - - function testFail_DoesNotEmit_ValidatorBlsPublicKeyUpdatedEvent() public { - (bytes memory _ecdsaPubKey, uint8 v, bytes32 r, bytes32 s) = _generateEcdsaPubKeyWithSigner( - validator, - signerPk - ); - - vm.prank(validator); - accounts.authorizeValidatorSigner(signer, v, r, s); - - vm.expectEmit(true, true, true, true); - emit ValidatorBlsPublicKeyUpdated(validator, blsPublicKey); - - vm.prank(validator); - validators.registerValidatorNoBls(_ecdsaPubKey); - } - - function test_Emits_ValidatorRegisteredEvent() public { - (bytes memory _ecdsaPubKey, uint8 v, bytes32 r, bytes32 s) = _generateEcdsaPubKeyWithSigner( - validator, - signerPk - ); - - vm.prank(validator); - accounts.authorizeValidatorSigner(signer, v, r, s); - - vm.expectEmit(true, true, true, true); - emit ValidatorRegistered(validator); - - vm.prank(validator); - validators.registerValidatorNoBls(_ecdsaPubKey); - } - - function test_Reverts_WhenAccountAlreadyRegisteredAsValidator() public { - bytes memory _registeredEcdsaPubKey = _registerValidatorWithSignerHelper_noBls(); - vm.prank(validator); - vm.expectRevert("Already registered"); - validators.registerValidatorNoBls(_registeredEcdsaPubKey); - } - - function test_Reverts_WhenAccountAlreadyRegisteredAsValidatorGroup() public { - _registerValidatorGroupHelper(validator, 1); - vm.prank(validator); - vm.expectRevert("Already registered"); - validators.registerValidatorNoBls( - abi.encodePacked(bytes32(0x0101010101010101010101010101010101010101010101010101010101010101)) - ); - } - - function test_Reverts_WhenAccountDoesNotMeetLockedGoldRequirements() public { - lockedGold.setAccountTotalLockedGold( - validator, - originalValidatorLockedGoldRequirements.value.sub(11) - ); - vm.expectRevert("Deposit too small"); - vm.prank(validator); - validators.registerValidatorNoBls( - abi.encodePacked(bytes32(0x0101010101010101010101010101010101010101010101010101010101010101)) - ); - } -} - -contract ValidatorsTest_DeregisterValidator_WhenAccountHasNeverBeenMemberOfValidatorGroup is - ValidatorsTest -{ - uint256 public constant INDEX = 0; - - function setUp() public { - super.setUp(); - - _registerValidatorHelper(validator, validatorPk); - - timeTravel(originalValidatorLockedGoldRequirements.duration); - } - - function test_ShouldMarkAccountAsNotValidator_WhenAccountHasNeverBeenMemberOfValidatorGroup() - public - { - assertTrue(validators.isValidator(validator)); - - _deregisterValidator(validator); - - assertFalse(validators.isValidator(validator)); - } - - function test_ShouldRemoveAccountFromValidatorList_WhenAccountHasNeverBeenMemberOfValidatorGroup() - public - { - address[] memory ExpectedRegisteredValidators = new address[](0); - - assertTrue(validators.isValidator(validator)); - _deregisterValidator(validator); - assertEq(validators.getRegisteredValidators().length, ExpectedRegisteredValidators.length); - } - - function test_ShouldResetAccountBalanceRequirements_WhenAccountHasNeverBeenMemberOfValidatorGroup() - public - { - assertTrue(validators.isValidator(validator)); - _deregisterValidator(validator); - assertEq(validators.getAccountLockedGoldRequirement(validator), 0); - } - - function test_Emits_ValidatorDeregisteredEvent_WhenAccountHasNeverBeenMemberOfValidatorGroup() - public - { - vm.expectEmit(true, true, true, true); - emit ValidatorDeregistered(validator); - _deregisterValidator(validator); - } - - function test_Reverts_WhenAccountNotRegisteredValidator() public { - vm.expectRevert("Not a validator"); - vm.prank(nonValidator); - validators.deregisterValidator(INDEX); - } - - function test_Reverts_WhenWrongIndexProvided() public { - timeTravel(originalValidatorLockedGoldRequirements.duration); - vm.expectRevert("deleteElement: index out of range"); - vm.prank(validator); - validators.deregisterValidator(INDEX + 1); - } - - function _deregisterValidator(address _validator) internal { - vm.prank(_validator); - validators.deregisterValidator(INDEX); - } -} - -contract ValidatorsTest_DeregisterValidator_WhenAccountHasNeverBeenMemberOfValidatorGroup_L2 is - ValidatorsTest_L2, - ValidatorsTest_DeregisterValidator_WhenAccountHasNeverBeenMemberOfValidatorGroup -{} - -contract ValidatorsTest_DeregisterValidator_WhenAccountHasBeenMemberOfValidatorGroup is - ValidatorsTest -{ - uint256 public constant INDEX = 0; - - function setUp() public { - super.setUp(); - - _registerValidatorHelper(validator, validatorPk); - - _registerValidatorGroupHelper(group, 1); - - vm.prank(validator); - validators.affiliate(group); + vm.prank(validator); + validators.affiliate(group); vm.prank(group); validators.addFirstMember(validator, address(0), address(0)); @@ -1248,11 +918,6 @@ contract ValidatorsTest_DeregisterValidator_WhenAccountHasBeenMemberOfValidatorG } } -contract ValidatorsTest_DeregisterValidator_WhenAccountHasBeenMemberOfValidatorGroup_L2 is - ValidatorsTest_L2, - ValidatorsTest_DeregisterValidator_WhenAccountHasBeenMemberOfValidatorGroup -{} - contract ValidatorsTest_Affiliate_WhenGroupAndValidatorMeetLockedGoldRequirements is ValidatorsTest { @@ -1316,12 +981,7 @@ contract ValidatorsTest_Affiliate_WhenGroupAndValidatorMeetLockedGoldRequirement } } -contract ValidatorsTest_Affiliate_WhenGroupAndValidatorMeetLockedGoldRequirements_L2 is - ValidatorsTest_L2, - ValidatorsTest_Affiliate_WhenGroupAndValidatorMeetLockedGoldRequirements -{} - -contract ValidatorsTest_Affiliate_WhenValidatorIsAlreadyAffiliatedWithValidatorGroup_Setup is +contract ValidatorsTest_Affiliate_WhenValidatorIsAlreadyAffiliatedWithValidatorGroup is ValidatorsTest { address otherGroup; @@ -1344,11 +1004,6 @@ contract ValidatorsTest_Affiliate_WhenValidatorIsAlreadyAffiliatedWithValidatorG vm.prank(validator); validators.affiliate(group); } -} - -contract ValidatorsTest_Affiliate_WhenValidatorIsAlreadyAffiliatedWithValidatorGroup is - ValidatorsTest_Affiliate_WhenValidatorIsAlreadyAffiliatedWithValidatorGroup_Setup -{ function test_ShouldSetAffiliate_WhenValidatorNotMemberOfThatValidatorGroup() public { vm.prank(validator); validators.affiliate(otherGroup); @@ -1441,25 +1096,7 @@ contract ValidatorsTest_Affiliate_WhenValidatorIsAlreadyAffiliatedWithValidatorG assertTrue(election.isIneligible(group)); } -} -contract ValidatorsTest_Affiliate_WhenValidatorIsAlreadyAffiliatedWithValidatorGroup_L1 is - ValidatorsTest_Affiliate_WhenValidatorIsAlreadyAffiliatedWithValidatorGroup_Setup -{ - function _performAffiliation() internal { - vm.prank(validator); - validators.affiliate(group); - } - - function test_ShouldNotTryToSendValidatorPayment() public { - assertDoesNotEmit(_performAffiliation, "SendValidatorPaymentCalled(address)"); - } -} - -contract ValidatorsTest_Affiliate_WhenValidatorIsAlreadyAffiliatedWithValidatorGroup_L2 is - ValidatorsTest_L2, - ValidatorsTest_Affiliate_WhenValidatorIsAlreadyAffiliatedWithValidatorGroup -{ function test_ShouldSendValidatorPayment() public { vm.expectEmit(true, true, true, true); emit SendValidatorPaymentCalled(validator); @@ -1468,7 +1105,7 @@ contract ValidatorsTest_Affiliate_WhenValidatorIsAlreadyAffiliatedWithValidatorG } } -contract ValidatorsTest_Deaffiliate_Setup is ValidatorsTest { +contract ValidatorsTest_Deaffiliate is ValidatorsTest { uint256 additionEpoch; uint256 deaffiliationEpoch; @@ -1484,9 +1121,6 @@ contract ValidatorsTest_Deaffiliate_Setup is ValidatorsTest { require(_affiliation == group, "Affiliation failed."); } -} - -contract ValidatorsTest_Deaffiliate is ValidatorsTest_Deaffiliate_Setup { function test_ShouldClearAffiliate() public { vm.prank(validator); validators.deaffiliate(); @@ -1593,20 +1227,7 @@ contract ValidatorsTest_Deaffiliate is ValidatorsTest_Deaffiliate_Setup { validators.deaffiliate(); assertTrue(election.isIneligible(group)); } -} - -contract ValidatorsTest_Deaffiliate_L1 is ValidatorsTest_Deaffiliate_Setup { - function _performDeaffiliation() internal { - vm.prank(validator); - validators.deaffiliate(); - } - - function test_ShouldNotTryToSendValidatorPayment() public { - assertDoesNotEmit(_performDeaffiliation, "SendValidatorPaymentCalled(address)"); - } -} -contract ValidatorsTest_Deaffiliate_L2 is ValidatorsTest_Deaffiliate, ValidatorsTest_L2 { function test_ShouldSendValidatorPayment() public { vm.expectEmit(true, true, true, true); emit SendValidatorPaymentCalled(validator); @@ -1675,230 +1296,6 @@ contract ValidatorsTest_UpdateEcdsaPublicKey is ValidatorsTest { } } -contract ValidatorsTest_UpdateEcdsaPublicKey_L2 is - ValidatorsTest_L2, - ValidatorsTest_UpdateEcdsaPublicKey -{} - -contract ValidatorsTest_UpdatePublicKeys_Setup is ValidatorsTest { - bytes validatorEcdsaPubKey; - - bytes public constant newBlsPublicKey = - abi.encodePacked( - bytes32(0x0101010101010101010101010101010101010101010101010101010101010102), - bytes32(0x0202020202020202020202020202020202020202020202020202020202020203), - bytes32(0x0303030303030303030303030303030303030303030303030303030303030304) - ); - bytes public constant newBlsPop = - abi.encodePacked( - bytes16(0x04040404040404040404040404040405), - bytes16(0x05050505050505050505050505050506), - bytes16(0x06060606060606060606060606060607) - ); - - function setUp() public { - super.setUp(); - - vm.prank(address(accounts)); - accounts.createAccount(); - - validatorEcdsaPubKey = _registerValidatorHelper(validator, validatorPk); - } -} - -contract ValidatorsTest_UpdatePublicKeys_L1 is ValidatorsTest_UpdatePublicKeys_Setup { - function test_ShouldSetValidatorNewBlsPubKeyAndEcdsaPubKey_WhenCalledByRegisteredAccountsContract() - public - { - (bytes memory _newEcdsaPubKey, , , ) = _generateEcdsaPubKeyWithSigner( - address(accounts), - signerPk - ); - - ph.mockSuccess( - ph.PROOF_OF_POSSESSION(), - abi.encodePacked(validator, newBlsPublicKey, newBlsPop) - ); - - vm.prank(address(accounts)); - validators.updatePublicKeys(validator, signer, _newEcdsaPubKey, newBlsPublicKey, newBlsPop); - - (bytes memory actualEcdsaPubKey, bytes memory actualBlsPublicKey, , , ) = validators - .getValidator(validator); - - assertEq(actualEcdsaPubKey, _newEcdsaPubKey); - assertEq(actualBlsPublicKey, newBlsPublicKey); - } - - function test_Emits_ValidatorEcdsaPublicKeyUpdatedAndValidatorBlsPublicKeyUpdatedEvent_WhenCalledByRegisteredAccountsContract() - public - { - (bytes memory _newEcdsaPubKey, , , ) = _generateEcdsaPubKeyWithSigner( - address(accounts), - signerPk - ); - - ph.mockSuccess( - ph.PROOF_OF_POSSESSION(), - abi.encodePacked(validator, newBlsPublicKey, newBlsPop) - ); - - vm.expectEmit(true, true, true, true); - emit ValidatorEcdsaPublicKeyUpdated(validator, _newEcdsaPubKey); - - vm.expectEmit(true, true, true, true); - emit ValidatorBlsPublicKeyUpdated(validator, newBlsPublicKey); - - vm.prank(address(accounts)); - validators.updatePublicKeys(validator, signer, _newEcdsaPubKey, newBlsPublicKey, newBlsPop); - } - - function test_Reverts_WhenPublicKeyDoesNotMatchSigner_WhenCalledByRegisteredAccountsContract() - public - { - (bytes memory _newEcdsaPubKey, , , ) = _generateEcdsaPubKeyWithSigner( - address(accounts), - otherValidatorPk - ); - - ph.mockSuccess( - ph.PROOF_OF_POSSESSION(), - abi.encodePacked(validator, newBlsPublicKey, newBlsPop) - ); - - vm.expectRevert("ECDSA key does not match signer"); - vm.prank(address(accounts)); - validators.updatePublicKeys(validator, signer, _newEcdsaPubKey, newBlsPublicKey, newBlsPop); - } - - function test_Reverts_WhenPublicKeyMatchesSigner_WhenNotCalledByRegisteredAccountsContract() - public - { - (bytes memory _newEcdsaPubKey, , , ) = _generateEcdsaPubKeyWithSigner(validator, signerPk); - - vm.expectRevert("only registered contract"); - vm.prank(validator); - validators.updatePublicKeys(validator, signer, _newEcdsaPubKey, newBlsPublicKey, newBlsPop); - } -} - -contract ValidatorsTest_UpdatePublicKeys_L2 is - ValidatorsTest_UpdatePublicKeys_Setup, - ValidatorsTest_L2 -{ - function test_Reverts() public { - (bytes memory _newEcdsaPubKey, , , ) = _generateEcdsaPubKeyWithSigner( - address(accounts), - signerPk - ); - - vm.expectRevert("This method is no longer supported in L2."); - vm.prank(address(accounts)); - validators.updatePublicKeys(validator, signer, _newEcdsaPubKey, newBlsPublicKey, newBlsPop); - } -} - -contract ValidatorsTest_UpdateBlsPublicKey_Setup is ValidatorsTest { - bytes validatorEcdsaPubKey; - - bytes public constant newBlsPublicKey = - abi.encodePacked( - bytes32(0x0101010101010101010101010101010101010101010101010101010101010102), - bytes32(0x0202020202020202020202020202020202020202020202020202020202020203), - bytes32(0x0303030303030303030303030303030303030303030303030303030303030304) - ); - - bytes public constant newBlsPop = - abi.encodePacked( - bytes16(0x04040404040404040404040404040405), - bytes16(0x05050505050505050505050505050506), - bytes16(0x06060606060606060606060606060607) - ); - - bytes public constant wrongBlsPublicKey = - abi.encodePacked( - bytes32(0x0101010101010101010101010101010101010101010101010101010101010102), - bytes32(0x0202020202020202020202020202020202020202020202020202020202020203), - bytes16(0x06060606060606060606060606060607) - ); - - bytes public constant wrongBlsPop = - abi.encodePacked( - bytes32(0x0101010101010101010101010101010101010101010101010101010101010102), - bytes16(0x05050505050505050505050505050506), - bytes16(0x06060606060606060606060606060607) - ); - - function setUp() public { - super.setUp(); - - validatorEcdsaPubKey = _registerValidatorHelper(validator, validatorPk); - } -} - -contract ValidatorsTest_UpdateBlsPublicKey_L1 is ValidatorsTest_UpdateBlsPublicKey_Setup { - function test_ShouldSetNewValidatorBlsPubKey() public { - ph.mockSuccess( - ph.PROOF_OF_POSSESSION(), - abi.encodePacked(validator, newBlsPublicKey, newBlsPop) - ); - - vm.prank(validator); - validators.updateBlsPublicKey(newBlsPublicKey, newBlsPop); - - (, bytes memory actualBlsPublicKey, , , ) = validators.getValidator(validator); - - assertEq(actualBlsPublicKey, newBlsPublicKey); - } - - function test_Emits_ValidatorValidatorBlsPublicKeyUpdatedEvent() public { - ph.mockSuccess( - ph.PROOF_OF_POSSESSION(), - abi.encodePacked(validator, newBlsPublicKey, newBlsPop) - ); - - vm.expectEmit(true, true, true, true); - emit ValidatorBlsPublicKeyUpdated(validator, newBlsPublicKey); - - vm.prank(validator); - validators.updateBlsPublicKey(newBlsPublicKey, newBlsPop); - } - - function test_Reverts_WhenPublicKeyIsNot96Bytes() public { - ph.mockSuccess( - ph.PROOF_OF_POSSESSION(), - abi.encodePacked(validator, wrongBlsPublicKey, newBlsPop) - ); - - vm.expectRevert("Wrong BLS public key length"); - vm.prank(validator); - validators.updateBlsPublicKey(wrongBlsPublicKey, newBlsPop); - } - - function test_Reverts_WhenProofOfPossessionIsNot48Bytes() public { - ph.mockSuccess( - ph.PROOF_OF_POSSESSION(), - abi.encodePacked(validator, newBlsPublicKey, wrongBlsPop) - ); - - vm.expectRevert("Wrong BLS PoP length"); - vm.prank(validator); - validators.updateBlsPublicKey(newBlsPublicKey, wrongBlsPop); - } -} - -contract ValidatorsTest_UpdateBlsPublicKey_L2 is - ValidatorsTest_UpdateBlsPublicKey_Setup, - ValidatorsTest_L2 -{ - function test_Reverts() public { - vm.expectRevert("This method is no longer supported in L2."); - - vm.prank(validator); - validators.updateBlsPublicKey(newBlsPublicKey, newBlsPop); - } -} - contract ValidatorsTest_RegisterValidatorGroup is ValidatorsTest { function setUp() public { super.setUp(); @@ -1991,11 +1388,6 @@ contract ValidatorsTest_RegisterValidatorGroup is ValidatorsTest { } } -contract ValidatorsTest_RegisterValidatorGroup_L2 is - ValidatorsTest_L2, - ValidatorsTest_RegisterValidatorGroup -{} - contract ValidatorsTest_DeregisterValidatorGroup_WhenGroupHasNeverHadMembers is ValidatorsTest { uint256 public constant INDEX = 0; @@ -2048,11 +1440,6 @@ contract ValidatorsTest_DeregisterValidatorGroup_WhenGroupHasNeverHadMembers is } } -contract ValidatorsTest_DeregisterValidatorGroup_WhenGroupHasNeverHadMembers_L2 is - ValidatorsTest_L2, - ValidatorsTest_DeregisterValidatorGroup_WhenGroupHasNeverHadMembers -{} - contract ValidatorsTest_DeregisterValidatorGroup_WhenGroupHasHadMembers is ValidatorsTest { uint256 public constant INDEX = 0; @@ -2147,11 +1534,6 @@ contract ValidatorsTest_DeregisterValidatorGroup_WhenGroupHasHadMembers is Valid } } -contract ValidatorsTest_DeregisterValidatorGroup_WhenGroupHasHadMembers_L2 is - ValidatorsTest_L2, - ValidatorsTest_DeregisterValidatorGroup_WhenGroupHasHadMembers -{} - contract ValidatorsTest_AddMember is ValidatorsTest { uint256 _registrationEpoch; uint256 _additionEpoch; @@ -2349,8 +1731,6 @@ contract ValidatorsTest_AddMember is ValidatorsTest { } } -contract ValidatorsTest_AddMember_L2 is ValidatorsTest_L2, ValidatorsTest_AddMember {} - contract ValidatorsTest_RemoveMember is ValidatorsTest { uint256 _registrationEpoch; uint256 _additionEpoch; @@ -2439,8 +1819,6 @@ contract ValidatorsTest_RemoveMember is ValidatorsTest { } } -contract ValidatorsTest_RemoveMember_L2 is ValidatorsTest_L2, ValidatorsTest_RemoveMember {} - contract ValidatorsTest_ReorderMember is ValidatorsTest { function setUp() public { super.setUp(); @@ -2490,8 +1868,6 @@ contract ValidatorsTest_ReorderMember is ValidatorsTest { } } -contract ValidatorsTest_ReorderMember_L2 is ValidatorsTest_L2, ValidatorsTest_ReorderMember {} - contract ValidatorsTest_SetNextCommissionUpdate is ValidatorsTest { uint256 newCommission = commission.unwrap().add(1); @@ -2543,12 +1919,9 @@ contract ValidatorsTest_SetNextCommissionUpdate is ValidatorsTest { } } -contract ValidatorsTest_SetNextCommissionUpdate_L2 is - ValidatorsTest_L2, - ValidatorsTest_SetNextCommissionUpdate -{} +contract ValidatorsTest_UpdateCommission_Setup is ValidatorsTest {} -contract ValidatorsTest_UpdateCommission_Setup is ValidatorsTest { +contract ValidatorsTest_UpdateCommission is ValidatorsTest { uint256 newCommission = commission.unwrap().add(1); function setUp() public { @@ -2570,9 +1943,7 @@ contract ValidatorsTest_UpdateCommission_Setup is ValidatorsTest { require(_affiliation1 == group, "Affiliation failed."); require(_affiliation2 == group, "Affiliation failed."); } -} -contract ValidatorsTest_UpdateCommission is ValidatorsTest_UpdateCommission_Setup { function test_ShouldSetValidatorGroupCommission() public { vm.prank(group); validators.setNextCommissionUpdate(newCommission); @@ -2587,448 +1958,64 @@ contract ValidatorsTest_UpdateCommission is ValidatorsTest_UpdateCommission_Setu assertEq(_commission, newCommission); } - function test_Emits_ValidatorGroupCommissionUpdated() public { - vm.prank(group); - validators.setNextCommissionUpdate(newCommission); - - blockTravel(commissionUpdateDelay); - - vm.expectEmit(true, true, true, true); - emit ValidatorGroupCommissionUpdated(group, newCommission); - - vm.prank(group); - validators.updateCommission(); - } - - function test_Reverts_WhenActivationBlockHasNotPassed() public { - vm.prank(group); - validators.setNextCommissionUpdate(newCommission); - - vm.expectRevert("Can't apply commission update yet"); - vm.prank(group); - validators.updateCommission(); - } - - function test_Reverts_WhenNoCommissionHasBeenQueued() public { - vm.expectRevert("No commission update queued"); - - vm.prank(group); - validators.updateCommission(); - } - - function test_Reverts_WhenApplyingAlreadyAppliedCommission() public { - vm.prank(group); - validators.setNextCommissionUpdate(newCommission); - blockTravel(commissionUpdateDelay); - - vm.prank(group); - validators.updateCommission(); - - vm.expectRevert("No commission update queued"); - - vm.prank(group); - validators.updateCommission(); - } -} - -contract ValidatorsTest_UpdateCommission_L1 is ValidatorsTest_UpdateCommission_Setup { - function _performCommissionUpdate() internal { - vm.prank(group); - validators.addFirstMember(validator, address(0), address(0)); - - vm.prank(group); - validators.setNextCommissionUpdate(newCommission); - blockTravel(commissionUpdateDelay); - - vm.prank(group); - validators.updateCommission(); - } - - function test_ShouldNotTryTodSendMultipleValidatorPayments_WhenL1() public { - assertDoesNotEmit(_performCommissionUpdate, "SendValidatorPaymentCalled(address)"); - } -} - -contract ValidatorsTest_UpdateCommission_L2 is ValidatorsTest_L2, ValidatorsTest_UpdateCommission { - function test_ShouldSendMultipleValidatorPayments_WhenL2() public { - vm.prank(group); - validators.addFirstMember(validator, address(0), address(0)); - vm.prank(group); - validators.addMember(otherValidator); - vm.prank(group); - validators.setNextCommissionUpdate(newCommission); - blockTravel(commissionUpdateDelay); - - vm.expectEmit(true, true, true, true); - emit SendValidatorPaymentCalled(validator); - vm.expectEmit(true, true, true, true); - emit SendValidatorPaymentCalled(otherValidator); - vm.prank(group); - validators.updateCommission(); - } -} - -contract ValidatorsTest_CalculateEpochScore is ValidatorsTest { - function setUp() public { - super.setUp(); - - _registerValidatorGroupHelper(group, 1); - } - - function test_ShouldCalculateScoreCorrectly_WhenUptimeInInterval0AND1() public { - FixidityLib.Fraction memory uptime = FixidityLib.newFixedFraction(99, 100); - FixidityLib.Fraction memory gracePeriod = FixidityLib.newFixedFraction( - validators.downtimeGracePeriod(), - 1 - ); - - uint256 _expectedScore0 = _calculateScore(uptime.unwrap(), gracePeriod.unwrap()); - - ph.mockReturn( - ph.FRACTION_MUL(), - abi.encodePacked( - FixidityLib.fixed1().unwrap(), - FixidityLib.fixed1().unwrap(), - uptime.unwrap(), - FixidityLib.fixed1().unwrap(), - originalValidatorScoreParameters.exponent, - uint256(18) - ), - abi.encodePacked(uint256(950990049900000000000000), FixidityLib.fixed1().unwrap()) - ); - uint256 _score0 = validators.calculateEpochScore(uptime.unwrap()); - - uint256 _expectedScore1 = _calculateScore(0, gracePeriod.unwrap()); - uint256 _expectedScore2 = 1; - - ph.mockReturn( - ph.FRACTION_MUL(), - abi.encodePacked( - FixidityLib.fixed1().unwrap(), - FixidityLib.fixed1().unwrap(), - uint256(0), - FixidityLib.fixed1().unwrap(), - originalValidatorScoreParameters.exponent, - uint256(18) - ), - abi.encodePacked(uint256(0), FixidityLib.fixed1().unwrap()) - ); - - uint256 _score1 = validators.calculateEpochScore(0); - - ph.mockReturn( - ph.FRACTION_MUL(), - abi.encodePacked( - FixidityLib.fixed1().unwrap(), - FixidityLib.fixed1().unwrap(), - FixidityLib.fixed1().unwrap(), - FixidityLib.fixed1().unwrap(), - originalValidatorScoreParameters.exponent, - uint256(18) - ), - abi.encodePacked(uint256(1), FixidityLib.fixed1().unwrap()) - ); - - uint256 _score2 = validators.calculateEpochScore(FixidityLib.fixed1().unwrap()); - - assertEq(_score0, _expectedScore0); - assertEq(_score1, _expectedScore1); - assertEq(_score2, _expectedScore2); - } - - function test_Reverts_WhenUptimeGreaterThan1() public { - FixidityLib.Fraction memory uptime = FixidityLib.add( - FixidityLib.fixed1(), - FixidityLib.newFixedFraction(1, 10) - ); - - ph.mockRevert( - ph.FRACTION_MUL(), - abi.encodePacked( - FixidityLib.fixed1().unwrap(), - FixidityLib.fixed1().unwrap(), - uptime.unwrap(), - FixidityLib.fixed1().unwrap(), - originalValidatorScoreParameters.exponent, - uint256(18) - ) - ); - - vm.expectRevert("Uptime cannot be larger than one"); - validators.calculateEpochScore(uptime.unwrap()); - } -} - -contract ValidatorsTest_CalculateEpochScore_L2 is ValidatorsTest_L2 { - function test_Reverts_WhenL2() public { - vm.expectRevert("This method is no longer supported in L2."); - validators.calculateEpochScore(1); - } -} - -contract ValidatorsTest_CalculateGroupEpochScore_Setup is ValidatorsTest { - function setUp() public { - super.setUp(); - - _registerValidatorGroupHelper(group, 1); - } - - function _computeGroupUptimeCalculation( - FixidityLib.Fraction[] memory _uptimes - ) public returns (uint256[] memory, uint256) { - FixidityLib.Fraction memory gracePeriod = FixidityLib.newFixedFraction( - validators.downtimeGracePeriod(), - 1 - ); - uint256 expectedScore; - uint256[] memory unwrapedUptimes = new uint256[](_uptimes.length); - - uint256 sum = 0; - for (uint256 i = 0; i < _uptimes.length; i++) { - uint256 _currentscore = _calculateScore(_uptimes[i].unwrap(), gracePeriod.unwrap()); - - sum = sum.add(_calculateScore(_uptimes[i].unwrap(), gracePeriod.unwrap())); - - ph.mockReturn( - ph.FRACTION_MUL(), - abi.encodePacked( - FixidityLib.fixed1().unwrap(), - FixidityLib.fixed1().unwrap(), - _uptimes[i].unwrap(), - FixidityLib.fixed1().unwrap(), - originalValidatorScoreParameters.exponent, - uint256(18) - ), - abi.encodePacked(_currentscore, FixidityLib.fixed1().unwrap()) - ); - unwrapedUptimes[i] = _uptimes[i].unwrap(); - } - - expectedScore = sum.div(_uptimes.length); - - return (unwrapedUptimes, expectedScore); - } -} - -contract ValidatorsTest_CalculateGroupEpochScore_L1 is - ValidatorsTest_CalculateGroupEpochScore_Setup -{ - function test_ShouldCalculateGroupScoreCorrectly_WhenThereIs1ValidatorGroup() public { - FixidityLib.Fraction[] memory uptimes = new FixidityLib.Fraction[](1); - uptimes[0] = FixidityLib.newFixedFraction(969, 1000); - - (uint256[] memory unwrapedUptimes, uint256 expectedScore) = _computeGroupUptimeCalculation( - uptimes - ); - uint256 _actualScore = validators.calculateGroupEpochScore(unwrapedUptimes); - assertEq(_actualScore, expectedScore); - } - - function test_ShouldCalculateGroupScoreCorrectly_WhenThereAre3ValidatorGroup() public { - FixidityLib.Fraction[] memory uptimes = new FixidityLib.Fraction[](3); - uptimes[0] = FixidityLib.newFixedFraction(969, 1000); - uptimes[1] = FixidityLib.newFixedFraction(485, 1000); - uptimes[2] = FixidityLib.newFixedFraction(456, 1000); - - (uint256[] memory unwrapedUptimes, uint256 expectedScore) = _computeGroupUptimeCalculation( - uptimes - ); - uint256 _actualScore = validators.calculateGroupEpochScore(unwrapedUptimes); - assertEq(_actualScore, expectedScore); - } - - function test_ShouldCalculateGroupScoreCorrectly_WhenThereAre5ValidatorGroup() public { - FixidityLib.Fraction[] memory uptimes = new FixidityLib.Fraction[](5); - uptimes[0] = FixidityLib.newFixedFraction(969, 1000); - uptimes[1] = FixidityLib.newFixedFraction(485, 1000); - uptimes[2] = FixidityLib.newFixedFraction(456, 1000); - uptimes[3] = FixidityLib.newFixedFraction(744, 1000); - uptimes[4] = FixidityLib.newFixedFraction(257, 1000); - - (uint256[] memory unwrapedUptimes, uint256 expectedScore) = _computeGroupUptimeCalculation( - uptimes - ); - uint256 _actualScore = validators.calculateGroupEpochScore(unwrapedUptimes); - assertEq(_actualScore, expectedScore); - } - - function test_ShouldCalculateGroupScoreCorrectly_WhenOnlyZerosAreProvided() public { - FixidityLib.Fraction[] memory uptimes = new FixidityLib.Fraction[](5); - uptimes[0] = FixidityLib.newFixed(0); - uptimes[1] = FixidityLib.newFixed(0); - uptimes[2] = FixidityLib.newFixed(0); - uptimes[3] = FixidityLib.newFixed(0); - uptimes[4] = FixidityLib.newFixed(0); - - (uint256[] memory unwrapedUptimes, uint256 expectedScore) = _computeGroupUptimeCalculation( - uptimes - ); - uint256 _actualScore = validators.calculateGroupEpochScore(unwrapedUptimes); - assertEq(_actualScore, expectedScore); - } - - function test_ShouldCalculateGroupScoreCorrectly_WhenThereAreZerosInUptimes() public { - FixidityLib.Fraction[] memory uptimes = new FixidityLib.Fraction[](3); - uptimes[0] = FixidityLib.newFixedFraction(75, 100); - uptimes[1] = FixidityLib.newFixed(0); - uptimes[2] = FixidityLib.newFixedFraction(95, 100); - - (uint256[] memory unwrapedUptimes, uint256 expectedScore) = _computeGroupUptimeCalculation( - uptimes - ); - uint256 _actualScore = validators.calculateGroupEpochScore(unwrapedUptimes); - assertEq(_actualScore, expectedScore); - } - - function test_Reverts_WhenMoreUptimesThanMaxGroupSize() public { - FixidityLib.Fraction[] memory uptimes = new FixidityLib.Fraction[](6); - uptimes[0] = FixidityLib.newFixedFraction(9, 10); - uptimes[1] = FixidityLib.newFixedFraction(9, 10); - uptimes[2] = FixidityLib.newFixedFraction(9, 10); - uptimes[3] = FixidityLib.newFixedFraction(9, 10); - uptimes[4] = FixidityLib.newFixedFraction(9, 10); - uptimes[5] = FixidityLib.newFixedFraction(9, 10); - - (uint256[] memory unwrapedUptimes, ) = _computeGroupUptimeCalculation(uptimes); - vm.expectRevert("Uptime array larger than maximum group size"); - validators.calculateGroupEpochScore(unwrapedUptimes); - } - - function test_Reverts_WhenNoUptimesProvided() public { - uint256[] memory uptimes = new uint256[](0); + function test_Emits_ValidatorGroupCommissionUpdated() public { + vm.prank(group); + validators.setNextCommissionUpdate(newCommission); - vm.expectRevert("Uptime array empty"); - validators.calculateGroupEpochScore(uptimes); - } + blockTravel(commissionUpdateDelay); - function test_Reverts_WhenUptimesGreaterThan1() public { - FixidityLib.Fraction[] memory uptimes = new FixidityLib.Fraction[](5); - uptimes[0] = FixidityLib.newFixedFraction(9, 10); - uptimes[1] = FixidityLib.newFixedFraction(9, 10); - uptimes[2] = FixidityLib.add(FixidityLib.fixed1(), FixidityLib.newFixedFraction(1, 10)); - uptimes[3] = FixidityLib.newFixedFraction(9, 10); - uptimes[4] = FixidityLib.newFixedFraction(9, 10); + vm.expectEmit(true, true, true, true); + emit ValidatorGroupCommissionUpdated(group, newCommission); - (uint256[] memory unwrapedUptimes, ) = _computeGroupUptimeCalculation(uptimes); - vm.expectRevert("Uptime cannot be larger than one"); - validators.calculateGroupEpochScore(unwrapedUptimes); + vm.prank(group); + validators.updateCommission(); } -} -contract ValidatorsTest_CalculateGroupEpochScore_L2 is - ValidatorsTest_CalculateGroupEpochScore_Setup, - ValidatorsTest_L2 -{ - function test_Reverts() public { - FixidityLib.Fraction[] memory uptimes = new FixidityLib.Fraction[](5); - uptimes[0] = FixidityLib.newFixedFraction(9, 10); - uptimes[1] = FixidityLib.newFixedFraction(9, 10); - uptimes[3] = FixidityLib.newFixedFraction(9, 10); - uptimes[4] = FixidityLib.newFixedFraction(9, 10); + function test_Reverts_WhenActivationBlockHasNotPassed() public { + vm.prank(group); + validators.setNextCommissionUpdate(newCommission); - (uint256[] memory unwrapedUptimes, ) = _computeGroupUptimeCalculation(uptimes); - vm.expectRevert("This method is no longer supported in L2."); - validators.calculateGroupEpochScore(unwrapedUptimes); + vm.expectRevert("Can't apply commission update yet"); + vm.prank(group); + validators.updateCommission(); } -} -contract ValidatorsTest_UpdateValidatorScoreFromSigner_Setup is ValidatorsTest { - FixidityLib.Fraction public gracePeriod; - FixidityLib.Fraction public uptime; - uint256 public _epochScore; - - function setUp() public { - super.setUp(); + function test_Reverts_WhenNoCommissionHasBeenQueued() public { + vm.expectRevert("No commission update queued"); - _registerValidatorHelper(validator, validatorPk); - gracePeriod = FixidityLib.newFixedFraction(validators.downtimeGracePeriod(), 1); - - uptime = FixidityLib.newFixedFraction(99, 100); - - _epochScore = _calculateScore(uptime.unwrap(), gracePeriod.unwrap()); - - ph.mockReturn( - ph.FRACTION_MUL(), - abi.encodePacked( - FixidityLib.fixed1().unwrap(), - FixidityLib.fixed1().unwrap(), - uptime.unwrap(), - FixidityLib.fixed1().unwrap(), - originalValidatorScoreParameters.exponent, - uint256(18) - ), - abi.encodePacked(_epochScore, FixidityLib.fixed1().unwrap()) - ); + vm.prank(group); + validators.updateCommission(); } -} -contract ValidatorsTest_UpdateValidatorScoreFromSigner_L1 is - ValidatorsTest_UpdateValidatorScoreFromSigner_Setup -{ - function test_ShouldUpdateValidatorScore_WhenUptimeInRange0And1() public { - uint256 _expectedScore = FixidityLib - .multiply( - originalValidatorScoreParameters.adjustmentSpeed, - FixidityLib.newFixedFraction(_epochScore, FixidityLib.fixed1().unwrap()) - ) - .unwrap(); - - vm.prank(address(0)); - validators.updateValidatorScoreFromSigner(validator, uptime.unwrap()); - - (, , , uint256 _actualScore, ) = validators.getValidator(validator); - - assertEq(_actualScore, _expectedScore); - } - - function test_ShouldUpdateValidatorScore_WhenValidatorHasNonZeroScore() public { - vm.prank(address(0)); - validators.updateValidatorScoreFromSigner(validator, uptime.unwrap()); - - uint256 _expectedScore = FixidityLib - .multiply( - originalValidatorScoreParameters.adjustmentSpeed, - FixidityLib.newFixedFraction(_epochScore, FixidityLib.fixed1().unwrap()) - ) - .unwrap(); - - _expectedScore = FixidityLib - .add( - FixidityLib.multiply( - FixidityLib.subtract( - FixidityLib.fixed1(), - originalValidatorScoreParameters.adjustmentSpeed - ), - FixidityLib.newFixedFraction(_expectedScore, FixidityLib.fixed1().unwrap()) - ), - FixidityLib.newFixedFraction(_expectedScore, FixidityLib.fixed1().unwrap()) - ) - .unwrap(); + function test_Reverts_WhenApplyingAlreadyAppliedCommission() public { + vm.prank(group); + validators.setNextCommissionUpdate(newCommission); + blockTravel(commissionUpdateDelay); - vm.prank(address(0)); - validators.updateValidatorScoreFromSigner(validator, uptime.unwrap()); - (, , , uint256 _actualScore, ) = validators.getValidator(validator); + vm.prank(group); + validators.updateCommission(); - assertEq(_actualScore, _expectedScore); - } + vm.expectRevert("No commission update queued"); - function test_Reverts_WhenUptimeGreaterThan1() public { - uptime = FixidityLib.add(FixidityLib.fixed1(), FixidityLib.newFixedFraction(1, 10)); - vm.prank(address(0)); - vm.expectRevert("Uptime cannot be larger than one"); - validators.updateValidatorScoreFromSigner(validator, uptime.unwrap()); + vm.prank(group); + validators.updateCommission(); } -} -contract ValidatorsTest_UpdateValidatorScoreFromSigner is - ValidatorsTest_UpdateValidatorScoreFromSigner_Setup, - ValidatorsTest_L2 -{ - function test_Reverts_WhenL2() public { - vm.expectRevert("This method is no longer supported in L2."); + function test_ShouldSendMultipleValidatorPayments() public { + vm.prank(group); + validators.addFirstMember(validator, address(0), address(0)); + vm.prank(group); + validators.addMember(otherValidator); + vm.prank(group); + validators.setNextCommissionUpdate(newCommission); + blockTravel(commissionUpdateDelay); - vm.prank(address(0)); - validators.updateValidatorScoreFromSigner(validator, uptime.unwrap()); + vm.expectEmit(true, true, true, true); + emit SendValidatorPaymentCalled(validator); + vm.expectEmit(true, true, true, true); + emit SendValidatorPaymentCalled(otherValidator); + vm.prank(group); + validators.updateCommission(); } } @@ -3136,11 +2123,6 @@ contract ValidatorsTest_UpdateMembershipHistory is ValidatorsTest { } } -contract ValidatorsTest_UpdateMembershipHistory_L2 is - ValidatorsTest_L2, - ValidatorsTest_UpdateMembershipHistory -{} - contract ValidatorsTest_GetMembershipInLastEpoch_Setup is ValidatorsTest { function setUp() public { super.setUp(); @@ -3175,42 +2157,6 @@ contract ValidatorsTest_GetMembershipInLastEpoch is ValidatorsTest_GetMembership } } -contract ValidatorsTest_GetMembershipInLastEpoch_L1 is - ValidatorsTest_GetMembershipInLastEpoch_Setup -{ - function test_MaintainsMembershipAfterL2Transition() public { - address lastValidatorGroup; - address nextValidatorGroup; - for (uint256 i = 0; i < membershipHistoryLength.add(1); i++) { - blockTravel(ph.epochSize()); - - vm.prank(validator); - validators.affiliate(vm.addr(i + 1)); - vm.prank(vm.addr(i + 1)); - validators.addFirstMember(validator, address(0), address(0)); - - if (i == 0) { - assertEq(validators.getMembershipInLastEpoch(validator), address(0)); - } else { - lastValidatorGroup = vm.addr(i); - nextValidatorGroup = vm.addr(i + 1); - assertEq(validators.getMembershipInLastEpoch(validator), vm.addr(i)); - } - } - - whenL2WithEpochManagerInitialization(); - - assertEq(validators.getMembershipInLastEpoch(validator), lastValidatorGroup); - epochManager.setCurrentEpochNumber(epochManager.getCurrentEpochNumber() + 1); - assertEq(validators.getMembershipInLastEpoch(validator), nextValidatorGroup); - } -} - -contract ValidatorsTest_GetMembershipInLastEpoch_L2 is - ValidatorsTest_L2, - ValidatorsTest_GetMembershipInLastEpoch -{} - contract ValidatorsTest_GetTopGroupValidators is ValidatorsTest { function setUp() public { super.setUp(); @@ -3226,11 +2172,6 @@ contract ValidatorsTest_GetTopGroupValidators is ValidatorsTest { } } -contract ValidatorsTest_GetTopGroupValidators_L2 is - ValidatorsTest_L2, - ValidatorsTest_GetTopGroupValidators -{} - contract ValidatorsTest_GetTopGroupValidatorsAccounts is ValidatorsTest { function setUp() public { super.setUp(); @@ -3246,17 +2187,6 @@ contract ValidatorsTest_GetTopGroupValidatorsAccounts is ValidatorsTest { } } -contract ValidatorsTest_GetTopGroupValidatorsAccounts_L2 is - ValidatorsTest_GetTopGroupValidatorsAccounts, - ValidatorsTest_L2 -{} - -contract ValidatorsTest_GetEpochSize is ValidatorsTest { - function test_ShouldReturn17280() public { - assertEq(IPrecompiles(address(validators)).getEpochSize(), 17280); - } -} - contract ValidatorsTest_GetAccountLockedGoldRequirement is ValidatorsTest { uint256 public numMembers = 5; uint256[] public actualRequirements; @@ -3318,325 +2248,7 @@ contract ValidatorsTest_GetAccountLockedGoldRequirement is ValidatorsTest { } } -contract ValidatorsTest_GetAccountLockedGoldRequirement_L2 is - ValidatorsTest_L2, - ValidatorsTest_GetAccountLockedGoldRequirement -{} - -contract ValidatorsTest_DistributeEpochPaymentsFromSigner is ValidatorsTest { - uint256 public numMembers = 5; - uint256 public maxPayment = 20122394876; - uint256 public expectedTotalPayment; - uint256 public expectedGroupPayment; - uint256 public expectedDelegatedPayment; - uint256 public expectedValidatorPayment; - uint256 public halfExpectedTotalPayment; - uint256 public halfExpectedGroupPayment; - uint256 public halfExpectedValidatorPayment; - uint256 public halfExpectedDelegatedPayment; - - uint256[] public actualRequirements; - uint256[] public removalTimestamps; - - FixidityLib.Fraction public expectedScore; - FixidityLib.Fraction public gracePeriod; - FixidityLib.Fraction public uptime; - FixidityLib.Fraction public delegatedFraction; - - function setUp() public { - super.setUp(); - - delegatedFraction = FixidityLib.newFixedFraction(10, 100); - _registerValidatorGroupWithMembers(group, 1); - blockTravel(ph.epochSize()); - - lockedGold.addSlasherTest(paymentDelegatee); - - vm.prank(validator); - accounts.setPaymentDelegation(paymentDelegatee, delegatedFraction.unwrap()); - - uptime = FixidityLib.newFixedFraction(99, 100); - - expectedScore = FixidityLib.multiply( - originalValidatorScoreParameters.adjustmentSpeed, - FixidityLib.newFixed(_calculateScore(uptime.unwrap(), validators.downtimeGracePeriod())) - ); - - expectedTotalPayment = FixidityLib.fromFixed( - FixidityLib.multiply( - expectedScore, - FixidityLib.newFixedFraction(maxPayment, FixidityLib.fixed1().unwrap()) - ) - ); - - expectedGroupPayment = FixidityLib.fromFixed( - FixidityLib.multiply(commission, FixidityLib.newFixed(expectedTotalPayment)) - ); - - uint256 remainingPayment = expectedTotalPayment.sub(expectedGroupPayment); - - expectedDelegatedPayment = FixidityLib.fromFixed( - FixidityLib.multiply(FixidityLib.newFixed(remainingPayment), delegatedFraction) - ); - - expectedValidatorPayment = remainingPayment.sub(expectedDelegatedPayment); - - halfExpectedTotalPayment = FixidityLib - .fromFixed( - FixidityLib.multiply( - expectedScore, - FixidityLib.newFixedFraction(maxPayment, FixidityLib.fixed1().unwrap()) - ) - ) - .div(2); - - halfExpectedGroupPayment = FixidityLib.fromFixed( - FixidityLib.multiply(commission, FixidityLib.newFixed(halfExpectedTotalPayment)) - ); - - remainingPayment = halfExpectedTotalPayment.sub(halfExpectedGroupPayment); - - halfExpectedDelegatedPayment = FixidityLib.fromFixed( - FixidityLib.multiply(FixidityLib.newFixed(remainingPayment), delegatedFraction) - ); - - halfExpectedValidatorPayment = remainingPayment.sub(halfExpectedDelegatedPayment); - - ph.mockReturn( - ph.FRACTION_MUL(), - abi.encodePacked( - FixidityLib.fixed1().unwrap(), - FixidityLib.fixed1().unwrap(), - uptime.unwrap(), - FixidityLib.fixed1().unwrap(), - originalValidatorScoreParameters.exponent, - uint256(18) - ), - abi.encodePacked( - _calculateScore(uptime.unwrap(), validators.downtimeGracePeriod()), - FixidityLib.fixed1().unwrap() - ) - ); - - vm.prank(address(0)); - validators.updateValidatorScoreFromSigner(validator, uptime.unwrap()); - } - - function test_ShouldPayValidator_WhenValidatorAndGroupMeetBalanceRequirements() public { - vm.prank(address(0)); - validators.distributeEpochPaymentsFromSigner(validator, maxPayment); - assertEq(stableToken.balanceOf(validator), expectedValidatorPayment); - } - - function test_ShouldPayGroup_WhenValidatorAndGroupMeetBalanceRequirements() public { - vm.prank(address(0)); - validators.distributeEpochPaymentsFromSigner(validator, maxPayment); - assertEq(stableToken.balanceOf(group), expectedGroupPayment); - } - - function test_ShouldPayDelegatee_WhenValidatorAndGroupMeetBalanceRequirements() public { - vm.prank(address(0)); - validators.distributeEpochPaymentsFromSigner(validator, maxPayment); - assertEq(stableToken.balanceOf(paymentDelegatee), expectedDelegatedPayment); - } - - function test_ShouldReturnTheExpectedTotalPayment_WhenValidatorAndGroupMeetBalanceRequirements() - public - { - // validators.distributeEpochPaymentsFromSigner(validator, maxPayment); - vm.prank(address(0)); - assertEq( - validators.distributeEpochPaymentsFromSigner(validator, maxPayment), - expectedTotalPayment - ); - } - - function test_ShouldPayValidator_WhenValidatorAndGroupMeetBalanceRequirementsAndNoPaymentDelegated() - public - { - expectedDelegatedPayment = 0; - expectedValidatorPayment = expectedTotalPayment.sub(expectedGroupPayment); - - vm.prank(validator); - accounts.deletePaymentDelegation(); - - vm.prank(address(0)); - validators.distributeEpochPaymentsFromSigner(validator, maxPayment); - assertEq(stableToken.balanceOf(validator), expectedValidatorPayment); - } - - function test_ShouldPayGroup_WhenValidatorAndGroupMeetBalanceRequirementsAndNoPaymentDelegated() - public - { - expectedDelegatedPayment = 0; - expectedValidatorPayment = expectedTotalPayment.sub(expectedGroupPayment); - - vm.prank(validator); - accounts.deletePaymentDelegation(); - - // validators.distributeEpochPaymentsFromSigner(validator, maxPayment); - vm.prank(address(0)); - assertEq( - validators.distributeEpochPaymentsFromSigner(validator, maxPayment), - expectedTotalPayment - ); - } - - function test_ShouldReturnTheExpectedTotalPayment_WhenValidatorAndGroupMeetBalanceRequirementsAndNoPaymentDelegated() - public - { - expectedDelegatedPayment = 0; - expectedValidatorPayment = expectedTotalPayment.sub(expectedGroupPayment); - - vm.prank(validator); - accounts.deletePaymentDelegation(); - - vm.prank(address(0)); - validators.distributeEpochPaymentsFromSigner(validator, maxPayment); - assertEq(stableToken.balanceOf(group), expectedGroupPayment); - } - - function test_shouldPayValidatorOnlyHalf_WhenSlashingMultiplierIsHalved() public { - vm.prank(paymentDelegatee); - validators.halveSlashingMultiplier(group); - vm.prank(address(0)); - validators.distributeEpochPaymentsFromSigner(validator, maxPayment); - - assertEq(stableToken.balanceOf(validator), halfExpectedValidatorPayment); - } - - function test_shouldPayGroupOnlyHalf_WhenSlashingMultiplierIsHalved() public { - vm.prank(paymentDelegatee); - validators.halveSlashingMultiplier(group); - vm.prank(address(0)); - validators.distributeEpochPaymentsFromSigner(validator, maxPayment); - - assertEq(stableToken.balanceOf(group), halfExpectedGroupPayment); - } - - function test_shouldPayDelegateeOnlyHalf_WhenSlashingMultiplierIsHalved() public { - vm.prank(paymentDelegatee); - validators.halveSlashingMultiplier(group); - vm.prank(address(0)); - validators.distributeEpochPaymentsFromSigner(validator, maxPayment); - - assertEq(stableToken.balanceOf(paymentDelegatee), halfExpectedDelegatedPayment); - } - - function test_shouldReturnHalfExpectedTotalPayment_WhenSlashingMultiplierIsHalved() public { - vm.prank(paymentDelegatee); - validators.halveSlashingMultiplier(group); - - vm.prank(address(0)); - assertEq( - validators.distributeEpochPaymentsFromSigner(validator, maxPayment), - halfExpectedTotalPayment - ); - } - - function test_ShouldNotPayValidator_WhenValidatorDoesNotMeetBalanceRequirement() public { - lockedGold.setAccountTotalLockedGold( - validator, - originalValidatorLockedGoldRequirements.value.sub(11) - ); - - vm.prank(address(0)); - validators.distributeEpochPaymentsFromSigner(validator, maxPayment); - assertEq(stableToken.balanceOf(validator), 0); - } - - function test_ShouldNotPayGroup_WhenValidatorDoesNotMeetBalanceRequirement() public { - lockedGold.setAccountTotalLockedGold( - validator, - originalValidatorLockedGoldRequirements.value.sub(11) - ); - - vm.prank(address(0)); - validators.distributeEpochPaymentsFromSigner(validator, maxPayment); - assertEq(stableToken.balanceOf(group), 0); - } - - function test_ShouldNotPayDelegatee_WhenValidatorDoesNotMeetBalanceRequirement() public { - lockedGold.setAccountTotalLockedGold( - validator, - originalValidatorLockedGoldRequirements.value.sub(11) - ); - - vm.prank(address(0)); - validators.distributeEpochPaymentsFromSigner(validator, maxPayment); - assertEq(stableToken.balanceOf(paymentDelegatee), 0); - } - - function test_ShouldReturnZero_WhenValidatorDoesNotMeetBalanceRequirement() public { - lockedGold.setAccountTotalLockedGold( - validator, - originalValidatorLockedGoldRequirements.value.sub(11) - ); - - vm.prank(address(0)); - assertEq(validators.distributeEpochPaymentsFromSigner(validator, maxPayment), 0); - } - - function test_ShouldNotPayValidator_WhenGroupDoesNotMeetBalanceRequirement() public { - lockedGold.setAccountTotalLockedGold( - validator, - originalGroupLockedGoldRequirements.value.sub(11) - ); - - vm.prank(address(0)); - validators.distributeEpochPaymentsFromSigner(validator, maxPayment); - assertEq(stableToken.balanceOf(validator), 0); - } - - function test_ShouldNotPayGroup_WhenGroupDoesNotMeetBalanceRequirement() public { - lockedGold.setAccountTotalLockedGold( - validator, - originalGroupLockedGoldRequirements.value.sub(11) - ); - - vm.prank(address(0)); - validators.distributeEpochPaymentsFromSigner(validator, maxPayment); - assertEq(stableToken.balanceOf(group), 0); - } - - function test_ShouldNotPayDelegatee_WhenGroupDoesNotMeetBalanceRequirement() public { - lockedGold.setAccountTotalLockedGold( - validator, - originalGroupLockedGoldRequirements.value.sub(11) - ); - - vm.prank(address(0)); - validators.distributeEpochPaymentsFromSigner(validator, maxPayment); - assertEq(stableToken.balanceOf(paymentDelegatee), 0); - } - - function test_ShouldReturnZero_WhenGroupDoesNotMeetBalanceRequirement() public { - lockedGold.setAccountTotalLockedGold( - validator, - originalGroupLockedGoldRequirements.value.sub(11) - ); - - vm.prank(address(0)); - assertEq(validators.distributeEpochPaymentsFromSigner(validator, maxPayment), 0); - } -} - -contract ValidatorsTest_DistributeEpochPaymentsFromSigner_L2 is ValidatorsTest_L2 { - function test_Reverts_WhenL2() public { - vm.prank(address(0)); - vm.expectRevert("This method is no longer supported in L2."); - validators.distributeEpochPaymentsFromSigner(validator, 100); - } -} - -contract ValidatorsTest_MintStableToEpochManager_L1 is ValidatorsTest { - function test_Reverts_WhenL1() public { - vm.expectRevert("This method is not supported in L1."); - validators.mintStableToEpochManager(5); - } -} - -contract ValidatorsTest_MintStableToEpochManager_L2 is ValidatorsTest_L2 { +contract ValidatorsTest_MintStableToEpochManager is ValidatorsTest { function test_Reverts_WhenCalledByOtherThanEpochManager() public { vm.expectRevert("only registered contract"); validators.mintStableToEpochManager(5); @@ -3654,7 +2266,7 @@ contract ValidatorsTest_MintStableToEpochManager_L2 is ValidatorsTest_L2 { } } -contract ValidatorsTest_ForceDeaffiliateIfValidator_Setup is ValidatorsTest { +contract ValidatorsTest_ForceDeaffiliateIfValidator is ValidatorsTest { function setUp() public { super.setUp(); @@ -3666,11 +2278,7 @@ contract ValidatorsTest_ForceDeaffiliateIfValidator_Setup is ValidatorsTest { lockedGold.addSlasherTest(paymentDelegatee); } -} -contract ValidatorsTest_ForceDeaffiliateIfValidator is - ValidatorsTest_ForceDeaffiliateIfValidator_Setup -{ function test_ShouldSucceed_WhenSenderIsWhitelistedSlashingAddress() public { vm.prank(paymentDelegatee); validators.forceDeaffiliateIfValidator(validator); @@ -3682,26 +2290,8 @@ contract ValidatorsTest_ForceDeaffiliateIfValidator is vm.expectRevert("Only registered slasher can call"); validators.forceDeaffiliateIfValidator(validator); } -} - -contract ValidatorsTest_ForceDeaffiliateIfValidator_L1 is - ValidatorsTest_ForceDeaffiliateIfValidator_Setup -{ - function _performForcedDeaffiliation() internal { - vm.prank(paymentDelegatee); - validators.forceDeaffiliateIfValidator(validator); - } - - function test_ShouldNotTryToSendValidatorPayment_WhenL1() public { - assertDoesNotEmit(_performForcedDeaffiliation, "SendValidatorPaymentCalled(address)"); - } -} -contract ValidatorsTest_ForceDeaffiliateIfValidator_L2 is - ValidatorsTest_ForceDeaffiliateIfValidator, - ValidatorsTest_L2 -{ - function test_ShouldSendValidatorPayment_WhenL2() public { + function test_ShouldSendValidatorPayment() public { vm.expectEmit(true, true, true, true); emit SendValidatorPaymentCalled(validator); vm.prank(paymentDelegatee); @@ -3732,8 +2322,7 @@ contract ValidatorsTest_GroupMembershipInEpoch is ValidatorsTest { // Start at 1 since we can't start with deaffiliate for (uint256 i = 1; i < totalEpochs; i++) { - blockTravel(ph.epochSize()); - + travelNL2Epoch(1); uint256 epochNumber = getEpochNumber(); if (i % gapSize == 0) { @@ -3823,11 +2412,6 @@ contract ValidatorsTest_GroupMembershipInEpoch is ValidatorsTest { } } -contract ValidatorsTest_GroupMembershipInEpoch_L2 is - ValidatorsTest_GroupMembershipInEpoch, - ValidatorsTest_L2 -{} - contract ValidatorsTest_HalveSlashingMultiplier is ValidatorsTest { function setUp() public { super.setUp(); @@ -3865,11 +2449,6 @@ contract ValidatorsTest_HalveSlashingMultiplier is ValidatorsTest { } } -contract ValidatorsTest_HalveSlashingMultiplier_L2 is - ValidatorsTest_HalveSlashingMultiplier, - ValidatorsTest_L2 -{} - contract ValidatorsTest_ResetSlashingMultiplier is ValidatorsTest { function setUp() public { super.setUp(); @@ -3917,8 +2496,3 @@ contract ValidatorsTest_ResetSlashingMultiplier is ValidatorsTest { assertEq(actualMultiplier, FixidityLib.fixed1().unwrap()); } } - -contract ValidatorsTest_ResetSlashingMultiplier_L2 is - ValidatorsTest_ResetSlashingMultiplier, - ValidatorsTest_L2 -{} diff --git a/packages/protocol/test-sol/unit/governance/validators/mocks/CompileValidatorIntegrationMock.t.sol b/packages/protocol/test-sol/unit/governance/validators/mocks/CompileValidatorIntegrationMock.t.sol index 46209af1d95..13a521afb64 100644 --- a/packages/protocol/test-sol/unit/governance/validators/mocks/CompileValidatorIntegrationMock.t.sol +++ b/packages/protocol/test-sol/unit/governance/validators/mocks/CompileValidatorIntegrationMock.t.sol @@ -4,8 +4,8 @@ pragma solidity >=0.8.7 <0.8.20; import "celo-foundry-8/Test.sol"; import "forge-std/console.sol"; -// here only to forge compile of ValidatorsMock -import "@test-sol/unit/governance/validators/mocks/ValidatorsMock.sol"; +// here only to force compile of ValidatorsMock +import "@test-sol/unit/governance/validators/mocks/ValidatorsCompile.sol"; contract CompileValidatorIntegrationMock is Test { function test_nop() public view { diff --git a/packages/protocol/test-sol/unit/governance/validators/mocks/ValidatorsCompile.sol b/packages/protocol/test-sol/unit/governance/validators/mocks/ValidatorsCompile.sol new file mode 100644 index 00000000000..46d350d06bb --- /dev/null +++ b/packages/protocol/test-sol/unit/governance/validators/mocks/ValidatorsCompile.sol @@ -0,0 +1,7 @@ +// SPDX-License-Identifier: UNLICENSED +pragma solidity >=0.8.7 <0.8.20; + +import "@celo-contracts-8/governance/Validators.sol"; + +// Hack to force forge to compile the Validators contract +contract ValidatorsCompile is Validators(true) {} diff --git a/packages/protocol/test-sol/unit/governance/validators/mocks/ValidatorsMock.sol b/packages/protocol/test-sol/unit/governance/validators/mocks/ValidatorsMock.sol deleted file mode 100644 index 55470bc89c4..00000000000 --- a/packages/protocol/test-sol/unit/governance/validators/mocks/ValidatorsMock.sol +++ /dev/null @@ -1,14 +0,0 @@ -// SPDX-License-Identifier: UNLICENSED -pragma solidity >=0.8.7 <0.8.20; - -import "@celo-contracts-8/governance/Validators.sol"; -import "@celo-contracts/common/FixidityLib.sol"; - -/** - * @title A wrapper around Validators that exposes onlyVm functions for testing. - */ -contract ValidatorsMock is Validators(true) { - function computeEpochReward(address, uint256, uint256) external pure override returns (uint256) { - return 1; - } -} diff --git a/packages/protocol/test-sol/unit/governance/validators/mocks/ValidatorsMockTunnel.sol b/packages/protocol/test-sol/unit/governance/validators/mocks/ValidatorsMockTunnel.sol index be98aa05d75..51e8d4a5fab 100644 --- a/packages/protocol/test-sol/unit/governance/validators/mocks/ValidatorsMockTunnel.sol +++ b/packages/protocol/test-sol/unit/governance/validators/mocks/ValidatorsMockTunnel.sol @@ -12,7 +12,6 @@ contract ValidatorsMockTunnel is ForgeTest { struct InitParamsTunnel { // The number of blocks to delay a ValidatorGroup's commission uint256 commissionUpdateDelay; - uint256 downtimeGracePeriod; } constructor(address _validatorContractAddress) public { @@ -26,8 +25,6 @@ contract ValidatorsMockTunnel is ForgeTest { uint256 groupRequirementDuration; uint256 validatorRequirementValue; uint256 validatorRequirementDuration; - uint256 validatorScoreExponent; - uint256 validatorScoreAdjustmentSpeed; } struct InitParams2 { @@ -35,7 +32,6 @@ contract ValidatorsMockTunnel is ForgeTest { uint256 _slashingMultiplierResetPeriod; uint256 _maxGroupSize; uint256 _commissionUpdateDelay; - uint256 _downtimeGracePeriod; } // TODO move this to a generic Tunnel helper contract, add to other tunnels. @@ -45,7 +41,7 @@ contract ValidatorsMockTunnel is ForgeTest { * See https://docs.soliditylang.org/en/v0.5.17/control-structures.html#revert * for details. */ - function recoverErrorString(bytes memory errorData) internal returns (string memory) { + function recoverErrorString(bytes memory errorData) internal pure returns (string memory) { // Offset in `errorData` due to it starting with the signature for Error(string) uint256 signatureLength = 4; uint256 stringEncodingLength = errorData.length - signatureLength; @@ -72,19 +68,16 @@ contract ValidatorsMockTunnel is ForgeTest { InitParams2 calldata params2 ) external returns (bool, bytes memory) { InitParamsTunnel memory initParamsTunnel = InitParamsTunnel({ - commissionUpdateDelay: params2._commissionUpdateDelay, - downtimeGracePeriod: params2._downtimeGracePeriod + commissionUpdateDelay: params2._commissionUpdateDelay }); bytes memory data = abi.encodeWithSignature( - "initialize(address,uint256,uint256,uint256,uint256,uint256,uint256,uint256,uint256,uint256,(uint256,uint256))", + "initialize(address,uint256,uint256,uint256,uint256,uint256,uint256,uint256,(uint256))", params.registryAddress, params.groupRequirementValue, params.groupRequirementDuration, params.validatorRequirementValue, params.validatorRequirementDuration, - params.validatorScoreExponent, - params.validatorScoreAdjustmentSpeed, params2._membershipHistoryLength, params2._slashingMultiplierResetPeriod, params2._maxGroupSize, diff --git a/packages/protocol/test-sol/unit/governance/voting/Election.t.sol b/packages/protocol/test-sol/unit/governance/voting/Election.t.sol index 3a58b4a26f8..89442f1b0c7 100644 --- a/packages/protocol/test-sol/unit/governance/voting/Election.t.sol +++ b/packages/protocol/test-sol/unit/governance/voting/Election.t.sol @@ -13,8 +13,6 @@ import "@celo-contracts/common/Accounts.sol"; import "@celo-contracts/common/linkedlists/AddressSortedLinkedList.sol"; import "@celo-contracts/identity/test/MockRandom.sol"; import "@celo-contracts/common/Freezer.sol"; -import "@test-sol/unit/common/mocks/MockEpochManager.sol"; -import "@test-sol/utils/WhenL2.sol"; import { TestBlocker } from "@test-sol/unit/common/Blockable.t.sol"; @@ -57,8 +55,6 @@ contract ElectionTest is TestWithUtils { address account9 = actor("account9"); address account10 = actor("account10"); - address epochManagerAddress = actor("epochManagerAddress"); - address[] accountsArray; TestBlocker blocker; @@ -110,6 +106,22 @@ contract ElectionTest is TestWithUtils { } function setUp() public { + preElectionSetup(); + + election.initialize( + REGISTRY_ADDRESS, + electableValidatorsMin, + electableValidatorsMax, + maxNumGroupsVotedFor, + electabilityThreshold + ); + + blocker = new TestBlocker(); + election.setBlockedByContract(address(blocker)); + whenL2WithEpochManagerInitialization(); + } + + function preElectionSetup() public { ph.setEpochSize(DAY / 5); setupRegistry(); setupEpochManager(); @@ -144,6 +156,12 @@ contract ElectionTest is TestWithUtils { registry.setAddressFor("LockedGold", address(lockedGold)); registry.setAddressFor("Validators", address(validators)); registry.setAddressFor("Random", address(random)); + } +} + +contract ElectionTest_Initialize is ElectionTest { + function setUp() public { + preElectionSetup(); election.initialize( REGISTRY_ADDRESS, @@ -156,11 +174,7 @@ contract ElectionTest is TestWithUtils { blocker = new TestBlocker(); election.setBlockedByContract(address(blocker)); } -} - -contract ElectionTest_L2 is ElectionTest, WhenL2 {} -contract ElectionTest_Initialize is ElectionTest { function test_shouldHaveSetOwner() public { assertEq(election.owner(), owner); } @@ -204,11 +218,6 @@ contract ElectionTest_SetElectabilityThreshold is ElectionTest { } } -contract ElectionTest_SetElectabilityThreshold_L2 is - ElectionTest_SetElectabilityThreshold, - ElectionTest_L2 -{} - contract ElectionTest_SetElectableValidators is ElectionTest { function test_shouldSetElectableValidators() public { uint256 newElectableValidatorsMin = 2; @@ -249,11 +258,6 @@ contract ElectionTest_SetElectableValidators is ElectionTest { } } -contract ElectionTest_SetElectableValidators_L2 is - ElectionTest_SetElectableValidators, - ElectionTest_L2 -{} - contract ElectionTest_SetMaxNumGroupsVotedFor is ElectionTest { function test_shouldSetMaxNumGroupsVotedFor() public { uint256 newMaxNumGroupsVotedFor = 4; @@ -280,11 +284,6 @@ contract ElectionTest_SetMaxNumGroupsVotedFor is ElectionTest { } } -contract ElectionTest_SetMaxNumGroupsVotedFor_L2 is - ElectionTest_SetMaxNumGroupsVotedFor, - ElectionTest_L2 -{} - contract ElectionTest_SetAllowedToVoteOverMaxNumberOfGroups is ElectionTest { function test_shouldSetAllowedToVoteOverMaxNumberOfGroups() public { election.setAllowedToVoteOverMaxNumberOfGroups(true); @@ -324,11 +323,6 @@ contract ElectionTest_SetAllowedToVoteOverMaxNumberOfGroups is ElectionTest { } } -contract ElectionTest_SetAllowedToVoteOverMaxNumberOfGroups_L2 is - ElectionTest_SetAllowedToVoteOverMaxNumberOfGroups, - ElectionTest_L2 -{} - contract ElectionTest_MarkGroupEligible is ElectionTest { function setUp() public { super.setUp(); @@ -365,8 +359,6 @@ contract ElectionTest_MarkGroupEligible is ElectionTest { } } -contract ElectionTest_MarkGroupEligible_L2 is ElectionTest_MarkGroupEligible, ElectionTest_L2 {} - contract ElectionTest_MarkGroupInEligible is ElectionTest { function setUp() public { super.setUp(); @@ -403,8 +395,6 @@ contract ElectionTest_MarkGroupInEligible is ElectionTest { } } -contract ElectionTest_MarkGroupInEligible_L2 is ElectionTest_MarkGroupInEligible, ElectionTest_L2 {} - contract ElectionTest_Vote_WhenGroupEligible is ElectionTest { address voter = address(this); address group = account1; @@ -604,11 +594,6 @@ contract ElectionTest_Vote_WhenGroupEligible is ElectionTest { } } -contract ElectionTest_Vote_WhenGroupEligible_L2 is - ElectionTest_L2, - ElectionTest_Vote_WhenGroupEligible -{} - contract ElectionTest_Vote_WhenGroupEligible_WhenGroupCanReceiveVotes is ElectionTest { address voter = address(this); address group = account1; @@ -763,11 +748,6 @@ contract ElectionTest_Vote_WhenGroupEligible_WhenGroupCanReceiveVotes is Electio } } -contract ElectionTest_Vote_WhenGroupEligible_WhenGroupCanReceiveVotes_L2 is - ElectionTest_Vote_WhenGroupEligible_WhenGroupCanReceiveVotes, - ElectionTest_L2 -{} - contract ElectionTest_Vote_GroupNotEligible is ElectionTest { address voter = address(this); address group = account1; @@ -791,11 +771,6 @@ contract ElectionTest_Vote_GroupNotEligible is ElectionTest { } } -contract ElectionTest_Vote_GroupNotEligible_L2 is - ElectionTest_Vote_GroupNotEligible, - ElectionTest_L2 -{} - contract ElectionTest_Activate is ElectionTest { address voter = address(this); address group = account1; @@ -960,8 +935,6 @@ contract ElectionTest_Activate is ElectionTest { } } -contract ElectionTest_Activate_L2 is ElectionTest_L2, ElectionTest_Activate {} - contract ElectionTest_ActivateForAccount is ElectionTest { address voter = address(this); address group = account1; @@ -1116,8 +1089,6 @@ contract ElectionTest_ActivateForAccount is ElectionTest { } } -contract ElectionTest_ActivateForAccount_L2 is ElectionTest_L2, ElectionTest_ActivateForAccount {} - contract ElectionTest_RevokePending is ElectionTest { address voter = address(this); address group = account1; @@ -1274,8 +1245,6 @@ contract ElectionTest_RevokePending is ElectionTest { } } -contract ElectionTest_RevokePending_L2 is ElectionTest_RevokePending, ElectionTest_L2 {} - contract ElectionTest_RevokeActive is ElectionTest { address voter0 = address(this); address voter1 = account1; @@ -1554,8 +1523,6 @@ contract ElectionTest_RevokeActive is ElectionTest { } } -contract ElectionTest_RevokeActive_L2 is ElectionTest_L2, ElectionTest_RevokeActive {} - contract ElectionTest_ElectValidatorsAbstract is ElectionTest { struct MemberWithVotes { address member; @@ -1816,11 +1783,6 @@ contract ElectionTest_ElectValidatorSigners is ElectionTest_ElectValidatorsAbstr } } -contract ElectionTest_ElectValidatorSignersL2 is - ElectionTest_ElectValidatorSigners, - ElectionTest_L2 -{} - contract ElectionTest_ElectValidatorsAccounts is ElectionTest_ElectValidatorsAbstract { function test_ShouldElectCorrectValidators_WhenThereIsALargeNumberOfGroups() public { WhenThereIsALargeNumberOfGroups(); @@ -1932,134 +1894,16 @@ contract ElectionTest_ElectValidatorsAccounts is ElectionTest_ElectValidatorsAbs } } -contract ElectionTest_ElectValidatorsAccountsL2 is - ElectionTest_ElectValidatorsAccounts, - ElectionTest_L2 -{} - -contract ElectionTest_GetGroupEpochRewards is ElectionTest { - address voter = address(this); - address group1 = account2; - address group2 = account3; - uint256 voteValue1 = 2000000000; - uint256 voteValue2 = 1000000000; - uint256 totalRewardValue = 3000000000; - - uint256 expectedGroup1EpochRewards = - FixidityLib - .newFixedFraction(voteValue1, voteValue1 + voteValue2) - .multiply(FixidityLib.newFixed(totalRewardValue)) - .fromFixed(); - - function setUp() public { - super.setUp(); - - vm.prank(address(validators)); - election.markGroupEligible(group1, address(0), address(0)); - vm.prank(address(validators)); - election.markGroupEligible(group2, address(0), group1); - registry.setAddressFor("Validators", address(validators)); - lockedGold.setTotalLockedGold(voteValue1 + voteValue2); - - address[] memory membersGroup1 = new address[](1); - membersGroup1[0] = account8; - - validators.setMembers(group1, membersGroup1); - - address[] memory membersGroup2 = new address[](1); - membersGroup2[0] = account9; - validators.setMembers(group2, membersGroup2); - validators.setNumRegisteredValidators(2); - lockedGold.incrementNonvotingAccountBalance(voter, voteValue1 + voteValue2); - election.vote(group1, voteValue1, group2, address(0)); - election.vote(group2, voteValue2, address(0), group1); - } - - function WhenOneGroupHasActiveVotes() public { - travelNEpoch(1); - election.activate(group1); - } - - function test_ShouldReturnTheTotalRewardValue_WhenGroupUptimeIs100Percent_WhenOneGroupHasActiveVotes() - public - { - WhenOneGroupHasActiveVotes(); - - uint256[] memory uptimes = new uint256[](1); - uptimes[0] = FIXED1; - assertEq(election.getGroupEpochRewards(group1, totalRewardValue, uptimes), totalRewardValue); - } - - function test_ShouldReturnPartOfTheTotalRewardValue_WhenWhenGroupUptimeIsLessThan100Percent_WhenOneGroupHasActiveVotes() - public - { - WhenOneGroupHasActiveVotes(); - - uint256[] memory uptimes = new uint256[](1); - uptimes[0] = FIXED1 / 2; - assertEq( - election.getGroupEpochRewards(group1, totalRewardValue, uptimes), - totalRewardValue / 2 - ); - } - - function test_ShouldReturnZero_WhenTheGroupDoesNotMeetTheLockedGoldRequirements_WhenOneGroupHasActiveVotes() - public - { - WhenOneGroupHasActiveVotes(); - - validators.setDoesNotMeetAccountLockedGoldRequirements(group1); - uint256[] memory uptimes = new uint256[](1); - uptimes[0] = FIXED1; - assertEq(election.getGroupEpochRewards(group1, totalRewardValue, uptimes), 0); - } - - function WhenTwoGroupsHaveActiveVotes() public { - travelNEpoch(1); - election.activate(group1); - election.activate(group2); - } - - function test_ShouldReturn0_WhenOneGroupDoesNotMeetLockedGoldRequirements_WhenTwoGroupsHaveActiveVotes() - public - { - WhenTwoGroupsHaveActiveVotes(); - - validators.setDoesNotMeetAccountLockedGoldRequirements(group2); - uint256[] memory uptimes = new uint256[](1); - uptimes[0] = FIXED1; - assertEq(election.getGroupEpochRewards(group2, totalRewardValue, uptimes), 0); - } - - function test_ShouldReturnProportionalRewardValueForOtherGroup_WhenOneGroupDoesNotMeetLockedGoldRequirements_WhenTwoGroupsHaveActiveVotes() - public - { - WhenTwoGroupsHaveActiveVotes(); - - validators.setDoesNotMeetAccountLockedGoldRequirements(group2); - uint256[] memory uptimes = new uint256[](1); - uptimes[0] = FIXED1; - - assertEq( - election.getGroupEpochRewards(group1, totalRewardValue, uptimes), - expectedGroup1EpochRewards - ); - } - - function test_ShouldReturn0_WhenTheGroupMeetsLockedGoldRequirements_WhenThenGroupDoesNotHaveActiveVotes() - public - { - uint256[] memory uptimes = new uint256[](1); - uptimes[0] = FIXED1; - assertEq(election.getGroupEpochRewards(group1, totalRewardValue, uptimes), 0); - } +contract ElectionTest_GetCurrentValidatorSigners is ElectionTest_ElectValidatorsAbstract { + function test_ShouldReturnValidatorSigners() public { + WhenThereIsALargeNumberOfGroups(); + address[] memory elected = election.electValidatorAccounts(); + epochManager.setElectedAccounts(elected); + epochManager.setElectedSigners(elected); + epochManager.setNumberOfElectedInCurrentSet(epochManager.getElectedAccounts().length); + address[] memory electedInCurrentSet = election.getCurrentValidatorSigners(); - function test_Reverts_WhenL2() public { - _whenL2(); - uint256[] memory uptimes = new uint256[](1); - uptimes[0] = FIXED1; - vm.expectRevert("This method is no longer supported in L2."); - election.getGroupEpochRewards(group1, totalRewardValue, uptimes); + assertEq(elected, electedInCurrentSet); } } @@ -2221,11 +2065,6 @@ contract ElectionTest_DistributeEpochRewards is ElectionTest { } } -contract ElectionTest_DistributeEpochRewards_L2 is - ElectionTest_L2, - ElectionTest_DistributeEpochRewards -{} - contract ElectionTest_ForceDecrementVotes is ElectionTest { address voter = address(this); address group = account2; @@ -2651,8 +2490,6 @@ contract ElectionTest_ForceDecrementVotes is ElectionTest { } } -contract ElectionTest_ForceDecrementVotes_L2 is ElectionTest_L2, ElectionTest_ForceDecrementVotes {} - contract ElectionTest_ConsistencyChecks is ElectionTest { struct AccountStruct { address account; @@ -2780,11 +2617,7 @@ contract ElectionTest_ConsistencyChecks is ElectionTest { checkVoterInvariants(_accounts[j], 0); checkGroupInvariants(0); - if (isL2()) { - epochManager.setCurrentEpochNumber(i + 1); - } else { - vm.roll((i + 1) * ph.epochSize() + (i + 1)); - } + epochManager.setCurrentEpochNumber(i + 1); } } revokeAllAndCheckInvariants(0); @@ -2820,11 +2653,8 @@ contract ElectionTest_ConsistencyChecks is ElectionTest { } distributeEpochRewards(i); - if (isL2()) { - epochManager.setCurrentEpochNumber(i + 1); - } else { - vm.roll((i + 1) * ph.epochSize() + (i + 1)); - } + + epochManager.setCurrentEpochNumber(i + 1); for (uint256 j = 0; j < _accounts.length; j++) { checkVoterInvariants(_accounts[j], 100); @@ -2881,8 +2711,6 @@ contract ElectionTest_ConsistencyChecks is ElectionTest { } } -contract ElectionTest_ConsistencyChecks_L2 is ElectionTest_L2, ElectionTest_ConsistencyChecks {} - contract ElectionTest_HasActivatablePendingVotes is ElectionTest { address voter = address(this); address group = account1; @@ -2910,8 +2738,3 @@ contract ElectionTest_HasActivatablePendingVotes is ElectionTest { assertTrue(election.hasActivatablePendingVotes(voter, group)); } } - -contract ElectionTest_HasActivatablePendingVotes_L2 is - ElectionTest_L2, - ElectionTest_HasActivatablePendingVotes -{} diff --git a/packages/protocol/test-sol/unit/governance/voting/GovernanceDelegation.t.sol b/packages/protocol/test-sol/unit/governance/voting/GovernanceDelegation.t.sol new file mode 100644 index 00000000000..9ca9cf404d6 --- /dev/null +++ b/packages/protocol/test-sol/unit/governance/voting/GovernanceDelegation.t.sol @@ -0,0 +1,223 @@ +// SPDX-License-Identifier: UNLICENSED +pragma solidity ^0.5.13; + +import { TestWithUtils } from "@test-sol/TestWithUtils.sol"; + +import "@celo-contracts/common/FixidityLib.sol"; +import "@celo-contracts/common/Accounts.sol"; +import "@celo-contracts/common/Registry.sol"; +import "@celo-contracts/governance/LockedGold.sol"; +import "@celo-contracts/governance/Governance.sol"; +import "@celo-contracts/governance/test/MockElection.sol"; +import "@celo-contracts/governance/test/MockValidators.sol"; + +contract GovernanceHarness is Governance(true) { + address[] internal validatorSet; + + function addValidator(address validator) external { + validatorSet.push(validator); + } + + function numberValidatorsInCurrentSet() public view returns (uint256) { + return validatorSet.length; + } + + function numberValidatorsInSet(uint256) public view returns (uint256) { + return validatorSet.length; + } + + function validatorSignerAddressFromCurrentSet(uint256 index) public view returns (address) { + return validatorSet[index]; + } +} + +contract GovernanceDelegationTest is TestWithUtils { + using FixidityLib for FixidityLib.Fraction; + + Accounts accounts; + LockedGold lockedGold; + GovernanceHarness governance; + MockElection election; + MockValidators validators; + + address delegator = actor("delegator"); + address delegatee1 = actor("delegatee1"); + address delegatee2 = actor("delegatee2"); + address approver = actor("approver"); + + uint256 constant LOCKED_AMOUNT = 1000 ether; + uint256 constant SMALL_LOCK = 1; + uint256 constant MIN_DEPOSIT = 1 ether; + uint256 constant QUEUE_EXPIRY = 30 days; + uint256 constant DEQUEUE_FREQUENCY = 1 seconds; + uint256 constant REFERENDUM_DURATION = 3 days; + uint256 constant EXECUTION_DURATION = 3 days; + + function setUp() public { + super.setUp(); + + accounts = new Accounts(true); + lockedGold = new LockedGold(true); + governance = new GovernanceHarness(); + election = new MockElection(); + validators = new MockValidators(); + + registry.setAddressFor(AccountsContract, address(accounts)); + registry.setAddressFor(LockedGoldContract, address(lockedGold)); + registry.setAddressFor(GovernanceContract, address(governance)); + registry.setAddressFor(ElectionContract, address(election)); + registry.setAddressFor(ValidatorsContract, address(validators)); + + accounts.initialize(address(registry)); + lockedGold.initialize(address(registry), 1 weeks); + governance.initialize( + address(registry), + approver, + 1, + MIN_DEPOSIT, + QUEUE_EXPIRY, + DEQUEUE_FREQUENCY, + REFERENDUM_DURATION, + EXECUTION_DURATION, + FixidityLib.newFixedFraction(5, 10).unwrap(), + FixidityLib.newFixedFraction(5, 10).unwrap(), + FixidityLib.newFixedFraction(1, 5).unwrap(), + FixidityLib.newFixedFraction(1, 2).unwrap() + ); + + governance.addValidator(actor("validator")); + + vm.deal(delegator, LOCKED_AMOUNT + 10 ether); + vm.deal(delegatee1, SMALL_LOCK + 1 ether); + vm.deal(delegatee2, 1 ether); + vm.deal(approver, 1 ether); + + vm.prank(delegator); + accounts.createAccount(); + vm.prank(delegatee1); + accounts.createAccount(); + vm.prank(delegatee2); + accounts.createAccount(); + + vm.prank(delegator); + lockedGold.lock.value(LOCKED_AMOUNT)(); + vm.prank(delegatee1); + lockedGold.lock.value(SMALL_LOCK)(); + } + + function test_ShouldReturnCorrectVotingAmount_WhenBothQueueAndReferendumActive() public { + uint256 prop1 = _makeProposal(delegator); + uint256 prop2 = _makeProposal(delegator); + + vm.prank(delegator); + lockedGold.delegateGovernanceVotes(delegatee1, FixidityLib.fixed1().unwrap()); + + vm.warp(block.timestamp + DEQUEUE_FREQUENCY + 1); + governance.dequeueProposalsIfReady(); + uint256 idx = _getDequeuedIndex(prop1); + vm.prank(approver); + governance.approve(prop1, idx); + + vm.prank(delegatee1); + governance.upvote(prop2, 0, 0); + + vm.prank(delegatee1); + governance.votePartially(prop1, idx, LOCKED_AMOUNT, 0, 0); + + uint256 reported = governance.getAmountOfGoldUsedForVoting(delegatee1); + assertEq(reported, LOCKED_AMOUNT, "should return referendum votes (higher than queue upvote)"); + } + + function test_ShouldReduceVotes_WhenRevokingDelegation() public { + uint256 prop1 = _makeProposal(delegator); + uint256 prop2 = _makeProposal(delegator); + + vm.prank(delegator); + lockedGold.delegateGovernanceVotes(delegatee1, FixidityLib.fixed1().unwrap()); + + vm.warp(block.timestamp + DEQUEUE_FREQUENCY + 1); + governance.dequeueProposalsIfReady(); + uint256 idx = _getDequeuedIndex(prop1); + vm.prank(approver); + governance.approve(prop1, idx); + + vm.prank(delegatee1); + governance.upvote(prop2, 0, 0); + + vm.prank(delegatee1); + governance.votePartially(prop1, idx, LOCKED_AMOUNT, 0, 0); + + (uint256 before, , ) = governance.getVoteTotals(prop1); + assertEq(before, LOCKED_AMOUNT); + + vm.prank(delegator); + lockedGold.revokeDelegatedGovernanceVotes(delegatee1, FixidityLib.fixed1().unwrap()); + + (uint256 after_, , ) = governance.getVoteTotals(prop1); + assertEq(after_, SMALL_LOCK); + assertEq(lockedGold.getAccountTotalGovernanceVotingPower(delegatee1), SMALL_LOCK); + assertEq(lockedGold.totalDelegatedCelo(delegatee1), 0); + } + + function test_ShouldMaintainCorrectTotals_WhenRedelegating() public { + uint256 prop1 = _makeProposal(delegator); + uint256 prop2 = _makeProposal(delegator); + + vm.prank(delegator); + lockedGold.delegateGovernanceVotes(delegatee1, FixidityLib.fixed1().unwrap()); + + vm.warp(block.timestamp + DEQUEUE_FREQUENCY + 1); + governance.dequeueProposalsIfReady(); + uint256 idx = _getDequeuedIndex(prop1); + vm.prank(approver); + governance.approve(prop1, idx); + + vm.prank(delegatee1); + governance.upvote(prop2, 0, 0); + + vm.prank(delegatee1); + governance.votePartially(prop1, idx, LOCKED_AMOUNT, 0, 0); + + vm.prank(delegator); + lockedGold.revokeDelegatedGovernanceVotes(delegatee1, FixidityLib.fixed1().unwrap()); + + vm.prank(delegator); + lockedGold.delegateGovernanceVotes(delegatee2, FixidityLib.fixed1().unwrap()); + + vm.prank(delegatee2); + governance.votePartially(prop1, idx, LOCKED_AMOUNT, 0, 0); + + (uint256 total, , ) = governance.getVoteTotals(prop1); + assertEq(total, LOCKED_AMOUNT + SMALL_LOCK); + } + + function test_ShouldReturnLockedGold_WhenDelegatorUpvotesAfterDelegating() public { + uint256 prop1 = _makeProposal(delegator); + + vm.prank(delegator); + lockedGold.delegateGovernanceVotes(delegatee1, FixidityLib.fixed1().unwrap()); + + vm.prank(delegator); + governance.upvote(prop1, 0, 0); + + uint256 reported = governance.getAmountOfGoldUsedForVoting(delegator); + assertEq(reported, LOCKED_AMOUNT, "delegator upvote should report locked gold weight"); + } + + function _makeProposal(address proposer) private returns (uint256) { + uint256[] memory vals = new uint256[](0); + address[] memory dests = new address[](0); + bytes memory data = ""; + uint256[] memory lens = new uint256[](0); + vm.prank(proposer); + return governance.propose.value(MIN_DEPOSIT)(vals, dests, data, lens, "url"); + } + + function _getDequeuedIndex(uint256 propId) private view returns (uint256) { + uint256[] memory ids = governance.getDequeue(); + for (uint256 i = 0; i < ids.length; i++) { + if (ids[i] == propId) return i; + } + revert("not dequeued"); + } +} diff --git a/packages/protocol/test-sol/unit/governance/voting/LockedGold.t.sol b/packages/protocol/test-sol/unit/governance/voting/LockedGold.t.sol index d0a6cb8f57d..bd47bc0fc4e 100644 --- a/packages/protocol/test-sol/unit/governance/voting/LockedGold.t.sol +++ b/packages/protocol/test-sol/unit/governance/voting/LockedGold.t.sol @@ -3,7 +3,6 @@ pragma solidity ^0.5.13; pragma experimental ABIEncoderV2; import { TestWithUtils } from "@test-sol/TestWithUtils.sol"; -import "@test-sol/utils/WhenL2.sol"; import "@celo-contracts/common/FixidityLib.sol"; import "@celo-contracts/common/Accounts.sol"; @@ -112,6 +111,7 @@ contract LockedGoldTest is TestWithUtils { (delegateeSigner2, delegateeSigner2PK) = actorWithPK("delegateeSigner2"); vm.deal(delegator, 10 ether); vm.deal(delegator2, 10 ether); + whenL2WithEpochManagerInitialization(); } function getParsedSignatureOfAddress( @@ -270,8 +270,6 @@ contract LockedGoldTest is TestWithUtils { } } -contract LockedGoldTest_L2 is WhenL2, LockedGoldTest {} - contract LockedGoldTest_initialize is LockedGoldTest { function setUp() public { super.setUp(); @@ -313,8 +311,6 @@ contract LockedGoldTest_setRegistry is LockedGoldTest { } } -contract LockedGoldTest_setRegistry_L2 is LockedGoldTest_L2, LockedGoldTest_setRegistry {} - contract LockedGoldTest_setUnlockingPeriod is LockedGoldTest { function setUp() public { super.setUp(); @@ -345,11 +341,6 @@ contract LockedGoldTest_setUnlockingPeriod is LockedGoldTest { } } -contract LockedGoldTest_setUnlockingPeriod_L2 is - LockedGoldTest_L2, - LockedGoldTest_setUnlockingPeriod -{} - contract LockedGoldTest_lock is LockedGoldTest { uint256 value = 1000; function setUp() public { @@ -388,6 +379,10 @@ contract LockedGoldTest_lock is LockedGoldTest { lockedGold.lock(); } + // This test fails at the same call depth as expectRevert, since it fails due to not having enough + // CELO to call with a non-0 value. + // See https://book.getfoundry.sh/cheatcodes/expect-revert#description + /// forge-config: default.allow_internal_expect_revert = true function test_ShouldRevertWhenUserDoesntHaveEnoughBalance() public { vm.expectRevert(); vm.prank(randomAddress); @@ -395,8 +390,6 @@ contract LockedGoldTest_lock is LockedGoldTest { } } -contract LockedGoldTest_lock_L2 is LockedGoldTest_L2, LockedGoldTest_lock {} - contract LockedGoldTest_unlock is LockedGoldTest { uint256 value = 1000; uint256 availabilityTime = unlockingPeriod + block.timestamp; @@ -579,8 +572,6 @@ contract LockedGoldTest_unlock is LockedGoldTest { } } -contract LockedGoldTest_unlock_L2 is LockedGoldTest_L2, LockedGoldTest_unlock {} - contract LockedGoldTest_unlockDelegation is LockedGoldTest { uint256 value = 1000; uint256 availabilityTime = unlockingPeriod + block.timestamp; @@ -656,8 +647,6 @@ contract LockedGoldTest_unlockDelegation is LockedGoldTest { } } -contract LockedGoldTest_unlockDelegation_L2 is LockedGoldTest_L2, LockedGoldTest_unlockDelegation {} - contract LockedGoldTest_unlock_WhenDelegation2Delegatees is LockedGoldTest { uint256 value = 1000; uint256 availabilityTime = unlockingPeriod + block.timestamp; @@ -723,11 +712,6 @@ contract LockedGoldTest_unlock_WhenDelegation2Delegatees is LockedGoldTest { } } -contract LockedGoldTest_unlock_WhenDelegation2Delegatees_L2 is - LockedGoldTest_L2, - LockedGoldTest_unlock_WhenDelegation2Delegatees -{} - contract LockedGoldTest_unlock_WhenDelegatingTo3Delegatees is LockedGoldTest { uint256 value = 5; uint256 availabilityTime = unlockingPeriod + block.timestamp; @@ -861,11 +845,6 @@ contract LockedGoldTest_unlock_WhenDelegatingTo3Delegatees is LockedGoldTest { } } -contract LockedGoldTest_unlock_WhenDelegatingTo3Delegatees_L2 is - LockedGoldTest_L2, - LockedGoldTest_unlock_WhenDelegatingTo3Delegatees -{} - contract LockedGoldTest_lock_AfterUnlocking is LockedGoldTest { uint256 pendingWithdrawalValue = 100; uint256 index = 0; @@ -1029,11 +1008,6 @@ contract LockedGoldTest_lock_AfterUnlocking is LockedGoldTest { } } -contract LockedGoldTest_lock_AfterUnlocking_L2 is - LockedGoldTest_L2, - LockedGoldTest_lock_AfterUnlocking -{} - contract LockedGoldTest_withdraw is LockedGoldTest { uint256 value = 1000; uint256 index = 0; @@ -1074,8 +1048,6 @@ contract LockedGoldTest_withdraw is LockedGoldTest { function() external payable {} } -contract LockedGoldTest_withdraw_L2 is LockedGoldTest_L2, LockedGoldTest_withdraw {} - contract LockedGoldTest_addSlasher is LockedGoldTest { string slasherName = "DowntimeSlasher"; address downtimeSlasher = actor(slasherName); @@ -1104,8 +1076,6 @@ contract LockedGoldTest_addSlasher is LockedGoldTest { } } -contract LockedGoldTest_addSlasher_L2 is LockedGoldTest_L2, LockedGoldTest_addSlasher {} - contract LockedGoldTest_removeSlasher is LockedGoldTest { string slasherName = "DowntimeSlasher"; string governanceSlasherName = "GovernanceSlasher"; @@ -1148,8 +1118,6 @@ contract LockedGoldTest_removeSlasher is LockedGoldTest { } } -contract LockedGoldTest_removeSlasher_L2 is LockedGoldTest_L2, LockedGoldTest_removeSlasher {} - contract LockedGoldTest_slash is LockedGoldTest { uint256 value = 1000; address group = actor("group"); @@ -1438,8 +1406,6 @@ contract LockedGoldTest_slash is LockedGoldTest { } } -contract LockedGoldTest_slash_L2 is LockedGoldTest_L2, LockedGoldTest_slash {} - contract LockedGoldTest_delegateGovernanceVotes is LockedGoldTest { uint256 value = 1000; uint256 percentToDelegate = 30; @@ -1823,11 +1789,6 @@ contract LockedGoldTest_delegateGovernanceVotes is LockedGoldTest { } } -contract LockedGoldTest_delegateGovernanceVotes_L2 is - LockedGoldTest_L2, - LockedGoldTest_delegateGovernanceVotes -{} - contract LockedGoldTest_revokeDelegatedGovernanceVotes is LockedGoldTest { uint256 value = 1000; uint256 percentageToRevoke = 2; @@ -2182,11 +2143,6 @@ contract LockedGoldTest_revokeDelegatedGovernanceVotes is LockedGoldTest { } } -contract LockedGoldTest_revokeDelegatedGovernanceVotes_L2 is - LockedGoldTest_L2, - LockedGoldTest_revokeDelegatedGovernanceVotes -{} - contract LockedGoldTest_getAccountTotalGovernanceVotingPower is LockedGoldTest { address delegator = actor("delegator"); address delegatee = actor("delegatee"); @@ -2234,11 +2190,6 @@ contract LockedGoldTest_getAccountTotalGovernanceVotingPower is LockedGoldTest { } } -contract LockedGoldTest_getAccountTotalGovernanceVotingPower_L2 is - LockedGoldTest_L2, - LockedGoldTest_getAccountTotalGovernanceVotingPower -{} - contract LockedGoldTest_getDelegatorDelegateeInfo is LockedGoldTest { address delegator = actor("delegator"); address delegatee = actor("delegatee"); @@ -2281,11 +2232,6 @@ contract LockedGoldTest_getDelegatorDelegateeInfo is LockedGoldTest { } } -contract LockedGoldTest_getDelegatorDelegateeInfo_L2 is - LockedGoldTest_L2, - LockedGoldTest_getDelegatorDelegateeInfo -{} - contract LockedGoldTest_getDelegatorDelegateeExpectedAndRealAmount is LockedGoldTest { address delegator = actor("delegator"); address delegatee = actor("delegatee"); @@ -2402,11 +2348,6 @@ contract LockedGoldTest_getDelegatorDelegateeExpectedAndRealAmount is LockedGold } } -contract LockedGoldTest_getDelegatorDelegateeExpectedAndRealAmount_L2 is - LockedGoldTest_L2, - LockedGoldTest_getDelegatorDelegateeExpectedAndRealAmount -{} - contract LockedGoldTest_updateDelegatedAmount is LockedGoldTest { address delegator = actor("delegator"); address delegatee = actor("delegatee"); @@ -2484,11 +2425,6 @@ contract LockedGoldTest_updateDelegatedAmount is LockedGoldTest { } } -contract LockedGoldTest_updateDelegatedAmount_L2 is - LockedGoldTest_L2, - LockedGoldTest_updateDelegatedAmount -{} - contract LockedGoldTest_getTotalPendingWithdrawalsCount is LockedGoldTest { uint256 value = 1000; address account = actor("account"); @@ -2519,11 +2455,6 @@ contract LockedGoldTest_getTotalPendingWithdrawalsCount is LockedGoldTest { } } -contract LockedGoldTest_getTotalPendingWithdrawalsCount_L2 is - LockedGoldTest_L2, - LockedGoldTest_getTotalPendingWithdrawalsCount -{} - contract LockedGoldTestGetPendingWithdrawalsInBatch is LockedGoldTest { uint256 value = 1000; @@ -2606,8 +2537,3 @@ contract LockedGoldTestGetPendingWithdrawalsInBatch is LockedGoldTest { assertEq(timestamps.length, 0); } } - -contract LockedGoldTestGetPendingWithdrawalsInBatch_L2 is - LockedGoldTest_L2, - LockedGoldTestGetPendingWithdrawalsInBatch -{} diff --git a/packages/protocol/test-sol/unit/governance/voting/ReleaseGold.t.sol b/packages/protocol/test-sol/unit/governance/voting/ReleaseGold.t.sol index f29213431a5..a1ee41e2b95 100644 --- a/packages/protocol/test-sol/unit/governance/voting/ReleaseGold.t.sol +++ b/packages/protocol/test-sol/unit/governance/voting/ReleaseGold.t.sol @@ -3,7 +3,6 @@ pragma solidity ^0.5.13; pragma experimental ABIEncoderV2; import { TestWithUtils } from "@test-sol/TestWithUtils.sol"; -import "@test-sol/utils/WhenL2.sol"; import { ECDSAHelper } from "@test-sol/utils/ECDSAHelper.sol"; import "@celo-contracts/identity/Escrow.sol"; @@ -140,11 +139,11 @@ contract ReleaseGoldTest is TestWithUtils, ECDSAHelper { }); vm.deal(randomAddress, 1000 ether); + + whenL2WithEpochManagerInitialization(); } } -contract ReleaseGoldTest_L2 is WhenL2, ReleaseGoldTest {} - contract ReleaseGoldTest_Initialize is ReleaseGoldTest { function test_ShouldIndicateIsFundedIfDeploymentIsPrefunded() public { newReleaseGold(true, false); @@ -157,8 +156,6 @@ contract ReleaseGoldTest_Initialize is ReleaseGoldTest { } } -contract ReleaseGoldTest_Initialize_L2 is ReleaseGoldTest_L2, ReleaseGoldTest_Initialize {} - contract ReleaseGoldTest_Payable is ReleaseGoldTest { function test_ShouldAcceptGoldTransferByDefaultFromAnyone() public { newReleaseGold(true, false); @@ -194,8 +191,6 @@ contract ReleaseGoldTest_Payable is ReleaseGoldTest { } } -contract ReleaseGoldTest_Payable_L2 is ReleaseGoldTest_L2, ReleaseGoldTest_Payable {} - contract ReleaseGoldTest_Transfer is ReleaseGoldTest { address receiver = actor("receiver"); uint256 transferAmount = 10; @@ -214,8 +209,6 @@ contract ReleaseGoldTest_Transfer is ReleaseGoldTest { } } -contract ReleaseGoldTest_Transfer_L2 is ReleaseGoldTest_L2, ReleaseGoldTest_Transfer {} - contract ReleaseGoldTest_GenericTransfer is ReleaseGoldTest { address receiver = actor("receiver"); uint256 transferAmount = 10; @@ -249,11 +242,6 @@ contract ReleaseGoldTest_GenericTransfer is ReleaseGoldTest { } } -contract ReleaseGoldTest_GenericTransfer_L2 is - ReleaseGoldTest_L2, - ReleaseGoldTest_GenericTransfer -{} - contract ReleaseGoldTest_Creation is ReleaseGoldTest { uint256 public maxUint256 = 0xFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF; @@ -375,8 +363,6 @@ contract ReleaseGoldTest_Creation is ReleaseGoldTest { } } -contract ReleaseGoldTest_Creation_L2 is ReleaseGoldTest_L2, ReleaseGoldTest_Creation {} - contract ReleaseGoldTest_SetBeneficiary is ReleaseGoldTest { function setUp() public { super.setUp(); @@ -401,8 +387,6 @@ contract ReleaseGoldTest_SetBeneficiary is ReleaseGoldTest { } } -contract ReleaseGoldTest_SetBeneficiary_L2 is ReleaseGoldTest_L2, ReleaseGoldTest_SetBeneficiary {} - contract ReleaseGoldTest_CreateAccount is ReleaseGoldTest { function setUp() public { super.setUp(); @@ -431,8 +415,6 @@ contract ReleaseGoldTest_CreateAccount is ReleaseGoldTest { } } -contract ReleaseGoldTest_CreateAccount_L2 is ReleaseGoldTest_L2, ReleaseGoldTest_CreateAccount {} - contract ReleaseGoldTest_SetAccount is ReleaseGoldTest { uint8 v; bytes32 r; @@ -479,8 +461,6 @@ contract ReleaseGoldTest_SetAccount is ReleaseGoldTest { } } -contract ReleaseGoldTest_SetAccount_L2 is ReleaseGoldTest_L2, ReleaseGoldTest_SetAccount {} - contract ReleaseGoldTest_SetAccountName is ReleaseGoldTest { function setUp() public { super.setUp(); @@ -522,8 +502,6 @@ contract ReleaseGoldTest_SetAccountName is ReleaseGoldTest { } } -contract ReleaseGoldTest_SetAccountName_L2 is ReleaseGoldTest_L2, ReleaseGoldTest_SetAccountName {} - contract ReleaseGoldTest_SetAccountWalletAddress is ReleaseGoldTest { uint8 v; bytes32 r; @@ -582,11 +560,6 @@ contract ReleaseGoldTest_SetAccountWalletAddress is ReleaseGoldTest { } } -contract ReleaseGoldTest_SetAccountWalletAddress_L2 is - ReleaseGoldTest_L2, - ReleaseGoldTest_SetAccountWalletAddress -{} - contract ReleaseGoldTest_SetAccountMetadataURL is ReleaseGoldTest { function setUp() public { super.setUp(); @@ -632,11 +605,6 @@ contract ReleaseGoldTest_SetAccountMetadataURL is ReleaseGoldTest { } } -contract ReleaseGoldTest_SetAccountMetadataURL_L2 is - ReleaseGoldTest_L2, - ReleaseGoldTest_SetAccountMetadataURL -{} - contract ReleaseGoldTest_SetAccountDataEncryptionKey is ReleaseGoldTest { bytes dataEncryptionKey = hex"02f2f48ee19680706196e2e339e5da3491186e0c4c5030670656b0e01611111111"; bytes longDataEncryptionKey = @@ -687,11 +655,6 @@ contract ReleaseGoldTest_SetAccountDataEncryptionKey is ReleaseGoldTest { } } -contract ReleaseGoldTest_SetAccountDataEncryptionKey_L2 is - ReleaseGoldTest_L2, - ReleaseGoldTest_SetAccountDataEncryptionKey -{} - contract ReleaseGoldTest_SetMaxDistribution is ReleaseGoldTest { function setUp() public { super.setUp(); @@ -722,11 +685,6 @@ contract ReleaseGoldTest_SetMaxDistribution is ReleaseGoldTest { } } -contract ReleaseGoldTest_SetMaxDistribution_L2 is - ReleaseGoldTest_L2, - ReleaseGoldTest_SetMaxDistribution -{} - contract ReleaseGoldTest_AuthorizationTests is ReleaseGoldTest { uint256 initialReleaseGoldAmount; @@ -1046,11 +1004,6 @@ contract ReleaseGoldTest_AuthorizationTests is ReleaseGoldTest { } } -contract ReleaseGoldTest_AuthorizationTests_L2 is - ReleaseGoldTest_L2, - ReleaseGoldTest_AuthorizationTests -{} - contract ReleaseGoldTest_AuthorizeWithPublicKeys_setup is ReleaseGoldTest { uint8 v; bytes32 r; @@ -1106,88 +1059,6 @@ contract ReleaseGoldTest_AuthorizeWithPublicKeys is ReleaseGoldTest_AuthorizeWit assertEq(accounts.getValidatorSigner(address(releaseGold)), authorized); assertEq(accounts.validatorSignerToAccount(authorized), address(releaseGold)); } - - function test_ShouldSetTheAuthorizedKeys_WhenUsingBLSKeys() public { - bytes32 newBlsPublicKeyPart1 = _randomBytes32(); - bytes32 newBlsPublicKeyPart2 = _randomBytes32(); - bytes32 newBlsPublicKeyPart3 = _randomBytes32(); - bytes memory newBlsPublicKey = abi.encodePacked( - newBlsPublicKeyPart1, - newBlsPublicKeyPart2, - newBlsPublicKeyPart3 - ); - newBlsPublicKey = _truncateBytes(newBlsPublicKey, 96); - - bytes32 newBlsPoPPart1 = _randomBytes32(); - bytes32 newBlsPoPPart2 = _randomBytes32(); - bytes memory newBlsPoP = abi.encodePacked(newBlsPoPPart1, newBlsPoPPart2); - newBlsPoP = _truncateBytes(newBlsPoP, 48); - - vm.prank(beneficiary); - releaseGold.authorizeValidatorSignerWithKeys( - address(uint160(authorized)), - v, - r, - s, - ecdsaPublicKey, - newBlsPublicKey, - newBlsPoP - ); - - assertEq(accounts.authorizedBy(authorized), address(releaseGold)); - assertEq(accounts.getValidatorSigner(address(releaseGold)), authorized); - assertEq(accounts.validatorSignerToAccount(authorized), address(releaseGold)); - } -} - -contract ReleaseGoldTest_AuthorizeWithPublicKeys_L2 is - ReleaseGoldTest_L2, - ReleaseGoldTest_AuthorizeWithPublicKeys_setup -{ - function test_ShouldSetTheAuthorizedKeys_WhenUsingECDSAPublickKey() public { - vm.prank(beneficiary); - releaseGold.authorizeValidatorSignerWithPublicKey( - address(uint160(authorized)), - v, - r, - s, - ecdsaPublicKey - ); - - assertEq(accounts.authorizedBy(authorized), address(releaseGold)); - assertEq(accounts.getValidatorSigner(address(releaseGold)), authorized); - assertEq(accounts.validatorSignerToAccount(authorized), address(releaseGold)); - } - - function test_Reverts_WhenAuthorizeValidatorSignerWithKeys() public { - bytes32 newBlsPublicKeyPart1 = _randomBytes32(); - bytes32 newBlsPublicKeyPart2 = _randomBytes32(); - bytes32 newBlsPublicKeyPart3 = _randomBytes32(); - bytes memory newBlsPublicKey = abi.encodePacked( - newBlsPublicKeyPart1, - newBlsPublicKeyPart2, - newBlsPublicKeyPart3 - ); - newBlsPublicKey = _truncateBytes(newBlsPublicKey, 96); - - bytes32 newBlsPoPPart1 = _randomBytes32(); - bytes32 newBlsPoPPart2 = _randomBytes32(); - bytes memory newBlsPoP = abi.encodePacked(newBlsPoPPart1, newBlsPoPPart2); - newBlsPoP = _truncateBytes(newBlsPoP, 48); - - vm.expectRevert("This method is no longer supported in L2."); - - vm.prank(beneficiary); - releaseGold.authorizeValidatorSignerWithKeys( - address(uint160(authorized)), - v, - r, - s, - ecdsaPublicKey, - newBlsPublicKey, - newBlsPoP - ); - } } contract ReleaseGoldTest_Revoke is ReleaseGoldTest { @@ -1228,8 +1099,6 @@ contract ReleaseGoldTest_Revoke is ReleaseGoldTest { } } -contract ReleaseGoldTest_Revoke_L2 is ReleaseGoldTest_L2, ReleaseGoldTest_Revoke {} - contract ReleaseGoldTest_Expire is ReleaseGoldTest { function setUp() public { super.setUp(); @@ -1365,8 +1234,6 @@ contract ReleaseGoldTest_Expire is ReleaseGoldTest { } } -contract ReleaseGoldTest_Expire_L2 is ReleaseGoldTest_L2, ReleaseGoldTest_Expire {} - contract ReleaseGoldTest_RefundAndFinalize is ReleaseGoldTest { function setUp() public { super.setUp(); @@ -1423,11 +1290,6 @@ contract ReleaseGoldTest_RefundAndFinalize is ReleaseGoldTest { } } -contract ReleaseGoldTest_RefundAndFinalize_L2 is - ReleaseGoldTest_L2, - ReleaseGoldTest_RefundAndFinalize -{} - contract ReleaseGoldTest_ExpireSelfDestructTest is ReleaseGoldTest { function setUp() public { super.setUp(); @@ -1440,6 +1302,9 @@ contract ReleaseGoldTest_ExpireSelfDestructTest is ReleaseGoldTest { releaseGold.refundAndFinalize(); } + // This test fails at the same call depth as expectRevert, since it fails due to the contract no + // longer existing. Same with other selfdestruct tests in this suite. + /// forge-config: default.allow_internal_expect_revert = true function test_ShouldDestructReleaseGoldInstanceAfterFinalizingAndPreventCallingFurtherActions_WhenRevoked() public { @@ -1448,11 +1313,6 @@ contract ReleaseGoldTest_ExpireSelfDestructTest is ReleaseGoldTest { } } -contract ReleaseGoldTest_ExpireSelfDestructTest_L2 is - ReleaseGoldTest_L2, - ReleaseGoldTest_ExpireSelfDestructTest -{} - contract ReleaseGoldTest_LockGold is ReleaseGoldTest { uint256 lockAmount; function setUp() public { @@ -1495,8 +1355,6 @@ contract ReleaseGoldTest_LockGold is ReleaseGoldTest { } } -contract ReleaseGoldTest_LockGold_L2 is ReleaseGoldTest_L2, ReleaseGoldTest_LockGold {} - contract ReleaseGoldTest_UnlockGold is ReleaseGoldTest { uint256 lockAmount; function setUp() public { @@ -1560,8 +1418,6 @@ contract ReleaseGoldTest_UnlockGold is ReleaseGoldTest { } } -contract ReleaseGoldTest_UnlockGold_L2 is ReleaseGoldTest_L2, ReleaseGoldTest_UnlockGold {} - contract ReleaseGoldTest_WithdrawLockedGold is ReleaseGoldTest { uint256 value = 1000; uint256 index = 0; @@ -1632,11 +1488,6 @@ contract ReleaseGoldTest_WithdrawLockedGold is ReleaseGoldTest { } } -contract ReleaseGoldTest_WithdrawLockedGold_L2 is - ReleaseGoldTest_L2, - ReleaseGoldTest_WithdrawLockedGold -{} - contract ReleaseGoldTest_RelockGold is ReleaseGoldTest { uint256 pendingWithdrawalValue = 1000; uint256 index = 0; @@ -1743,8 +1594,6 @@ contract ReleaseGoldTest_RelockGold is ReleaseGoldTest { } } -contract ReleaseGoldTest_RelockGold_L2 is ReleaseGoldTest_L2, ReleaseGoldTest_RelockGold {} - contract ReleaseGoldTest_Withdraw is ReleaseGoldTest { uint256 initialReleaseGoldAmount; @@ -2041,8 +1890,6 @@ contract ReleaseGoldTest_Withdraw is ReleaseGoldTest { } } -contract ReleaseGoldTest_Withdraw_L2 is ReleaseGoldTest_L2, ReleaseGoldTest_Withdraw {} - contract ReleaseGoldTest_WithdrawSelfDestruct_WhenNotRevoked is ReleaseGoldTest { uint256 initialReleaseGoldAmount; @@ -2063,17 +1910,13 @@ contract ReleaseGoldTest_WithdrawSelfDestruct_WhenNotRevoked is ReleaseGoldTest releaseGold.withdraw(expectedWithdrawalAmount); } + /// forge-config: default.allow_internal_expect_revert = true function test_ShouldSelfDestructIfBeneficiaryWithdrawsTheEntireAmount() public { vm.expectRevert(); releaseGold.totalWithdrawn(); } } -contract ReleaseGoldTest_WithdrawSelfDestruct_WhenNotRevoked_L2 is - ReleaseGoldTest_L2, - ReleaseGoldTest_WithdrawSelfDestruct_WhenNotRevoked -{} - contract ReleaseGoldTest_WithdrawSelfDestruct_WhenRevoked is ReleaseGoldTest { uint256 initialReleaseGoldAmount; @@ -2096,17 +1939,13 @@ contract ReleaseGoldTest_WithdrawSelfDestruct_WhenRevoked is ReleaseGoldTest { releaseGold.withdraw(expectedWithdrawalAmount); } + /// forge-config: default.allow_internal_expect_revert = true function test_ShouldSelfDestructIfBeneficiaryWithdrawsTheEntireAmount() public { vm.expectRevert(); releaseGold.totalWithdrawn(); } } -contract ReleaseGoldTest_WithdrawSelfDestruct_WhenRevoked_L2 is - ReleaseGoldTest_L2, - ReleaseGoldTest_WithdrawSelfDestruct_WhenRevoked -{} - contract ReleaseGoldTest_GetCurrentReleasedTotalAmount is ReleaseGoldTest { uint256 initialReleaseGoldAmount; @@ -2150,11 +1989,6 @@ contract ReleaseGoldTest_GetCurrentReleasedTotalAmount is ReleaseGoldTest { } } -contract ReleaseGoldTest_GetCurrentReleasedTotalAmount_L2 is - ReleaseGoldTest_L2, - ReleaseGoldTest_GetCurrentReleasedTotalAmount -{} - contract ReleaseGoldTest_GetWithdrawableAmount is ReleaseGoldTest { uint256 initialReleaseGoldAmount; @@ -2257,8 +2091,3 @@ contract ReleaseGoldTest_GetWithdrawableAmount is ReleaseGoldTest { assertEq(releaseGold.getWithdrawableAmount(), initialReleaseGoldAmount / 4); } } - -contract ReleaseGoldTest_GetWithdrawableAmount_L2 is - ReleaseGoldTest_L2, - ReleaseGoldTest_GetWithdrawableAmount -{} diff --git a/packages/protocol/test-sol/unit/identity/Random.t.sol b/packages/protocol/test-sol/unit/identity/Random.t.sol index 3a6b33eb0c2..558528f9784 100644 --- a/packages/protocol/test-sol/unit/identity/Random.t.sol +++ b/packages/protocol/test-sol/unit/identity/Random.t.sol @@ -12,8 +12,10 @@ contract RandomTest_ is TestWithUtils { event RandomnessBlockRetentionWindowSet(uint256 value); function setUp() public { + super.setUp(); random = new RandomTest(); random.initialize(256); + whenL2WithEpochManagerInitialization(); } function commitmentFor(uint256 value) internal pure returns (bytes32) { @@ -22,199 +24,14 @@ contract RandomTest_ is TestWithUtils { } contract RandomTest_SetRandomnessRetentionWindow is RandomTest_ { - function test_ShouldSetTheVariable() public { - random.setRandomnessBlockRetentionWindow(1000); - assertEq(random.randomnessBlockRetentionWindow(), 1000); - } - - function test_Emits_TheEvent() public { - vm.expectEmit(true, true, true, true); - emit RandomnessBlockRetentionWindowSet(1000); - random.setRandomnessBlockRetentionWindow(1000); - } - - function testRevert_OnlyOwnerCanSet() public { - vm.expectRevert("Ownable: caller is not the owner"); - vm.prank(address(0x45)); - random.setRandomnessBlockRetentionWindow(1000); - } - function test_Reverts_WhenCalledOnL2() public { - _whenL2(); vm.expectRevert("This method is no longer supported in L2."); random.setRandomnessBlockRetentionWindow(1000); } } contract RandomTest_AddTestRandomness is RandomTest_ { - uint256 constant RETENTION_WINDOW = 5; - uint256 constant EPOCH_SIZE = 10; - - function test_ShouldBeAbleToSimulateAddingRandomness() public { - random.addTestRandomness(1, 0x0000000000000000000000000000000000000000000000000000000000000001); - random.addTestRandomness(2, 0x0000000000000000000000000000000000000000000000000000000000000002); - random.addTestRandomness(3, 0x0000000000000000000000000000000000000000000000000000000000000003); - random.addTestRandomness(4, 0x0000000000000000000000000000000000000000000000000000000000000004); - assertEq( - 0x0000000000000000000000000000000000000000000000000000000000000001, - random.getTestRandomness(1, 4) - ); - assertEq( - 0x0000000000000000000000000000000000000000000000000000000000000002, - random.getTestRandomness(2, 4) - ); - assertEq( - 0x0000000000000000000000000000000000000000000000000000000000000003, - random.getTestRandomness(3, 4) - ); - assertEq( - 0x0000000000000000000000000000000000000000000000000000000000000004, - random.getTestRandomness(4, 4) - ); - } - - function setUpWhenChangingHistorySmaller() private { - random.addTestRandomness(1, 0x0000000000000000000000000000000000000000000000000000000000000001); - random.addTestRandomness(2, 0x0000000000000000000000000000000000000000000000000000000000000002); - random.addTestRandomness(3, 0x0000000000000000000000000000000000000000000000000000000000000003); - random.addTestRandomness(4, 0x0000000000000000000000000000000000000000000000000000000000000004); - random.setRandomnessBlockRetentionWindow(2); - } - - function test_canStillAddRandomness_whenChangingHistorySmaller() public { - setUpWhenChangingHistorySmaller(); - random.addTestRandomness(5, 0x0000000000000000000000000000000000000000000000000000000000000005); - assertEq( - 0x0000000000000000000000000000000000000000000000000000000000000005, - random.getTestRandomness(5, 5) - ); - } - - function test_cannotReadOldBlocks_whenChangingHistorySmaller() public { - setUpWhenChangingHistorySmaller(); - vm.expectRevert("Cannot query randomness older than the stored history"); - random.getTestRandomness(3, 5); - } - - function setUpWhenChangingHistoryLarger() private { - random.setRandomnessBlockRetentionWindow(2); - random.addTestRandomness(1, 0x0000000000000000000000000000000000000000000000000000000000000001); - random.addTestRandomness(2, 0x0000000000000000000000000000000000000000000000000000000000000002); - random.addTestRandomness(3, 0x0000000000000000000000000000000000000000000000000000000000000003); - random.addTestRandomness(4, 0x0000000000000000000000000000000000000000000000000000000000000004); - random.setRandomnessBlockRetentionWindow(4); - } - - function test_CanStillAddRandomness_WhenChangingHistoryLarger() public { - setUpWhenChangingHistoryLarger(); - random.addTestRandomness(5, 0x0000000000000000000000000000000000000000000000000000000000000005); - assertEq( - 0x0000000000000000000000000000000000000000000000000000000000000005, - random.getTestRandomness(5, 5) - ); - } - - function test_CannotReadOldBlocks_WhenChangingHistoryLarger() public { - setUpWhenChangingHistoryLarger(); - vm.expectRevert("Cannot query randomness older than the stored history"); - random.getTestRandomness(1, 5); - } - - function test_OldValuesArePreserved_WhenChangingHistoryLarger() public { - setUpWhenChangingHistoryLarger(); - random.addTestRandomness(5, 0x0000000000000000000000000000000000000000000000000000000000000005); - random.addTestRandomness(6, 0x0000000000000000000000000000000000000000000000000000000000000006); - assertEq( - 0x0000000000000000000000000000000000000000000000000000000000000003, - random.getTestRandomness(3, 6) - ); - } - - function setUpWhenRelyingOnTheLastBlockOfEachEpochsRandomness() - private - returns (uint256 lastBlockOfEpoch) - { - bytes32 defaultValue = 0x0000000000000000000000000000000000000000000000000000000000000002; - bytes32 valueForLastBlockOfEpoch = 0x0000000000000000000000000000000000000000000000000000000000000001; - - ph.setEpochSize(EPOCH_SIZE); - random.setRandomnessBlockRetentionWindow(RETENTION_WINDOW); - - // Epoch - // [1 , 2 , 2 , 3 ] - // Blocks - // [EPOCH_SIZE, EPOCH_SIZE+1... EPOCH_SIZE+n, 2 * EPOCH_SIZE, 2 * EPOCH_SIZE + 1... 2 * EPOCH_SIZE + RETENTION_WINDOW-1] - - // go to last block of epoch 1 - vm.roll(EPOCH_SIZE); - // Add randomness to epoch's last block - random.addTestRandomness(block.number, valueForLastBlockOfEpoch); - - // Add a different randomness to all but last epoch blocks - for (uint256 i = 0; i < EPOCH_SIZE - 1; i++) { - blockTravel(1); - random.addTestRandomness(block.number, defaultValue); - } - - // Add randomness to epoch's last block - blockTravel(1); - random.addTestRandomness(block.number, valueForLastBlockOfEpoch); - - // Now we add `RETENTION_WINDOW` worth of blocks' randomness to flush out the new lastEpochBlock - // This means we can test `lastEpochBlock` stores epoch i+1's last block, - // and we test that epoch i's last block is not retained. - for (uint256 i = 0; i < RETENTION_WINDOW + 1; i++) { - blockTravel(1); - random.addTestRandomness(block.number, defaultValue); - } - - return EPOCH_SIZE * 2; - } - - function test_shouldRetainTheLastEpochBlocksRandomness_WhenRelyingOnTheLastBlockOfEachEpochsRandomness() - public - { - uint256 lastBlockOfEpoch = setUpWhenRelyingOnTheLastBlockOfEachEpochsRandomness(); - - // Get start of epoch and then subtract one for last block of previous epoch - assertEq( - 0x0000000000000000000000000000000000000000000000000000000000000001, - random.getTestRandomness(lastBlockOfEpoch, block.number) - ); - } - - function test_shouldRetainTheUsualRetentionWindowWorthOfBlocks_WhenRelyingOnTheLastBlockOfEachEpochsRandomness() - public - { - setUpWhenRelyingOnTheLastBlockOfEachEpochsRandomness(); - - for (uint256 i = 0; i < RETENTION_WINDOW; i++) { - assertEq( - random.getTestRandomness(block.number - i, block.number), - 0x0000000000000000000000000000000000000000000000000000000000000002 - ); - } - } - - function test_shouldStillNotRetainOtherBlocksNotCoveredByTheRetentionWindow_WhenRelyingOnTheLastBlockOfEachEpochsRandomness() - public - { - setUpWhenRelyingOnTheLastBlockOfEachEpochsRandomness(); - vm.expectRevert("Cannot query randomness older than the stored history"); - random.getTestRandomness(block.number - RETENTION_WINDOW, block.number); - } - - function test_shouldNotRetainTheLastEpochBlockOfPreviousEpochs_WhenRelyingOnTheLastBlockOfEachEpochsRandomness() - public - { - uint256 lastBlockOfEpoch = setUpWhenRelyingOnTheLastBlockOfEachEpochsRandomness(); - - vm.expectRevert("Cannot query randomness older than the stored history"); - random.getTestRandomness(lastBlockOfEpoch - EPOCH_SIZE, block.number); - } - function test_Reverts_WhenCalledOnL2() public { - _whenL2(); vm.expectRevert("This method is no longer supported in L2."); random.addTestRandomness(1, 0x0000000000000000000000000000000000000000000000000000000000000001); vm.expectRevert("This method is no longer supported in L2."); @@ -226,34 +43,7 @@ contract RandomTest_RevealAndCommit is RandomTest_ { address constant ACCOUNT = address(0x01); bytes32 constant RANDONMESS = bytes32(uint256(0x00)); - function setUp() public { - super.setUp(); - random.setRandomnessBlockRetentionWindow(256); - } - - function testRevert_CannotAddZeroCommitment() public { - vm.expectRevert("cannot commit zero randomness"); - random.testRevealAndCommit(RANDONMESS, commitmentFor(0x00), ACCOUNT); - } - - function test_CanAddInitialCommitment() public { - random.testRevealAndCommit(RANDONMESS, commitmentFor(0x01), ACCOUNT); - } - - function test_CanRevealInitialCommitment() public { - blockTravel(2); - random.testRevealAndCommit(RANDONMESS, commitmentFor(0x01), ACCOUNT); - blockTravel(1); - random.testRevealAndCommit(bytes32(uint256(0x01)), commitmentFor(0x02), ACCOUNT); - - bytes32 lastRandomness = random.getBlockRandomness(block.number - 1); - bytes32 expected = keccak256(abi.encodePacked(lastRandomness, bytes32(uint256(0x01)))); - - assertEq(random.getBlockRandomness(block.number), expected); - } - function test_Reverts_WhenCalledOnL2() public { - _whenL2(); vm.expectRevert("This method is no longer supported in L2."); blockTravel(2); random.testRevealAndCommit(RANDONMESS, commitmentFor(0x01), ACCOUNT); @@ -262,40 +52,15 @@ contract RandomTest_RevealAndCommit is RandomTest_ { contract RandomTest_Commitments is RandomTest_ { address constant ACCOUNT = address(0x01); - bytes32 constant RANDONMESS = bytes32(uint256(0x00)); - uint256 randomness2 = 0x01; - bytes32 commitment2 = commitmentFor(randomness2); - - function setUp() public { - super.setUp(); - random.testRevealAndCommit(RANDONMESS, commitment2, ACCOUNT); - } - - function test_returnsACommtiment() public { - bytes32 commitment = random.commitments(ACCOUNT); - assertEq(commitment, commitment2); - } function test_Reverts_WhenCalledOnL2() public { - _whenL2(); vm.expectRevert("This method is no longer supported in L2."); random.commitments(ACCOUNT); } } contract RandomTest_RandomnessBlockRetentionWindow is RandomTest_ { - function setUp() public { - super.setUp(); - random.setRandomnessBlockRetentionWindow(256); - } - - function test_getsTheRandomnessBlockRetentionWindow() public { - uint256 randomnessBlockRetentionWindow = random.randomnessBlockRetentionWindow(); - assertEq(randomnessBlockRetentionWindow, 256); - } - function test_Reverts_WhenCalledOnL2() public { - _whenL2(); vm.expectRevert("This method is no longer supported in L2."); random.randomnessBlockRetentionWindow(); } diff --git a/packages/protocol/test-sol/unit/stability/SortedOracles.mento.t.sol b/packages/protocol/test-sol/unit/stability/SortedOracles.mento.t.sol index 9f6ad19e9c8..a377adfbfc8 100644 --- a/packages/protocol/test-sol/unit/stability/SortedOracles.mento.t.sol +++ b/packages/protocol/test-sol/unit/stability/SortedOracles.mento.t.sol @@ -2,8 +2,10 @@ // solhint-disable func-name-mixedcase, var-name-mixedcase, state-visibility // solhint-disable const-name-snakecase, max-states-count, contract-name-camelcase pragma solidity ^0.5.13; +pragma experimental ABIEncoderV2; -import { Test, console2 as console } from "celo-foundry/Test.sol"; +import { console2 as console } from "celo-foundry/Test.sol"; +import { TestWithUtils } from "@test-sol/TestWithUtils.sol"; import { SortedLinkedListWithMedian } from "contracts/common/linkedlists/SortedLinkedListWithMedian.sol"; import { FixidityLib } from "contracts/common/FixidityLib.sol"; @@ -33,7 +35,7 @@ contract MockBreakerBox is IBreakerBox { function checkAndSetBreakers(address) external {} } -contract SortedOraclesTest is Test { +contract SortedOraclesTest is TestWithUtils { // Declare SortedOracles events for matching event ReportExpirySet(uint256 reportExpiry); event TokenReportExpirySet(address token, uint256 reportExpiry); @@ -301,19 +303,20 @@ contract SortedOracles_RemoveOracles is SortedOraclesTest { assertEq(medianTimestampBefore, medianTimestampAfter); } - function testFail_removeOracle_whenOneReportExists_shouldNotEmitTheOracleReportedAndMedianUpdatedEvent() - public - { - // testFail feals impricise here. - // TODO: Better way of testing this case :) - submitNReports(1); - vm.expectEmit(true, true, true, true, address(sortedOracles)); - emit OracleReportRemoved(token, oracle); - vm.expectEmit(true, true, true, true, address(sortedOracles)); - emit MedianUpdated(token, 0); + function _performOracleRemoval() internal { sortedOracles.removeOracle(token, oracle, 0); } + function test_removeOracle_whenOneReportExists_shouldNotEmitTheMedianUpdatedEvent() public { + submitNReports(1); + assertDoesNotEmit(_performOracleRemoval, "MedianUpdated(address,uint256)"); + } + + function test_removeOracle_whenOneReportExists_shouldNotEmitTheOracleReportedEvent() public { + submitNReports(1); + assertDoesNotEmit(_performOracleRemoval, "OracleReportRemoved(address,address)"); + } + function test_removeOracle_whenOneReportExists_shouldEmitTheOracleRemovedEvent() public { submitNReports(1); vm.expectEmit(true, true, true, true, address(sortedOracles)); diff --git a/packages/protocol/test-sol/utils/ConstitutionHelper.sol b/packages/protocol/test-sol/utils/ConstitutionHelper.sol new file mode 100644 index 00000000000..3ebb626189c --- /dev/null +++ b/packages/protocol/test-sol/utils/ConstitutionHelper.sol @@ -0,0 +1,140 @@ +// SPDX-License-Identifier: Unlicensed +pragma solidity >=0.5.13 <0.9.0; + +// Foundry imports +import { Vm } from "forge-std-8/Vm.sol"; +import { stdJson } from "forge-std-8/StdJson.sol"; + +// Celo imports +import { IRegistry } from "@celo-contracts/common/interfaces/IRegistry.sol"; + +// Test imports +import { SelectorParser } from "@test-sol/utils/SelectorParser.sol"; +import { StringUtils } from "@test-sol/utils/StringUtils.sol"; + +library ConstitutionHelper { + using stdJson for string; + using SelectorParser for string; + using StringUtils for string; + + struct JsonFiles { + string constitutionJson; + string proxySelectors; + } + + struct FileProps { + string[] contractNames; + string[] proxyNames; + string[] proxySigs; + } + + struct ConstitutionEntry { + string contractName; + address contractAddress; + string functionName; + bytes4 functionSelector; + uint256 threshold; + } + + function readConstitution( + ConstitutionEntry[] storage _entries, + IRegistry _registry, + Vm _vm + ) external { + // get contracts from constitution + JsonFiles memory files_ = JsonFiles( + _vm.readFile("./governanceConstitution.json"), // constitution json + _vm.readFile("./.tmp/selectors/Proxy.json") // proxy selectors + ); + FileProps memory props_ = FileProps( + _vm.parseJsonKeys(files_.constitutionJson, ""), // contract names + _vm.parseJsonKeys(files_.constitutionJson, ".Proxy"), // proxy names + _vm.parseJsonKeys(files_.proxySelectors, "") // proxy sigs + ); + + // vars for looping + ConstitutionEntry memory entry_; + string memory contractSelectors_; + string[] memory functionsWithTypes_; + string[] memory functionNames_; + + // loop over contract names + for (uint256 i = 0; i < props_.contractNames.length; i++) { + entry_.contractName = props_.contractNames[i]; + if (entry_.contractName.equals("Proxy")) { + // skip proxy address + continue; + } else { + // set address from registry + entry_.contractAddress = _registry.getAddressForStringOrDie(entry_.contractName); + } + + // load selectors for given contract from file + contractSelectors_ = _vm.readFile( + string.concat("./.tmp/selectors/", entry_.contractName, ".json") + ); + + // get function names with types + functionsWithTypes_ = _vm.parseJsonKeys(contractSelectors_, ""); + + // get functions names from constitution for contract + functionNames_ = _vm.parseJsonKeys( + files_.constitutionJson, + string.concat(".", entry_.contractName) + ); + + // loop over function names + uint256 functionsCount_ = functionNames_.length + props_.proxyNames.length; + for (uint256 j = 0; j < functionsCount_; j++) { + if (j < functionNames_.length) { + // get function from contract implementation + entry_.functionName = functionNames_[j]; + } else { + // get function from proxy contract + entry_.functionName = props_.proxyNames[j - functionNames_.length]; + } + + if (entry_.functionName.equals("default")) { + // use empty selector as default + entry_.functionSelector = hex"00000000"; + } else if (j < functionNames_.length) { + // retrieve selector from contract selectors + entry_.functionSelector = contractSelectors_.getSelector( + functionsWithTypes_, + entry_.functionName, + _vm + ); + } else { + // retrieve selector from proxy selectors + entry_.functionSelector = files_.proxySelectors.getSelector( + props_.proxySigs, + entry_.functionName, + _vm + ); + } + + // determine treshold from constitution + if (j < functionNames_.length) { + entry_.threshold = files_.constitutionJson.readUint( + string.concat(".", entry_.contractName, ".", entry_.functionName) + ); + } else { + entry_.threshold = files_.constitutionJson.readUint( + string.concat(".Proxy.", entry_.functionName) + ); + } + + // push constitution to return array + _entries.push( + ConstitutionEntry( + entry_.contractName, + entry_.contractAddress, + entry_.functionName, + entry_.functionSelector, + entry_.threshold + ) + ); + } + } + } +} diff --git a/packages/protocol/test-sol/utils/PrecompileHandler.sol b/packages/protocol/test-sol/utils/PrecompileHandler.sol index 286eb1f4c5c..9a909f18295 100644 --- a/packages/protocol/test-sol/utils/PrecompileHandler.sol +++ b/packages/protocol/test-sol/utils/PrecompileHandler.sol @@ -4,8 +4,8 @@ pragma solidity >=0.5.13 <0.8.20; // Note: This is contract is a copy of `PrecompileHandler` in celo-foundry, but uses `UsingPrecompile` instead of `Precompiles`. // This contract is to be removed/deprecated once the transition to L2 is live on mainnet. -import "@lib/celo-foundry-8/lib/forge-std/src/Vm.sol"; -import "@lib/celo-foundry-8/lib/forge-std/src/console2.sol"; +import "forge-std-8/Vm.sol"; +import "forge-std-8/console2.sol"; import "@celo-contracts-8/common/UsingPrecompiles.sol"; contract PrecompileHandler is UsingPrecompiles { diff --git a/packages/protocol/test-sol/utils/SelectorParser.sol b/packages/protocol/test-sol/utils/SelectorParser.sol new file mode 100644 index 00000000000..b1dd5304cc0 --- /dev/null +++ b/packages/protocol/test-sol/utils/SelectorParser.sol @@ -0,0 +1,64 @@ +// SPDX-License-Identifier: Unlicensed +pragma solidity >=0.5.13 <0.9.0; + +// Foundry imports +import { Vm } from "forge-std-8/Vm.sol"; + +// Migrations imports +import { StringUtils } from "./StringUtils.sol"; + +library SelectorParser { + using StringUtils for string; + + // internal function to convert value of string to selector + function _parseSelector(string memory _str) internal pure returns (bytes4) { + // cast string to raw bytes + bytes memory bytes_ = bytes(_str); + // loop over string and interpret as bytes + uint256 value_ = 0; + for (uint i = 0; i < bytes_.length; i++) { + uint8 c = uint8(bytes_[i]); + // interpret ASCII character as byte + value_ = + value_ * + 16 + + ( + c >= 48 && c <= 57 + ? c - 48 // 0-9 + : c >= 97 && c <= 102 + ? c - 87 // a-f + : c >= 65 && c <= 70 + ? c - 55 // A-F + : 0 + ); + } + return bytes4(uint32(value_)); + } + + // internal function to read function selector from prepared file + function getSelector( + string memory _json, + string[] memory _functionsWithTypes, + string memory _functionName, + Vm _vm + ) internal pure returns (bytes4) { + // loop over functions with types (eg: transfer(address,uint256)) + for (uint256 i = 0; i < _functionsWithTypes.length; i++) { + string memory functionWithTypes_ = _functionsWithTypes[i]; + // check if function with type starts with desired function name (eg: transfer) + if (functionWithTypes_.startsWith(_functionName)) { + // load value of selector from json + bytes memory selectorValue_ = _vm.parseJson( + _json, + string.concat("['", functionWithTypes_, "']") + ); + // decode value to string + string memory selectorString_ = abi.decode(selectorValue_, (string)); + // return parsed value as bytes4 + return _parseSelector(selectorString_); + } + } + // revert if selector not found + revert(string.concat("selector for function ", _functionName, " not present in contract")); + } +} diff --git a/packages/protocol/test-sol/utils/StringUtils.sol b/packages/protocol/test-sol/utils/StringUtils.sol new file mode 100644 index 00000000000..a8ee7ead36f --- /dev/null +++ b/packages/protocol/test-sol/utils/StringUtils.sol @@ -0,0 +1,28 @@ +// SPDX-License-Identifier: MIT +pragma solidity >=0.5.13 <0.9.0; + +library StringUtils { + // This function can be also found in OpenZeppelin's library, but in a newer version than the one we use. + function equals(string memory a, string memory b) internal pure returns (bool) { + // compare keccak256 of encoded string + return (keccak256(abi.encodePacked((a))) == keccak256(abi.encodePacked((b)))); + } + + // This function is simplification of function present in StringUtils library, but in a newer version. + function startsWith(string memory a, string memory b) internal pure returns (bool equal) { + // false if first string is shorter (cannot contain string b) + if (bytes(a).length < bytes(b).length) return false; + + // determine in assembly if strings are equal + assembly { + // load length of string b stored in memory location of b + let bLength := mload(b) + // load pointer to string in memory location 32 bytes (0x20) after a + let aPointer := add(a, 0x20) + // load pointer to string in memory location 32 bytes (0x20) after b + let bPointer := add(b, 0x20) + // compare and return keccak of part of string a (using string b length) to keccak of string b + equal := eq(keccak256(aPointer, bLength), keccak256(bPointer, bLength)) + } + } +} diff --git a/packages/protocol/test/compatibility/ast-code.ts b/packages/protocol/test-ts/compatibility/ast-code.ts similarity index 87% rename from packages/protocol/test/compatibility/ast-code.ts rename to packages/protocol/test-ts/compatibility/ast-code.ts index 300177786d1..8b61e5b5ff2 100644 --- a/packages/protocol/test/compatibility/ast-code.ts +++ b/packages/protocol/test-ts/compatibility/ast-code.ts @@ -9,7 +9,7 @@ import { MethodVisibilityChange, NewContractChange, } from '@celo/protocol/lib/compatibility/change' -import { getTestArtifacts } from '@celo/protocol/test/compatibility/common' +import { getTestArtifacts } from '@celo/protocol/test-ts/util/compatibility' import { assert } from 'chai' const testCases = { @@ -22,7 +22,6 @@ const testCases = { big_original_modified: getTestArtifacts('big_original_modified'), } -// @ts-ignore const comp = (c1: Change, c2: Change): number => { const v1 = JSON.stringify(c1) const v2 = JSON.stringify(c2) @@ -35,21 +34,21 @@ const comp = (c1: Change, c2: Change): number => { describe('#reportASTIncompatibilities()', () => { describe('when the contracts are the same', () => { it('reports no changes', () => { - const report = reportASTIncompatibilities([testCases.original], [testCases.original_copy]) + const report = reportASTIncompatibilities(testCases.original, testCases.original_copy) assert.isEmpty(report.getChanges()) }) }) describe('when only metadata has changed', () => { it('reports no changes', () => { - const report = reportASTIncompatibilities([testCases.original], [testCases.metadata_changed]) + const report = reportASTIncompatibilities(testCases.original, testCases.metadata_changed) assert.isEmpty(report.getChanges()) }) }) describe('when a contract storage is changed', () => { it('reports only bytecode changes', () => { - const report = reportASTIncompatibilities([testCases.original], [testCases.inserted_constant]) + const report = reportASTIncompatibilities(testCases.original, testCases.inserted_constant) const expected = [new DeployedBytecodeChange('TestContract')] assert.deepEqual(report.getChanges(), expected) }) @@ -58,8 +57,8 @@ describe('#reportASTIncompatibilities()', () => { describe('when a contract and methods are added', () => { it('reports proper changes', () => { const report = reportASTIncompatibilities( - [testCases.original], - [testCases.added_methods_and_contracts] + testCases.original, + testCases.added_methods_and_contracts ) const expected = [ new NewContractChange('TestContractNew'), @@ -76,8 +75,8 @@ describe('#reportASTIncompatibilities()', () => { describe('when methods are removed', () => { it('reports proper changes', () => { const report = reportASTIncompatibilities( - [testCases.added_methods_and_contracts], - [testCases.original] + testCases.added_methods_and_contracts, + testCases.original ) const expected = [ new DeployedBytecodeChange('TestContract'), @@ -93,8 +92,8 @@ describe('#reportASTIncompatibilities()', () => { describe('when many changes are made', () => { it('reports proper changes', () => { const report = reportASTIncompatibilities( - [testCases.big_original], - [testCases.big_original_modified] + testCases.big_original, + testCases.big_original_modified ) const expected = [ new NewContractChange('NewContract'), diff --git a/packages/protocol/test/compatibility/ast-layout.ts b/packages/protocol/test-ts/compatibility/ast-layout.ts similarity index 77% rename from packages/protocol/test/compatibility/ast-layout.ts rename to packages/protocol/test-ts/compatibility/ast-layout.ts index 9f4f0c0a126..acfcbc0b87f 100644 --- a/packages/protocol/test/compatibility/ast-layout.ts +++ b/packages/protocol/test-ts/compatibility/ast-layout.ts @@ -1,5 +1,8 @@ -import { reportLayoutIncompatibilities } from '@celo/protocol/lib/compatibility/ast-layout' -import { getTestArtifacts } from '@celo/protocol/test/compatibility/common' +import { + reportLayoutIncompatibilities, + ASTStorageCompatibilityReport, +} from '@celo/protocol/lib/compatibility/ast-layout' +import { getTestArtifacts } from '@celo/protocol/test-ts/util/compatibility' import { assert } from 'chai' const testCases = { @@ -38,15 +41,18 @@ const testCases = { deprecated_prefixed_variable: getTestArtifacts('deprecated_prefixed_variable'), } -const assertCompatible = (report) => { +const assertCompatible = (report: ASTStorageCompatibilityReport[]) => { assert.isTrue(report.every((contractReport) => contractReport.compatible)) } -const assertNotCompatible = (report) => { +const assertNotCompatible = (report: ASTStorageCompatibilityReport[]) => { assert.isFalse(report.every((contractReport) => contractReport.compatible)) } -const selectReportFor = (report, contractName) => { +const selectReportFor = ( + report: ASTStorageCompatibilityReport[], + contractName: string +): ASTStorageCompatibilityReport => { return report.find((contractReport) => contractReport.contract === contractName) } @@ -56,7 +62,11 @@ const selectReportFor = (report, contractName) => { * @param expectedMatches The regular expressions that each successive error for * `contractName` should match. */ -const assertContractErrorsMatch = (report, contractName: string, expectedMatches) => { +const assertContractErrorsMatch = ( + report: ASTStorageCompatibilityReport[], + contractName: string, + expectedMatches: RegExp[] +) => { const contractReport = selectReportFor(report, contractName) assert.equal(contractReport.errors.length, 1) @@ -68,31 +78,28 @@ const assertContractErrorsMatch = (report, contractName: string, expectedMatches describe('#reportLayoutIncompatibilities()', () => { describe('when the contracts are the same', () => { it('reports no incompatibilities', () => { - const report = reportLayoutIncompatibilities([testCases.original], [testCases.original]) + const report = reportLayoutIncompatibilities(testCases.original, testCases.original) assertCompatible(report) }) }) describe('when a constant is inserted in a contract', () => { it('reports no incompatibilities', () => { - const report = reportLayoutIncompatibilities( - [testCases.original], - [testCases.inserted_constant] - ) + const report = reportLayoutIncompatibilities(testCases.original, testCases.inserted_constant) assertCompatible(report) }) }) describe('when a variable is appended in a contract', () => { it('reports no incompatibilities', () => { - const report = reportLayoutIncompatibilities([testCases.original], [testCases.appended]) + const report = reportLayoutIncompatibilities(testCases.original, testCases.appended) assertCompatible(report) }) }) describe('when a variable is inserted in a contract', () => { it('reports an inserted variable', () => { - const report = reportLayoutIncompatibilities([testCases.original], [testCases.inserted]) + const report = reportLayoutIncompatibilities(testCases.original, testCases.inserted) assertNotCompatible(report) assertContractErrorsMatch(report, 'TestContract', [/inserted/]) }) @@ -100,10 +107,7 @@ describe('#reportLayoutIncompatibilities()', () => { describe('when a variable is appended in a parent contract', () => { it('reports an inserted variable', () => { - const report = reportLayoutIncompatibilities( - [testCases.original], - [testCases.appended_in_parent] - ) + const report = reportLayoutIncompatibilities(testCases.original, testCases.appended_in_parent) assertNotCompatible(report) assertContractErrorsMatch(report, 'TestContract', [/inserted/]) }) @@ -111,7 +115,7 @@ describe('#reportLayoutIncompatibilities()', () => { describe('when a variable is removed in a contract', () => { it('reports a removed variable', () => { - const report = reportLayoutIncompatibilities([testCases.original], [testCases.removed]) + const report = reportLayoutIncompatibilities(testCases.original, testCases.removed) assertNotCompatible(report) assertContractErrorsMatch(report, 'TestContract', [/removed/]) }) @@ -120,8 +124,8 @@ describe('#reportLayoutIncompatibilities()', () => { describe('when a variable is removed in a parent contract', () => { it('reports a removed variable', () => { const report = reportLayoutIncompatibilities( - [testCases.original], - [testCases.removed_from_parent] + testCases.original, + testCases.removed_from_parent ) assertNotCompatible(report) assertContractErrorsMatch(report, 'TestContract', [/removed/]) @@ -130,7 +134,7 @@ describe('#reportLayoutIncompatibilities()', () => { describe(`when a variable's type changes in a contract`, () => { it('reports a typechanged variable', () => { - const report = reportLayoutIncompatibilities([testCases.original], [testCases.typechange]) + const report = reportLayoutIncompatibilities(testCases.original, testCases.typechange) assertNotCompatible(report) assertContractErrorsMatch(report, 'TestContract', [/had type/]) }) @@ -139,8 +143,8 @@ describe('#reportLayoutIncompatibilities()', () => { describe(`when a variable's type changes in a parent contract`, () => { it('reports a typechanged variable', () => { const report = reportLayoutIncompatibilities( - [testCases.original], - [testCases.typechange_in_parent] + testCases.original, + testCases.typechange_in_parent ) assertNotCompatible(report) assertContractErrorsMatch(report, 'TestContract', [/had type/]) @@ -150,8 +154,8 @@ describe('#reportLayoutIncompatibilities()', () => { describe('when a field is added to a struct in mapping', () => { it('reports no incompatibilities', () => { const report = reportLayoutIncompatibilities( - [testCases.original_struct_in_mapping], - [testCases.inserted_in_struct_mapping] + testCases.original_struct_in_mapping, + testCases.inserted_in_struct_mapping ) assertCompatible(report) }) @@ -160,8 +164,8 @@ describe('#reportLayoutIncompatibilities()', () => { describe('when a field is added to a library struct in mapping', () => { it('reports no incompatibilities', () => { const report = reportLayoutIncompatibilities( - [testCases.original_struct_in_mapping], - [testCases.inserted_in_library_struct_mapping] + testCases.original_struct_in_mapping, + testCases.inserted_in_library_struct_mapping ) assertCompatible(report) }) @@ -170,8 +174,8 @@ describe('#reportLayoutIncompatibilities()', () => { describe('when a field is prefixed with deprecated to a library struct in mapping', () => { it('reports no incompatibilities', () => { const report = reportLayoutIncompatibilities( - [testCases.original_struct_in_mapping], - [testCases.deprecated_prefixed_in_library_struct_mapping] + testCases.original_struct_in_mapping, + testCases.deprecated_prefixed_in_library_struct_mapping ) assertCompatible(report) }) @@ -180,8 +184,8 @@ describe('#reportLayoutIncompatibilities()', () => { describe('when a field is prefixed with deprecated to struct variable', () => { it('reports no incompatibilities', () => { const report = reportLayoutIncompatibilities( - [testCases.original], - [testCases.deprecated_prefixed_in_struct] + testCases.original, + testCases.deprecated_prefixed_in_struct ) assertCompatible(report) }) @@ -190,8 +194,8 @@ describe('#reportLayoutIncompatibilities()', () => { describe('when a variable is prefixed with deprecated', () => { it('reports no incompatibilities', () => { const report = reportLayoutIncompatibilities( - [testCases.original], - [testCases.deprecated_prefixed_variable] + testCases.original, + testCases.deprecated_prefixed_variable ) assertCompatible(report) }) @@ -199,10 +203,7 @@ describe('#reportLayoutIncompatibilities()', () => { describe('when a field is added to a struct', () => { it('reports a struct change', () => { - const report = reportLayoutIncompatibilities( - [testCases.original], - [testCases.inserted_in_struct] - ) + const report = reportLayoutIncompatibilities(testCases.original, testCases.inserted_in_struct) assertNotCompatible(report) assertContractErrorsMatch(report, 'TestContract', [/struct.*changed/]) }) @@ -211,8 +212,8 @@ describe('#reportLayoutIncompatibilities()', () => { describe('when a field changes type in a struct', () => { it('reports a struct change', () => { const report = reportLayoutIncompatibilities( - [testCases.original], - [testCases.typechange_in_struct] + testCases.original, + testCases.typechange_in_struct ) assertNotCompatible(report) assertContractErrorsMatch(report, 'TestContract', [/struct.*changed/]) @@ -222,8 +223,8 @@ describe('#reportLayoutIncompatibilities()', () => { describe('when a field changes type in a library struct', () => { it('reports a struct change', () => { const report = reportLayoutIncompatibilities( - [testCases.original], - [testCases.typechange_in_library_struct] + testCases.original, + testCases.typechange_in_library_struct ) assertNotCompatible(report) assertContractErrorsMatch(report, 'TestContract', [/struct.*changed/]) @@ -233,8 +234,8 @@ describe('#reportLayoutIncompatibilities()', () => { describe('when a field is removed from a struct', () => { it('reports a struct change', () => { const report = reportLayoutIncompatibilities( - [testCases.original], - [testCases.removed_from_struct] + testCases.original, + testCases.removed_from_struct ) assertNotCompatible(report) assertContractErrorsMatch(report, 'TestContract', [/struct.*changed/]) @@ -244,8 +245,8 @@ describe('#reportLayoutIncompatibilities()', () => { describe('when a field is removed from a library struct', () => { it('reports a struct change', () => { const report = reportLayoutIncompatibilities( - [testCases.original], - [testCases.removed_from_library_struct] + testCases.original, + testCases.removed_from_library_struct ) assertNotCompatible(report) assertContractErrorsMatch(report, 'TestContract', [/struct.*changed/]) @@ -255,8 +256,8 @@ describe('#reportLayoutIncompatibilities()', () => { describe('when a field is inserted in a library struct', () => { it('reports a struct change', () => { const report = reportLayoutIncompatibilities( - [testCases.original], - [testCases.inserted_in_library_struct] + testCases.original, + testCases.inserted_in_library_struct ) assertNotCompatible(report) assertContractErrorsMatch(report, 'TestContract', [/struct.*changed/]) @@ -266,8 +267,8 @@ describe('#reportLayoutIncompatibilities()', () => { describe('when a fixed array has length increased', () => { it('reports a typechanged variable', () => { const report = reportLayoutIncompatibilities( - [testCases.original_complex], - [testCases.longer_fixed_array] + testCases.original_complex, + testCases.longer_fixed_array ) assertNotCompatible(report) assertContractErrorsMatch(report, 'TestContract', [/had type/]) @@ -277,8 +278,8 @@ describe('#reportLayoutIncompatibilities()', () => { describe('when a fixed array has length decreased', () => { it('reports a typechanged variable', () => { const report = reportLayoutIncompatibilities( - [testCases.original_complex], - [testCases.shorter_fixed_array] + testCases.original_complex, + testCases.shorter_fixed_array ) assertNotCompatible(report) assertContractErrorsMatch(report, 'TestContract', [/had type/]) @@ -288,8 +289,8 @@ describe('#reportLayoutIncompatibilities()', () => { describe('when a fixed array becomes dynamic', () => { it('reports a typechanged variable', () => { const report = reportLayoutIncompatibilities( - [testCases.original_complex], - [testCases.fixed_to_dynamic_array] + testCases.original_complex, + testCases.fixed_to_dynamic_array ) assertNotCompatible(report) assertContractErrorsMatch(report, 'TestContract', [/had type/]) @@ -299,8 +300,8 @@ describe('#reportLayoutIncompatibilities()', () => { describe('when a dynamic array becomes fixed', () => { it('reports a typechanged variable', () => { const report = reportLayoutIncompatibilities( - [testCases.original_complex], - [testCases.dynamic_to_fixed_array] + testCases.original_complex, + testCases.dynamic_to_fixed_array ) assertNotCompatible(report) assertContractErrorsMatch(report, 'TestContract', [/had type/]) @@ -314,8 +315,8 @@ describe('#reportLayoutIncompatibilities()', () => { describe.skip('when the source of a mapping changes', () => { it('reports a typechanged variable', () => { const report = reportLayoutIncompatibilities( - [testCases.original_complex], - [testCases.mapping_source_changed] + testCases.original_complex, + testCases.mapping_source_changed ) assertNotCompatible(report) assertContractErrorsMatch(report, 'TestContract', [/had type/]) @@ -325,8 +326,8 @@ describe('#reportLayoutIncompatibilities()', () => { describe.skip('when the source of a nested mapping changes', () => { it('reports a typechanged variable', () => { const report = reportLayoutIncompatibilities( - [testCases.original_complex], - [testCases.internal_mapping_source_changed] + testCases.original_complex, + testCases.internal_mapping_source_changed ) assertNotCompatible(report) assertContractErrorsMatch(report, 'TestContract', [/had type/]) @@ -336,8 +337,8 @@ describe('#reportLayoutIncompatibilities()', () => { describe('when the target of a mapping changes', () => { it('reports a typechanged variable', () => { const report = reportLayoutIncompatibilities( - [testCases.original_complex], - [testCases.mapping_target_changed] + testCases.original_complex, + testCases.mapping_target_changed ) assertNotCompatible(report) assertContractErrorsMatch(report, 'TestContract', [/had type/]) diff --git a/packages/protocol/test/compatibility/ast-version.ts b/packages/protocol/test-ts/compatibility/ast-version.ts similarity index 68% rename from packages/protocol/test/compatibility/ast-version.ts rename to packages/protocol/test-ts/compatibility/ast-version.ts index 98de2807288..e538300eeb7 100644 --- a/packages/protocol/test/compatibility/ast-version.ts +++ b/packages/protocol/test-ts/compatibility/ast-version.ts @@ -1,6 +1,7 @@ import { getContractVersion } from '@celo/protocol/lib/compatibility/ast-version' +import { getArtifactByName } from '@celo/protocol/lib/compatibility/internal' import { DEFAULT_VERSION_STRING } from '@celo/protocol/lib/compatibility/version' -import { getTestArtifacts } from '@celo/protocol/test/compatibility/common' +import { getTestArtifacts } from '@celo/protocol/test-ts/util/compatibility' import { assert } from 'chai' const testCases = { @@ -12,7 +13,8 @@ describe('#getContractVersion()', () => { describe('when the contract implements getVersionNumber()', () => { it('returns the correct version number', async () => { const version = await getContractVersion( - testCases.versioned.getArtifactByName('TestContract') + getArtifactByName('TestContract', testCases.versioned[0]), + false ) assert.equal(version.toString(), '1.2.3.4') }) @@ -20,7 +22,10 @@ describe('#getContractVersion()', () => { describe('when the contract does not implement getVersionNumber()', () => { it('returns the default version number', async () => { - const version = await getContractVersion(testCases.original.getArtifactByName('TestContract')) + const version = await getContractVersion( + getArtifactByName('TestContract', testCases.original[0]), + false + ) assert.equal(version.toString(), DEFAULT_VERSION_STRING) }) }) diff --git a/packages/protocol/test/compatibility/library-linking.ts b/packages/protocol/test-ts/compatibility/library-linking.ts similarity index 89% rename from packages/protocol/test/compatibility/library-linking.ts rename to packages/protocol/test-ts/compatibility/library-linking.ts index b3c446bac69..f78a3993dc2 100644 --- a/packages/protocol/test/compatibility/library-linking.ts +++ b/packages/protocol/test-ts/compatibility/library-linking.ts @@ -2,7 +2,7 @@ import { assert } from 'chai' import { reportASTIncompatibilities } from '@celo/protocol/lib/compatibility/ast-code' import { reportLibraryLinkingIncompatibilities } from '@celo/protocol/lib/compatibility/library-linking' -import { getTestArtifacts } from '@celo/protocol/test/compatibility/common' +import { getTestArtifacts } from '@celo/protocol/test-ts/util/compatibility' const testCases = { linked_libraries: getTestArtifacts('linked_libraries'), @@ -12,8 +12,8 @@ const testCases = { describe('reportLibraryLinkingIncompatibilities', () => { it('detects when a linked library has changed', () => { const codeReport = reportASTIncompatibilities( - [testCases.linked_libraries], - [testCases.linked_libraries_upgraded_lib] + testCases.linked_libraries, + testCases.linked_libraries_upgraded_lib ) const libraryLinksReport = reportLibraryLinkingIncompatibilities( { diff --git a/packages/protocol/test-ts/compatibility/verify-bytecode.ts b/packages/protocol/test-ts/compatibility/verify-bytecode.ts new file mode 100644 index 00000000000..a7bd59f1f6e --- /dev/null +++ b/packages/protocol/test-ts/compatibility/verify-bytecode.ts @@ -0,0 +1,571 @@ +import { assert } from 'chai' +import { BuildArtifacts } from '@openzeppelin/upgrades' +import { AbiFunction, Abi, encodeFunctionData, GetProofReturnType } from 'viem' +import { readJsonSync } from 'fs-extra' + +import { ProposalTx } from '@celo/protocol/scripts/truffle/make-release' +import { + ArtifactLibraryLinking, + LibraryLinks, + LibraryLinkingInfo, + getPlaceholderHash, + linkLibraries, +} from '@celo/protocol/lib/bytecode-foundry' +import { + getArtifactByName, + getBytecode, + getDeployedBytecode, + getSourceFile, +} from '@celo/protocol/lib/compatibility/internal' +import { + verifyBytecodes, + ChainLookup, + RegistryLookup, + ProxyLookup, +} from '@celo/protocol/lib/compatibility/verify-bytecode-foundry' +import { assertThrowsAsync } from '@celo/protocol/lib/test-utils' +import { startNetwork } from '@celo/protocol/test-ts/util/anvil' +import { getTestArtifacts } from '@celo/protocol/test-ts/util/compatibility' +import { deployViemContract } from '@celo/protocol/test-ts/util/viem' + +const registryAbi = readJsonSync(`./out/Registry.sol/Registry.json`).abi as Abi +const registryBytecode = readJsonSync(`./out/Registry.sol/Registry.json`).bytecode.object as string + +const proxyAbi = readJsonSync(`./out/Proxy.sol/Proxy.json`).abi as Abi +const proxyBytecode = readJsonSync(`./out/Proxy.sol/Proxy.json`).bytecode.object as string + +const deployContractWithLinking = async ( + contract: string, + artifacts: BuildArtifacts, + client, + links: LibraryLinks +) => { + const artifact = getArtifactByName(contract, artifacts) + const bytecode = getBytecode(artifact) + const linkedBytecode = linkLibraries(bytecode, links) + const address = await deployViemContract(artifact.abi, linkedBytecode, client) + + links[contract] = { + address: address.slice(2), + placeholderHash: getPlaceholderHash(`${getSourceFile(artifact)}:${contract}`), + } + + return address +} + +const deployProxiedContract = async ( + contract: string, + artifacts: BuildArtifacts, + client, + links: LibraryLinks +) => { + const proxyAddress = await deployViemContract(proxyAbi, proxyBytecode, client) + const implementationAddress = await deployContractWithLinking(contract, artifacts, client, links) + + await client.writeContract({ + address: proxyAddress, + abi: proxyAbi, + functionName: '_setImplementation', + args: [implementationAddress], + }) + + return proxyAddress +} + +const setPlaceholderHash = (links: LibraryLinks, library: string, artifacts: BuildArtifacts) => { + links[library].placeholderHash = getPlaceholderHash( + `${getSourceFile(getArtifactByName(library, artifacts))}:${library}` + ) +} + +const buildArtifacts = getTestArtifacts('linked_libraries')[0] +const upgradedLibBuildArtifacts = getTestArtifacts('linked_libraries_upgraded_lib')[0] +const upgradedContractBuildArtifacts = getTestArtifacts('linked_libraries_upgraded_contract')[0] + +describe('', () => { + const artifact = getArtifactByName('TestContract', buildArtifacts) + const placeholderHashes: { [library: string]: string } = {} + + before(() => { + const libraryNames = ['LinkedLibrary1', 'LinkedLibrary2', 'LinkedLibrary3'] + libraryNames.forEach((library: string) => { + const libArtifact = getArtifactByName(library, buildArtifacts) + const placeholderHash = getPlaceholderHash(`${getSourceFile(libArtifact)}:${library}`) + placeholderHashes[library] = placeholderHash + }) + }) + + describe('ArtifactLibraryLinking()', () => { + it('collects the right number of positions for each library', () => { + const linking = new ArtifactLibraryLinking(artifact) + assert.equal(linking.links['LinkedLibrary1'].positions.length, 2) + assert.equal(linking.links['LinkedLibrary2'].positions.length, 2) + }) + }) + + describe('LibraryLinkingInfo.collect()', () => { + describe('when libraries are linked correctly', () => { + it('collects the correct addresses', () => { + const linking = new ArtifactLibraryLinking(artifact) + const links: LibraryLinks = { + LinkedLibrary1: { + address: '0000000000000000000000000000000000000001', + placeholderHash: placeholderHashes['LinkedLibrary1'], + }, + LinkedLibrary2: { + address: '0000000000000000000000000000000000000002', + placeholderHash: placeholderHashes['LinkedLibrary2'], + }, + } + const linkedBytecode = linkLibraries(getDeployedBytecode(artifact), links) + const linkingInfo = new LibraryLinkingInfo() + linkingInfo.collect(linkedBytecode, linking) + + assert.equal( + linkingInfo.info['LinkedLibrary1'].address, + '0000000000000000000000000000000000000001' + ) + assert.equal( + linkingInfo.info['LinkedLibrary2'].address, + '0000000000000000000000000000000000000002' + ) + }) + }) + + describe('when libraries are not linked correctly', () => { + it('detects incorrect linking', () => { + const linking = new ArtifactLibraryLinking(artifact) + const links: LibraryLinks = { + LinkedLibrary1: { + address: '0000000000000000000000000000000000000001', + placeholderHash: placeholderHashes['LinkedLibrary1'], + }, + LinkedLibrary2: { + address: '0000000000000000000000000000000000000002', + placeholderHash: placeholderHashes['LinkedLibrary2'], + }, + } + const linkedBytecode = linkLibraries(getDeployedBytecode(artifact), links) + const incorrectBytecode = + linkedBytecode.slice(0, linking.links['LinkedLibrary1'].positions[0] - 1) + + '0000000000000000000000000000000000000003' + + linkedBytecode.slice( + linking.links['LinkedLibrary1'].positions[0] - 1 + 40, + linkedBytecode.length + ) + + const errors = new LibraryLinkingInfo().collect(incorrectBytecode, linking) + assert.isAbove(errors.length, 0) + assert.match(errors[0], /Mismatched addresses for LinkedLibrary1/) + }) + }) + }) + + describe('on a test contract deployment', () => { + let network + let registryLookup: RegistryLookup + let proxyLookup: ProxyLookup + let chainLookup: ChainLookup + const links: LibraryLinks = {} + + beforeEach(async () => { + network = await startNetwork() + + const registryAddress = await deployViemContract( + registryAbi, + registryBytecode, + network.client, + [true] + ) + + await deployContractWithLinking('LinkedLibrary1', buildArtifacts, network.client, links) + await deployContractWithLinking('LinkedLibrary3', buildArtifacts, network.client, links) + await deployContractWithLinking('LinkedLibrary2', buildArtifacts, network.client, links) + + const testContractAddress = await deployProxiedContract( + 'TestContract', + buildArtifacts, + network.client, + links + ) + + registryLookup = { + getAddressForString: async (name: string) => { + return (await network.client.readContract({ + address: registryAddress as `0x${string}`, + abi: registryAbi, + functionName: 'getAddressForString', + args: [name], + })) as string + }, + } + + proxyLookup = { + getImplementation: async (address: string): Promise => { + return (await network.client.readContract({ + address, + abi: proxyAbi, + functionName: '_getImplementation', + args: [], + })) as string + }, + } + + chainLookup = { + getCode: async (address: `0x${string}`): Promise => { + return (await network.client.getCode({ address })) as string + }, + encodeFunctionCall: (abi: AbiFunction, args: any[]) => { + return encodeFunctionData({ + abi: [abi], + functionName: abi.name, + args, + }) + }, + getProof: async ( + address: `0x${string}`, + slots: `0x${string}`[] + ): Promise => { + return (await network.client.getProof({ + address, + storageKeys: slots, + })) as GetProofReturnType + }, + } + + await network.client.writeContract({ + address: registryAddress as `0x${string}`, + abi: registryAbi, + functionName: 'setAddressFor', + args: ['TestContract', testContractAddress], + }) + }) + + afterEach(() => { + network.anvil.kill() + }) + + describe('verifyBytecodes', () => { + it(`doesn't throw on matching contracts`, async () => { + await verifyBytecodes( + ['TestContract'], + [buildArtifacts], + registryLookup, + [], + proxyLookup, + chainLookup + ) + }) + + it(`throws when a contract's bytecodes don't match`, async () => { + const oldBytecode = (artifact as any).deployedBytecode.object + ;(artifact as any).deployedBytecode.object = + '0x0' + oldBytecode.slice(3, oldBytecode.length) + await assertThrowsAsync( + verifyBytecodes( + ['TestContract'], + [buildArtifacts], + registryLookup, + [], + proxyLookup, + chainLookup + ) + ) + ;(artifact as any).deployedBytecode.object = oldBytecode + }) + + it(`throws when a library's bytecodes don't match`, async () => { + const libraryArtifact = getArtifactByName('LinkedLibrary1', buildArtifacts) + const oldBytecode = (libraryArtifact as any).deployedBytecode.object + ;(libraryArtifact as any).deployedBytecode.object = + oldBytecode.slice(0, 44) + '00' + oldBytecode.slice(46, oldBytecode.length) + await assertThrowsAsync( + verifyBytecodes( + ['TestContract'], + [buildArtifacts], + registryLookup, + [], + proxyLookup, + chainLookup + ) + ) + ;(libraryArtifact as any).deployedBytecode.object = oldBytecode + }) + + describe(`when a proposal upgrades a library's implementation`, () => { + let testContractAddress + + beforeEach(async () => { + await deployContractWithLinking( + 'LinkedLibrary3', + upgradedLibBuildArtifacts, + network.client, + links + ) + + await deployContractWithLinking( + 'LinkedLibrary2', + upgradedLibBuildArtifacts, + network.client, + links + ) + + // The new linking placeholders are source path dependent. This doesn't matter in a real + // deployment where contract source paths remain consistent between releases, but our test + // cases are organized in separate directories, so this needs to be updated. + setPlaceholderHash(links, 'LinkedLibrary1', upgradedLibBuildArtifacts) + + testContractAddress = await deployContractWithLinking( + 'TestContract', + upgradedLibBuildArtifacts, + network.client, + links + ) + }) + + it(`doesn't throw on matching contracts`, async () => { + const proposal = [ + { + contract: 'TestContractProxy', + function: '_setImplementation', + args: [testContractAddress], + value: '0', + }, + ] + + await verifyBytecodes( + ['TestContract'], + [upgradedLibBuildArtifacts], + registryLookup, + proposal, + proxyLookup, + chainLookup + ) + assert(true) + }) + + it(`throws on different contracts`, async () => { + const proposal = [ + { + contract: 'TestContractProxy', + function: '_setImplementation', + args: [testContractAddress], + value: '0', + }, + ] + + await assertThrowsAsync( + verifyBytecodes( + ['TestContract'], + [buildArtifacts], + registryLookup, + proposal, + proxyLookup, + chainLookup + ) + ) + }) + + it(`throws when the proposed address is wrong`, async () => { + const proposal = [ + { + contract: 'TestContractProxy', + function: '_setImplementation', + args: [network.accounts[1].address], + value: '0', + }, + ] + + await assertThrowsAsync( + verifyBytecodes( + ['TestContract'], + [upgradedLibBuildArtifacts], + registryLookup, + proposal, + proxyLookup, + chainLookup + ) + ) + }) + }) + + describe(`when a proposal upgrades a contract's implementation`, () => { + let testContractAddress + beforeEach(async () => { + // The new linking placeholders are source path dependent. This doesn't matter in a real + // deployment where contract source paths remain consistent between releases, but our test + // cases are organized in separate directories, so this needs to be updated. + setPlaceholderHash(links, 'LinkedLibrary1', upgradedContractBuildArtifacts) + setPlaceholderHash(links, 'LinkedLibrary2', upgradedContractBuildArtifacts) + + testContractAddress = await deployContractWithLinking( + 'TestContract', + upgradedContractBuildArtifacts, + network.client, + links + ) + }) + + it(`doesn't throw on matching contracts`, async () => { + const proposal = [ + { + contract: 'TestContractProxy', + function: '_setImplementation', + args: [testContractAddress], + value: '0', + }, + ] + + await verifyBytecodes( + ['TestContract'], + [upgradedContractBuildArtifacts], + registryLookup, + proposal, + proxyLookup, + chainLookup + ) + assert(true) + }) + + it(`throws on different contracts`, async () => { + const proposal = [ + { + contract: 'TestContractProxy', + function: '_setImplementation', + args: [testContractAddress], + value: '0', + }, + ] + + await assertThrowsAsync( + verifyBytecodes( + ['TestContract'], + [buildArtifacts], + registryLookup, + proposal, + proxyLookup, + chainLookup + ) + ) + }) + + it(`throws when the proposed address is wrong`, async () => { + const proposal = [ + { + contract: 'TestContractProxy', + function: '_setImplementation', + args: [network.accounts[1].address], + value: '0', + }, + ] + + await assertThrowsAsync( + verifyBytecodes( + ['TestContract'], + [upgradedContractBuildArtifacts], + registryLookup, + proposal, + proxyLookup, + chainLookup + ) + ) + }) + + it(`throws when there is no proposal`, async () => { + const proposal: ProposalTx[] = [] + + await assertThrowsAsync( + verifyBytecodes( + ['TestContract'], + [upgradedContractBuildArtifacts], + registryLookup, + proposal, + proxyLookup, + chainLookup + ) + ) + }) + }) + + describe(`when a proposal changes a contract's proxy`, () => { + let testContractProxyAddress + beforeEach(async () => { + setPlaceholderHash(links, 'LinkedLibrary1', upgradedContractBuildArtifacts) + setPlaceholderHash(links, 'LinkedLibrary2', upgradedContractBuildArtifacts) + + testContractProxyAddress = await deployProxiedContract( + 'TestContract', + upgradedContractBuildArtifacts, + network.client, + links + ) + }) + + it(`doesn't throw on matching contracts`, async () => { + const proposal = [ + { + contract: 'Registry', + function: 'setAddressFor', + args: ['TestContract', testContractProxyAddress], + value: '0', + }, + ] + + await verifyBytecodes( + ['TestContract'], + [upgradedContractBuildArtifacts], + registryLookup, + proposal, + proxyLookup, + chainLookup + ) + assert(true) + }) + + it(`throws on different contracts`, async () => { + const proposal = [ + { + contract: 'Registry', + function: 'setAddressFor', + args: ['TestContract', testContractProxyAddress], + value: '0', + }, + ] + + await assertThrowsAsync( + verifyBytecodes( + ['TestContract'], + [buildArtifacts], + registryLookup, + proposal, + proxyLookup, + chainLookup + ) + ) + }) + + it(`throws when the proposed address is wrong`, async () => { + const proposal = [ + { + contract: 'Registry', + function: 'setAddressFor', + args: ['TestContract', network.accounts[0].address], + value: '0', + }, + ] + + await assertThrowsAsync( + verifyBytecodes( + ['TestContract'], + [upgradedContractBuildArtifacts], + registryLookup, + proposal, + proxyLookup, + chainLookup + ) + ) + }) + }) + }) + }) +}) diff --git a/packages/protocol/test/compatibility/version.ts b/packages/protocol/test-ts/compatibility/version.ts similarity index 100% rename from packages/protocol/test/compatibility/version.ts rename to packages/protocol/test-ts/compatibility/version.ts diff --git a/packages/protocol/test/resources/compatibility/contracts_added_methods_and_contracts/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_added_methods_and_contracts/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_added_methods_and_contracts/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_added_methods_and_contracts/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_added_methods_and_contracts/TestContractNew.sol b/packages/protocol/test-ts/resources/compatibility/contracts_added_methods_and_contracts/TestContractNew.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_added_methods_and_contracts/TestContractNew.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_added_methods_and_contracts/TestContractNew.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_added_methods_and_contracts/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_added_methods_and_contracts/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_added_methods_and_contracts/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_added_methods_and_contracts/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_added_methods_and_contracts/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_added_methods_and_contracts/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_added_methods_and_contracts/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_added_methods_and_contracts/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_appended/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_appended/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_appended/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_appended/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_appended/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_appended/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_appended/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_appended/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_appended/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_appended/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_appended/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_appended/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_appended_in_parent/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_appended_in_parent/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_appended_in_parent/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_appended_in_parent/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_appended_in_parent/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_appended_in_parent/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_appended_in_parent/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_appended_in_parent/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_appended_in_parent/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_appended_in_parent/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_appended_in_parent/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_appended_in_parent/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_big_original/ImplementationChangeContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_big_original/ImplementationChangeContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_big_original/ImplementationChangeContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_big_original/ImplementationChangeContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_big_original/MethodsAddedContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_big_original/MethodsAddedContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_big_original/MethodsAddedContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_big_original/MethodsAddedContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_big_original/MethodsModifiedContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_big_original/MethodsModifiedContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_big_original/MethodsModifiedContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_big_original/MethodsModifiedContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_big_original/MethodsRemovedContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_big_original/MethodsRemovedContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_big_original/MethodsRemovedContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_big_original/MethodsRemovedContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_big_original/UnmodifiedContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_big_original/UnmodifiedContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_big_original/UnmodifiedContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_big_original/UnmodifiedContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_big_original_modified/ImplementationChangeContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_big_original_modified/ImplementationChangeContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_big_original_modified/ImplementationChangeContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_big_original_modified/ImplementationChangeContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_big_original_modified/MethodsAddedContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_big_original_modified/MethodsAddedContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_big_original_modified/MethodsAddedContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_big_original_modified/MethodsAddedContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_big_original_modified/MethodsModifiedContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_big_original_modified/MethodsModifiedContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_big_original_modified/MethodsModifiedContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_big_original_modified/MethodsModifiedContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_big_original_modified/MethodsRemovedContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_big_original_modified/MethodsRemovedContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_big_original_modified/MethodsRemovedContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_big_original_modified/MethodsRemovedContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_big_original_modified/NewContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_big_original_modified/NewContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_big_original_modified/NewContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_big_original_modified/NewContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_big_original_modified/UnmodifiedContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_big_original_modified/UnmodifiedContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_big_original_modified/UnmodifiedContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_big_original_modified/UnmodifiedContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_in_library_struct_mapping/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_deprecated_prefixed_in_library_struct_mapping/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_in_library_struct_mapping/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_deprecated_prefixed_in_library_struct_mapping/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_in_library_struct_mapping/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_deprecated_prefixed_in_library_struct_mapping/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_in_library_struct_mapping/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_deprecated_prefixed_in_library_struct_mapping/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_in_library_struct_mapping/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_deprecated_prefixed_in_library_struct_mapping/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_in_library_struct_mapping/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_deprecated_prefixed_in_library_struct_mapping/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_in_struct/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_deprecated_prefixed_in_struct/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_in_struct/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_deprecated_prefixed_in_struct/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_in_struct/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_deprecated_prefixed_in_struct/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_in_struct/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_deprecated_prefixed_in_struct/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_in_struct/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_deprecated_prefixed_in_struct/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_in_struct/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_deprecated_prefixed_in_struct/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_variable/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_deprecated_prefixed_variable/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_variable/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_deprecated_prefixed_variable/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_variable/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_deprecated_prefixed_variable/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_variable/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_deprecated_prefixed_variable/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_variable/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_deprecated_prefixed_variable/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_variable/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_deprecated_prefixed_variable/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_dynamic_to_fixed_array/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_dynamic_to_fixed_array/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_dynamic_to_fixed_array/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_dynamic_to_fixed_array/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_dynamic_to_fixed_array/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_dynamic_to_fixed_array/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_dynamic_to_fixed_array/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_dynamic_to_fixed_array/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_dynamic_to_fixed_array/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_dynamic_to_fixed_array/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_dynamic_to_fixed_array/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_dynamic_to_fixed_array/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_fixed_to_dynamic_array/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_fixed_to_dynamic_array/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_fixed_to_dynamic_array/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_fixed_to_dynamic_array/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_fixed_to_dynamic_array/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_fixed_to_dynamic_array/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_fixed_to_dynamic_array/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_fixed_to_dynamic_array/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_fixed_to_dynamic_array/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_fixed_to_dynamic_array/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_fixed_to_dynamic_array/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_fixed_to_dynamic_array/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_inserted/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_inserted/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_inserted/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_inserted/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_inserted/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_inserted/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_inserted/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_inserted/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_inserted/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_inserted/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_inserted/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_inserted/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_inserted_constant/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_inserted_constant/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_inserted_constant/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_inserted_constant/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_inserted_constant/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_inserted_constant/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_inserted_constant/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_inserted_constant/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_inserted_constant/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_inserted_constant/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_inserted_constant/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_inserted_constant/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_inserted_in_library_struct/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_inserted_in_library_struct/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_inserted_in_library_struct/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_inserted_in_library_struct/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_inserted_in_library_struct/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_inserted_in_library_struct/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_inserted_in_library_struct/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_inserted_in_library_struct/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_inserted_in_library_struct/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_inserted_in_library_struct/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_inserted_in_library_struct/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_inserted_in_library_struct/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_inserted_in_library_struct_mapping/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_inserted_in_library_struct_mapping/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_inserted_in_library_struct_mapping/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_inserted_in_library_struct_mapping/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_inserted_in_library_struct_mapping/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_inserted_in_library_struct_mapping/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_inserted_in_library_struct_mapping/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_inserted_in_library_struct_mapping/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_inserted_in_library_struct_mapping/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_inserted_in_library_struct_mapping/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_inserted_in_library_struct_mapping/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_inserted_in_library_struct_mapping/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_inserted_in_struct/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_inserted_in_struct/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_inserted_in_struct/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_inserted_in_struct/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_inserted_in_struct/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_inserted_in_struct/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_inserted_in_struct/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_inserted_in_struct/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_inserted_in_struct/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_inserted_in_struct/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_inserted_in_struct/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_inserted_in_struct/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_inserted_in_struct_mapping/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_inserted_in_struct_mapping/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_inserted_in_struct_mapping/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_inserted_in_struct_mapping/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_inserted_in_struct_mapping/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_inserted_in_struct_mapping/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_inserted_in_struct_mapping/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_inserted_in_struct_mapping/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_inserted_in_struct_mapping/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_inserted_in_struct_mapping/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_inserted_in_struct_mapping/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_inserted_in_struct_mapping/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_internal_mapping_source_changed/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_internal_mapping_source_changed/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_internal_mapping_source_changed/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_internal_mapping_source_changed/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_internal_mapping_source_changed/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_internal_mapping_source_changed/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_internal_mapping_source_changed/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_internal_mapping_source_changed/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_internal_mapping_source_changed/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_internal_mapping_source_changed/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_internal_mapping_source_changed/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_internal_mapping_source_changed/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_linked_libraries/LinkedLibrary1.sol b/packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries/LinkedLibrary1.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_linked_libraries/LinkedLibrary1.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries/LinkedLibrary1.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_linked_libraries/LinkedLibrary2.sol b/packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries/LinkedLibrary2.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_linked_libraries/LinkedLibrary2.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries/LinkedLibrary2.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_linked_libraries/LinkedLibrary3.sol b/packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries/LinkedLibrary3.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_linked_libraries/LinkedLibrary3.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries/LinkedLibrary3.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_linked_libraries/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_linked_libraries/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_contract/LinkedLibrary1.sol b/packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries_upgraded_contract/LinkedLibrary1.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_contract/LinkedLibrary1.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries_upgraded_contract/LinkedLibrary1.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_contract/LinkedLibrary2.sol b/packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries_upgraded_contract/LinkedLibrary2.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_contract/LinkedLibrary2.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries_upgraded_contract/LinkedLibrary2.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_contract/LinkedLibrary3.sol b/packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries_upgraded_contract/LinkedLibrary3.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_contract/LinkedLibrary3.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries_upgraded_contract/LinkedLibrary3.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_linked_libraries/Migrations.sol b/packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries_upgraded_contract/Migrations.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_linked_libraries/Migrations.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries_upgraded_contract/Migrations.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_contract/Proxy.sol b/packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries_upgraded_contract/Proxy.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_contract/Proxy.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries_upgraded_contract/Proxy.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_contract/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries_upgraded_contract/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_contract/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries_upgraded_contract/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_contract/TestContractProxy.sol b/packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries_upgraded_contract/TestContractProxy.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_contract/TestContractProxy.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries_upgraded_contract/TestContractProxy.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_lib/LinkedLibrary1.sol b/packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries_upgraded_lib/LinkedLibrary1.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_lib/LinkedLibrary1.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries_upgraded_lib/LinkedLibrary1.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_lib/LinkedLibrary2.sol b/packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries_upgraded_lib/LinkedLibrary2.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_lib/LinkedLibrary2.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries_upgraded_lib/LinkedLibrary2.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_lib/LinkedLibrary3.sol b/packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries_upgraded_lib/LinkedLibrary3.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_lib/LinkedLibrary3.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries_upgraded_lib/LinkedLibrary3.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_lib/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries_upgraded_lib/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_lib/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_linked_libraries_upgraded_lib/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_longer_fixed_array/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_longer_fixed_array/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_longer_fixed_array/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_longer_fixed_array/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_longer_fixed_array/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_longer_fixed_array/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_longer_fixed_array/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_longer_fixed_array/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_longer_fixed_array/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_longer_fixed_array/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_longer_fixed_array/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_longer_fixed_array/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_mapping_source_changed/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_mapping_source_changed/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_mapping_source_changed/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_mapping_source_changed/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_mapping_source_changed/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_mapping_source_changed/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_mapping_source_changed/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_mapping_source_changed/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_mapping_source_changed/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_mapping_source_changed/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_mapping_source_changed/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_mapping_source_changed/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_mapping_target_changed/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_mapping_target_changed/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_mapping_target_changed/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_mapping_target_changed/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_mapping_target_changed/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_mapping_target_changed/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_mapping_target_changed/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_mapping_target_changed/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_mapping_target_changed/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_mapping_target_changed/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_mapping_target_changed/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_mapping_target_changed/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_metadata_changed/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_metadata_changed/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_metadata_changed/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_metadata_changed/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_metadata_changed/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_metadata_changed/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_metadata_changed/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_metadata_changed/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_metadata_changed/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_metadata_changed/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_metadata_changed/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_metadata_changed/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_original/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_original/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_original/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_original/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_original/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_original/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_original/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_original/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_original/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_original/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_original/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_original/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_original_complex/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_original_complex/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_original_complex/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_original_complex/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_original_complex/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_original_complex/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_original_complex/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_original_complex/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_original_complex/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_original_complex/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_original_complex/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_original_complex/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_original_copy/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_original_copy/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_original_copy/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_original_copy/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_original_copy/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_original_copy/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_original_copy/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_original_copy/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_original_copy/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_original_copy/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_original_copy/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_original_copy/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_original_struct_in_mapping/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_original_struct_in_mapping/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_original_struct_in_mapping/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_original_struct_in_mapping/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_original_struct_in_mapping/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_original_struct_in_mapping/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_original_struct_in_mapping/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_original_struct_in_mapping/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_original_struct_in_mapping/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_original_struct_in_mapping/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_original_struct_in_mapping/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_original_struct_in_mapping/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_removed/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_removed/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_removed/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_removed/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_removed/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_removed/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_removed/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_removed/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_removed/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_removed/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_removed/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_removed/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_removed_from_library_struct/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_removed_from_library_struct/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_removed_from_library_struct/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_removed_from_library_struct/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_removed_from_library_struct/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_removed_from_library_struct/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_removed_from_library_struct/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_removed_from_library_struct/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_removed_from_library_struct/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_removed_from_library_struct/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_removed_from_library_struct/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_removed_from_library_struct/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_removed_from_parent/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_removed_from_parent/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_removed_from_parent/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_removed_from_parent/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_removed_from_parent/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_removed_from_parent/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_removed_from_parent/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_removed_from_parent/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_removed_from_parent/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_removed_from_parent/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_removed_from_parent/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_removed_from_parent/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_removed_from_struct/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_removed_from_struct/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_removed_from_struct/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_removed_from_struct/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_removed_from_struct/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_removed_from_struct/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_removed_from_struct/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_removed_from_struct/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_removed_from_struct/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_removed_from_struct/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_removed_from_struct/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_removed_from_struct/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_shorter_fixed_array/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_shorter_fixed_array/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_shorter_fixed_array/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_shorter_fixed_array/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_shorter_fixed_array/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_shorter_fixed_array/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_shorter_fixed_array/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_shorter_fixed_array/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_shorter_fixed_array/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_shorter_fixed_array/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_shorter_fixed_array/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_shorter_fixed_array/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_typechange/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_typechange/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_typechange/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_typechange/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_typechange/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_typechange/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_typechange/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_typechange/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_typechange/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_typechange/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_typechange/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_typechange/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_typechange_in_library_struct/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_typechange_in_library_struct/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_typechange_in_library_struct/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_typechange_in_library_struct/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_typechange_in_library_struct/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_typechange_in_library_struct/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_typechange_in_library_struct/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_typechange_in_library_struct/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_typechange_in_library_struct/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_typechange_in_library_struct/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_typechange_in_library_struct/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_typechange_in_library_struct/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_typechange_in_parent/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_typechange_in_parent/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_typechange_in_parent/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_typechange_in_parent/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_typechange_in_parent/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_typechange_in_parent/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_typechange_in_parent/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_typechange_in_parent/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_typechange_in_parent/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_typechange_in_parent/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_typechange_in_parent/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_typechange_in_parent/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_typechange_in_struct/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_typechange_in_struct/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_typechange_in_struct/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_typechange_in_struct/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_typechange_in_struct/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_typechange_in_struct/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_typechange_in_struct/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_typechange_in_struct/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_typechange_in_struct/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_typechange_in_struct/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_typechange_in_struct/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_typechange_in_struct/TestParent.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_versioned/TestContract.sol b/packages/protocol/test-ts/resources/compatibility/contracts_versioned/TestContract.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_versioned/TestContract.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_versioned/TestContract.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_versioned/TestLibrary.sol b/packages/protocol/test-ts/resources/compatibility/contracts_versioned/TestLibrary.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_versioned/TestLibrary.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_versioned/TestLibrary.sol diff --git a/packages/protocol/test/resources/compatibility/contracts_versioned/TestParent.sol b/packages/protocol/test-ts/resources/compatibility/contracts_versioned/TestParent.sol similarity index 100% rename from packages/protocol/test/resources/compatibility/contracts_versioned/TestParent.sol rename to packages/protocol/test-ts/resources/compatibility/contracts_versioned/TestParent.sol diff --git a/packages/protocol/test-ts/resources/compatibility/foundry.toml b/packages/protocol/test-ts/resources/compatibility/foundry.toml new file mode 100644 index 00000000000..3fa1dd86e80 --- /dev/null +++ b/packages/protocol/test-ts/resources/compatibility/foundry.toml @@ -0,0 +1,3 @@ +[profile.default] +optimizer = true +libs = ['../../../lib', '../../../node_modules'] diff --git a/packages/protocol/test-ts/util/anvil.ts b/packages/protocol/test-ts/util/anvil.ts new file mode 100644 index 00000000000..af482698e6b --- /dev/null +++ b/packages/protocol/test-ts/util/anvil.ts @@ -0,0 +1,57 @@ +import { spawn, ChildProcess } from 'node:child_process' +import { createTestClient, http, publicActions, walletActions } from 'viem' +import { foundry } from 'viem/chains' +import { privateKeyToAccount } from 'viem/accounts' + +const defaultPrivateKeys = [ + '0xac0974bec39a17e36ba4a6b4d238ff944bacb478cbed5efcae784d7bf4f2ff80', + '0x59c6995e998f97a5a0044966f0945389dc9e86dae88c7a8412f4603b6b78690d', + '0x5de4111afa1a4b94908f83103eb1f1706367c2e68ca870fc3fb9a804cdab365a', + '0x7c852118294e51e653712a81e05800f419141751be58f605c371e15141b007a6', + '0x47e179ec197488593b187f80a00eb0da91f1b9d0b13f8733639f19c30a34926a', + '0x8b3a350cf5c34c9194ca85829a2df0ec3153be0318b5e2d3348e872092edffba', + '0x92db14e403b83dfe3df233f83dfa3a0d7096f21ca9b0d6d6b8d88b2b4ec1564e', + '0x4bbbf85ce3377467afe5d46f804f221813b2bb87f24d81f60f1fcdbf7cbf4356', + '0xdbda1821b80551c9d65939329250298aa3472ba22feea921c0cf5d620ea67b97', + '0x2a871d0798f97d79848a013d4936a73bf4cc922c825d33c1cf7073dff6d409c6', +] + +const containsListeningMessage = (output: string) => { + return /Listening on /.test(output) +} + +const waitForListening = (proc: ChildProcess) => { + return new Promise((resolve) => { + const resolveIfListening = (output: string) => { + if (containsListeningMessage(output.toString())) { + resolve(null) + proc.stdout.off('data', resolveIfListening) + } + } + + proc.stdout.on('data', resolveIfListening) + }) +} + +// Spawns a new process running the Anvil devnet. +// Returns an object with +// - `client`: a viem client with test, public, and wallet actions +// - `anvil`: a NodeJS `ChildProcess` object to the Anvil process +// - `accounts`: the 10 default pre-funded Anvil accounts. +export const startNetwork = async () => { + const anvil = spawn('anvil') + await waitForListening(anvil) + + const accounts = defaultPrivateKeys.map((key: `0x${string}`) => privateKeyToAccount(key)) + + const client = createTestClient({ + account: accounts[0], + chain: foundry, + mode: 'anvil', + transport: http(), + }) + .extend(publicActions) + .extend(walletActions) + + return { client, anvil, accounts } +} diff --git a/packages/protocol/test/compatibility/common.ts b/packages/protocol/test-ts/util/compatibility.ts similarity index 59% rename from packages/protocol/test/compatibility/common.ts rename to packages/protocol/test-ts/util/compatibility.ts index 663d6ec246e..f9452dadf9f 100644 --- a/packages/protocol/test/compatibility/common.ts +++ b/packages/protocol/test-ts/util/compatibility.ts @@ -1,19 +1,36 @@ -import { getBuildArtifacts } from '@openzeppelin/upgrades' +import { instantiateArtifactsFromForge } from '@celo/protocol/lib/compatibility/utils' import { execSync } from 'child_process' import fs from 'fs' import path from 'path' -// Measured in millis const recompileThresholdTime = 1000 * 60 * 60 * 24 // One day const ROOT_DIR = path.normalize(path.join(__dirname, '../../')) +export function getTestArtifacts(caseName: string) { + const base = 'test-ts/resources/compatibility' + const srcDirectory = `${base}/contracts_${caseName}` + // Forge take a path relative to the config + const forgeBuildDirectory = `build/out_${caseName}` + const buildDirectory = `${base}/${forgeBuildDirectory}` + const configPath = `${base}/foundry.toml` + + if (needsCompiling(srcDirectory, buildDirectory)) { + exec(`rm -rf ${buildDirectory}`) + + exec( + `forge build --config-path "${configPath}" --out "${forgeBuildDirectory}" --ast ${srcDirectory}` + ) + } + return instantiateArtifactsFromForge(`${buildDirectory}`) +} + function exec(cmd: string) { return execSync(cmd, { cwd: ROOT_DIR, stdio: 'inherit' }) } // Return the latest modification time of a file in the given directory -function getLatestUpdateTime(dir) { +function getLatestUpdateTime(dir: string) { if (!fs.existsSync(dir)) { return 0 } @@ -41,26 +58,3 @@ function needsCompiling(src: string, build: string): boolean { } return false } - -export function getTestArtifacts(caseName: string) { - const back = './test/resources/compatibility' - const srcDirectory = `${back}/contracts_${caseName}` - const buildDirectory = `${back}/build/b_${caseName}` - - if (needsCompiling(srcDirectory, buildDirectory)) { - // We force all contracts compiled from the same source folder - // To minimize differences in the artifact files - const tmpSrcDirectory = `${back}/build/src` - // Clean folders - exec(`rm -rf ${tmpSrcDirectory}`) - exec(`rm -rf ${buildDirectory}`) - exec(`mkdir -p ./${tmpSrcDirectory}`) - // Copy the contracts source code to the src folder - exec(`cp -r ./${srcDirectory}/* ./${tmpSrcDirectory}`) - - exec( - `yarn run --silent truffle compile --all --contracts_directory=${tmpSrcDirectory} --contracts_build_directory=${buildDirectory}` - ) - } - return getBuildArtifacts(`${buildDirectory}`) -} diff --git a/packages/protocol/test-ts/util/viem.ts b/packages/protocol/test-ts/util/viem.ts new file mode 100644 index 00000000000..5fc0de72116 --- /dev/null +++ b/packages/protocol/test-ts/util/viem.ts @@ -0,0 +1,21 @@ +import { Abi, getContractAddress, Transaction } from 'viem' + +export const deployViemContract = async ( + abi: Abi, + bytecode: string, + client, + args = [] +): Promise => { + const hash = await client.deployContract({ + abi, + bytecode, + args, + }) + const tx: Transaction = await client.getTransaction({ hash }) + const address = getContractAddress({ + from: tx.from, + nonce: BigInt(tx.nonce), + }) + + return address +} diff --git a/packages/protocol/test/common/integration.ts b/packages/protocol/test/common/integration.ts deleted file mode 100644 index 0e8b775a3f6..00000000000 --- a/packages/protocol/test/common/integration.ts +++ /dev/null @@ -1,748 +0,0 @@ -import { ensureLeading0x, NULL_ADDRESS } from '@celo/base/lib/address' -import { constitution } from '@celo/protocol/governanceConstitution' -import { - addressMinedLatestBlock, - assertEqualBN, - assertTransactionRevertWithReason, - assumeOwnershipWithTruffle, - stripHexEncoding, - timeTravel, -} from '@celo/protocol/lib/test-utils' -import { - getDeployedProxiedContract, - getFunctionSelectorsForContract, - makeTruffleContractForMigration, -} from '@celo/protocol/lib/web3-utils' -import { config } from '@celo/protocol/migrationsConfig' -import { linkedListChanges, zip } from '@celo/utils/lib/collections' -import { fixed1, toFixed } from '@celo/utils/lib/fixidity' -import BigNumber from 'bignumber.js' -import { - ElectionInstance, - FeeCurrencyWhitelistInstance, - FreezerInstance, - GoldTokenInstance, - GovernanceApproverMultiSigInstance, - GovernanceInstance, - GovernanceSlasherInstance, - LockedGoldInstance, - RegistryInstance, -} from 'types' -import { - ExchangeContract, - ExchangeInstance, - ReserveInstance, - ReserveSpenderMultiSigInstance, - SortedOraclesInstance, - StableTokenContract, - StableTokenInstance, -} from 'types/mento' -import { MENTO_PACKAGE } from '../../contractPackages' -import { ArtifactsSingleton } from '../../lib/artifactsSingleton' -import { SECONDS_IN_A_WEEK } from '../constants' - -enum VoteValue { - None = 0, - Abstain, - No, - Yes, -} - -async function getGroups(election: ElectionInstance) { - const response = await election.getTotalVotesForEligibleValidatorGroups() - console.info('response', response) - const lst1 = response[0] - const lst2 = response[1] - return zip( - (address, value) => { - return { address, value } - }, - lst1, - lst2 - ) -} - -// Returns how much voting gold will be decremented from the groups voted by an account -async function slashingOfGroups( - account: string, - penalty: BigNumber, - lockedGold: LockedGoldInstance, - election: ElectionInstance -) { - // first check how much voting gold has to be slashed - const nonVoting = await lockedGold.getAccountNonvotingLockedGold(account) - if (penalty.isLessThan(nonVoting)) { - return [] - } - let difference = penalty.minus(nonVoting) - // find voted groups - const groups = await election.getGroupsVotedForByAccount(account) - const res = [] - // - for (let i = groups.length - 1; i >= 0; i--) { - const group = groups[i] - const totalVotes = await election.getTotalVotesForGroup(group) - const votes = await election.getTotalVotesForGroupByAccount(group, account) - const slashedVotes = votes.lt(difference) ? votes : difference - res.push({ address: group, value: totalVotes.minus(slashedVotes), index: i }) - difference = difference.minus(slashedVotes) - if (difference.eq(new BigNumber(0))) { - break - } - } - return res -} - -async function findLessersAndGreaters( - account: string, - penalty: BigNumber, - lockedGold: LockedGoldInstance, - election: ElectionInstance -) { - const groups = await getGroups(election) - const changed = await slashingOfGroups(account, penalty, lockedGold, election) - const changes = linkedListChanges(groups, changed) - return { ...changes, indices: changed.map((a) => a.index) } -} - -contract('Integration: Running elections', (_accounts: string[]) => { - let election: ElectionInstance - - before(async () => { - election = await getDeployedProxiedContract('Election', artifacts) - }) - - describe('When getting the elected validators', () => { - it('should elect all 30 validators', async () => { - const elected = await election.electValidatorSigners() - assert.equal(elected.length, 30) - }) - it('should elect specified number validators with electNValidatorSigners', async () => { - const elected = await election.electNValidatorSigners(1, 20) - assert.equal(elected.length, 20) - }) - }) -}) - -contract('Integration: Governance slashing', (accounts: string[]) => { - const proposalId = 1 - const dequeuedIndex = 0 - let lockedGold: LockedGoldInstance - let election: ElectionInstance - let multiSig: GovernanceApproverMultiSigInstance - let governance: GovernanceInstance - let governanceSlasher: GovernanceSlasherInstance - let proposalTransactions: any - let value: BigNumber - let valueOfSlashed: BigNumber - const penalty = new BigNumber('100') - const slashedAccount = accounts[9] - - before(async () => { - lockedGold = await getDeployedProxiedContract('LockedGold', artifacts) - election = await getDeployedProxiedContract('Election', artifacts) - // @ts-ignore - await lockedGold.lock({ value: '10000000000000000000000000' }) - - multiSig = await getDeployedProxiedContract('GovernanceApproverMultiSig', artifacts) - governance = await getDeployedProxiedContract('Governance', artifacts) - governanceSlasher = await getDeployedProxiedContract('GovernanceSlasher', artifacts) - value = await lockedGold.getAccountTotalLockedGold(accounts[0]) - - proposalTransactions = [ - { - value: 0, - destination: governanceSlasher.address, - data: Buffer.from( - stripHexEncoding( - // @ts-ignore - governanceSlasher.contract.methods.approveSlashing(slashedAccount, 100).encodeABI() - ), - 'hex' - ), - }, - ] - }) - - describe('When making a governance proposal', () => { - before(async () => { - await governance.propose( - proposalTransactions.map((x: any) => x.value), - proposalTransactions.map((x: any) => x.destination), - // @ts-ignore - Buffer.concat(proposalTransactions.map((x: any) => x.data)), - proposalTransactions.map((x: any) => x.data.length), - 'URL', - // @ts-ignore: TODO(mcortesi) fix typings for TransactionDetails - { value: web3.utils.toWei(config.governance.minDeposit.toString(), 'ether') } - ) - }) - - it('should increment the proposal count', async () => { - assert.equal((await governance.proposalCount()).toNumber(), proposalId) - }) - }) - - describe('When upvoting that proposal', () => { - before(async () => { - await governance.upvote(proposalId, 0, 0) - }) - - it('should increase the number of upvotes for the proposal', async () => { - assertEqualBN(await governance.getUpvotes(proposalId), value) - }) - }) - - describe('When approving that proposal', () => { - before(async () => { - await timeTravel(config.governance.dequeueFrequency, web3) - // @ts-ignore - const txData = governance.contract.methods.approve(proposalId, dequeuedIndex).encodeABI() - await multiSig.submitTransaction(governance.address, 0, txData, { - from: accounts[0], - }) - }) - - it('should set the proposal to approved', async () => { - assert.isTrue(await governance.isApproved(proposalId)) - }) - }) - - describe('When voting on that proposal', () => { - before(async () => { - await timeTravel(config.governance.approvalStageDuration, web3) - await governance.vote(proposalId, dequeuedIndex, VoteValue.Yes) - }) - - it('should increment the vote totals', async () => { - const response = await governance.getVoteTotals(proposalId) - assertEqualBN(response[0], value) - }) - }) - - describe('When executing that proposal', () => { - before(async () => { - await timeTravel(config.governance.referendumStageDuration, web3) - await governance.execute(proposalId, dequeuedIndex) - }) - - it('should execute the proposal', async () => { - assertEqualBN(await governanceSlasher.getApprovedSlashing(slashedAccount), penalty) - }) - }) - - describe('When performing slashing', () => { - before(async () => { - await timeTravel(config.governance.referendumStageDuration, web3) - valueOfSlashed = await lockedGold.getAccountTotalLockedGold(slashedAccount) - const { lessers, greaters, indices } = await findLessersAndGreaters( - slashedAccount, - penalty, - lockedGold, - election - ) - await governanceSlasher.slash(slashedAccount, lessers, greaters, indices) - }) - - it('should set approved slashing to zero', async () => { - assert.equal((await governanceSlasher.getApprovedSlashing(slashedAccount)).toNumber(), 0) - }) - - it('should slash the account', async () => { - assertEqualBN( - await lockedGold.getAccountTotalLockedGold(slashedAccount), - valueOfSlashed.minus(penalty) - ) - }) - }) -}) - -contract('Integration: Governance', (accounts: string[]) => { - const proposalId = 1 - const dequeuedIndex = 0 - let lockedGold: LockedGoldInstance - let multiSig: GovernanceApproverMultiSigInstance - let governance: GovernanceInstance - let registry: RegistryInstance - let proposalTransactions: any - let value: BigNumber - - before(async () => { - lockedGold = await getDeployedProxiedContract('LockedGold', artifacts) - // @ts-ignore - await lockedGold.lock({ value: '10000000000000000000000000' }) - value = await lockedGold.getAccountTotalLockedGold(accounts[0]) - multiSig = await getDeployedProxiedContract('GovernanceApproverMultiSig', artifacts) - governance = await getDeployedProxiedContract('Governance', artifacts) - registry = await getDeployedProxiedContract('Registry', artifacts) - proposalTransactions = [ - { - value: 0, - destination: registry.address, - data: Buffer.from( - stripHexEncoding( - // @ts-ignore - registry.contract.methods.setAddressFor('test1', accounts[1]).encodeABI() - ), - 'hex' - ), - }, - { - value: 0, - destination: registry.address, - data: Buffer.from( - stripHexEncoding( - // @ts-ignore - registry.contract.methods.setAddressFor('test2', accounts[2]).encodeABI() - ), - 'hex' - ), - }, - ] - }) - - describe('Checking governance thresholds', () => { - for (const contractName of Object.keys(constitution).filter((k) => k !== 'proxy')) { - it('should have correct thresholds for ' + contractName, async () => { - const artifactsInstance = ArtifactsSingleton.getInstance( - constitution[contractName].__contractPackage, - artifacts - ) - - const contract = await getDeployedProxiedContract( - contractName, - artifactsInstance - ) - - const selectors = getFunctionSelectorsForContract(contract, contractName, artifactsInstance) - - selectors.default = ['0x00000000'] - - const thresholds = { ...constitution.proxy, ...constitution[contractName] } - await Promise.all( - Object.keys(thresholds) - .filter((k) => k !== '__contractPackage') - .map((func) => - Promise.all( - selectors[func].map(async (selector) => { - assertEqualBN( - await governance.getConstitution(contract.address, selector), - toFixed(thresholds[func]), - 'Threshold set incorrectly for function ' + func - ) - }) - ) - ) - ) - }) - } - }) - - describe('When making a governance proposal', () => { - before(async () => { - await governance.propose( - proposalTransactions.map((x: any) => x.value), - proposalTransactions.map((x: any) => x.destination), - // @ts-ignore - Buffer.concat(proposalTransactions.map((x: any) => x.data)), - proposalTransactions.map((x: any) => x.data.length), - 'URL', - // @ts-ignore: TODO(mcortesi) fix typings for TransactionDetails - { value: web3.utils.toWei(config.governance.minDeposit.toString(), 'ether') } - ) - }) - - it('should increment the proposal count', async () => { - assert.equal((await governance.proposalCount()).toNumber(), proposalId) - }) - }) - - describe('When upvoting that proposal', () => { - before(async () => { - await governance.upvote(proposalId, 0, 0) - }) - - it('should increase the number of upvotes for the proposal', async () => { - assertEqualBN(await governance.getUpvotes(proposalId), value) - }) - }) - - describe('When approving that proposal', () => { - before(async () => { - await timeTravel(config.governance.dequeueFrequency, web3) - // @ts-ignore - const txData = governance.contract.methods.approve(proposalId, dequeuedIndex).encodeABI() - await multiSig.submitTransaction(governance.address, 0, txData, { - from: accounts[0], - }) - }) - - it('should set the proposal to approved', async () => { - assert.isTrue(await governance.isApproved(proposalId)) - }) - }) - - describe('When voting on that proposal', () => { - before(async () => { - await timeTravel(config.governance.approvalStageDuration, web3) - await governance.vote(proposalId, dequeuedIndex, VoteValue.Yes) - }) - - it('should increment the vote totals', async () => { - const response = await governance.getVoteTotals(proposalId) - assertEqualBN(response[0], value) - }) - }) - - describe('When executing that proposal', () => { - before(async () => { - await timeTravel(config.governance.referendumStageDuration, web3) - await governance.execute(proposalId, dequeuedIndex) - }) - - it('should execute the proposal', async () => { - assert.equal(await registry.getAddressForOrDie(web3.utils.soliditySha3('test1')), accounts[1]) - assert.equal(await registry.getAddressForOrDie(web3.utils.soliditySha3('test2')), accounts[2]) - }) - }) -}) - -Array.from([ - ['Exchange', 'StableToken'], // USD - ['ExchangeEUR', 'StableTokenEUR'], // EUR - ['ExchangeBRL', 'StableTokenBRL'], // BRL (cREAL) -]).forEach(([exchangeId, stableTokenId]) => - contract(`Integration: ${exchangeId} ${stableTokenId}`, (accounts: string[]) => { - const transferAmount = 10 - let exchange: ExchangeInstance - let multiSig: ReserveSpenderMultiSigInstance - let reserve: ReserveInstance - let goldToken: GoldTokenInstance - let stableToken: StableTokenInstance - let originalStable - let originalGold - let originalReserve - let finalStable: BigNumber - let finalGold: BigNumber - let finalReserve: BigNumber - - const decimals = 18 - - before(async () => { - exchange = await getDeployedProxiedContract( - exchangeId, - ArtifactsSingleton.getInstance(MENTO_PACKAGE) - ) - stableToken = await getDeployedProxiedContract( - stableTokenId, - ArtifactsSingleton.getInstance(MENTO_PACKAGE) - ) - multiSig = await getDeployedProxiedContract( - 'ReserveSpenderMultiSig', - ArtifactsSingleton.getInstance(MENTO_PACKAGE) - ) - reserve = await getDeployedProxiedContract( - 'Reserve', - ArtifactsSingleton.getInstance(MENTO_PACKAGE) - ) - goldToken = await getDeployedProxiedContract('GoldToken', artifacts) - }) - - describe('Selling', () => { - const sellAmount = new BigNumber('1000000000000000000000') - const minBuyAmount = 1 - - describe('When selling gold', () => { - before(async () => { - originalStable = await stableToken.balanceOf(accounts[0]) - originalGold = await goldToken.balanceOf(accounts[0]) - originalReserve = await goldToken.balanceOf(reserve.address) - await goldToken.approve(exchange.address, sellAmount) - await exchange.sell(sellAmount, minBuyAmount, true) - finalStable = await stableToken.balanceOf(accounts[0]) - finalGold = await goldToken.balanceOf(accounts[0]) - finalReserve = await goldToken.balanceOf(reserve.address) - }) - - it(`should increase user's stable`, async () => { - assert.isTrue(finalStable.gt(originalStable)) - }) - - it(`should reduce user's gold`, async () => { - if (await addressMinedLatestBlock(accounts[0])) { - const blockReward = new BigNumber(2).times(new BigNumber(10).pow(decimals)) - assert.isTrue(finalGold.lt(originalGold.plus(blockReward))) - } else { - assert.isTrue(finalGold.lt(originalGold)) - } - }) - - it(`should increase Reserve's gold`, async () => { - assert.isTrue(finalReserve.gt(originalReserve)) - }) - }) - - // Note that this test relies on having purchased stable token in the previous test. - describe('When selling stable token', () => { - before(async () => { - originalStable = await stableToken.balanceOf(accounts[0]) - originalGold = await goldToken.balanceOf(accounts[0]) - originalReserve = await goldToken.balanceOf(reserve.address) - await stableToken.approve(exchange.address, sellAmount) - // Cannot sell more than was purchased in the previous test. - await exchange.sell(sellAmount.div(20), minBuyAmount, false) - finalStable = await stableToken.balanceOf(accounts[0]) - finalGold = await goldToken.balanceOf(accounts[0]) - finalReserve = await goldToken.balanceOf(reserve.address) - }) - - it(`should reduce user's stable`, async () => { - assert.isTrue(finalStable.lt(originalStable)) - }) - - it(`should increase user's gold`, async () => { - assert.isTrue(finalGold.gt(originalGold)) - }) - - it(`should reduce Reserve's gold`, async () => { - assert.isTrue(finalReserve.lt(originalReserve)) - }) - }) - }) - - describe('Buying', () => { - const buyAmount = new BigNumber(10000000000000000000) - const maxSellAmount = new BigNumber('10000000000000000000000') - - describe('When buying stable token', () => { - before(async () => { - originalStable = await stableToken.balanceOf(accounts[0]) - originalGold = await goldToken.balanceOf(accounts[0]) - originalReserve = await goldToken.balanceOf(reserve.address) - await goldToken.approve(exchange.address, maxSellAmount) - await exchange.buy(buyAmount, maxSellAmount, false) - finalStable = await stableToken.balanceOf(accounts[0]) - finalGold = await goldToken.balanceOf(accounts[0]) - finalReserve = await goldToken.balanceOf(reserve.address) - }) - - it(`should increase user's stable`, async () => { - assert.isTrue(finalStable.gt(originalStable)) - }) - - it(`should reduce user's gold`, async () => { - if (await addressMinedLatestBlock(accounts[0])) { - const blockReward = new BigNumber(2).times(new BigNumber(10).pow(decimals)) - assert.isTrue(finalGold.lt(originalGold.plus(blockReward))) - } else { - assert.isTrue(finalGold.lt(originalGold)) - } - }) - - it(`should increase Reserve's gold`, async () => { - assert.isTrue(finalReserve.gt(originalReserve)) - }) - }) - - // Note that this test relies on having purchased cUSD in a previous test - describe('When buying celo', () => { - before(async () => { - originalStable = await stableToken.balanceOf(accounts[0]) - originalGold = await goldToken.balanceOf(accounts[0]) - originalReserve = await goldToken.balanceOf(reserve.address) - await stableToken.approve(exchange.address, maxSellAmount) - // Cannot sell more than was purchased in the previous test. - await exchange.buy(buyAmount, maxSellAmount, true) - finalStable = await stableToken.balanceOf(accounts[0]) - finalGold = await goldToken.balanceOf(accounts[0]) - finalReserve = await goldToken.balanceOf(reserve.address) - }) - - it(`should reduce user's stable`, async () => { - assert.isTrue(finalStable.lt(originalStable)) - }) - - it(`should increase user's gold`, async () => { - assert.isTrue(finalGold.gt(originalGold)) - }) - - it(`should reduce Reserve's gold`, async () => { - assert.isTrue(finalReserve.lt(originalReserve)) - }) - }) - }) - - describe('When transferring gold', () => { - const otherReserveAddress = '0x7457d5E02197480Db681D3fdF256c7acA21bDc12' - let originalOtherAccount - beforeEach(async () => { - originalReserve = await goldToken.balanceOf(reserve.address) - originalOtherAccount = await goldToken.balanceOf(otherReserveAddress) - }) - - it(`should transfer gold`, async () => { - // @ts-ignore - const txData = reserve.contract.methods - .transferGold(otherReserveAddress, transferAmount) - .encodeABI() - await multiSig.submitTransaction(reserve.address, 0, txData, { - from: accounts[0], - }) - assert.isTrue( - (await goldToken.balanceOf(reserve.address)).isEqualTo( - originalReserve.minus(transferAmount) - ) - ) - assert.isTrue( - (await goldToken.balanceOf(otherReserveAddress)).isEqualTo( - originalOtherAccount.plus(transferAmount) - ) - ) - }) - }) - }) -) - -contract('Integration: Adding StableToken', (accounts: string[]) => { - const Exchange: ExchangeContract = makeTruffleContractForMigration( - 'Exchange', - MENTO_PACKAGE, - web3 - ) - const StableToken: StableTokenContract = makeTruffleContractForMigration( - 'StableToken', - MENTO_PACKAGE, - web3 - ) - let exchangeAbc: ExchangeInstance - let freezer: FreezerInstance - let goldToken: GoldTokenInstance - let stableTokenAbc: StableTokenInstance - const sellAmount = web3.utils.toWei('0.1', 'ether') - const minBuyAmount = 1 - - // 0. Make ourselves the owner of the various contracts we will need to interact with, as - // passing a governance proposal for each one will be a pain in the butt. - before(async () => { - goldToken = await getDeployedProxiedContract('GoldToken', artifacts) - freezer = await getDeployedProxiedContract('Freezer', artifacts) - const contractsToOwn = ['Freezer', 'Registry', 'SortedOracles', 'FeeCurrencyWhitelist'] - await assumeOwnershipWithTruffle(contractsToOwn, accounts[0]) - await assumeOwnershipWithTruffle(['Reserve'], accounts[0], 0, MENTO_PACKAGE) - }) - - // 1. Mimic the state of the world post-contracts-release - // a) Deploy the contracts. For simplicity, omit proxies for now. - // b) Register the contracts - // c) Initialize the contracts - // d) Confirm mento is effectively frozen - describe('When the contracts have been deployed and initialized', () => { - before(async () => { - exchangeAbc = await Exchange.new(true) - stableTokenAbc = await StableToken.new(true) - - const registry: RegistryInstance = await getDeployedProxiedContract('Registry', artifacts) - await registry.setAddressFor('ExchangeABC', exchangeAbc.address) - await registry.setAddressFor('StableTokenABC', stableTokenAbc.address) - - await stableTokenAbc.initialize( - 'Celo Abc', // Name - 'cABC', // symbol - '18', // decimals - registry.address, - fixed1, // inflationRate - SECONDS_IN_A_WEEK, // inflationRatePeriod - [accounts[0]], // pre-mint account - ['1000000000000000000'], // pre-mint amount - 'ExchangeABC' // exchange contract key on the registry - ) - await exchangeAbc.initialize( - registry.address, - 'StableTokenABC', - '5000000000000000000000', // spread, matches mainnet for cUSD and cEUR - '1300000000000000000000', // reserveFraction, matches mainnet for cEUR - '300', // updateFrequency, matches mainnet for cUSD and cEUR - '1' // minimumReports, minimum possible to avoid having to mock multiple reports - ) - }) - - it(`should be impossible to sell CELO`, async () => { - await goldToken.approve(exchangeAbc.address, sellAmount) - await assertTransactionRevertWithReason(exchangeAbc.sell(sellAmount, minBuyAmount, true)) - }) - - it(`should be impossible to sell stable token`, async () => { - await stableTokenAbc.approve(exchangeAbc.address, sellAmount) - await assertTransactionRevertWithReason(exchangeAbc.sell(sellAmount, minBuyAmount, false)) - }) - }) - - // 2. Mimic the state of the world post-oracle-activation-proposal - // a) Activate the oracles and freeze the mento - // b) Make an oracle report - // c) Confirm mento is effectively frozen - describe('When the contracts have been frozen and an oracle report has been made', () => { - before(async () => { - const sortedOracles: SortedOraclesInstance = await getDeployedProxiedContract( - 'SortedOracles', - artifacts - ) - await sortedOracles.addOracle(stableTokenAbc.address, ensureLeading0x(accounts[0])) - await freezer.freeze(stableTokenAbc.address) - await freezer.freeze(exchangeAbc.address) - await sortedOracles.report(stableTokenAbc.address, toFixed(1), NULL_ADDRESS, NULL_ADDRESS) - }) - - it(`should be impossible to sell CELO`, async () => { - await goldToken.approve(exchangeAbc.address, sellAmount) - await assertTransactionRevertWithReason( - exchangeAbc.sell(sellAmount, minBuyAmount, true), - "can't call when contract is frozen" - ) - }) - - it(`should be impossible to sell stable token`, async () => { - await stableTokenAbc.approve(exchangeAbc.address, sellAmount) - await assertTransactionRevertWithReason( - exchangeAbc.sell(sellAmount, minBuyAmount, false), - "can't call when contract is frozen" - ) - }) - }) - - // 3. Mimic the state of the world post-mento-activation-proposal - // a) Add the stable token to the reserve - // b) Unfreeze the mento - // c) Confirm mento is functional - describe('When the contracts have been unfrozen and the mento has been activated', () => { - before(async () => { - const reserve: ReserveInstance = await getDeployedProxiedContract( - 'Reserve', - ArtifactsSingleton.getInstance(MENTO_PACKAGE) - ) - const feeCurrencyWhitelist: FeeCurrencyWhitelistInstance = await getDeployedProxiedContract( - 'FeeCurrencyWhitelist', - artifacts - ) - await reserve.addToken(stableTokenAbc.address) - await reserve.addExchangeSpender(exchangeAbc.address) - await freezer.unfreeze(stableTokenAbc.address) - await freezer.unfreeze(exchangeAbc.address) - - // activate stable during mento-activation proposal - await exchangeAbc.activateStable() - // Fee currency can't be tested here, but keep this line for reference - await feeCurrencyWhitelist.addToken(stableTokenAbc.address) - }) - - it(`should be possible to sell CELO`, async () => { - await goldToken.approve(exchangeAbc.address, sellAmount) - await exchangeAbc.sell(sellAmount, minBuyAmount, true) - }) - - it(`should be possible to sell stable token`, async () => { - await stableTokenAbc.approve(exchangeAbc.address, sellAmount) - await exchangeAbc.sell(sellAmount, minBuyAmount, false) - }) - }) -}) diff --git a/packages/protocol/test/common/migration.ts b/packages/protocol/test/common/migration.ts deleted file mode 100644 index 05b6eda1573..00000000000 --- a/packages/protocol/test/common/migration.ts +++ /dev/null @@ -1,53 +0,0 @@ -import { - assertContractsRegistered, - assertProxiesSet, - assertRegistryAddressesSet, -} from '@celo/protocol/lib/test-utils' -import { getDeployedProxiedContract } from '@celo/protocol/lib/web3-utils' -import { ContractPackage } from 'contractPackages' -import { ArtifactsSingleton } from '../../lib/artifactsSingleton' - -const getProxiedContract = async (contractName: string, contractPackage: ContractPackage) => { - const artifactsObject = ArtifactsSingleton.getInstance(contractPackage, artifacts) - /* eslint-disable-next-line */ - return await getDeployedProxiedContract(contractName, artifactsObject) -} - -const getContract = async ( - contractName: string, - type: string, - contractPackage: ContractPackage -) => { - // /* eslint-disable-next-line */ - const artifactsObject = ArtifactsSingleton.getInstance(contractPackage, artifacts) - - if (type === 'contract') { - /* eslint-disable-next-line */ - return await artifactsObject.require(contractName).deployed() - } - if (type === 'proxy') { - /* eslint-disable-next-line */ - return await artifactsObject.require(contractName + 'Proxy').deployed() - } -} - -contract('Migration', () => { - describe('Checking proxies', () => { - // https://github.com/celo-org/celo-monorepo/issues/10566 - it.skip('should have the proxy set up for all proxied contracts', async () => { - await assertProxiesSet(getContract) - }) - }) - - describe('Checking the registry', () => { - it('should have the correct entry in the registry for all contracts used by the registry', async () => { - await assertContractsRegistered(getProxiedContract) - }) - }) - - describe('Checking contracts that use the registry', () => { - it.skip('should have set the registry address properly in all contracts that use it', async () => { - await assertRegistryAddressesSet(getProxiedContract) - }) - }) -}) diff --git a/packages/protocol/test/common/recoverFunds.ts b/packages/protocol/test/common/recoverFunds.ts deleted file mode 100644 index 7278ab72d60..00000000000 --- a/packages/protocol/test/common/recoverFunds.ts +++ /dev/null @@ -1,76 +0,0 @@ -// Note: this test is testing the recover-funds script and not a particular smart contract. - -import { recoverFunds } from '@celo/protocol/lib/recover-funds' -import { CeloContractName } from '@celo/protocol/lib/registry-utils' -import { expectBigNumberInRange } from '@celo/protocol/lib/test-utils' -import { CeloUnreleasedTreasuryContract } from '@celo/protocol/types/08' -import { BigNumber } from 'bignumber.js' -import { - FreezerContract, - GetSetV0Instance, - GoldTokenContract, - ProxyInstance, - RegistryContract, -} from 'types' -import { SOLIDITY_08_PACKAGE } from '../../contractPackages' -import { ArtifactsSingleton } from '../../lib/artifactsSingleton' - -const GetSetV0: Truffle.Contract = artifacts.require('GetSetV0') -const Proxy: Truffle.Contract = artifacts.require('Proxy') - -contract('Proxy', (accounts: string[]) => { - let proxy: ProxyInstance - let getSet: GetSetV0Instance - - const owner = accounts[0] - - beforeEach(async () => { - proxy = await Proxy.new({ from: owner }) - getSet = await GetSetV0.new({ from: owner }) - }) - - describe('fallback', () => { - beforeEach(async () => { - await proxy._setImplementation(getSet.address) - }) - - it('recovers funds from an incorrectly intialized implementation', async () => { - const Freezer: FreezerContract = artifacts.require('Freezer') - const GoldToken: GoldTokenContract = artifacts.require('GoldToken') - const CeloUnreleasedTreasury: CeloUnreleasedTreasuryContract = - ArtifactsSingleton.getInstance(SOLIDITY_08_PACKAGE).require('CeloUnreleasedTreasury') // Added because the CeloToken `_transfer` prevents transfers to the celoUnreleasedTreasury. - // @ts-ignore - GoldToken.numberFormat = 'BigNumber' - const Registry: RegistryContract = artifacts.require('Registry') - - const freezer = await Freezer.new(true) - const goldToken = await GoldToken.new(true) - const celoUnreleasedTreasury = await CeloUnreleasedTreasury.new(true) - - const registry = await Registry.new(true) - await registry.setAddressFor(CeloContractName.Freezer, freezer.address) - await registry.setAddressFor( - CeloContractName.CeloUnreleasedTreasury, - celoUnreleasedTreasury.address - ) - await goldToken.initialize(registry.address) - - const amount = new BigNumber(10) - const initialBalance = new BigNumber(await goldToken.balanceOf(owner)) - await goldToken.transfer(proxy.address, amount) - - await proxy._setImplementation(getSet.address) - - const ownerBalance = await goldToken.balanceOf(owner) - - expectBigNumberInRange(ownerBalance, initialBalance.minus(amount)) - const proxyBalance = await web3.eth.getBalance(proxy.address) - assert(proxyBalance === amount.toString()) - - await recoverFunds(proxy.address, owner) - const ownerBalance2 = await goldToken.balanceOf(owner) - assert((await web3.eth.getBalance(proxy.address)) === '0') - expectBigNumberInRange(ownerBalance2, initialBalance) - }) - }) -}) diff --git a/packages/protocol/test/compatibility/verify-bytecode.ts b/packages/protocol/test/compatibility/verify-bytecode.ts deleted file mode 100644 index 942a612f303..00000000000 --- a/packages/protocol/test/compatibility/verify-bytecode.ts +++ /dev/null @@ -1,377 +0,0 @@ -import { - LibraryAddresses, - LibraryLinks, - LibraryPositions, - linkLibraries, -} from '@celo/protocol/lib/bytecode' -import { Artifact } from '@celo/protocol/lib/compatibility/internal' -import { verifyBytecodes } from '@celo/protocol/lib/compatibility/verify-bytecode' -import { assertThrowsAsync } from '@celo/protocol/lib/test-utils' -import { getTestArtifacts } from '@celo/protocol/test/compatibility/common' -import { NULL_ADDRESS } from '@celo/utils/lib/address' -import { assert } from 'chai' -import { RegistryInstance } from 'types' - -import truffleContract = require('@truffle/contract') - -const Registry = artifacts.require('Registry') -const Proxy = artifacts.require('Proxy') - -const makeTruffleContract = (artifact: Artifact) => { - const Contract = truffleContract({ - abi: artifact.abi, - unlinked_binary: artifact.bytecode, - }) - Contract.setProvider(web3.currentProvider) - Contract.setNetwork('development') - - return Contract -} - -const deployProxiedContract = async (Contract: any, from: string) => { - const proxy = await Proxy.new() - const contract = await Contract.new({ from }) - await proxy._setImplementation(contract.address) - return Contract.at(proxy.address) -} - -contract('', (accounts) => { - const buildArtifacts = getTestArtifacts('linked_libraries') - const upgradedLibBuildArtifacts = getTestArtifacts('linked_libraries_upgraded_lib') - const upgradedContractBuildArtifacts = getTestArtifacts('linked_libraries_upgraded_contract') - const artifact = buildArtifacts.getArtifactByName('TestContract') - - const TestContract = makeTruffleContract(buildArtifacts.getArtifactByName('TestContract')) - const LinkedLibrary1 = makeTruffleContract(buildArtifacts.getArtifactByName('LinkedLibrary1')) - const LinkedLibrary2 = makeTruffleContract(buildArtifacts.getArtifactByName('LinkedLibrary2')) - const LinkedLibrary3 = makeTruffleContract(buildArtifacts.getArtifactByName('LinkedLibrary3')) - - describe('LibraryPositions()', () => { - it('collects the right number of positions for each library', () => { - const positions = new LibraryPositions(artifact.deployedBytecode) - assert.equal(positions.positions['LinkedLibrary1'].length, 2) - assert.equal(positions.positions['LinkedLibrary2'].length, 2) - }) - }) - - describe('#LibraryAddresses.collect()', () => { - describe('when libraries are linked correctly', () => { - it('collects the correct addresses', () => { - const positions = new LibraryPositions(artifact.deployedBytecode) - const links: LibraryLinks = { - LinkedLibrary1: '0000000000000000000000000000000000000001', - LinkedLibrary2: '0000000000000000000000000000000000000002', - } - const linkedBytecode = linkLibraries(artifact.deployedBytecode, links) - const addresses = new LibraryAddresses() - addresses.collect(linkedBytecode, positions) - - assert.equal( - addresses.addresses['LinkedLibrary1'], - '0000000000000000000000000000000000000001' - ) - assert.equal( - addresses.addresses['LinkedLibrary2'], - '0000000000000000000000000000000000000002' - ) - }) - }) - - describe('when libraries are not linked correctly', () => { - it('detects incorrect linking', () => { - const positions = new LibraryPositions(artifact.deployedBytecode) - const links: LibraryLinks = { - LinkedLibrary1: '0000000000000000000000000000000000000001', - LinkedLibrary2: '0000000000000000000000000000000000000002', - } - const linkedBytecode = linkLibraries(artifact.deployedBytecode, links) - const incorrectBytecode = - linkedBytecode.slice(0, positions.positions['LinkedLibrary1'][0] - 1) + - '0000000000000000000000000000000000000003' + - linkedBytecode.slice( - positions.positions['LinkedLibrary1'][0] - 1 + 40, - linkedBytecode.length - ) - - assert.throws(() => { - new LibraryAddresses().collect(incorrectBytecode, positions) - }) - }) - }) - }) - - describe('on a test contract deployment', () => { - let registry: RegistryInstance - let library1 - let library2 - let library3 - let testContract - beforeEach(async () => { - registry = await Registry.new(true) - - library1 = await LinkedLibrary1.new({ from: accounts[0] }) - library3 = await LinkedLibrary3.new({ from: accounts[0] }) - LinkedLibrary2.link('LinkedLibrary3', library3.address) - library2 = await LinkedLibrary2.new({ from: accounts[0] }) - - TestContract.link('LinkedLibrary1', library1.address) - TestContract.link('LinkedLibrary2', library2.address) - testContract = await deployProxiedContract(TestContract, accounts[0]) - - await registry.setAddressFor('TestContract', testContract.address) - }) - - describe('verifyBytecodes', () => { - it(`doesn't throw on matching contracts`, async () => { - await verifyBytecodes(['TestContract'], [buildArtifacts], registry, [], Proxy, web3) - assert(true) - }) - - it(`throws when a contract's bytecodes don't match`, async () => { - const oldBytecode = artifact.deployedBytecode - artifact.deployedBytecode = '0x0' + oldBytecode.slice(3, artifact.deployedBytecode.length) - await assertThrowsAsync( - verifyBytecodes(['TestContract'], [buildArtifacts], registry, [], Proxy, web3) - ) - artifact.deployedBytecode = oldBytecode - }) - - it(`throws when a library's bytecodes don't match`, async () => { - const libraryArtifact = buildArtifacts.getArtifactByName('LinkedLibrary1') - const oldBytecode = libraryArtifact.deployedBytecode - libraryArtifact.deployedBytecode = - oldBytecode.slice(0, 44) + '00' + oldBytecode.slice(46, oldBytecode.length) - await assertThrowsAsync( - verifyBytecodes(['TestContract'], [buildArtifacts], registry, [], Proxy, web3) - ) - libraryArtifact.deployedBytecode = oldBytecode - }) - - describe(`when a proposal upgrades a library's implementation`, () => { - const LinkedLibrary3Upgraded = makeTruffleContract( - upgradedLibBuildArtifacts.getArtifactByName('LinkedLibrary3') - ) - beforeEach(async () => { - library3 = await LinkedLibrary3Upgraded.new({ from: accounts[0] }) - LinkedLibrary2.link('LinkedLibrary3', library3.address) - library2 = await LinkedLibrary2.new({ from: accounts[0] }) - TestContract.link('LinkedLibrary2', library2.address) - testContract = await TestContract.new({ from: accounts[0] }) - }) - - it(`doesn't throw on matching contracts`, async () => { - const proposal = [ - { - contract: 'TestContractProxy', - function: '_setImplementation', - args: [testContract.address], - value: '0', - }, - ] - - await verifyBytecodes( - ['TestContract'], - [upgradedLibBuildArtifacts], - registry, - proposal, - Proxy, - web3 - ) - assert(true) - }) - - it(`throws on different contracts`, async () => { - const proposal = [ - { - contract: 'TestContractProxy', - function: '_setImplementation', - args: [testContract.address], - value: '0', - }, - ] - - await assertThrowsAsync( - verifyBytecodes(['TestContract'], [buildArtifacts], registry, proposal, Proxy, web3) - ) - }) - - it(`throws when the proposed address is wrong`, async () => { - const proposal = [ - { - contract: 'TestContractProxy', - function: '_setImplementation', - args: [accounts[1]], - value: '0', - }, - ] - - await assertThrowsAsync( - verifyBytecodes(['TestContract'], [buildArtifacts], registry, proposal, Proxy, web3) - ) - }) - }) - - describe(`when a proposal upgrades a contract's implementation`, () => { - const TestContractUpgraded = makeTruffleContract( - upgradedContractBuildArtifacts.getArtifactByName('TestContract') - ) - beforeEach(async () => { - TestContractUpgraded.link('LinkedLibrary1', library1.address) - TestContractUpgraded.link('LinkedLibrary2', library2.address) - testContract = await TestContractUpgraded.new({ from: accounts[0] }) - }) - - it(`doesn't throw on matching contracts`, async () => { - const proposal = [ - { - contract: 'TestContractProxy', - function: '_setImplementation', - args: [testContract.address], - value: '0', - }, - ] - - await verifyBytecodes( - ['TestContract'], - [upgradedContractBuildArtifacts], - registry, - proposal, - Proxy, - web3 - ) - assert(true) - }) - - it(`throws on different contracts`, async () => { - const proposal = [ - { - contract: 'TestContractProxy', - function: '_setImplementation', - args: [testContract.address], - value: '0', - }, - ] - - await assertThrowsAsync( - verifyBytecodes(['TestContract'], [buildArtifacts], registry, proposal, Proxy, web3) - ) - }) - - it(`throws when the proposed address is wrong`, async () => { - const proposal = [ - { - contract: 'TestContractProxy', - function: '_setImplementation', - args: [accounts[1]], - value: '0', - }, - ] - - await assertThrowsAsync( - verifyBytecodes(['TestContract'], [buildArtifacts], registry, proposal, Proxy, web3) - ) - }) - - it(`throws when there is no proposal`, async () => { - const proposal = [] - - await assertThrowsAsync( - verifyBytecodes( - ['TestContract'], - [upgradedContractBuildArtifacts], - registry, - proposal, - Proxy, - web3 - ) - ) - }) - }) - - describe(`when a proposal changes a contract's proxy`, () => { - const TestContractUpgraded = makeTruffleContract( - upgradedContractBuildArtifacts.getArtifactByName('TestContract') - ) - beforeEach(async () => { - TestContractUpgraded.link('LinkedLibrary1', library1.address) - TestContractUpgraded.link('LinkedLibrary2', library2.address) - testContract = await deployProxiedContract(TestContractUpgraded, accounts[0]) - await registry.setAddressFor('TestContract', testContract.address) - }) - - it(`doesn't throw on matching contracts`, async () => { - const proposal = [ - { - contract: 'Registry', - function: 'setAddressFor', - args: ['TestContract', testContract.address], - value: '0', - }, - ] - - await verifyBytecodes( - ['TestContract'], - [upgradedContractBuildArtifacts], - registry, - proposal, - Proxy, - web3 - ) - assert(true) - }) - - it(`throws on different contracts`, async () => { - const proposal = [ - { - contract: 'Registry', - function: 'setAddressFor', - args: ['TestContract', testContract.address], - value: '0', - }, - ] - - await assertThrowsAsync( - verifyBytecodes(['TestContract'], [buildArtifacts], registry, proposal, Proxy, web3) - ) - }) - - it(`throws when the proposed address is wrong`, async () => { - const proposal = [ - { - contract: 'Registry', - function: 'setAddressFor', - args: ['TestContract', accounts[0]], - value: '0', - }, - ] - - await assertThrowsAsync( - verifyBytecodes(['TestContract'], [buildArtifacts], registry, proposal, Proxy, web3) - ) - }) - }) - - it('throws when a proposal does not only proxy or registry repointing', async () => { - const proposal = [ - { - contract: 'GoldToken', - function: 'transfer', - args: [NULL_ADDRESS, '100000000'], - value: '0', - }, - ] - - await assertThrowsAsync( - verifyBytecodes( - ['TestContract'], - [upgradedContractBuildArtifacts], - registry, - proposal, - Proxy, - web3 - ) - ) - }) - }) - }) -}) diff --git a/packages/protocol/test/constants.ts b/packages/protocol/test/constants.ts deleted file mode 100644 index 6b1e89c70fc..00000000000 --- a/packages/protocol/test/constants.ts +++ /dev/null @@ -1,3 +0,0 @@ -export const SECONDS_IN_A_WEEK = 60 * 60 * 24 * 7 - -export const ZERO_ADDRESS = '0x0000000000000000000000000000000000000000' diff --git a/packages/protocol/test/customHooks.ts b/packages/protocol/test/customHooks.ts deleted file mode 100644 index 1da6ba454b3..00000000000 --- a/packages/protocol/test/customHooks.ts +++ /dev/null @@ -1,13 +0,0 @@ -import { retryAsync } from '@celo/utils/lib/async' - -// Handles flakey `error: Invalid JSON RPC response: ""` error that seems to be caused by port exhaustion in CI. -// See https://github.com/ethereum/web3.js/issues/3425 and https://github.com/ethereum/web3.js/issues/926. -export const beforeEachWithRetries = ( - title: string, - numRetries: number, - sleepMs: number, - fn: () => any -) => - beforeEach(title, async () => { - await retryAsync(fn, numRetries, [], sleepMs) - }) diff --git a/packages/protocol/test/resources/compatibility/.gitignore b/packages/protocol/test/resources/compatibility/.gitignore deleted file mode 100644 index 1e012e20493..00000000000 --- a/packages/protocol/test/resources/compatibility/.gitignore +++ /dev/null @@ -1,2 +0,0 @@ -# Don't commit the compiled test contracts -build/* diff --git a/packages/protocol/test/resources/compatibility/contracts_added_methods_and_contracts/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_added_methods_and_contracts/Migrations.sol deleted file mode 100644 index d1008cca7bc..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_added_methods_and_contracts/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.13; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_appended/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_appended/Migrations.sol deleted file mode 100644 index d1008cca7bc..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_appended/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.13; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_appended_in_parent/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_appended_in_parent/Migrations.sol deleted file mode 100644 index d1008cca7bc..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_appended_in_parent/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.13; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_in_library_struct_mapping/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_in_library_struct_mapping/Migrations.sol deleted file mode 100644 index d1008cca7bc..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_in_library_struct_mapping/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.13; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_in_struct/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_in_struct/Migrations.sol deleted file mode 100644 index d1008cca7bc..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_in_struct/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.13; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_variable/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_variable/Migrations.sol deleted file mode 100644 index d1008cca7bc..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_deprecated_prefixed_variable/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.13; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_inserted/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_inserted/Migrations.sol deleted file mode 100644 index d1008cca7bc..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_inserted/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.13; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_inserted_constant/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_inserted_constant/Migrations.sol deleted file mode 100644 index d1008cca7bc..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_inserted_constant/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.13; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_inserted_in_library_struct/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_inserted_in_library_struct/Migrations.sol deleted file mode 100644 index d1008cca7bc..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_inserted_in_library_struct/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.13; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_inserted_in_library_struct_mapping/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_inserted_in_library_struct_mapping/Migrations.sol deleted file mode 100644 index d1008cca7bc..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_inserted_in_library_struct_mapping/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.13; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_inserted_in_struct/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_inserted_in_struct/Migrations.sol deleted file mode 100644 index d1008cca7bc..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_inserted_in_struct/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.13; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_inserted_in_struct_mapping/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_inserted_in_struct_mapping/Migrations.sol deleted file mode 100644 index d1008cca7bc..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_inserted_in_struct_mapping/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.13; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_contract/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_contract/Migrations.sol deleted file mode 100644 index b61987df672..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_contract/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.3; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_lib/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_lib/Migrations.sol deleted file mode 100644 index b61987df672..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_linked_libraries_upgraded_lib/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.3; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_metadata_changed/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_metadata_changed/Migrations.sol deleted file mode 100644 index d1008cca7bc..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_metadata_changed/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.13; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_original/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_original/Migrations.sol deleted file mode 100644 index d1008cca7bc..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_original/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.13; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_original_copy/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_original_copy/Migrations.sol deleted file mode 100644 index d1008cca7bc..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_original_copy/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.13; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_original_struct_in_mapping/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_original_struct_in_mapping/Migrations.sol deleted file mode 100644 index d1008cca7bc..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_original_struct_in_mapping/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.13; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_removed/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_removed/Migrations.sol deleted file mode 100644 index d1008cca7bc..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_removed/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.13; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_removed_from_library_struct/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_removed_from_library_struct/Migrations.sol deleted file mode 100644 index d1008cca7bc..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_removed_from_library_struct/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.13; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_removed_from_parent/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_removed_from_parent/Migrations.sol deleted file mode 100644 index d1008cca7bc..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_removed_from_parent/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.13; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_removed_from_struct/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_removed_from_struct/Migrations.sol deleted file mode 100644 index d1008cca7bc..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_removed_from_struct/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.13; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_typechange/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_typechange/Migrations.sol deleted file mode 100644 index d1008cca7bc..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_typechange/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.13; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_typechange_in_library_struct/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_typechange_in_library_struct/Migrations.sol deleted file mode 100644 index d1008cca7bc..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_typechange_in_library_struct/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.13; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_typechange_in_parent/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_typechange_in_parent/Migrations.sol deleted file mode 100644 index d1008cca7bc..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_typechange_in_parent/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.13; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_typechange_in_struct/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_typechange_in_struct/Migrations.sol deleted file mode 100644 index d1008cca7bc..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_typechange_in_struct/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.13; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/resources/compatibility/contracts_versioned/Migrations.sol b/packages/protocol/test/resources/compatibility/contracts_versioned/Migrations.sol deleted file mode 100644 index d1008cca7bc..00000000000 --- a/packages/protocol/test/resources/compatibility/contracts_versioned/Migrations.sol +++ /dev/null @@ -1,24 +0,0 @@ -pragma solidity ^0.5.13; - -contract Migrations { - address public owner; - uint256 public last_completed_migration; // solhint-disable var-name-mixedcase - - modifier restricted() { - if (msg.sender == owner) _; - } - - constructor() public { - owner = msg.sender; - } - - function setCompleted(uint256 completed) external restricted { - last_completed_migration = completed; // solhint-disable var-name-mixedcase - } - - // solhint-disable-next-line func-param-name-mixedcase - function upgrade(address new_address) external restricted { - Migrations upgraded = Migrations(new_address); - upgraded.setCompleted(last_completed_migration); - } -} diff --git a/packages/protocol/test/teardown.ts b/packages/protocol/test/teardown.ts deleted file mode 100644 index 0f6d61acea0..00000000000 --- a/packages/protocol/test/teardown.ts +++ /dev/null @@ -1,6 +0,0 @@ -// Global teardown for coverage -after(async () => { - if ((global as any).coverageSubprovider) { - await (global as any).coverageSubprovider.writeCoverageAsync() - } -}) diff --git a/packages/protocol/truffle-config-parent.js b/packages/protocol/truffle-config-parent.js index 3cb57cfa781..e606d6b11b6 100644 --- a/packages/protocol/truffle-config-parent.js +++ b/packages/protocol/truffle-config-parent.js @@ -1,37 +1,26 @@ /* tslint:disable: object-literal-sort-keys */ require('ts-node/register') -const ProviderEngine = require('web3-provider-engine') -const WebsocketSubprovider = require('web3-provider-engine/subproviders/websocket.js') -const { TruffleArtifactAdapter } = require('@0x/sol-trace') -const { CoverageSubprovider } = require('@0x/sol-coverage') var Web3 = require('web3') -var net = require('net') + +const HDWalletProvider = require('@truffle/hdwallet-provider') const argv = require('minimist')(process.argv.slice(2), { string: ['truffle_override', 'network'], boolean: ['reset'], }) -const ALFAJORES_NETWORKID = 44787 -const BAKLAVA_NETWORKID = 62320 -const BAKLAVASTAGING_NETWORKID = 31420 -const CANNOLI_NETWORKID = 17323 +const CELOSEPOLIA_NETWORKID = 11142220 const OG_FROM = '0xfeE1a22F43BeeCB912B5a4912ba87527682ef0fC' const DEVELOPMENT_FROM = '0x5409ed021d9299bf6814279a6a1411a7e866a631' const INTEGRATION_FROM = '0x47e172F6CfB6c7D01C1574fa3E2Be7CC73269D95' const INTEGRATION_TESTING_FROM = '0x47e172F6CfB6c7D01C1574fa3E2Be7CC73269D95' -const ALFAJORESSTAGING_FROM = '0xf4314cb9046bece6aa54bb9533155434d0c76909' -const ALFAJORES_FROM = '0x456f41406B32c45D59E539e4BBA3D7898c3584dA' -const RC0_FROM = '0x469be98FE71AFf8F6e7f64F9b732e28A03596B5C' -const BAKLAVA_FROM = '0x0Cc59Ed03B3e763c02d54D695FFE353055f1502D' -const BAKLAVASTAGING_FROM = '0x4588ABb84e1BBEFc2BcF4b2296F785fB7AD9F285' -const STAGING_FROM = '0x4e3d385ecdee402da395a3b18575b05cc5e8ff21' -const CANNOLI_FROM = '0x8C174E896A85E487aa895865657b78Ea64879dC7' // validator zero +const CELOSEPOLIA_FROM = '0x33c35343B08ff91a9d7a07daa34eDce5248bE318' const gasLimit = 20000000 const hostAddress = process.env.CELO_NODE_ADDRESS || '127.0.0.1' const hostPort = parseInt(process.env.CELO_NODE_PORT || '8545') +const devPort = 8546 const defaultConfig = { host: hostAddress, @@ -40,26 +29,29 @@ const defaultConfig = { from: OG_FROM, gas: gasLimit, gasPrice: 100000000000, - // maxFeePerGas: 975000000, + maxFeePerGas: 975_000_000_000, } -const freeGasConfig = { ...defaultConfig, ...{ gasPrice: 0 } } +function readMnemonic(networkName) { + dotenv = require('dotenv').config({ + path: require('path').resolve(__dirname, `../../.env.mnemonic.${networkName.replace('-', '')}`), + }) -// ipcProvider returns a function to create an IPC provider when called. -// Use by adding `provider: ipcProvider(...)` to any of the configs below. -function ipcProvider(path) { - return () => new Web3.providers.IpcProvider(path, net) -} + const privateKey = process.env.DEPLOYER_PRIVATE_KEY + if (privateKey === undefined || privateKey === null || privateKey === '') { + console.log( + `No private key found in .env.mnemonic.${networkName}. Please run "yarn keys:decrypt" in root after escalating perms in Akeyless` + ) + process.exit(1) + } -// Here to avoid recreating it each time -let coverageProvider = null + return process.env.DEPLOYER_PRIVATE_KEY +} const fornoUrls = { - alfajores: 'https://alfajores-forno.celo-testnet.org', - baklava: 'https://baklava-forno.celo-testnet.org', + 'celo-sepolia': 'https://forno.celo-sepolia.celo-testnet.org', rc1: 'https://forno.celo.org', mainnet: 'https://forno.celo.org', - staging: 'https://staging-forno.celo-networks-dev.org', } const networks = { @@ -71,119 +63,47 @@ const networks = { defaultBalance: 200000000, mnemonic: 'concert load couple harbor equip island argue ramp clarify fence smart topic', }, - rc0: { - host: hostAddress, - port: hostPort, - from: RC0_FROM, - network_id: 200312, - gasPrice: 100000000000, - }, rc1: { - host: '127.0.0.1', port: 8545, - from: '0xE23a4c6615669526Ab58E9c37088bee4eD2b2dEE', + from: '0xF3EB910DA09B8AF348E0E5B6636da442cFa79239', network_id: 42220, gas: gasLimit, - gasPrice: 10000000000, - }, - coverage: { - host: 'localhost', - network_id: '*', - gasPrice: 0, - gas: gasLimit, - from: DEVELOPMENT_FROM, - provider: function () { - if (coverageProvider == null) { - coverageProvider = new ProviderEngine() - - const projectRoot = '' - const artifactAdapter = new TruffleArtifactAdapter(projectRoot, SOLC_VERSION) - global.coverageSubprovider = new CoverageSubprovider(artifactAdapter, DEVELOPMENT_FROM, { - isVerbose: true, - ignoreFilesGlobs: [ - // Proxies - '**/*Proxy.sol', - - // Test contracts - '**/test/*.sol', - - // Interfaces - '**/interfaces/*.sol', - ], - }) - coverageProvider.addProvider(global.coverageSubprovider) - - coverageProvider.addProvider( - new WebsocketSubprovider({ - rpcUrl: `http://localhost:${defaultConfig.port}`, - debug: false, - }) - ) - - coverageProvider.start((err) => { - if (err !== undefined) { - // eslint-disable-next-line: no-console - console.error(err) - process.exit(1) - } - }) - /** - * HACK: Truffle providers should have `send` function, while `ProviderEngine` creates providers with `sendAsync`, - * but it can be easily fixed by assigning `sendAsync` to `send`. - */ - coverageProvider.send = coverageProvider.sendAsync.bind(coverageProvider) - } - return coverageProvider - }, + gasPrice: 100000000000, + privateKeyAvailable: false, + proposer: '0xc11F5aC70B86517Dcc10f20d8B0D5e77EBb956Ce', + approver: '0x41822d8A191fcfB1cfcA5F7048818aCd8eE933d3', + voter: '0xb073014a4c60c9824B597375C5e2d49e765cf811', }, testnet_prod: defaultConfig, anvil: { ...defaultConfig, - network_id: 31337, + network_id: '*', // Accept any chain ID for anvil fork testing from: '0xf39fd6e51aad88f6f4ce6ab8827279cfffb92266', + port: devPort, // Use port 8546 for anvil (matches ANVIL_PORT in constants.sh) }, // New testnets integration: { ...defaultConfig, from: INTEGRATION_FROM, + port: devPort, }, testing: { ...defaultConfig, from: INTEGRATION_TESTING_FROM, network_id: 1101, + port: devPort, }, - alfajoresstaging: { - ...defaultConfig, - from: ALFAJORESSTAGING_FROM, - }, - - alfajores: { - ...defaultConfig, - network_id: ALFAJORES_NETWORKID, - from: ALFAJORES_FROM, - }, - - cannoli: { - ...defaultConfig, - network_id: CANNOLI_NETWORKID, - from: CANNOLI_FROM, - }, - - baklava: { + 'celo-sepolia': { ...defaultConfig, - from: BAKLAVA_FROM, - network_id: BAKLAVA_NETWORKID, - }, - baklavastaging: { - ...defaultConfig, - from: BAKLAVASTAGING_FROM, - network_id: BAKLAVASTAGING_NETWORKID, - }, - staging: { - ...defaultConfig, - from: STAGING_FROM, + network_id: CELOSEPOLIA_NETWORKID, + from: CELOSEPOLIA_FROM, + privateKeyAvailable: true, + proposer: '0x95a40aA01d2d72b4122C19c86160710D01224ada', + approver: '0x95a40aA01d2d72b4122C19c86160710D01224ada', + voter: '0x95a40aA01d2d72b4122C19c86160710D01224ada', }, } + // Equivalent networks.mainnet = networks.rc1 @@ -199,6 +119,7 @@ if (argv.truffle_override || !(argv.network in networks)) { } if (process.argv.includes('--forno')) { + console.log('Using forno as RPC') if (!fornoUrls[argv.network]) { console.log(`Forno URL for network ${argv.network} not known!`) process.exit(1) @@ -206,9 +127,20 @@ if (process.argv.includes('--forno')) { networks[argv.network].host = undefined networks[argv.network].port = undefined - networks[argv.network].provider = function () { - return new Web3.providers.HttpProvider(fornoUrls[argv.network]) + + if (networks[argv.network].privateKeyAvailable) { + console.log('Network is supposed to have a private key available, using HDWalletProvider') + networks[argv.network].provider = function () { + return new HDWalletProvider({ + privateKeys: [readMnemonic(argv.network)], + providerOrUrl: fornoUrls[argv.network], + }) + } + } else { + networks[argv.network].provider = function () { + return new Web3.providers.HttpProvider(fornoUrls[argv.network]) + } } } -module.exports = { networks: networks } +module.exports = { networks: networks, fornoUrls: fornoUrls } diff --git a/scripts/key_placer.sh b/scripts/key_placer.sh index d9460b2a1af..737c16d6d03 100755 --- a/scripts/key_placer.sh +++ b/scripts/key_placer.sh @@ -2,12 +2,14 @@ echo "Processing encrypted files v2" -# Set list of secret files to encrypt and decrypt. +# Set list of pairs (secret file:environment) to encrypt and decrypt. files=( ".env.mnemonic:celo-testnet" ".env.mnemonic.alfajores:celo-testnet" - ".env.mnemonic.baklava:celo-testnet" - ".env.mnemonic.rc1:celo-testnet-production" + ".env.mnemonic.mainnet:celo-testnet-production" + "secrets/.env.signers.v2:celo-testnet-production" + "secrets/.env.signers.v3:celo-testnet-production" + "secrets/.env.signers.succinct:celo-testnet-production" ) if [[ -z "$1" ]]; then diff --git a/secrets/.env.signers.succinct.enc b/secrets/.env.signers.succinct.enc new file mode 100644 index 00000000000..d286590cb5b Binary files /dev/null and b/secrets/.env.signers.succinct.enc differ diff --git a/secrets/.env.signers.v2.enc b/secrets/.env.signers.v2.enc new file mode 100644 index 00000000000..d6f24de84f2 Binary files /dev/null and b/secrets/.env.signers.v2.enc differ diff --git a/secrets/.env.signers.v3.enc b/secrets/.env.signers.v3.enc new file mode 100644 index 00000000000..1946fe3d72d Binary files /dev/null and b/secrets/.env.signers.v3.enc differ diff --git a/secrets/.gitignore b/secrets/.gitignore new file mode 100644 index 00000000000..3af558ee78c --- /dev/null +++ b/secrets/.gitignore @@ -0,0 +1,2 @@ +.env.signers* +!.env.signers*.enc diff --git a/yarn-audit-known-issues b/yarn-audit-known-issues deleted file mode 100644 index e69de29bb2d..00000000000 diff --git a/yarn.lock b/yarn.lock index 8071b357323..91596a81ef8 100644 --- a/yarn.lock +++ b/yarn.lock @@ -2,271 +2,6 @@ # yarn lockfile v1 -"@0x/assert@^3.0.35", "@0x/assert@^3.0.36": - version "3.0.36" - resolved "https://registry.yarnpkg.com/@0x/assert/-/assert-3.0.36.tgz#91f82973e11826c08011a5830508246262b43284" - integrity sha512-sUtrV5MhixXvWZpATjFqIDtgvvv64duSTuOyPdPJjB+/Lcl5jQhlSNuoN0X3XP0P79Sp+6tuez5MupgFGPA2QQ== - dependencies: - "@0x/json-schemas" "^6.4.6" - "@0x/typescript-typings" "^5.3.2" - "@0x/utils" "^7.0.0" - "@types/node" "12.12.54" - lodash "^4.17.21" - valid-url "^1.0.9" - -"@0x/dev-utils@^5.0.3": - version "5.0.3" - resolved "https://registry.yarnpkg.com/@0x/dev-utils/-/dev-utils-5.0.3.tgz#8de9b2f8241caa649ceffc6b891122aa923150a0" - integrity sha512-V58tT2aiiHNSQtEt2XQWZDsPMsQ4wPDnjZRUaW38W46QasIFfCbcpKmfCsAGJyFxM7t0m0JJJwmYTJwZhCGcoQ== - dependencies: - "@0x/subproviders" "^8.0.1" - "@0x/types" "^3.3.7" - "@0x/typescript-typings" "^5.3.2" - "@0x/utils" "^7.0.0" - "@0x/web3-wrapper" "^8.0.1" - "@types/node" "12.12.54" - "@types/web3-provider-engine" "^14.0.0" - chai "^4.0.1" - chai-as-promised "^7.1.0" - chai-bignumber "^3.0.0" - dirty-chai "^2.0.1" - ethereum-types "^3.7.1" - lodash "^4.17.21" - web3-provider-engine "16.0.4" - -"@0x/json-schemas@^6.4.6": - version "6.4.6" - resolved "https://registry.yarnpkg.com/@0x/json-schemas/-/json-schemas-6.4.6.tgz#3904b8219ad2fefdf47ddb94fb65c1b9fb82d1d7" - integrity sha512-TaqvhOkmLN/vkcpMUNVFZBTnWP05ZVo9iGAnP1CG/B8l4rvnUbLZvWx8KeDKs62I/5d7jdYISvXyOwP4EwrG4w== - dependencies: - "@0x/typescript-typings" "^5.3.2" - "@types/node" "12.12.54" - ajv "^6.12.5" - lodash.values "^4.3.0" - -"@0x/sol-compiler@^4.8.3", "@0x/sol-compiler@^4.8.5": - version "4.8.5" - resolved "https://registry.yarnpkg.com/@0x/sol-compiler/-/sol-compiler-4.8.5.tgz#49579cee7de838d8ebd1bfd094a0d30a9e96506c" - integrity sha512-hAc3ZjpD+/fgSt/UQaAim8d2fQL3kWpnP5+tSEVf3/xetDDp3BhTOMi+wKnVuYo9FzuTgHx5MFueWM+mojE41A== - dependencies: - "@0x/assert" "^3.0.36" - "@0x/json-schemas" "^6.4.6" - "@0x/sol-resolver" "^3.1.13" - "@0x/types" "^3.3.7" - "@0x/typescript-typings" "^5.3.2" - "@0x/utils" "^7.0.0" - "@0x/web3-wrapper" "^8.0.1" - "@types/node" "12.12.54" - "@types/yargs" "^11.0.0" - chalk "^2.3.0" - chokidar "^3.0.2" - ethereum-types "^3.7.1" - ethereumjs-util "^7.1.5" - lodash "^4.17.21" - mkdirp "^0.5.1" - pluralize "^7.0.0" - require-from-string "^2.0.1" - semver "5.5.0" - solc "^0.8" - source-map-support "^0.5.0" - strip-comments "^2.0.1" - web3-eth-abi "^1.0.0-beta.24" - yargs "^17.5.1" - -"@0x/sol-coverage@^4.0.47": - version "4.0.49" - resolved "https://registry.yarnpkg.com/@0x/sol-coverage/-/sol-coverage-4.0.49.tgz#15bb1210263a7dacc5a19f939892a9fb15bf2e5a" - integrity sha512-gphKAO63NBysXgFV2fqxHmnBnHnD/i6rarmEwMKMKjWJ1shqQsKgJL984jBtek0UPsR5X4QM5tcHNKuCSe2Tuw== - dependencies: - "@0x/sol-tracing-utils" "^7.3.5" - "@0x/subproviders" "^8.0.1" - "@0x/typescript-typings" "^5.3.2" - "@types/minimatch" "^3.0.3" - "@types/node" "12.12.54" - ethereum-types "^3.7.1" - lodash "^4.17.21" - minimatch "^3.0.4" - web3-provider-engine "16.0.4" - -"@0x/sol-profiler@^4.1.37": - version "4.1.39" - resolved "https://registry.yarnpkg.com/@0x/sol-profiler/-/sol-profiler-4.1.39.tgz#2a2f13056d8306bdea242df1d2566c0d80ac5e1e" - integrity sha512-yrEHRun6RvXsg1NWhx4iH/YBeJXZDTQRAyNGQai+P1hrcOE7yoZ7PKtMv+69r3vCH38xVcWzF6xQKa7p4BIvlA== - dependencies: - "@0x/sol-tracing-utils" "^7.3.5" - "@0x/subproviders" "^8.0.1" - "@0x/typescript-typings" "^5.3.2" - "@0x/utils" "^7.0.0" - "@types/node" "12.12.54" - ethereum-types "^3.7.1" - ethereumjs-util "^7.1.5" - lodash "^4.17.21" - web3-provider-engine "16.0.4" - -"@0x/sol-resolver@^3.1.13": - version "3.1.13" - resolved "https://registry.yarnpkg.com/@0x/sol-resolver/-/sol-resolver-3.1.13.tgz#d985be51cec600384c4df101ad33956b844e3ed7" - integrity sha512-nQHqW7sOsDEH4ejH9nu60sCgFXEW08LM0v+5DimA/R7MizOW4LAG7OoHM+Oq8uPcHbeU0peFEDOW0idBsIzZ6g== - dependencies: - "@0x/types" "^3.3.7" - "@0x/typescript-typings" "^5.3.2" - "@types/node" "12.12.54" - lodash "^4.17.21" - -"@0x/sol-trace@^3.0.47": - version "3.0.49" - resolved "https://registry.yarnpkg.com/@0x/sol-trace/-/sol-trace-3.0.49.tgz#10b2a4e079c187a11d3fca9d32031c10fb3d0a7e" - integrity sha512-mCNbUCX6Oh1Z1e1g4AsD7sSbgMN7IGQyR0w+4xQYApgjwtYOaRPZMiluMXqp9nNHX75LwCfVd7NEIJIUMVQf2A== - dependencies: - "@0x/sol-tracing-utils" "^7.3.5" - "@0x/subproviders" "^8.0.1" - "@0x/typescript-typings" "^5.3.2" - "@types/node" "12.12.54" - chalk "^2.3.0" - ethereum-types "^3.7.1" - ethereumjs-util "^7.1.5" - lodash "^4.17.21" - loglevel "^1.6.1" - web3-provider-engine "16.0.4" - -"@0x/sol-tracing-utils@^7.3.5": - version "7.3.5" - resolved "https://registry.yarnpkg.com/@0x/sol-tracing-utils/-/sol-tracing-utils-7.3.5.tgz#9fe3fb36462424a72437a0310d872611bc5f970a" - integrity sha512-KzLTcUcLiQD5N/NzkZnIwI0i4w775z4w/H0o2FeM3Gp/0BcBx2DZ+sqKVoCEUSussm+jx2v8MNJnM3wcdvvDlg== - dependencies: - "@0x/dev-utils" "^5.0.3" - "@0x/sol-compiler" "^4.8.5" - "@0x/sol-resolver" "^3.1.13" - "@0x/subproviders" "^8.0.1" - "@0x/typescript-typings" "^5.3.2" - "@0x/utils" "^7.0.0" - "@0x/web3-wrapper" "^8.0.1" - "@types/node" "12.12.54" - "@types/solidity-parser-antlr" "^0.2.3" - chalk "^2.3.0" - ethereum-types "^3.7.1" - ethereumjs-util "^7.1.5" - ethers "~4.0.4" - glob "^7.1.2" - istanbul "^0.4.5" - lodash "^4.17.21" - loglevel "^1.6.1" - mkdirp "^0.5.1" - rimraf "^2.6.2" - semaphore-async-await "^1.5.1" - solc "^0.8" - solidity-parser-antlr "^0.4.2" - -"@0x/subproviders@^7.0.1": - version "7.0.1" - resolved "https://registry.yarnpkg.com/@0x/subproviders/-/subproviders-7.0.1.tgz#3e74cbe61ae746bea67766821f226398978a0cc0" - integrity sha512-S5LrUg12szE8T3U+2ymcdiH4zOkSyGJIh3FRN9jXspwQCe6+fggEymt6n0SyK3p1QiHipVkCxhjTR/53+shwsg== - dependencies: - "@0x/assert" "^3.0.35" - "@0x/types" "^3.3.7" - "@0x/typescript-typings" "^5.3.2" - "@0x/utils" "^7.0.0" - "@0x/web3-wrapper" "^8.0.0" - "@ethereumjs/common" "^2.6.3" - "@ethereumjs/tx" "^3.5.1" - "@ledgerhq/hw-app-eth" "^5.53.0" - "@ledgerhq/hw-transport-webusb" "^5.53.1" - "@types/hdkey" "^0.7.0" - "@types/node" "12.12.54" - "@types/web3-provider-engine" "^14.0.0" - bip39 "^2.5.0" - bn.js "^4.11.8" - ethereum-types "^3.7.1" - ethereumjs-util "^7.1.5" - ganache "^7.4.0" - hdkey "^0.7.1" - json-rpc-error "2.0.0" - lodash "^4.17.21" - semaphore-async-await "^1.5.1" - web3-provider-engine "16.0.4" - optionalDependencies: - "@ledgerhq/hw-transport-node-hid" "^5.51.1" - -"@0x/subproviders@^8.0.1": - version "8.0.1" - resolved "https://registry.yarnpkg.com/@0x/subproviders/-/subproviders-8.0.1.tgz#b934547bfa1c8049030b243a134de9755792d53a" - integrity sha512-Lax7Msb1Ef9D6Dd7PQ19oPgjl5GIrKje7XsrO7YCfx5A0RM3Hr4nSQIxgg78jwvuulSFxQ5Sr8WiZ2hTHATtQg== - dependencies: - "@0x/assert" "^3.0.36" - "@0x/types" "^3.3.7" - "@0x/typescript-typings" "^5.3.2" - "@0x/utils" "^7.0.0" - "@0x/web3-wrapper" "^8.0.1" - "@ethereumjs/common" "^2.6.3" - "@ethereumjs/tx" "^3.5.1" - "@types/hdkey" "^0.7.0" - "@types/node" "12.12.54" - "@types/web3-provider-engine" "^14.0.0" - bip39 "2.5.0" - ethereum-types "^3.7.1" - ethereumjs-util "^7.1.5" - ganache "^7.4.0" - hdkey "2.1.0" - json-rpc-error "2.0.0" - lodash "^4.17.21" - web3-provider-engine "16.0.4" - -"@0x/types@^3.3.7": - version "3.3.7" - resolved "https://registry.yarnpkg.com/@0x/types/-/types-3.3.7.tgz#2a8556b3398b6d6fac942c63de23ab22836624ee" - integrity sha512-6lPXPnvKaIfAJ5hIgs81SytqNCPCLstQ/DA598iLpb90KKjjz8QsdrfII4JeKdrEREvLcWSH9SeH4sNPWyLhlA== - dependencies: - "@types/node" "12.12.54" - bignumber.js "~9.0.2" - ethereum-types "^3.7.1" - -"@0x/typescript-typings@^5.3.2": - version "5.3.2" - resolved "https://registry.yarnpkg.com/@0x/typescript-typings/-/typescript-typings-5.3.2.tgz#b2b2a46ebff7f9d885b1357feae615225cb0bb31" - integrity sha512-VIo8PS/IRXrI1aEzM8TenUMViX4MFMKBnIAwqC4K/ewVDSnKyYZSk8fzw0XZph6tN07obchPf+1sHIWYe8EUow== - dependencies: - "@types/bn.js" "^4.11.0" - "@types/node" "12.12.54" - "@types/react" "*" - bignumber.js "~9.0.2" - ethereum-types "^3.7.1" - popper.js "1.14.3" - -"@0x/utils@^7.0.0": - version "7.0.0" - resolved "https://registry.yarnpkg.com/@0x/utils/-/utils-7.0.0.tgz#322168b21cf11741003c9cc490b13adafbe57393" - integrity sha512-g+Bp0eHUGhnVGeVZqGn7UVtpzs/FuoXksiDaajfJrHFW0owwo5YwpwFIAVU7/ca0B6IKa74e71gskLwWGINEFg== - dependencies: - "@0x/types" "^3.3.7" - "@0x/typescript-typings" "^5.3.2" - "@types/mocha" "^5.2.7" - "@types/node" "12.12.54" - abortcontroller-polyfill "^1.1.9" - bignumber.js "~9.0.2" - chalk "^2.3.0" - detect-node "2.0.3" - ethereum-types "^3.7.1" - ethereumjs-util "^7.1.5" - ethers "~4.0.4" - isomorphic-fetch "^3.0.0" - js-sha3 "^0.7.0" - lodash "^4.17.21" - -"@0x/web3-wrapper@^8.0.0", "@0x/web3-wrapper@^8.0.1": - version "8.0.1" - resolved "https://registry.yarnpkg.com/@0x/web3-wrapper/-/web3-wrapper-8.0.1.tgz#3625e63737d005fe6a92b71d0e676b4d03e88c60" - integrity sha512-2rqugeCld5r/3yg+Un9sPCUNVeZW5J64Fm6i/W6qRE87X+spIJG48oJymTjSMDXw/w3FaP4nAvhSj2C5fvhN6w== - dependencies: - "@0x/assert" "^3.0.36" - "@0x/json-schemas" "^6.4.6" - "@0x/typescript-typings" "^5.3.2" - "@0x/utils" "^7.0.0" - "@types/node" "12.12.54" - ethereum-types "^3.7.1" - ethereumjs-util "^7.1.5" - ethers "~4.0.4" - lodash "^4.17.21" - "@aashutoshrathi/word-wrap@^1.2.3": version "1.2.6" resolved "https://registry.yarnpkg.com/@aashutoshrathi/word-wrap/-/word-wrap-1.2.6.tgz#bd9154aec9983f77b3a034ecaa015c2e4201f6cf" @@ -277,6 +12,11 @@ resolved "https://registry.yarnpkg.com/@adraffy/ens-normalize/-/ens-normalize-1.9.4.tgz#aae21cb858bbb0411949d5b7b3051f4209043f62" integrity sha512-UK0bHA7hh9cR39V+4gl2/NnBBjoXIxkuWAPCaY4X7fbH4L/azIi7ilWOCjMUYfpJgraLUAqkRi2BqrjME8Rynw== +"@adraffy/ens-normalize@^1.10.1": + version "1.11.1" + resolved "https://registry.yarnpkg.com/@adraffy/ens-normalize/-/ens-normalize-1.11.1.tgz#6c2d657d4b2dfb37f8ea811dcb3e60843d4ac24a" + integrity sha512-nhCBV3quEgesuf7c7KYfperqSS14T8bYuvJ8PcLJp6znkZpFc0AuW4qBtr8eKVyPPe/8RSr7sglCWPU5eaxwKQ== + "@ampproject/remapping@^2.2.0": version "2.2.1" resolved "https://registry.yarnpkg.com/@ampproject/remapping/-/remapping-2.2.1.tgz#99e8e11851128b8702cd57c33684f1d0f260b630" @@ -721,7 +461,7 @@ babel-plugin-polyfill-regenerator "^0.4.1" semver "^6.3.0" -"@babel/runtime@^7.0.0", "@babel/runtime@^7.12.1", "@babel/runtime@^7.15.4", "@babel/runtime@^7.3.1", "@babel/runtime@^7.4.4", "@babel/runtime@^7.5.5": +"@babel/runtime@^7.0.0", "@babel/runtime@^7.12.1", "@babel/runtime@^7.15.4", "@babel/runtime@^7.4.4", "@babel/runtime@^7.5.5": version "7.21.0" resolved "https://registry.yarnpkg.com/@babel/runtime/-/runtime-7.21.0.tgz#5b55c9d394e5fcf304909a8b00c07dc217b56673" integrity sha512-xwII0//EObnq89Ji5AKYQaRYiW/nZ3llSv29d49IuxPhKbtJoLP+9QUUZ4nVragQVtaVGeZrpB+ZtG/Pdy/POw== @@ -792,11 +532,6 @@ resolved "https://registry.yarnpkg.com/@bcoe/v8-coverage/-/v8-coverage-0.2.3.tgz#75a2e8b51cb758a7553d6804a5932d7aace75c39" integrity sha512-0hYQ8SB4Db5zvZB4axdMHGwEaQjkZzFjQiN9LVYvIFB2nSUHW9tYpxWriPrWDASIxiaXax83REcLxuSdnGPZtw== -"@celo/abis@10.0.0": - version "10.0.0" - resolved "https://registry.yarnpkg.com/@celo/abis/-/abis-10.0.0.tgz#2c1002f2e82b29ca22cec70c988bf90d26fadc68" - integrity sha512-DC8UAEz89+1yEQqKzkxOWLYGUv/XWaqPAOkU0lKuQzhvN25ndP3fitawQl53WYn8i3ZPfRWfLO4w75l//tDSbg== - "@celo/base@^6.0.0": version "6.0.0" resolved "https://registry.yarnpkg.com/@celo/base/-/base-6.0.0.tgz#23811b0bc8730a0d13fa90df55a7ec0e90d370cd" @@ -832,27 +567,6 @@ web3-eth "1.10.0" web3-eth-contract "1.10.0" -"@celo/contractkit@^7.0.0": - version "7.0.0" - resolved "https://registry.yarnpkg.com/@celo/contractkit/-/contractkit-7.0.0.tgz#6878ce9c2b0c381703cfc26195db0678f50b6854" - integrity sha512-Knmg2TeO7W363xMNgOC5ZEKWn8RpsO4ARsq8SDLRq2O4v84GnTMPCh43+sIwmckx/d3tklhXrcHq/oLWiNLXYg== - dependencies: - "@celo/abis" "10.0.0" - "@celo/base" "^6.0.0" - "@celo/connect" "^5.1.2" - "@celo/utils" "^6.0.0" - "@celo/wallet-local" "^5.1.2" - "@types/bn.js" "^5.1.0" - "@types/debug" "^4.1.5" - bignumber.js "^9.0.0" - cross-fetch "3.0.6" - debug "^4.1.1" - fp-ts "2.1.1" - io-ts "2.0.1" - semver "^7.3.5" - web3 "1.10.0" - web3-core-helpers "1.10.0" - "@celo/cryptographic-utils@^5.0.7": version "5.1.0" resolved "https://registry.yarnpkg.com/@celo/cryptographic-utils/-/cryptographic-utils-5.1.0.tgz#64adcf4fe852d209cc4443d5e1ad9897d03de611" @@ -869,61 +583,6 @@ "@types/bn.js" "^5.1.0" "@types/node" "^18.7.16" -"@celo/dev-utils@^0.0.3": - version "0.0.3" - resolved "https://registry.yarnpkg.com/@celo/dev-utils/-/dev-utils-0.0.3.tgz#b0ec2b9fe31e2e8937058c902c7d2f373d988bd5" - integrity sha512-9fTYTfpRcl8c+R6G/7Mk0pRnDUSyul7im+6lIIGZ7DuvSxoi/X6v8Zqopl6S+5FKBm81dXSk4DaRcPGOn5tEDQ== - dependencies: - bignumber.js "^9.0.0" - fs-extra "^8.1.0" - ganache "npm:@celo/ganache@7.8.0-unofficial.0" - targz "^1.0.1" - tmp "^0.1.0" - web3 "1.10.4" - web3-core-helpers "1.10.4" - -"@celo/explorer@^5.0.8": - version "5.0.8" - resolved "https://registry.yarnpkg.com/@celo/explorer/-/explorer-5.0.8.tgz#48e7ec3867f3a82318b66038c1e546ab279a710a" - integrity sha512-2Qadubg9Cs33Vp8y8m2dAj4iTnL5dKgu9u3p+uacrGjWG48tax1hMU+9hpmaR0/Hc/gONNjdK3WzuqfnH02rKQ== - dependencies: - "@celo/base" "^6.0.0" - "@celo/connect" "^5.1.2" - "@celo/contractkit" "^7.0.0" - "@celo/utils" "^6.0.0" - "@types/debug" "^4.1.5" - bignumber.js "9.0.0" - cross-fetch "3.0.6" - debug "^4.1.1" - -"@celo/governance@^5.0.9": - version "5.0.9" - resolved "https://registry.yarnpkg.com/@celo/governance/-/governance-5.0.9.tgz#d7b92598ddd3f249a5e0fd45a6d5565985b1238f" - integrity sha512-PEHK6te4zx5pIqBi8MHqNjJt8flqoO6y/OHHux7labJywIrkOHk8CmEesfYm1ByaREhp7L9Fo4RwE40bf8OY2Q== - dependencies: - "@celo/abis" "10.0.0" - "@celo/base" "^6.0.0" - "@celo/connect" "^5.1.2" - "@celo/contractkit" "^7.0.0" - "@celo/explorer" "^5.0.8" - "@celo/utils" "^6.0.0" - "@ethereumjs/util" "8.0.5" - "@types/debug" "^4.1.5" - "@types/inquirer" "^6.5.0" - bignumber.js "^9.0.0" - debug "^4.1.1" - ethereum-cryptography "1.2.0" - inquirer "^7.0.5" - -"@celo/network-utils@^5.0.5": - version "5.0.5" - resolved "https://registry.yarnpkg.com/@celo/network-utils/-/network-utils-5.0.5.tgz#5d096b435cd93ea1083951966b4f349bdb82b3d4" - integrity sha512-9+0Pr3OLSS3O1baxbUSW+HfCWcctPEfwuccvuVqdLiXqcuP5Usx8IDi3XAkEQcvGQ1RS097FqZqM6t8u+Npqgw== - dependencies: - "@types/debug" "^4.1.5" - cross-fetch "3.0.6" - debug "^4.1.1" - "@celo/typechain-target-web3-v1-celo@^1.0.0": version "1.0.0" resolved "https://registry.yarnpkg.com/@celo/typechain-target-web3-v1-celo/-/typechain-target-web3-v1-celo-1.0.0.tgz#9b22ae2624c06d9cddb1616fa232970b45056aa9" @@ -1280,14 +939,6 @@ crc-32 "^1.2.0" ethereumjs-util "^7.1.1" -"@ethereumjs/common@2.6.5", "@ethereumjs/common@^2.5.0", "@ethereumjs/common@^2.6.3", "@ethereumjs/common@^2.6.4": - version "2.6.5" - resolved "https://registry.yarnpkg.com/@ethereumjs/common/-/common-2.6.5.tgz#0a75a22a046272579d91919cb12d84f2756e8d30" - integrity sha512-lRyVQOeCDaIVtgfbowla32pzeDv2Obr8oR8Put5RdUBNRGr1VGPGQNGP6elWIpgK3YdpzqTOh4GyUGOureVeeA== - dependencies: - crc-32 "^1.2.0" - ethereumjs-util "^7.1.5" - "@ethereumjs/common@3.1.1", "@ethereumjs/common@^3.1.1": version "3.1.1" resolved "https://registry.yarnpkg.com/@ethereumjs/common/-/common-3.1.1.tgz#6f754c8933727ad781f63ca3929caab542fe184e" @@ -1296,6 +947,14 @@ "@ethereumjs/util" "^8.0.5" crc-32 "^1.2.0" +"@ethereumjs/common@^2.4.0", "@ethereumjs/common@^2.5.0", "@ethereumjs/common@^2.6.4": + version "2.6.5" + resolved "https://registry.yarnpkg.com/@ethereumjs/common/-/common-2.6.5.tgz#0a75a22a046272579d91919cb12d84f2756e8d30" + integrity sha512-lRyVQOeCDaIVtgfbowla32pzeDv2Obr8oR8Put5RdUBNRGr1VGPGQNGP6elWIpgK3YdpzqTOh4GyUGOureVeeA== + dependencies: + crc-32 "^1.2.0" + ethereumjs-util "^7.1.5" + "@ethereumjs/ethash@^2.0.4": version "2.0.4" resolved "https://registry.yarnpkg.com/@ethereumjs/ethash/-/ethash-2.0.4.tgz#1892e8e17a11b10efeee3075fb09cd3cbd12c33b" @@ -1368,14 +1027,6 @@ "@ethereumjs/common" "^2.5.0" ethereumjs-util "^7.1.2" -"@ethereumjs/tx@3.5.2", "@ethereumjs/tx@^3.3.0", "@ethereumjs/tx@^3.5.1": - version "3.5.2" - resolved "https://registry.yarnpkg.com/@ethereumjs/tx/-/tx-3.5.2.tgz#197b9b6299582ad84f9527ca961466fce2296c1c" - integrity sha512-gQDNJWKrSDGu2w7w0PzVXVBNMzb7wwdDOmOqczmhNjqFxFuIbhVJDwiGEnxFNC2/b8ifcZzY7MLcluizohRzNw== - dependencies: - "@ethereumjs/common" "^2.6.4" - ethereumjs-util "^7.1.5" - "@ethereumjs/tx@4.1.1", "@ethereumjs/tx@^4.1.1": version "4.1.1" resolved "https://registry.yarnpkg.com/@ethereumjs/tx/-/tx-4.1.1.tgz#d1b5bf2c4fd3618f2f333b66e262848530d4686a" @@ -1388,6 +1039,14 @@ "@ethersproject/providers" "^5.7.2" ethereum-cryptography "^1.1.2" +"@ethereumjs/tx@^3.3.0": + version "3.5.2" + resolved "https://registry.yarnpkg.com/@ethereumjs/tx/-/tx-3.5.2.tgz#197b9b6299582ad84f9527ca961466fce2296c1c" + integrity sha512-gQDNJWKrSDGu2w7w0PzVXVBNMzb7wwdDOmOqczmhNjqFxFuIbhVJDwiGEnxFNC2/b8ifcZzY7MLcluizohRzNw== + dependencies: + "@ethereumjs/common" "^2.6.4" + ethereumjs-util "^7.1.5" + "@ethereumjs/util@8.0.2": version "8.0.2" resolved "https://registry.yarnpkg.com/@ethereumjs/util/-/util-8.0.2.tgz#b7348fc7253649b0f00685a94546c6eee1fad819" @@ -1800,114 +1459,6 @@ resolved "https://registry.yarnpkg.com/@gar/promisify/-/promisify-1.1.3.tgz#555193ab2e3bb3b6adc3d551c9c030d9e860daf6" integrity sha512-k2Ty1JcVojjJFwrg/ThKi2ujJ7XNLYaFGNB/bWT9wGR+oSMJHMa5w+CUq6p/pVrKeNNgA7pCqEcjSnHVoqJQFw== -"@google-cloud/common@^0.32.0": - version "0.32.1" - resolved "https://registry.yarnpkg.com/@google-cloud/common/-/common-0.32.1.tgz#6a32c340172cea3db6674d0e0e34e78740a0073f" - integrity sha512-bLdPzFvvBMtVkwsoBtygE9oUm3yrNmPa71gvOgucYI/GqvNP2tb6RYsDHPq98kvignhcgHGDI5wyNgxaCo8bKQ== - dependencies: - "@google-cloud/projectify" "^0.3.3" - "@google-cloud/promisify" "^0.4.0" - "@types/request" "^2.48.1" - arrify "^2.0.0" - duplexify "^3.6.0" - ent "^2.2.0" - extend "^3.0.2" - google-auth-library "^3.1.1" - pify "^4.0.1" - retry-request "^4.0.0" - teeny-request "^3.11.3" - -"@google-cloud/monitoring@0.7.1": - version "0.7.1" - resolved "https://registry.yarnpkg.com/@google-cloud/monitoring/-/monitoring-0.7.1.tgz#9afd2c8b237f01675dcaea8c5c7b3de9b58e24ad" - integrity sha512-RsBM/3pvlKvJdSuiMKU0cJ7XKZiMsrXvdY2AexK75659cou4SxhbqM60iaU9lqAbIcMcuhH8rc1AG3cRsDhYhQ== - dependencies: - google-gax "^0.25.0" - lodash.merge "^4.6.0" - -"@google-cloud/paginator@^0.2.0": - version "0.2.0" - resolved "https://registry.yarnpkg.com/@google-cloud/paginator/-/paginator-0.2.0.tgz#eab2e6aa4b81df7418f6c51e2071f64dab2c2fa5" - integrity sha512-2ZSARojHDhkLvQ+CS32K+iUhBsWg3AEw+uxtqblA7xoCABDyhpj99FPp35xy6A+XlzMhOSrHHaxFE+t6ZTQq0w== - dependencies: - arrify "^1.0.1" - extend "^3.0.1" - split-array-stream "^2.0.0" - stream-events "^1.0.4" - -"@google-cloud/precise-date@^0.1.0": - version "0.1.0" - resolved "https://registry.yarnpkg.com/@google-cloud/precise-date/-/precise-date-0.1.0.tgz#02ccda04b4413fa64f098fc93db51e95af5c855a" - integrity sha512-nXt4AskYjmDLRIO+nquVVppjiLE5ficFRP3WF1JYtPnSRFRpuMusa1kysPsD/yOxt5NMmvlkUCkaFI4rHYeckQ== - -"@google-cloud/projectify@^0.3.0", "@google-cloud/projectify@^0.3.3": - version "0.3.3" - resolved "https://registry.yarnpkg.com/@google-cloud/projectify/-/projectify-0.3.3.tgz#bde9103d50b20a3ea3337df8c6783a766e70d41d" - integrity sha512-7522YHQ4IhaafgSunsFF15nG0TGVmxgXidy9cITMe+256RgqfcrfWphiMufW+Ou4kqagW/u3yxwbzVEW3dk2Uw== - -"@google-cloud/promisify@^0.4.0": - version "0.4.0" - resolved "https://registry.yarnpkg.com/@google-cloud/promisify/-/promisify-0.4.0.tgz#4fbfcf4d85bb6a2e4ccf05aa63d2b10d6c9aad9b" - integrity sha512-4yAHDC52TEMCNcMzVC8WlqnKKKq+Ssi2lXoUg9zWWkZ6U6tq9ZBRYLHHCRdfU+EU9YJsVmivwGcKYCjRGjnf4Q== - -"@google-cloud/pubsub@^0.28.1": - version "0.28.1" - resolved "https://registry.yarnpkg.com/@google-cloud/pubsub/-/pubsub-0.28.1.tgz#8d0605e155f5a8c36f7b51363c1e139f534b5fd8" - integrity sha512-ukvR2S6DgerEJ5T0e9G2XTyk83Ajjfhy2GdNHR3qOIkFZTn1VjqnMbGK8oWtnYm4+hZ9PHPiZY4LnxvapmwaRA== - dependencies: - "@google-cloud/paginator" "^0.2.0" - "@google-cloud/precise-date" "^0.1.0" - "@google-cloud/projectify" "^0.3.0" - "@google-cloud/promisify" "^0.4.0" - "@sindresorhus/is" "^0.15.0" - "@types/duplexify" "^3.6.0" - "@types/long" "^4.0.0" - "@types/p-defer" "^1.0.3" - arrify "^1.0.0" - async-each "^1.0.1" - extend "^3.0.1" - google-auth-library "^3.0.0" - google-gax "^0.25.0" - is-stream-ended "^0.1.4" - lodash.merge "^4.6.0" - lodash.snakecase "^4.1.1" - p-defer "^1.0.0" - protobufjs "^6.8.1" - -"@google-cloud/secret-manager@3.12.0": - version "3.12.0" - resolved "https://registry.yarnpkg.com/@google-cloud/secret-manager/-/secret-manager-3.12.0.tgz#ff975190e45da3aaa762905f2b7c679c06a0f4a3" - integrity sha512-nFNm5lYgH2RRAn1x8vPKt1c+MBMJtBXqkYB5Jpi68PjN7Emjcu7/kl+0/+FamFLp3qJUU7RKRCwETFNNZAOkuw== - dependencies: - google-gax "^2.30.0" - -"@google-cloud/storage@^2.4.3": - version "2.5.0" - resolved "https://registry.yarnpkg.com/@google-cloud/storage/-/storage-2.5.0.tgz#9dd3566d8155cf5ba0c212208f69f9ecd47fbd7e" - integrity sha512-q1mwB6RUebIahbA3eriRs8DbG2Ij81Ynb9k8hMqTPkmbd8/S6Z0d6hVvfPmnyvX9Ej13IcmEYIbymuq/RBLghA== - dependencies: - "@google-cloud/common" "^0.32.0" - "@google-cloud/paginator" "^0.2.0" - "@google-cloud/promisify" "^0.4.0" - arrify "^1.0.0" - async "^2.0.1" - compressible "^2.0.12" - concat-stream "^2.0.0" - date-and-time "^0.6.3" - duplexify "^3.5.0" - extend "^3.0.0" - gcs-resumable-upload "^1.0.0" - hash-stream-validation "^0.2.1" - mime "^2.2.0" - mime-types "^2.0.8" - onetime "^5.1.0" - pumpify "^1.5.1" - snakeize "^0.1.0" - stream-events "^1.0.1" - teeny-request "^3.11.3" - through2 "^3.0.0" - xdg-basedir "^3.0.0" - "@graphql-tools/batch-execute@8.5.1": version "8.5.1" resolved "https://registry.yarnpkg.com/@graphql-tools/batch-execute/-/batch-execute-8.5.1.tgz#fa3321d58c64041650be44250b1ebc3aab0ba7a9" @@ -1996,51 +1547,6 @@ resolved "https://registry.yarnpkg.com/@graphql-typed-document-node/core/-/core-3.2.0.tgz#5f3d96ec6b2354ad6d8a28bf216a1d97b5426861" integrity sha512-mB9oAsNCm9aM3/SOv4YtBMqZbYj10R7dkq8byBqxGY/ncFwhf2oQzMV+LCRlWoDSEBJ3COiR1yeDvMtsoOsuFQ== -"@grpc/grpc-js@^0.3.0": - version "0.3.6" - resolved "https://registry.yarnpkg.com/@grpc/grpc-js/-/grpc-js-0.3.6.tgz#d9b52043907170d38e06711d9477fde29ab46fa8" - integrity sha512-SmLNuPGlUur64bNS9aHZguqWDVQ8+Df1CGn+xsh7l6T2wiP5ArOMlywZ3TZo6z/rwKtGQgUJY9ZrPYUmHEXd/Q== - dependencies: - semver "^5.5.0" - -"@grpc/grpc-js@~1.6.0": - version "1.6.12" - resolved "https://registry.yarnpkg.com/@grpc/grpc-js/-/grpc-js-1.6.12.tgz#20f710d8a8c5c396b2ae9530ba6c06b984614fdf" - integrity sha512-JmvQ03OTSpVd9JTlj/K3IWHSz4Gk/JMLUTtW7Zb0KvO1LcOYGATh5cNuRYzCAeDR3O8wq+q8FZe97eO9MBrkUw== - dependencies: - "@grpc/proto-loader" "^0.7.0" - "@types/node" ">=12.12.47" - -"@grpc/proto-loader@^0.4.0": - version "0.4.0" - resolved "https://registry.yarnpkg.com/@grpc/proto-loader/-/proto-loader-0.4.0.tgz#a823a51eb2fde58369bef1deb5445fd808d70901" - integrity sha512-Jm6o+75uWT7E6+lt8edg4J1F/9+BedOjaMgwE14pxS/AO43/0ZqK+rCLVVrXLoExwSAZvgvOD2B0ivy3Spsspw== - dependencies: - lodash.camelcase "^4.3.0" - protobufjs "^6.8.6" - -"@grpc/proto-loader@^0.6.12": - version "0.6.13" - resolved "https://registry.yarnpkg.com/@grpc/proto-loader/-/proto-loader-0.6.13.tgz#008f989b72a40c60c96cd4088522f09b05ac66bc" - integrity sha512-FjxPYDRTn6Ec3V0arm1FtSpmP6V50wuph2yILpyvTKzjc76oDdoihXqM1DzOW5ubvCC8GivfCnNtfaRE8myJ7g== - dependencies: - "@types/long" "^4.0.1" - lodash.camelcase "^4.3.0" - long "^4.0.0" - protobufjs "^6.11.3" - yargs "^16.2.0" - -"@grpc/proto-loader@^0.7.0": - version "0.7.6" - resolved "https://registry.yarnpkg.com/@grpc/proto-loader/-/proto-loader-0.7.6.tgz#b71fdf92b184af184b668c4e9395a5ddc23d61de" - integrity sha512-QyAXR8Hyh7uMDmveWxDSUcJr9NAWaZ2I6IXgAYvQmfflwouTM+rArE2eEaCtLlRqO81j7pRLCt81IefUei6Zbw== - dependencies: - "@types/long" "^4.0.1" - lodash.camelcase "^4.3.0" - long "^4.0.0" - protobufjs "^7.0.0" - yargs "^16.2.0" - "@humanwhocodes/config-array@^0.11.13": version "0.11.14" resolved "https://registry.yarnpkg.com/@humanwhocodes/config-array/-/config-array-0.11.14.tgz#d78e481a039f7566ecc9660b4ea7fe6b1fec442b" @@ -2345,89 +1851,6 @@ "@jridgewell/resolve-uri" "3.1.0" "@jridgewell/sourcemap-codec" "1.4.14" -"@ledgerhq/cryptoassets@^5.53.0": - version "5.53.0" - resolved "https://registry.yarnpkg.com/@ledgerhq/cryptoassets/-/cryptoassets-5.53.0.tgz#11dcc93211960c6fd6620392e4dd91896aaabe58" - integrity sha512-M3ibc3LRuHid5UtL7FI3IC6nMEppvly98QHFoSa7lJU0HDzQxY6zHec/SPM4uuJUC8sXoGVAiRJDkgny54damw== - dependencies: - invariant "2" - -"@ledgerhq/devices@^5.51.1": - version "5.51.1" - resolved "https://registry.yarnpkg.com/@ledgerhq/devices/-/devices-5.51.1.tgz#d741a4a5d8f17c2f9d282fd27147e6fe1999edb7" - integrity sha512-4w+P0VkbjzEXC7kv8T1GJ/9AVaP9I6uasMZ/JcdwZBS3qwvKo5A5z9uGhP5c7TvItzcmPb44b5Mw2kT+WjUuAA== - dependencies: - "@ledgerhq/errors" "^5.50.0" - "@ledgerhq/logs" "^5.50.0" - rxjs "6" - semver "^7.3.5" - -"@ledgerhq/errors@^5.50.0": - version "5.50.0" - resolved "https://registry.yarnpkg.com/@ledgerhq/errors/-/errors-5.50.0.tgz#e3a6834cb8c19346efca214c1af84ed28e69dad9" - integrity sha512-gu6aJ/BHuRlpU7kgVpy2vcYk6atjB4iauP2ymF7Gk0ez0Y/6VSMVSJvubeEQN+IV60+OBK0JgeIZG7OiHaw8ow== - -"@ledgerhq/hw-app-eth@^5.53.0": - version "5.53.0" - resolved "https://registry.yarnpkg.com/@ledgerhq/hw-app-eth/-/hw-app-eth-5.53.0.tgz#5df2d7427db9f387099d0cc437e9730101d7c404" - integrity sha512-LKi/lDA9tW0GdoYP1ng0VY/PXNYjSrwZ1cj0R0MQ9z+knmFlPcVkGK2MEqE8W8cXrC0tjsUXITMcngvpk5yfKA== - dependencies: - "@ledgerhq/cryptoassets" "^5.53.0" - "@ledgerhq/errors" "^5.50.0" - "@ledgerhq/hw-transport" "^5.51.1" - "@ledgerhq/logs" "^5.50.0" - bignumber.js "^9.0.1" - ethers "^5.2.0" - -"@ledgerhq/hw-transport-node-hid-noevents@^5.51.1": - version "5.51.1" - resolved "https://registry.yarnpkg.com/@ledgerhq/hw-transport-node-hid-noevents/-/hw-transport-node-hid-noevents-5.51.1.tgz#71f37f812e448178ad0bcc2258982150d211c1ab" - integrity sha512-9wFf1L8ZQplF7XOY2sQGEeOhpmBRzrn+4X43kghZ7FBDoltrcK+s/D7S+7ffg3j2OySyP6vIIIgloXylao5Scg== - dependencies: - "@ledgerhq/devices" "^5.51.1" - "@ledgerhq/errors" "^5.50.0" - "@ledgerhq/hw-transport" "^5.51.1" - "@ledgerhq/logs" "^5.50.0" - node-hid "2.1.1" - -"@ledgerhq/hw-transport-node-hid@^5.51.1": - version "5.51.1" - resolved "https://registry.yarnpkg.com/@ledgerhq/hw-transport-node-hid/-/hw-transport-node-hid-5.51.1.tgz#fe8eb81e18929663540698c80905952cdbe542d5" - integrity sha512-Y2eVCCdhVs2Lfr7N2x2cNb+ogcZ24ZATO4QxaQ7LogjiPwYmzmvuXFn8zFjMSrKUCn9CtbptXcuiu0NkGsjWlw== - dependencies: - "@ledgerhq/devices" "^5.51.1" - "@ledgerhq/errors" "^5.50.0" - "@ledgerhq/hw-transport" "^5.51.1" - "@ledgerhq/hw-transport-node-hid-noevents" "^5.51.1" - "@ledgerhq/logs" "^5.50.0" - lodash "^4.17.21" - node-hid "2.1.1" - usb "^1.7.0" - -"@ledgerhq/hw-transport-webusb@^5.53.1": - version "5.53.1" - resolved "https://registry.yarnpkg.com/@ledgerhq/hw-transport-webusb/-/hw-transport-webusb-5.53.1.tgz#3df8c401417571e3bcacc378d8aca587214b05ae" - integrity sha512-A/f+xcrkIAZiJrvPpDvsrjxQX4cI2kbdiunQkwsYmOG3Bp4z89ZnsBiC7YBst4n2/g+QgTg0/KPVtODU5djooQ== - dependencies: - "@ledgerhq/devices" "^5.51.1" - "@ledgerhq/errors" "^5.50.0" - "@ledgerhq/hw-transport" "^5.51.1" - "@ledgerhq/logs" "^5.50.0" - -"@ledgerhq/hw-transport@^5.51.1": - version "5.51.1" - resolved "https://registry.yarnpkg.com/@ledgerhq/hw-transport/-/hw-transport-5.51.1.tgz#8dd14a8e58cbee4df0c29eaeef983a79f5f22578" - integrity sha512-6wDYdbWrw9VwHIcoDnqWBaDFyviyjZWv6H9vz9Vyhe4Qd7TIFmbTl/eWs6hZvtZBza9K8y7zD8ChHwRI4s9tSw== - dependencies: - "@ledgerhq/devices" "^5.51.1" - "@ledgerhq/errors" "^5.50.0" - events "^3.3.0" - -"@ledgerhq/logs@^5.50.0": - version "5.50.0" - resolved "https://registry.yarnpkg.com/@ledgerhq/logs/-/logs-5.50.0.tgz#29c6419e8379d496ab6d0426eadf3c4d100cd186" - integrity sha512-swKHYCOZUGyVt4ge0u8a7AwNcA//h4nx5wIi0sruGye1IJ5Cva0GyK9L2/WdX+kWVTKp92ZiEo1df31lrWGPgA== - "@lerna/add@5.6.2": version "5.6.2" resolved "https://registry.yarnpkg.com/@lerna/add/-/add-5.6.2.tgz#d0e25fd4900b6f8a9548f940cc016ce8a3e2d2ba" @@ -3112,6 +2535,17 @@ npmlog "^6.0.2" write-file-atomic "^4.0.1" +"@metamask/eth-sig-util@4.0.1": + version "4.0.1" + resolved "https://registry.yarnpkg.com/@metamask/eth-sig-util/-/eth-sig-util-4.0.1.tgz#3ad61f6ea9ad73ba5b19db780d40d9aae5157088" + integrity sha512-tghyZKLHZjcdlDqCA3gNZmLeR0XvOE9U1qoQO9ohyAZT6Pya+H9vkBPcsyXytmYLNgVoin7CKCmweo/R43V+tQ== + dependencies: + ethereumjs-abi "^0.6.8" + ethereumjs-util "^6.2.1" + ethjs-util "^0.1.6" + tweetnacl "^1.0.3" + tweetnacl-util "^0.15.1" + "@metamask/safe-event-emitter@^2.0.0": version "2.0.0" resolved "https://registry.yarnpkg.com/@metamask/safe-event-emitter/-/safe-event-emitter-2.0.0.tgz#af577b477c683fad17c619a78208cede06f9605c" @@ -3143,6 +2577,20 @@ dependencies: "@noble/hashes" "1.3.3" +"@noble/curves@1.8.2", "@noble/curves@~1.8.1": + version "1.8.2" + resolved "https://registry.yarnpkg.com/@noble/curves/-/curves-1.8.2.tgz#8f24c037795e22b90ae29e222a856294c1d9ffc7" + integrity sha512-vnI7V6lFNe0tLAuJMu+2sX+FcL14TaCWy1qiczg1VwRmPrpQCdq5ESXQMqUc2tluRNf6irBXrWbl1mGN8uaU/g== + dependencies: + "@noble/hashes" "1.7.2" + +"@noble/curves@^1.6.0", "@noble/curves@~1.9.0": + version "1.9.7" + resolved "https://registry.yarnpkg.com/@noble/curves/-/curves-1.9.7.tgz#79d04b4758a43e4bca2cbdc62e7771352fa6b951" + integrity sha512-gbKGcRUYIjA3/zCCNaWDciTMFI0dCkvou3TL8Zmy5Nc7sJ47a0jtOeZoTaMxkuqRo9cRhjOdZJXegxYE5FN/xw== + dependencies: + "@noble/hashes" "1.8.0" + "@noble/curves@~1.4.0": version "1.4.2" resolved "https://registry.yarnpkg.com/@noble/curves/-/curves-1.4.2.tgz#40309198c76ed71bc6dbf7ba24e81ceb4d0d1fe9" @@ -3150,6 +2598,11 @@ dependencies: "@noble/hashes" "1.4.0" +"@noble/hashes@1.1.2": + version "1.1.2" + resolved "https://registry.yarnpkg.com/@noble/hashes/-/hashes-1.1.2.tgz#e9e035b9b166ca0af657a7848eb2718f0f22f183" + integrity sha512-KYRCASVTv6aeUi1tsF8/vpyR7zpfs3FUzy2Jqm+MU+LmUKhQ0y2FpfwqkCcxSg2ua4GALJd8k2R76WxwZGbQpA== + "@noble/hashes@1.2.0", "@noble/hashes@~1.2.0": version "1.2.0" resolved "https://registry.yarnpkg.com/@noble/hashes/-/hashes-1.2.0.tgz#a3150eeb09cc7ab207ebf6d7b9ad311a9bdbed12" @@ -3175,6 +2628,26 @@ resolved "https://registry.yarnpkg.com/@noble/hashes/-/hashes-1.4.0.tgz#45814aa329f30e4fe0ba49426f49dfccdd066426" integrity sha512-V1JJ1WTRUqHHrOSh597hURcMqVKVGL/ea3kv0gSnEdsEZ0/+VyPghM1lMNGc00z7CIQorSvbKpuJkxvuHbvdbg== +"@noble/hashes@1.7.2", "@noble/hashes@~1.7.1": + version "1.7.2" + resolved "https://registry.yarnpkg.com/@noble/hashes/-/hashes-1.7.2.tgz#d53c65a21658fb02f3303e7ee3ba89d6754c64b4" + integrity sha512-biZ0NUSxyjLLqo6KxEJ1b+C2NAx0wtDoFvCaXHGgUkeHzf3Xc1xKumFKREuT7f7DARNZ/slvYUwFG6B0f2b6hQ== + +"@noble/hashes@1.8.0", "@noble/hashes@^1.2.0", "@noble/hashes@^1.5.0", "@noble/hashes@~1.8.0": + version "1.8.0" + resolved "https://registry.yarnpkg.com/@noble/hashes/-/hashes-1.8.0.tgz#cee43d801fcef9644b11b8194857695acd5f815a" + integrity sha512-jCs9ldd7NwzpgXDIf6P3+NrHh9/sD6CQdxHyjQI+h/6rDNo88ypBxxz45UDuZHz9r3tNz7N/VInSVoVdtXEI4A== + +"@noble/hashes@~1.1.1": + version "1.1.5" + resolved "https://registry.yarnpkg.com/@noble/hashes/-/hashes-1.1.5.tgz#1a0377f3b9020efe2fae03290bd2a12140c95c11" + integrity sha512-LTMZiiLc+V4v1Yi16TD6aX2gmtKszNye0pQgbaLqkvhIqP7nVsSaJsWloGQjJfJ8offaoP5GtX3yY5swbcJxxQ== + +"@noble/secp256k1@1.6.3", "@noble/secp256k1@~1.6.0": + version "1.6.3" + resolved "https://registry.yarnpkg.com/@noble/secp256k1/-/secp256k1-1.6.3.tgz#7eed12d9f4404b416999d0c87686836c4c5c9b94" + integrity sha512-T04e4iTurVy7I8Sw4+c5OSN9/RkPlo1uKxAomtxQNLq8j1uPAqnsqG1bqvY3Jv7c13gyr6dui0zmh/I3+f/JaQ== + "@noble/secp256k1@1.7.1", "@noble/secp256k1@~1.7.0": version "1.7.1" resolved "https://registry.yarnpkg.com/@noble/secp256k1/-/secp256k1-1.7.1.tgz#b251c70f824ce3ca7f8dc3df08d58f005cc0507c" @@ -3722,6 +3195,20 @@ resolved "https://registry.yarnpkg.com/@scure/base/-/base-1.1.7.tgz#fe973311a5c6267846aa131bc72e96c5d40d2b30" integrity sha512-PPNYBslrLNNUQ/Yad37MHYsNQtK67EhWb6WtSvNLLPo7SdVZgkUjD6Dg+5On7zNwmskf8OX7I7Nx5oN+MIWE0g== +"@scure/base@~1.2.2", "@scure/base@~1.2.4", "@scure/base@~1.2.5": + version "1.2.6" + resolved "https://registry.yarnpkg.com/@scure/base/-/base-1.2.6.tgz#ca917184b8231394dd8847509c67a0be522e59f6" + integrity sha512-g/nm5FgUa//MCj1gV09zTJTaM6KBAHqLN907YVQqf7zC49+DcO4B1so4ZX07Ef10Twr6nuqYEH9GEggFXA4Fmg== + +"@scure/bip32@1.1.0": + version "1.1.0" + resolved "https://registry.yarnpkg.com/@scure/bip32/-/bip32-1.1.0.tgz#dea45875e7fbc720c2b4560325f1cf5d2246d95b" + integrity sha512-ftTW3kKX54YXLCxH6BB7oEEoJfoE2pIgw7MINKAs5PsS6nqKPuKk1haTF/EuHmYqG330t5GSrdmtRuHaY1a62Q== + dependencies: + "@noble/hashes" "~1.1.1" + "@noble/secp256k1" "~1.6.0" + "@scure/base" "~1.1.0" + "@scure/bip32@1.1.5": version "1.1.5" resolved "https://registry.yarnpkg.com/@scure/bip32/-/bip32-1.1.5.tgz#d2ccae16dcc2e75bc1d75f5ef3c66a338d1ba300" @@ -3758,6 +3245,15 @@ "@noble/hashes" "~1.3.2" "@scure/base" "~1.1.4" +"@scure/bip32@1.6.2": + version "1.6.2" + resolved "https://registry.yarnpkg.com/@scure/bip32/-/bip32-1.6.2.tgz#093caa94961619927659ed0e711a6e4bf35bffd0" + integrity sha512-t96EPDMbtGgtb7onKKqxRLfE5g05k7uHnHRM2xdE6BP/ZmxaLtPek4J4KfVn/90IQNrU1IOAqMgiDtUdtbe3nw== + dependencies: + "@noble/curves" "~1.8.1" + "@noble/hashes" "~1.7.1" + "@scure/base" "~1.2.2" + "@scure/bip32@^1.3.3": version "1.4.0" resolved "https://registry.yarnpkg.com/@scure/bip32/-/bip32-1.4.0.tgz#4e1f1e196abedcef395b33b9674a042524e20d67" @@ -3767,6 +3263,23 @@ "@noble/hashes" "~1.4.0" "@scure/base" "~1.1.6" +"@scure/bip32@^1.5.0": + version "1.7.0" + resolved "https://registry.yarnpkg.com/@scure/bip32/-/bip32-1.7.0.tgz#b8683bab172369f988f1589640e53c4606984219" + integrity sha512-E4FFX/N3f4B80AKWp5dP6ow+flD1LQZo/w8UnLGYZO674jS6YnYeepycOOksv+vLPSpgN35wgKgy+ybfTb2SMw== + dependencies: + "@noble/curves" "~1.9.0" + "@noble/hashes" "~1.8.0" + "@scure/base" "~1.2.5" + +"@scure/bip39@1.1.0": + version "1.1.0" + resolved "https://registry.yarnpkg.com/@scure/bip39/-/bip39-1.1.0.tgz#92f11d095bae025f166bef3defcc5bf4945d419a" + integrity sha512-pwrPOS16VeTKg98dYXQyIjJEcWfz7/1YJIwxUEPFfQPtc86Ym/1sVgQ2RLoD43AazMk2l/unK4ITySSpW2+82w== + dependencies: + "@noble/hashes" "~1.1.1" + "@scure/base" "~1.1.0" + "@scure/bip39@1.1.1": version "1.1.1" resolved "https://registry.yarnpkg.com/@scure/bip39/-/bip39-1.1.1.tgz#b54557b2e86214319405db819c4b6a370cf340c5" @@ -3791,6 +3304,14 @@ "@noble/hashes" "~1.3.2" "@scure/base" "~1.1.4" +"@scure/bip39@1.5.4": + version "1.5.4" + resolved "https://registry.yarnpkg.com/@scure/bip39/-/bip39-1.5.4.tgz#07fd920423aa671be4540d59bdd344cc1461db51" + integrity sha512-TFM4ni0vKvCfBpohoh+/lY05i9gRbSwXWngAsF4CABQxoaOHijxuaZ2R6cStDQ5CHtHO9aGJTr4ksVJASRRyMA== + dependencies: + "@noble/hashes" "~1.7.1" + "@scure/base" "~1.2.4" + "@scure/bip39@^1.2.2": version "1.3.0" resolved "https://registry.yarnpkg.com/@scure/bip39/-/bip39-1.3.0.tgz#0f258c16823ddd00739461ac31398b4e7d6a18c3" @@ -3799,6 +3320,14 @@ "@noble/hashes" "~1.4.0" "@scure/base" "~1.1.6" +"@scure/bip39@^1.4.0": + version "1.6.0" + resolved "https://registry.yarnpkg.com/@scure/bip39/-/bip39-1.6.0.tgz#475970ace440d7be87a6086cbee77cb8f1a684f9" + integrity sha512-+lF0BbLiJNwVlev4eKelw1WWLaiKXw7sSl8T6FvBlWkdX+94aGJ4o8XjUdlyhTCjd8c+B3KT3JfS8P0bLRNU6A== + dependencies: + "@noble/hashes" "~1.8.0" + "@scure/base" "~1.2.5" + "@sinclair/typebox@^0.25.16": version "0.25.24" resolved "https://registry.yarnpkg.com/@sinclair/typebox/-/typebox-0.25.24.tgz#8c7688559979f7079aacaf31aa881c3aa410b718" @@ -3809,11 +3338,6 @@ resolved "https://registry.yarnpkg.com/@sindresorhus/is/-/is-0.14.0.tgz#9fb3a3cf3132328151f353de4632e01e52102bea" integrity sha512-9NET910DNaIPngYnLLPeg+Ogzqsi9uM4mSboU5y6p8S5DzMTVEsJZrawi+BoDNUVBa2DhJqQYUFvMDfgU062LQ== -"@sindresorhus/is@^0.15.0": - version "0.15.0" - resolved "https://registry.yarnpkg.com/@sindresorhus/is/-/is-0.15.0.tgz#96915baa05e6a6a1d137badf4984d3fc05820bb6" - integrity sha512-lu8BpxjAtRCAo5ifytTpCPCj99LF7o/2Myn+NXyNCBqvPYn7Pjd76AMmUB5l7XF1U6t0hcWrlEM5ESufW7wAeA== - "@sindresorhus/is@^4.0.0", "@sindresorhus/is@^4.6.0": version "4.6.0" resolved "https://registry.yarnpkg.com/@sindresorhus/is/-/is-4.6.0.tgz#3c7c9c46e678feefe7a2e5bb609d3dbd665ffb3f" @@ -3895,11 +3419,6 @@ resolved "https://registry.yarnpkg.com/@stablelib/wipe/-/wipe-0.5.0.tgz#a682d5f9448e950e099e537e6f72fc960275d151" integrity sha512-SifvRV0rTTFR1qEF6G1hondGZyrmiM1laR8PPrO6TZwQG03hJduVbUX8uQk+Q6FdkND2Z9B8uLPyUAquQIk3iA== -"@summa-tx/memview.sol@^1.1.0": - version "1.1.0" - resolved "https://registry.yarnpkg.com/@summa-tx/memview.sol/-/memview.sol-1.1.0.tgz#f54a09faef46ca52deb94e10d0daacd28efe5be6" - integrity sha512-nLEfdC0ayn+PND5WWGalv+IyQ0NYomz43s1IJWDjawUh4JvW03LeP7ZjuMJPLdyJqYX3p8x1LwYt8Klde7cluw== - "@szmarczak/http-timer@^1.1.2": version "1.1.2" resolved "https://registry.yarnpkg.com/@szmarczak/http-timer/-/http-timer-1.1.2.tgz#b1665e2c461a2cd92f4c1bbf50d5454de0d4b421" @@ -3953,15 +3472,6 @@ fast-check "3.1.1" web3-utils "1.10.0" -"@truffle/artifactor@4.0.180": - version "4.0.180" - resolved "https://registry.yarnpkg.com/@truffle/artifactor/-/artifactor-4.0.180.tgz#5dded58d923d5c2e6442d09d4d3d3d9b10646c05" - integrity sha512-XWY1w5KMYsYQ9XN+QbR5W95HXLKumxdi3so70SRUkmmEx9+vTlvhzJ/aeyCtuOzRO6WjdZ6pOaUxMNcW5BqQDg== - dependencies: - "@truffle/contract-schema" "^3.4.11" - fs-extra "^9.1.0" - lodash "^4.17.21" - "@truffle/artifactor@^4.0.36": version "4.0.188" resolved "https://registry.yarnpkg.com/@truffle/artifactor/-/artifactor-4.0.188.tgz#2197300fd61a4196f2be9599fc0bb82e5d0b1817" @@ -4455,6 +3965,33 @@ glob "^7.1.2" web3-utils "1.2.1" +"@truffle/hdwallet-provider@^2.1.15": + version "2.1.15" + resolved "https://registry.yarnpkg.com/@truffle/hdwallet-provider/-/hdwallet-provider-2.1.15.tgz#fbf8e19d112db81b109ebc06ac6d9d42124b512c" + integrity sha512-I5cSS+5LygA3WFzru9aC5+yDXVowEEbLCx0ckl/RqJ2/SCiYXkzYlR5/DjjDJuCtYhivhrn2RP9AheeFlRF+qw== + dependencies: + "@ethereumjs/common" "^2.4.0" + "@ethereumjs/tx" "^3.3.0" + "@metamask/eth-sig-util" "4.0.1" + "@truffle/hdwallet" "^0.1.4" + "@types/ethereum-protocol" "^1.0.0" + "@types/web3" "1.0.20" + "@types/web3-provider-engine" "^14.0.0" + ethereum-cryptography "1.1.2" + ethereum-protocol "^1.0.1" + ethereumjs-util "^7.1.5" + web3 "1.10.0" + web3-provider-engine "16.0.3" + +"@truffle/hdwallet@^0.1.4": + version "0.1.4" + resolved "https://registry.yarnpkg.com/@truffle/hdwallet/-/hdwallet-0.1.4.tgz#eeb21163d9e295692a0ba2fa848cc7b5a29b0ded" + integrity sha512-D3SN0iw3sMWUXjWAedP6RJtopo9qQXYi80inzbtcsoso4VhxFxCwFvCErCl4b27AEJ9pkAtgnxEFRaSKdMmi1Q== + dependencies: + ethereum-cryptography "1.1.2" + keccak "3.0.2" + secp256k1 "4.0.3" + "@truffle/interface-adapter@^0.5.26", "@truffle/interface-adapter@^0.5.32": version "0.5.32" resolved "https://registry.yarnpkg.com/@truffle/interface-adapter/-/interface-adapter-0.5.32.tgz#2ae896ea85a9d267abcd5d0139afc0f06ebc5745" @@ -4545,25 +4082,6 @@ dependencies: "@truffle/config" "^1.3.61" -"@truffle/resolver@9.0.53", "@truffle/resolver@^9.0.35": - version "9.0.53" - resolved "https://registry.yarnpkg.com/@truffle/resolver/-/resolver-9.0.53.tgz#1864463c17b4aa54136ae015a40d19d15a33c97b" - integrity sha512-jYqHIucs6yMCOpKFwnvcW6cfpn/WEWJQ8FN0EUhf0r0HMz9TjG9HnabBZSvfMBFPAmKklGR/GI0GESWf3alpXQ== - dependencies: - "@ganache/console.log" "0.3.0" - "@truffle/compile-solidity" "^6.0.79" - "@truffle/contract" "^4.6.31" - "@truffle/contract-sources" "^0.2.1" - "@truffle/expect" "^0.1.7" - "@truffle/provisioner" "^0.2.84" - abi-to-sol "^0.7.0" - debug "^4.3.1" - detect-installed "^2.0.4" - fs-extra "^9.1.0" - get-installed-path "^4.0.8" - glob "^7.1.6" - web3-utils "1.10.0" - "@truffle/resolver@^5.0.18": version "5.1.12" resolved "https://registry.yarnpkg.com/@truffle/resolver/-/resolver-5.1.12.tgz#69c220973bb3326c3513a8e3e5a524f3aaa30f62" @@ -4579,6 +4097,25 @@ source-map-support "^0.5.16" supports-color "^7.1.0" +"@truffle/resolver@^9.0.35": + version "9.0.53" + resolved "https://registry.yarnpkg.com/@truffle/resolver/-/resolver-9.0.53.tgz#1864463c17b4aa54136ae015a40d19d15a33c97b" + integrity sha512-jYqHIucs6yMCOpKFwnvcW6cfpn/WEWJQ8FN0EUhf0r0HMz9TjG9HnabBZSvfMBFPAmKklGR/GI0GESWf3alpXQ== + dependencies: + "@ganache/console.log" "0.3.0" + "@truffle/compile-solidity" "^6.0.79" + "@truffle/contract" "^4.6.31" + "@truffle/contract-sources" "^0.2.1" + "@truffle/expect" "^0.1.7" + "@truffle/provisioner" "^0.2.84" + abi-to-sol "^0.7.0" + debug "^4.3.1" + detect-installed "^2.0.4" + fs-extra "^9.1.0" + get-installed-path "^4.0.8" + glob "^7.1.6" + web3-utils "1.10.0" + "@truffle/source-map-utils@^1.3.111": version "1.3.111" resolved "https://registry.yarnpkg.com/@truffle/source-map-utils/-/source-map-utils-1.3.111.tgz#8ebc52f6a7f7f77f4ff302141c11e09dd7f8a220" @@ -4722,7 +4259,7 @@ dependencies: "@babel/types" "^7.3.0" -"@types/bn.js@*", "@types/bn.js@4.11.6", "@types/bn.js@^4.11.0", "@types/bn.js@^4.11.3", "@types/bn.js@^4.11.4", "@types/bn.js@^5.1.0", "@types/bn.js@^5.1.1": +"@types/bn.js@*", "@types/bn.js@4.11.6", "@types/bn.js@^4.11.3", "@types/bn.js@^4.11.4", "@types/bn.js@^5.1.0", "@types/bn.js@^5.1.1": version "4.11.6" resolved "https://registry.yarnpkg.com/@types/bn.js/-/bn.js-4.11.6.tgz#c306c70d9358aaea33cd4eda092a742b9505967c" integrity sha512-pqr857jrp2kPuO9uRjZ3PwnJTjoQy+fcdxvBTvHm6dkmEL9q+hDD/2j/0ELOBPtPnS8LjCX0gI9nbl8lVkadpg== @@ -4737,21 +4274,6 @@ "@types/connect" "*" "@types/node" "*" -"@types/bunyan@1.8.8": - version "1.8.8" - resolved "https://registry.yarnpkg.com/@types/bunyan/-/bunyan-1.8.8.tgz#8d6d33f090f37c07e2a80af30ae728450a101008" - integrity sha512-Cblq+Yydg3u+sGiz2mjHjC5MPmdjY+No4qvHrF+BUhblsmSfMvsHLbOG62tPbonsqBj6sbWv1LHcsoe5Jw+/Ow== - dependencies: - "@types/node" "*" - -"@types/bytebuffer@^5.0.40": - version "5.0.44" - resolved "https://registry.yarnpkg.com/@types/bytebuffer/-/bytebuffer-5.0.44.tgz#553015fb34db1fc3eb3f7b232bff91c006c251a1" - integrity sha512-k1qonHga/SfQT02NF633i+7tIfKd+cfC/8pjnedcfuXJNMWooss/FkCgRMSnLf2WorLjbuH4bfgAZEbtyHBDoQ== - dependencies: - "@types/long" "^3.0.0" - "@types/node" "*" - "@types/cacheable-request@^6.0.1", "@types/cacheable-request@^6.0.2": version "6.0.3" resolved "https://registry.yarnpkg.com/@types/cacheable-request/-/cacheable-request-6.0.3.tgz#a430b3260466ca7b5ca5bfd735693b36e7a9d183" @@ -4762,11 +4284,6 @@ "@types/node" "*" "@types/responselike" "^1.0.0" -"@types/caseless@*": - version "0.12.2" - resolved "https://registry.yarnpkg.com/@types/caseless/-/caseless-0.12.2.tgz#f65d3d6389e01eeb458bd54dc8f52b95a9463bc8" - integrity sha512-6ckxMjBBD8URvjB6J3NcnuAn5Pkl7t3TizAg+xdlzzQGSPSmBcXf8KoIH0ua/i+tio+ZRUHEXp0HEmvaR4kt0w== - "@types/cbor@^2.0.0": version "2.0.0" resolved "https://registry.yarnpkg.com/@types/cbor/-/cbor-2.0.0.tgz#c627afc2ee22f23f2337fecb34628a4f97c6afbb" @@ -4812,20 +4329,6 @@ dependencies: "@types/ms" "*" -"@types/dotenv@^8.2.0": - version "8.2.0" - resolved "https://registry.yarnpkg.com/@types/dotenv/-/dotenv-8.2.0.tgz#5cd64710c3c98e82d9d15844375a33bf1b45d053" - integrity sha512-ylSC9GhfRH7m1EUXBXofhgx4lUWmFeQDINW5oLuS+gxWdfUeW4zJdeVTYVkexEW+e2VUvlZR2kGnGGipAWR7kw== - dependencies: - dotenv "*" - -"@types/duplexify@^3.6.0": - version "3.6.1" - resolved "https://registry.yarnpkg.com/@types/duplexify/-/duplexify-3.6.1.tgz#5685721cf7dc4a21b6f0e8a8efbec6b4d2fbafad" - integrity sha512-n0zoEj/fMdMOvqbHxmqnza/kXyoGgJmEpsXjpP+gEqE1Ye4yNqc7xWipKnUoMpWhMuzJQSfK2gMrwlElly7OGQ== - dependencies: - "@types/node" "*" - "@types/elliptic@^6.4.9": version "6.4.14" resolved "https://registry.yarnpkg.com/@types/elliptic/-/elliptic-6.4.14.tgz#7bbaad60567a588c1f08b10893453e6b9b4de48e" @@ -4840,6 +4343,13 @@ dependencies: bignumber.js "7.2.1" +"@types/ethereum-protocol@^1.0.0": + version "1.0.5" + resolved "https://registry.yarnpkg.com/@types/ethereum-protocol/-/ethereum-protocol-1.0.5.tgz#6ad4c2c722d440d1f59e0d7e44a0fbb5fad2c41b" + integrity sha512-4wr+t2rYbwMmDrT447SGzE/43Z0EN++zyHCBoruIx32fzXQDxVa1rnQbYwPO8sLP2OugE/L8KaAIJC5kieUuBg== + dependencies: + bignumber.js "7.2.1" + "@types/express-serve-static-core@4.17.31": version "4.17.31" resolved "https://registry.yarnpkg.com/@types/express-serve-static-core/-/express-serve-static-core-4.17.31.tgz#a1139efeab4e7323834bb0226e62ac019f474b2f" @@ -4883,13 +4393,6 @@ dependencies: "@types/node" "*" -"@types/hdkey@^0.7.0": - version "0.7.1" - resolved "https://registry.yarnpkg.com/@types/hdkey/-/hdkey-0.7.1.tgz#9bc63ebbe96b107b277b65ea7a95442a677d0d61" - integrity sha512-4Kkr06hq+R8a9EzVNqXGOY2x1xA7dhY6qlp6OvaZ+IJy1BCca1Cv126RD9X7CMJoXoLo8WvAizy8gQHpqW6K0Q== - dependencies: - "@types/node" "*" - "@types/http-cache-semantics@*": version "4.0.1" resolved "https://registry.yarnpkg.com/@types/http-cache-semantics/-/http-cache-semantics-4.0.1.tgz#0ea7b61496902b95890dc4c3a116b60cb8dae812" @@ -4900,14 +4403,6 @@ resolved "https://registry.yarnpkg.com/@types/http-cache-semantics/-/http-cache-semantics-4.0.4.tgz#b979ebad3919799c979b17c72621c0bc0a31c6c4" integrity sha512-1m0bIFVc7eJWyve9S0RnuRgcQqF/Xd5QsUZAZeQFr1Q3/p9JWoQQEqmVy+DPTNpGXwhgIetAoYF8JSc33q29QA== -"@types/inquirer@^6.5.0": - version "6.5.0" - resolved "https://registry.yarnpkg.com/@types/inquirer/-/inquirer-6.5.0.tgz#b83b0bf30b88b8be7246d40e51d32fe9d10e09be" - integrity sha512-rjaYQ9b9y/VFGOpqBEXRavc3jh0a+e6evAbI31tMda8VlPaSy0AZJfXsvmIe3wklc7W6C3zCSfleuMXR7NOyXw== - dependencies: - "@types/through" "*" - rxjs "^6.4.0" - "@types/isomorphic-fetch@0.0.31": version "0.0.31" resolved "https://registry.yarnpkg.com/@types/isomorphic-fetch/-/isomorphic-fetch-0.0.31.tgz#ec120166ce22f0b134e8770f40c97cd076068fae" @@ -4970,12 +4465,7 @@ resolved "https://registry.yarnpkg.com/@types/lodash/-/lodash-4.14.202.tgz#f09dbd2fb082d507178b2f2a5c7e74bd72ff98f8" integrity sha512-OvlIYQK9tNneDlS0VN54LLd5uiPCBOp7gS5Z0f1mjoJYBrtStzgmJBxONW3U6OZqdtNzZPmn9BS/7WI7BFFcFQ== -"@types/long@^3.0.0": - version "3.0.32" - resolved "https://registry.yarnpkg.com/@types/long/-/long-3.0.32.tgz#f4e5af31e9e9b196d8e5fca8a5e2e20aa3d60b69" - integrity sha512-ZXyOOm83p7X8p3s0IYM3VeueNmHpkk/yMlP8CLeOnEcu6hIwPH7YjZBvhQkR0ZFS2DqZAxKtJ/M5fcuv3OU5BA== - -"@types/long@^4.0.0", "@types/long@^4.0.1": +"@types/long@^4.0.0": version "4.0.2" resolved "https://registry.yarnpkg.com/@types/long/-/long-4.0.2.tgz#b74129719fc8d11c01868010082d483b7545591a" integrity sha512-MqTGEo5bj5t157U6fA/BiDynNkn0YknVdh48CMPkTSpFTVmvao5UQmm7uEF6xBEo7qIMAlY/JSleYaE6VOdpaA== @@ -4985,13 +4475,6 @@ resolved "https://registry.yarnpkg.com/@types/lru-cache/-/lru-cache-5.1.1.tgz#c48c2e27b65d2a153b19bfc1a317e30872e01eef" integrity sha512-ssE3Vlrys7sdIzs5LOxCzTVMsU7i9oa/IaW92wF32JFb3CVczqOkru2xspuKczHEbG3nvmPY7IFqVmGGHdNbYw== -"@types/mathjs@^4.4.1": - version "4.4.5" - resolved "https://registry.yarnpkg.com/@types/mathjs/-/mathjs-4.4.5.tgz#b28d46919c68b93bcabf0551729624b302af9b4b" - integrity sha512-Z9XyD6ORkE/dTwCGQ4htXaB8D/OkcRhQy4EtEvJ6lRmbgcYoS2q3CAMoB3s2mgQZmyQy85gtJcfbCz85LV1w+Q== - dependencies: - decimal.js "^10.0.0" - "@types/mime@*": version "3.0.1" resolved "https://registry.yarnpkg.com/@types/mime/-/mime-3.0.1.tgz#5f8f2bca0a5863cb69bc0b0acd88c96cb1d4ae10" @@ -5019,12 +4502,7 @@ dependencies: "@types/node" "*" -"@types/mocha@^10.0.1": - version "10.0.1" - resolved "https://registry.yarnpkg.com/@types/mocha/-/mocha-10.0.1.tgz#2f4f65bb08bc368ac39c96da7b2f09140b26851b" - integrity sha512-/fvYntiO1GeICvqbQ3doGDIP97vWmvFt83GKguJ6prmQM2iXZfFcq6YE8KteFyRtX2/h5Hf91BYvPodJKFYv5Q== - -"@types/mocha@^5.2.5", "@types/mocha@^5.2.7": +"@types/mocha@^5.2.5": version "5.2.7" resolved "https://registry.yarnpkg.com/@types/mocha/-/mocha-5.2.7.tgz#315d570ccb56c53452ff8638738df60726d5b6ea" integrity sha512-NYrtPht0wGzhwe9+/idPaBB+TqkY9AhTvOLMkThm0IoEfLaiVQZwBwyJ5puCkO3AUCWrmcoePjp2mbFocKy4SQ== @@ -5039,29 +4517,11 @@ resolved "https://registry.yarnpkg.com/@types/ms/-/ms-0.7.31.tgz#31b7ca6407128a3d2bbc27fe2d21b345397f6197" integrity sha512-iiUgKzV9AuaEkZqkOLDIvlQiL6ltuZd9tGcW3gwpnX8JbuiuhFlEGmmFXEXkN50Cvq7Os88IY2v0dkDqXYWVgA== -"@types/node-fetch@^2.5.7": - version "2.6.3" - resolved "https://registry.yarnpkg.com/@types/node-fetch/-/node-fetch-2.6.3.tgz#175d977f5e24d93ad0f57602693c435c57ad7e80" - integrity sha512-ETTL1mOEdq/sxUtgtOhKjyB2Irra4cjxksvcMUR5Zr4n+PxVhsCD9WS46oPbHL3et9Zde7CNRr+WUNlcHvsX+w== - dependencies: - "@types/node" "*" - form-data "^3.0.0" - -"@types/node@*", "@types/node@>=12.12.47", "@types/node@>=13.7.0", "@types/node@^18.7.16": +"@types/node@*", "@types/node@^18.7.16": version "18.15.13" resolved "https://registry.yarnpkg.com/@types/node/-/node-18.15.13.tgz#f64277c341150c979e42b00e4ac289290c9df469" integrity sha512-N+0kuo9KgrUQ1Sn/ifDXsvg0TTleP7rIy4zOBGECxAljqvqfqpTfzx0Q1NUedOixRMBfe2Whhb056a42cWs26Q== -"@types/node@11.11.6": - version "11.11.6" - resolved "https://registry.yarnpkg.com/@types/node/-/node-11.11.6.tgz#df929d1bb2eee5afdda598a41930fe50b43eaa6a" - integrity sha512-Exw4yUWMBXM3X+8oqzJNRqZSwUAaS4+7NdvHqQuFi/d+synz++xmX3QIf+BFqneW8N31R8Ky+sikfZUXq07ggQ== - -"@types/node@12.12.54": - version "12.12.54" - resolved "https://registry.yarnpkg.com/@types/node/-/node-12.12.54.tgz#a4b58d8df3a4677b6c08bfbc94b7ad7a7a5f82d1" - integrity sha512-ge4xZ3vSBornVYlDnk7yZ0gK6ChHf/CHB7Gl1I0Jhah8DDnEQqBzgohYG4FX4p81TNirSETOiSyn+y1r9/IR6w== - "@types/node@18.7.16": version "18.7.16" resolved "https://registry.yarnpkg.com/@types/node/-/node-18.7.16.tgz#0eb3cce1e37c79619943d2fd903919fc30850601" @@ -5092,11 +4552,6 @@ resolved "https://registry.yarnpkg.com/@types/normalize-package-data/-/normalize-package-data-2.4.1.tgz#d3357479a0fdfdd5907fe67e17e0a85c906e1301" integrity sha512-Gj7cI7z+98M282Tqmp2K5EIsoouUEzbBJhQQzDE3jSIRk6r9gsz0oUokqIUR4u1R3dMHo0pDHM7sNOHyhulypw== -"@types/p-defer@^1.0.3": - version "1.0.3" - resolved "https://registry.yarnpkg.com/@types/p-defer/-/p-defer-1.0.3.tgz#786ce79c86f779fcd9e9bec4f1fbd1167aeac064" - integrity sha512-0CK39nXek0mSZL/lnGYjhcR1QLAxg9N0/5S1BvU+MQwjlP4Jd2ebbEkJ/bEUqYMAvKLMZcGd4sJE13dnUKlDnQ== - "@types/parse-json@^4.0.0": version "4.0.0" resolved "https://registry.yarnpkg.com/@types/parse-json/-/parse-json-4.0.0.tgz#2f8bb441434d163b35fb8ffdccd7138927ffb8c0" @@ -5109,15 +4564,6 @@ dependencies: "@types/node" "*" -"@types/pg@^7.14.3": - version "7.14.11" - resolved "https://registry.yarnpkg.com/@types/pg/-/pg-7.14.11.tgz#daf5555504a1f7af4263df265d91f140fece52e3" - integrity sha512-EnZkZ1OMw9DvNfQkn2MTJrwKmhJYDEs5ujWrPfvseWNoI95N8B4HzU/Ltrq5ZfYxDX/Zg8mTzwr6UAyTjjFvXA== - dependencies: - "@types/node" "*" - pg-protocol "^1.2.0" - pg-types "^2.2.0" - "@types/prettier@^1.13.2": version "1.19.1" resolved "https://registry.yarnpkg.com/@types/prettier/-/prettier-1.19.1.tgz#33509849f8e679e4add158959fdb086440e9553f" @@ -5141,16 +4587,6 @@ "@types/node" "*" "@types/revalidator" "*" -"@types/prompts@^1.1.1": - version "1.2.0" - resolved "https://registry.yarnpkg.com/@types/prompts/-/prompts-1.2.0.tgz#891e73f735ad5e82e8adae3a99424128e105fb62" - integrity sha512-7JXpT2rSd4hqd2oBWU1wfEW6x6gX+qPH+gLzGEx+My3wcb67K9Rc02xNQRVn67phusmXm5Yqn4oTP2OW1G5zdQ== - -"@types/prop-types@*": - version "15.7.5" - resolved "https://registry.yarnpkg.com/@types/prop-types/-/prop-types-15.7.5.tgz#5f19d2b85a98e9558036f6a3cacc8819420f05cf" - integrity sha512-JCB8C6SnDoQf0cNycqd/35A7MjcnK+ZTqE7judS6o7utxUCg6imJg3QK2qzHKszlTjcj2cn+NwMB2i96ubpj7w== - "@types/qs@*", "@types/qs@^6.2.31": version "6.9.7" resolved "https://registry.yarnpkg.com/@types/qs/-/qs-6.9.7.tgz#63bb7d067db107cc1e457c303bc25d511febf6cb" @@ -5161,15 +4597,6 @@ resolved "https://registry.yarnpkg.com/@types/range-parser/-/range-parser-1.2.4.tgz#cd667bcfdd025213aafb7ca5915a932590acdcdc" integrity sha512-EEhsLsD6UsDM1yFhAvy0Cjr6VwmpMWqFBCb9w07wVugF7w9nfajxLuVmngTIpgS6svCnm6Vaw+MZhoDCKnOfsw== -"@types/react@*": - version "18.0.38" - resolved "https://registry.yarnpkg.com/@types/react/-/react-18.0.38.tgz#02a23bef8848b360a0d1dceef4432c15c21c600c" - integrity sha512-ExsidLLSzYj4cvaQjGnQCk4HFfVT9+EZ9XZsQ8Hsrcn8QNgXtpZ3m9vSIC2MWtx7jHictK6wYhQgGh6ic58oOw== - dependencies: - "@types/prop-types" "*" - "@types/scheduler" "*" - csstype "^3.0.2" - "@types/readable-stream@^2.3.13": version "2.3.15" resolved "https://registry.yarnpkg.com/@types/readable-stream/-/readable-stream-2.3.15.tgz#3d79c9ceb1b6a57d5f6e6976f489b9b5384321ae" @@ -5178,16 +4605,6 @@ "@types/node" "*" safe-buffer "~5.1.1" -"@types/request@^2.48.1": - version "2.48.8" - resolved "https://registry.yarnpkg.com/@types/request/-/request-2.48.8.tgz#0b90fde3b655ab50976cb8c5ac00faca22f5a82c" - integrity sha512-whjk1EDJPcAR2kYHRbFl/lKeeKYTi05A15K9bnLInCVroNDCtXce57xKdI0/rQaA3K+6q0eFyUBPmqfSndUZdQ== - dependencies: - "@types/caseless" "*" - "@types/node" "*" - "@types/tough-cookie" "*" - form-data "^2.5.0" - "@types/resolve@^0.0.8": version "0.0.8" resolved "https://registry.yarnpkg.com/@types/resolve/-/resolve-0.0.8.tgz#f26074d238e02659e323ce1a13d041eee280e194" @@ -5207,11 +4624,6 @@ resolved "https://registry.yarnpkg.com/@types/revalidator/-/revalidator-0.3.8.tgz#86e0b03b49736000ad42ce6b002725e74c6805ff" integrity sha512-q6KSi3PklLGQ0CesZ/XuLwly4DXXlnJuucYOG9lrBqrP8rKiuPZThav2h2+pFjaheNpnT0qKK3i304QWIePeJw== -"@types/scheduler@*": - version "0.16.3" - resolved "https://registry.yarnpkg.com/@types/scheduler/-/scheduler-0.16.3.tgz#cef09e3ec9af1d63d2a6cc5b383a737e24e6dcf5" - integrity sha512-5cJ8CB4yAx7BH1oMvdU0Jh9lrEXyPkar6F9G/ERswkCuvP4KQZfZkSjcMbAICCpQTN4OuZn8tz0HiKv9TGZgrQ== - "@types/secp256k1@^4.0.1": version "4.0.3" resolved "https://registry.yarnpkg.com/@types/secp256k1/-/secp256k1-4.0.3.tgz#1b8e55d8e00f08ee7220b4d59a6abe89c37a901c" @@ -5250,59 +4662,20 @@ "@types/mime" "*" "@types/node" "*" -"@types/solidity-parser-antlr@^0.2.3": - version "0.2.3" - resolved "https://registry.yarnpkg.com/@types/solidity-parser-antlr/-/solidity-parser-antlr-0.2.3.tgz#bb2d9c6511bf483afe4fc3e2714da8a924e59e3f" - integrity sha512-FoSyZT+1TTaofbEtGW1oC9wHND1YshvVeHerME/Jh6gIdHbBAWFW8A97YYqO/dpHcFjIwEPEepX0Efl2ckJgwA== - "@types/stack-utils@^2.0.0": version "2.0.1" resolved "https://registry.yarnpkg.com/@types/stack-utils/-/stack-utils-2.0.1.tgz#20f18294f797f2209b5f65c8e3b5c8e8261d127c" integrity sha512-Hl219/BT5fLAaz6NDkSuhzasy49dwQS/DSdu4MdggFB8zcXv7vflBI3xp7FEmkmdDkBUI2bPUNeMttp2knYdxw== -"@types/string-hash@^1.1.1": - version "1.1.3" - resolved "https://registry.yarnpkg.com/@types/string-hash/-/string-hash-1.1.3.tgz#8d9a73cf25574d45daf11e3ae2bf6b50e69aa212" - integrity sha512-p6skq756fJWiA59g2Uss+cMl6tpoDGuCBuxG0SI1t0NwJmYOU66LAMS6QiCgu7cUh3/hYCaMl5phcCW1JP5wOA== - -"@types/tar-fs@*": - version "2.0.4" - resolved "https://registry.yarnpkg.com/@types/tar-fs/-/tar-fs-2.0.4.tgz#7c7502d281d436db0ad0f78282acef71da02a292" - integrity sha512-ipPec0CjTmVDWE+QKr9cTmIIoTl7dFG/yARCM5MqK8i6CNLIG1P8x4kwDsOQY1ChZOZjH0wO9nvfgBvWl4R3kA== - dependencies: - "@types/node" "*" - "@types/tar-stream" "*" - -"@types/tar-stream@*": - version "2.2.2" - resolved "https://registry.yarnpkg.com/@types/tar-stream/-/tar-stream-2.2.2.tgz#be9d0be9404166e4b114151f93e8442e6ab6fb1d" - integrity sha512-1AX+Yt3icFuU6kxwmPakaiGrJUwG44MpuiqPg4dSolRFk6jmvs4b3IbUol9wKDLIgU76gevn3EwE8y/DkSJCZQ== - dependencies: - "@types/node" "*" - -"@types/targz@^1.0.0": - version "1.0.4" - resolved "https://registry.yarnpkg.com/@types/targz/-/targz-1.0.4.tgz#bf78d46b564ac1a2527532c915892a96ddf1ed01" - integrity sha512-4i2weIjweWsnrvutLH7dM/+FPVSFSqxb+XKWo61tAiHxyYYHveImqys5JijMboKJz+jhFu24SlFrdVAB0xAMIw== - dependencies: - "@types/tar-fs" "*" - -"@types/through@*": - version "0.0.30" - resolved "https://registry.yarnpkg.com/@types/through/-/through-0.0.30.tgz#e0e42ce77e897bd6aead6f6ea62aeb135b8a3895" - integrity sha512-FvnCJljyxhPM3gkRgWmxmDZyAQSiBQQWLI0A0VFL0K7W1oRUrPJSqNO0NvTnLkBcotdlp3lKvaT0JrnyRDkzOg== - dependencies: - "@types/node" "*" - "@types/tmp@^0.1.0": version "0.1.0" resolved "https://registry.yarnpkg.com/@types/tmp/-/tmp-0.1.0.tgz#19cf73a7bcf641965485119726397a096f0049bd" integrity sha512-6IwZ9HzWbCq6XoQWhxLpDjuADodH/MKXRUIDFudvgjcVdjFknvmR+DNsoUeer4XPrEnrZs04Jj+kfV9pFsrhmA== -"@types/tough-cookie@*": - version "4.0.2" - resolved "https://registry.yarnpkg.com/@types/tough-cookie/-/tough-cookie-4.0.2.tgz#6286b4c7228d58ab7866d19716f3696e03a09397" - integrity sha512-Q5vtl1W5ue16D+nIaW8JWebSSraJVlK+EthKn7e7UcD4KWsaSJ8BqGPXNaPghgtcn/fhvrN17Tv8ksUsQpiplw== +"@types/underscore@*": + version "1.13.0" + resolved "https://registry.yarnpkg.com/@types/underscore/-/underscore-1.13.0.tgz#dd8c034a92e5b8e24650c31af43d807c5340cee4" + integrity sha512-L6LBgy1f0EFQZ+7uSA57+n2g/s4Qs5r06Vwrwn0/nuK1de+adz00NWaztRQ30aEqw5qOaWbPI8u2cGQ52lj6VA== "@types/utf8@^2.1.6": version "2.1.6" @@ -5316,6 +4689,14 @@ dependencies: "@types/ethereum-protocol" "*" +"@types/web3@1.0.20": + version "1.0.20" + resolved "https://registry.yarnpkg.com/@types/web3/-/web3-1.0.20.tgz#234dd1f976702c0daaff147c80f24a5582e09d0e" + integrity sha512-KTDlFuYjzCUlBDGt35Ir5QRtyV9klF84MMKUsEJK10sTWga/71V+8VYLT7yysjuBjaOx2uFYtIWNGoz3yrNDlg== + dependencies: + "@types/bn.js" "*" + "@types/underscore" "*" + "@types/web3@^1.0.18": version "1.2.2" resolved "https://registry.yarnpkg.com/@types/web3/-/web3-1.2.2.tgz#d95a101547ce625c5ebd0470baa5dbd4b9f3c015" @@ -5328,11 +4709,6 @@ resolved "https://registry.yarnpkg.com/@types/yargs-parser/-/yargs-parser-21.0.0.tgz#0c60e537fa790f5f9472ed2776c2b71ec117351b" integrity sha512-iO9ZQHkZxHn4mSakYV0vFHAVDyEOIJQrV2uZ06HxEPcx+mt8swXoZHIbaaJ2crJYFfErySgktuTZ3BeLz+XmFA== -"@types/yargs@^11.0.0": - version "11.1.8" - resolved "https://registry.yarnpkg.com/@types/yargs/-/yargs-11.1.8.tgz#b730ecb2bde209d12194cdf8bf9f12c4bd21965a" - integrity sha512-49Pmk3GBUOrs/ZKJodGMJeEeiulv2VdfAYpGgkTCSXpNWx7KCX36+PbrkItwzrjTDHO2QoEZDpbhFoMN1lxe9A== - "@types/yargs@^13.0.2": version "13.0.12" resolved "https://registry.yarnpkg.com/@types/yargs/-/yargs-13.0.12.tgz#d895a88c703b78af0465a9de88aa92c61430b092" @@ -5499,11 +4875,6 @@ abbrev@1, abbrev@^1.0.0: resolved "https://registry.yarnpkg.com/abbrev/-/abbrev-1.1.1.tgz#f8f2c887ad10bf67f634f005b6987fed3179aac8" integrity sha512-nne9/IiQ/hzIhY6pdDnbBtz7DjPTKrY00P/zvPSm5pOFkl6xuGrGnXn/VtTNNfNtAfZ9/1RtehkszU9qcTii0Q== -abbrev@1.0.x: - version "1.0.9" - resolved "https://registry.yarnpkg.com/abbrev/-/abbrev-1.0.9.tgz#91b4792588a7738c25f35dd6f63752a2f8776135" - integrity sha512-LEyx4aLEC3x6T0UguF6YILf+ntvmOaWsVfENmIW0E9H09vKlLDGelMjjSm0jkDHALj8A8quZ/HapKNigzwge+Q== - abi-to-sol@^0.7.0: version "0.7.1" resolved "https://registry.yarnpkg.com/abi-to-sol/-/abi-to-sol-0.7.1.tgz#76d55c35ab2932fda26c224e4194987517324a19" @@ -5530,6 +4901,16 @@ abitype@0.9.8: resolved "https://registry.yarnpkg.com/abitype/-/abitype-0.9.8.tgz#1f120b6b717459deafd213dfbf3a3dd1bf10ae8c" integrity sha512-puLifILdm+8sjyss4S+fsUN09obiT1g2YW6CtcQF+QDzxR0euzgEB29MZujC6zMk2a6SVmtttq1fc6+YFA7WYQ== +abitype@1.0.8: + version "1.0.8" + resolved "https://registry.yarnpkg.com/abitype/-/abitype-1.0.8.tgz#3554f28b2e9d6e9f35eb59878193eabd1b9f46ba" + integrity sha512-ZeiI6h3GnW06uYDLx0etQtX/p8E24UaHHBj57RSjK7YBFe7iuVn07EDpOeP451D06sF27VOz9JJPlIKJmXgkEg== + +abitype@^1.0.6: + version "1.2.3" + resolved "https://registry.yarnpkg.com/abitype/-/abitype-1.2.3.tgz#bec3e09dea97d99ef6c719140bee663a329ad1f4" + integrity sha512-Ofer5QUnuUdTFsBRwARMoWKOH1ND5ehwYhJ3OJ/BQO+StkwQjHw0XyVh4vDttzHB7QOFhPHa/o413PJ82gU/Tg== + abort-controller@3.0.0, abort-controller@^3.0.0: version "3.0.0" resolved "https://registry.yarnpkg.com/abort-controller/-/abort-controller-3.0.0.tgz#eaf54d53b62bae4138e809ca225c8439a6efb392" @@ -5537,14 +4918,7 @@ abort-controller@3.0.0, abort-controller@^3.0.0: dependencies: event-target-shim "^5.0.0" -abort-controller@^2.0.2: - version "2.0.3" - resolved "https://registry.yarnpkg.com/abort-controller/-/abort-controller-2.0.3.tgz#b174827a732efadff81227ed4b8d1cc569baf20a" - integrity sha512-EPSq5wr2aFyAZ1PejJB32IX9Qd4Nwus+adnp7STYFM5/23nLPBazqZ1oor6ZqbH+4otaaGXTlC8RN5hq3C8w9Q== - dependencies: - event-target-shim "^5.0.0" - -abortcontroller-polyfill@^1.1.9, abortcontroller-polyfill@^1.7.3, abortcontroller-polyfill@^1.7.5: +abortcontroller-polyfill@^1.7.3: version "1.7.5" resolved "https://registry.yarnpkg.com/abortcontroller-polyfill/-/abortcontroller-polyfill-1.7.5.tgz#6738495f4e901fbb57b6c0611d0c75f76c485bed" integrity sha512-JMJ5soJWP18htbbxJjG7bG6yuI6pRhgJ0scHHTfkUjf6wjP912xZWvM+A4sJK3gqd9E8fcPbDnOefbA9Th/FIQ== @@ -5672,13 +5046,6 @@ agent-base@6, agent-base@^6.0.2: dependencies: debug "4" -agent-base@^4.3.0: - version "4.3.0" - resolved "https://registry.yarnpkg.com/agent-base/-/agent-base-4.3.0.tgz#8165f01c436009bccad0b1d122f05ed770efc6ee" - integrity sha512-salcGninV0nPrwpGNn4VTXBb1SOuXQBiqbrNXoeizJsHrsL6ERFM2Ne3JUSBWRE6aeNJI2ROP/WEEIDUiDe3cg== - dependencies: - es6-promisify "^5.0.0" - agentkeepalive@^4.2.1: version "4.3.0" resolved "https://registry.yarnpkg.com/agentkeepalive/-/agentkeepalive-4.3.0.tgz#bb999ff07412653c1803b3ced35e50729830a255" @@ -5733,18 +5100,6 @@ ajv@^8.0.1: require-from-string "^2.0.2" uri-js "^4.4.1" -algebrite@^0.2.23: - version "0.2.23" - resolved "https://registry.yarnpkg.com/algebrite/-/algebrite-0.2.23.tgz#e0490d7c6ea7ddb0dc46fc98d1026415dfe2d853" - integrity sha512-9eLHJVW3QqIZMq679gqbqEl/DXux4KvSWFyvLJ75ovl92RG3lm5mkL5UmjKcUveUH5WF1/TImP8UqiWbRZJTKg== - dependencies: - big-integer "^1.6.15" - -amdefine@>=0.0.4: - version "1.0.1" - resolved "https://registry.yarnpkg.com/amdefine/-/amdefine-1.0.1.tgz#4a5282ac164729e93619bcfd3ad151f817ce91f5" - integrity sha512-S2Hw0TtNkMJhIabBwIojKL9YHO5T0n5eNqWJ7Lrlel/zDbftQpxpapi8tZs3X1HWa+u+QeydGmzzNU0m09+Rcg== - ansi-colors@4.1.1: version "4.1.1" resolved "https://registry.yarnpkg.com/ansi-colors/-/ansi-colors-4.1.1.tgz#cbb9ae256bf750af1eab344f229aa27fe94ba348" @@ -5838,7 +5193,7 @@ antlr4ts@^0.5.0-alpha.4: resolved "https://registry.yarnpkg.com/antlr4ts/-/antlr4ts-0.5.0-alpha.4.tgz#71702865a87478ed0b40c0709f422cf14d51652a" integrity sha512-WPQDt1B74OfPv/IMS2ekXAKkTZIHl88uMetg6q3OTqgFxZ/dxDXI0EWLyZid/1Pe6hTftyg5N7gel5wNAGxXyQ== -any-promise@1.3.0, any-promise@^1.0.0: +any-promise@1.3.0: version "1.3.0" resolved "https://registry.yarnpkg.com/any-promise/-/any-promise-1.3.0.tgz#abc6afeedcea52e809cdc0376aed3ce39635d17f" integrity sha512-7UvmKalWRt1wgjL1RrGxoSJW/0QZFIegpeGvZG9kjp8vrRu55XTHbwnqq2GpXm9uLbcuhxm3IqX9OB4MZR1b2A== @@ -5956,11 +5311,6 @@ app-module-path@^2.2.0: resolved "https://registry.yarnpkg.com/app-module-path/-/app-module-path-2.2.0.tgz#641aa55dfb7d6a6f0a8141c4b9c0aa50b6c24dd5" integrity sha512-gkco+qxENJV+8vFcDiiFhuoSvRXb2a/QPqpSoWhVz829VNJfOTnELbBmPmNKFxf3xdNnw4DWCkzkDaavcX/1YQ== -aproba@^1.0.3: - version "1.2.0" - resolved "https://registry.yarnpkg.com/aproba/-/aproba-1.2.0.tgz#6802e6264efd18c790a1b0d517f0f2627bf2c94a" - integrity sha512-Y9J6ZjXtoYh8RnXVCMOU/ttDmk1aBjunq9vO0ta5x85WDQiQfUF9sIPBITdbiiIVcBo03Hi3jMxigBtsddlXRw== - "aproba@^1.0.3 || ^2.0.0", aproba@^2.0.0: version "2.0.0" resolved "https://registry.yarnpkg.com/aproba/-/aproba-2.0.0.tgz#52520b8ae5b569215b354efc0caa3fe1e45a8adc" @@ -5974,14 +5324,6 @@ are-we-there-yet@^3.0.0: delegates "^1.0.0" readable-stream "^3.6.0" -are-we-there-yet@~1.1.2: - version "1.1.7" - resolved "https://registry.yarnpkg.com/are-we-there-yet/-/are-we-there-yet-1.1.7.tgz#b15474a932adab4ff8a50d9adfa7e4e926f21146" - integrity sha512-nxwy40TuMiUGqMyRHgCSWZ9FM4VAoRP4xUYSTv5ImRog+h9yISPbVH7H8fASCIzYn9wlEv4zvFL7uKDMCFQm3g== - dependencies: - delegates "^1.0.0" - readable-stream "^2.0.6" - arg@^4.1.0: version "4.1.3" resolved "https://registry.yarnpkg.com/arg/-/arg-4.1.3.tgz#269fc7ad5b8e42cb63c896d5666017261c144089" @@ -6164,12 +5506,12 @@ arraybuffer.prototype.slice@^1.0.2: is-array-buffer "^3.0.2" is-shared-array-buffer "^1.0.2" -arrify@^1.0.0, arrify@^1.0.1: +arrify@^1.0.1: version "1.0.1" resolved "https://registry.yarnpkg.com/arrify/-/arrify-1.0.1.tgz#898508da2226f380df904728456849c1501a4b0d" integrity sha512-3CYzex9M9FGQjCGMGyi6/31c8GJbgb0qGyrx5HWxPd0aCwh4cB2YjMb2Xf9UuoogrMrlO9cTqnB5rI5GHZTcUA== -arrify@^2.0.0, arrify@^2.0.1: +arrify@^2.0.1: version "2.0.1" resolved "https://registry.yarnpkg.com/arrify/-/arrify-2.0.1.tgz#c9655e9331e0abcd588d2a7cad7e9956f66701fa" integrity sha512-3duEwti880xqi4eAMN8AyR4a0ByT90zoYdLlevfrvU43vb0YZwZVfxOgxWrLXXXpyugL0hNZc9G6BiB5B3nUug== @@ -6179,14 +5521,6 @@ asap@^2.0.0, asap@~2.0.6: resolved "https://registry.yarnpkg.com/asap/-/asap-2.0.6.tgz#e50347611d7e690943208bbdafebcbc2fb866d46" integrity sha512-BSHWgDSAiKs50o2Re8ppvp3seVHXSRM44cdSsT9FfNEUUZLOGWVCsiWaRPWM1Znn+mqZ1OfVZ3z3DWEzSp7hRA== -ascli@~1: - version "1.0.1" - resolved "https://registry.yarnpkg.com/ascli/-/ascli-1.0.1.tgz#bcfa5974a62f18e81cabaeb49732ab4a88f906bc" - integrity sha512-JGQaNxpaCJz9Bd1JvVaFIHuWn9S+l3xhN17R0V/vmUDiGE0QngNMXhjlqpwqV+91plWz9Fg+Lt28Lj7p5vjs8A== - dependencies: - colour "~0.7.1" - optjs "~3.2.2" - asn1.js@^5.2.0: version "5.4.1" resolved "https://registry.yarnpkg.com/asn1.js/-/asn1.js-5.4.1.tgz#11a980b84ebb91781ce35b0fdc2ee294e3783f07" @@ -6229,11 +5563,6 @@ astral-regex@^2.0.0: resolved "https://registry.yarnpkg.com/astral-regex/-/astral-regex-2.0.0.tgz#483143c567aeed4785759c0865786dc77d7d2e31" integrity sha512-Z7tMw1ytTXt5jqMcOP+OQteU1VuNK9Y02uuJtKQ1Sv69jXQKKg5cibLwGJow8yzZP+eAc18EmLGPal0bp36rvQ== -async-each@^1.0.1: - version "1.0.6" - resolved "https://registry.yarnpkg.com/async-each/-/async-each-1.0.6.tgz#52f1d9403818c179b7561e11a5d1b77eb2160e77" - integrity sha512-c646jH1avxr+aVpndVMeAfYw7wAa6idufrlN3LPA4PmKS0QEGp6PIC9nwz0WQkkvBGAMEki3pFdtxaF39J9vvg== - async-eventemitter@0.2.4, async-eventemitter@^0.2.2: version "0.2.4" resolved "https://registry.yarnpkg.com/async-eventemitter/-/async-eventemitter-0.2.4.tgz#f5e7c8ca7d3e46aab9ec40a292baf686a0bafaca" @@ -6260,7 +5589,7 @@ async-retry@^1.2.1: dependencies: retry "0.13.1" -async@1.x, async@^1.3.0, async@^1.4.2: +async@^1.3.0, async@^1.4.2: version "1.5.2" resolved "https://registry.yarnpkg.com/async/-/async-1.5.2.tgz#ec6a61ae56480c0c3cb241c95618e20892f9672a" integrity sha512-nSVgobk4rv61R9PUSDtYt7mPVB2olxNR5RWJcAsH676/ef11bUZwvu7+RGYrYauVdDPcO519v68wRhXQtxsV9w== @@ -6488,7 +5817,7 @@ base-x@^3.0.2, base-x@^3.0.8: dependencies: safe-buffer "^5.0.1" -base64-js@^1.3.0, base64-js@^1.3.1: +base64-js@^1.3.1: version "1.5.1" resolved "https://registry.yarnpkg.com/base64-js/-/base64-js-1.5.1.tgz#1b1b440160a5bf7ad40b650f095963481903930a" integrity sha512-AKpaYlHn8t4SVbOHCy+b5+KKgvR4vrsD8vbvrbiQJps7fKDTkjkDry6ji0rUJjC0kzbNePLwzxq8iypo41qeWA== @@ -6528,7 +5857,7 @@ big-integer@1.6.36: resolved "https://registry.yarnpkg.com/big-integer/-/big-integer-1.6.36.tgz#78631076265d4ae3555c04f85e7d9d2f3a071a36" integrity sha512-t70bfa7HYEA1D9idDbmuv7YbsbVkQ+Hp+8KFSul4aE5e/i1bjCNIRYJZlA8Q8p0r9T8cF/RVvwUgRA//FydEyg== -big-integer@^1.6.15, big-integer@^1.6.44: +big-integer@^1.6.44: version "1.6.51" resolved "https://registry.yarnpkg.com/big-integer/-/big-integer-1.6.51.tgz#0df92a5d9880560d3ff2d5fd20245c889d130686" integrity sha512-GPEid2Y9QU1Exl1rpO9B2IPJGHPSupF5GnVIP0blYvNOMer2bTvSWs1jGOUg04hTmu67nmLsQ9TBo1puaotBHg== @@ -6538,17 +5867,12 @@ big.js@^6.0.3: resolved "https://registry.yarnpkg.com/big.js/-/big.js-6.2.1.tgz#7205ce763efb17c2e41f26f121c420c6a7c2744f" integrity sha512-bCtHMwL9LeDIozFn+oNhhFoq+yQ3BNdnsLSASUxLciOb1vgvpHsIO1dsENiGMgbb4SkP5TrzWzRiLddn8ahVOQ== -bigi@^1.1.0: - version "1.4.2" - resolved "https://registry.yarnpkg.com/bigi/-/bigi-1.4.2.tgz#9c665a95f88b8b08fc05cfd731f561859d725825" - integrity sha512-ddkU+dFIuEIW8lE7ZwdIAf2UPoM90eaprg5m3YXAVVTmKlqV/9BX4A2M8BOK2yOq6/VgZFVhK6QAxJebhlbhzw== - bigint-crypto-utils@^3.0.23: version "3.2.2" resolved "https://registry.yarnpkg.com/bigint-crypto-utils/-/bigint-crypto-utils-3.2.2.tgz#e30a49ec38357c6981cd3da5aaa6480b1f752ee4" integrity sha512-U1RbE3aX9ayCUVcIPHuPDPKcK3SFOXf93J1UK/iHlJuQB7bhagPIX06/CLpLEsDThJ7KA4Dhrnzynl+d2weTiw== -bignumber.js@7.2.1, bignumber.js@9.0.0, bignumber.js@9.1.0, bignumber.js@^7.2.0, bignumber.js@^7.2.1, bignumber.js@^9.0.0, bignumber.js@^9.0.1, "bignumber.js@git+https://github.com/debris/bignumber.js#master", "bignumber.js@git+https://github.com/debris/bignumber.js.git#94d7146671b9719e00a09c29b01a691bc85048c2", bignumber.js@~9.0.2: +bignumber.js@7.2.1, bignumber.js@9.0.0, bignumber.js@9.1.0, bignumber.js@^7.2.0, bignumber.js@^7.2.1, bignumber.js@^9.0.0, bignumber.js@^9.0.1, "bignumber.js@git+https://github.com/debris/bignumber.js#master", "bignumber.js@git+https://github.com/debris/bignumber.js.git#94d7146671b9719e00a09c29b01a691bc85048c2": version "9.0.0" resolved "https://registry.yarnpkg.com/bignumber.js/-/bignumber.js-9.0.0.tgz#805880f84a329b5eac6e7cb6f8274b6d82bdf075" integrity sha512-t/OYhhJ2SD+YGBQcjY8GzzDHEk9f3nerxjtfa6tlMXfe7frs/WozhvCNoGvpM0P3bNf3Gq5ZRMlGr5f3r4/N8A== @@ -6570,40 +5894,18 @@ binary-extensions@^2.0.0: resolved "https://registry.yarnpkg.com/binary-extensions/-/binary-extensions-2.2.0.tgz#75f502eeaf9ffde42fc98829645be4ea76bd9e2d" integrity sha512-jDctJ/IVQbZoJykoeHbhXpOlNBqGNcwXJKJog42E5HDPUwQTSdjCHdihjj0DlnheQ7blbT6dHOafNAiS8ooQKA== -bindings@^1.2.1, bindings@^1.5.0: +bindings@^1.2.1: version "1.5.0" resolved "https://registry.yarnpkg.com/bindings/-/bindings-1.5.0.tgz#10353c9e945334bc0511a6d90b38fbc7c9c504df" integrity sha512-p2q/t/mhvuOj/UeLlV6566GD/guowlr0hHxClI0W9m7MWYkL1F0hLo+0Aexs9HSPCtR1SXQ0TD3MMKrXZajbiQ== dependencies: file-uri-to-path "1.0.0" -bip32@3.1.0: +bip39@^2.2.0, "bip39@https://github.com/bitcoinjs/bip39#a7ecbfe2e60d0214ce17163d610cad9f7b23140c": version "3.1.0" - resolved "https://registry.yarnpkg.com/bip32/-/bip32-3.1.0.tgz#ce90e020d0e6b41e891a0122ff053efabcce1ccc" - integrity sha512-eoeajYEzJ4d6yyVtby8C+XkCeKItiC4Mx56a0M9VaqTMC73SWOm4xVZG7SaR8e/yp4eSyky2XcBpH3DApPdu7Q== - dependencies: - bs58check "^2.1.1" - create-hash "^1.2.0" - create-hmac "^1.1.7" - ripemd160 "^2.0.2" - typeforce "^1.11.5" - wif "^2.0.6" - -bip39@2.5.0, bip39@^2.2.0, bip39@^2.5.0, "bip39@https://github.com/bitcoinjs/bip39#d8ea080a18b40f301d4e2219a2991cd2417e83c2": - version "3.0.3" - resolved "https://github.com/bitcoinjs/bip39#d8ea080a18b40f301d4e2219a2991cd2417e83c2" - dependencies: - "@types/node" "11.11.6" - create-hash "^1.1.0" - pbkdf2 "^3.0.9" - randombytes "^2.0.1" - -bip66@^1.1.5: - version "1.1.5" - resolved "https://registry.yarnpkg.com/bip66/-/bip66-1.1.5.tgz#01fa8748785ca70955d5011217d1b3139969ca22" - integrity sha512-nemMHz95EmS38a26XbbdxIYj5csHd3RMP3H5bwQknX0WYHF01qhpufP42mLOwVICuH2JmhIhXiWs89MfUGL7Xw== + resolved "https://github.com/bitcoinjs/bip39#a7ecbfe2e60d0214ce17163d610cad9f7b23140c" dependencies: - safe-buffer "^5.0.1" + "@noble/hashes" "^1.2.0" bl@^1.0.0: version "1.2.3" @@ -6636,11 +5938,6 @@ blakejs@^1.1.0: resolved "https://registry.yarnpkg.com/blakejs/-/blakejs-1.2.1.tgz#5057e4206eadb4a97f7c0b6e197a505042fc3814" integrity sha512-QXUSXI3QVc/gJME0dBpXrag1kbzOqCjCX8/b54ntNyW6sjtoqxqRk3LTmXzaJoh71zMsDCjM+47jS7XiwN/+fQ== -"blind-threshold-bls@npm:@celo/blind-threshold-bls@1.0.0-beta": - version "1.0.0-beta" - resolved "https://registry.yarnpkg.com/@celo/blind-threshold-bls/-/blind-threshold-bls-1.0.0-beta.tgz#6c46e55c3720d99929d6d34dd3770b1623a09900" - integrity sha512-sk9XLvbv0M0TJKJPHPc8FkIRTfP/PiPHeyKXPBTMZBW8URL4pRix9IfcT98zT5sA7hvMDJwgw3p3tM/L6Z1iGw== - bluebird@^2.9.33: version "2.11.0" resolved "https://registry.yarnpkg.com/bluebird/-/bluebird-2.11.0.tgz#534b9033c022c9579c56ba3b3e5a5caafbb650e1" @@ -6768,7 +6065,7 @@ browser-stdout@1.3.1: resolved "https://registry.yarnpkg.com/browser-stdout/-/browser-stdout-1.3.1.tgz#baa559ee14ced73452229bad7326467c61fabd60" integrity sha512-qhAVI1+Av2X7qelOfAIYwXONood6XlZE/fXaBSmW/T5SzLAmCgzi+eiWE7fUvbHaeNBQH13UftjpXxsfLkMpgw== -browserify-aes@^1.0.0, browserify-aes@^1.0.4, browserify-aes@^1.0.6, browserify-aes@^1.2.0: +browserify-aes@^1.0.0, browserify-aes@^1.0.4, browserify-aes@^1.2.0: version "1.2.0" resolved "https://registry.yarnpkg.com/browserify-aes/-/browserify-aes-1.2.0.tgz#326734642f403dabc3003209853bb70ad428ef48" integrity sha512-+7CHXqGuspUn/Sl5aO7Ea0xWGAtETPXNSAjHo48JfLdPWcMng33Xe4znFvQweqc/uzk5zSOI3H52CYnjCfb5hA== @@ -6839,11 +6136,6 @@ bs-logger@0.x: dependencies: fast-json-stable-stringify "2.x" -bs58@^2.0.1: - version "2.0.1" - resolved "https://registry.yarnpkg.com/bs58/-/bs58-2.0.1.tgz#55908d58f1982aba2008fa1bed8f91998a29bf8d" - integrity sha512-77ld2g7Hn1GyIUpuUVfbZdhO1q9R9gv/GYam4HAeAW/tzhQDrbJ2ZttN1tIe4hmKrWFE+oUtAhBNx/EA5SVdTg== - bs58@^4.0.0, bs58@^4.0.1: version "4.0.1" resolved "https://registry.yarnpkg.com/bs58/-/bs58-4.0.1.tgz#be161e76c354f6f788ae4071f63f34e8c4f0a42a" @@ -6851,7 +6143,7 @@ bs58@^4.0.0, bs58@^4.0.1: dependencies: base-x "^3.0.2" -bs58check@<3.0.0, bs58check@^2.1.1, bs58check@^2.1.2: +bs58check@^2.1.2: version "2.1.2" resolved "https://registry.yarnpkg.com/bs58check/-/bs58check-2.1.2.tgz#53b018291228d82a5aa08e7d796fdafda54aebfc" integrity sha512-0TS1jicxdU09dwJMNZtVAfzPi6Q6QeN0pM1Fkzrjn+XYHvzMKPU3pHVpva+769iNVSfIYWf7LJ6WR+BuuMf8cA== @@ -6910,11 +6202,6 @@ buffer-to-arraybuffer@^0.0.5: resolved "https://registry.yarnpkg.com/buffer-to-arraybuffer/-/buffer-to-arraybuffer-0.0.5.tgz#6064a40fa76eb43c723aba9ef8f6e1216d10511a" integrity sha512-3dthu5CYiVB1DEJp61FtApNnNndTckcqe4pFcLdvHtrpG+kcyekCJKg4MRiDcFW7A6AODnXB9U4dwQiCW5kzJQ== -buffer-writer@2.0.0: - version "2.0.0" - resolved "https://registry.yarnpkg.com/buffer-writer/-/buffer-writer-2.0.0.tgz#ce7eb81a38f7829db09c873f2fbb792c0c98ec04" - integrity sha512-a7ZpuTZU1TRtnwyCNW3I5dc0wWNC3VR9S++Ewyk2HHZdrO3CQJqSpd+95Us590V6AL7JqUAH2IwZ/398PmNFgw== - buffer-xor@^1.0.3: version "1.0.3" resolved "https://registry.yarnpkg.com/buffer-xor/-/buffer-xor-1.0.3.tgz#26e61ed1422fb70dd42e6e36729ed51d855fe8d9" @@ -6981,49 +6268,11 @@ bundle-require@^3.1.2: dependencies: load-tsconfig "^0.2.0" -bunyan-debug-stream@2.0.0: - version "2.0.0" - resolved "https://registry.yarnpkg.com/bunyan-debug-stream/-/bunyan-debug-stream-2.0.0.tgz#b9593e38753f594e3f9db3eb2fdebdc2af147a9f" - integrity sha512-Ovl43CJ7nUwalLzdXc6E1nGIy6ift9Z/QpYXUtsjpDAg35ZFKXifKNZyfpMGuN3N7ijLLqbnxPsMMHsXDdXa9A== - dependencies: - colors "^1.0.3" - exception-formatter "^1.0.4" - -bunyan-debug-stream@^2.0.0: - version "2.0.1" - resolved "https://registry.yarnpkg.com/bunyan-debug-stream/-/bunyan-debug-stream-2.0.1.tgz#9bd7c7e30c7b2cf711317e9d37529b0464c3b164" - integrity sha512-MCEoqggU7NMt7f2O+PU8VkqfSkoQoa4lmN/OWhaRfqFRBF1Se2TOXQyLF6NxC+EtfrdthnquQe8jOe83fpEoGA== - dependencies: - colors "1.4.0" - exception-formatter "^1.0.4" - -bunyan-gke-stackdriver@0.1.2: - version "0.1.2" - resolved "https://registry.yarnpkg.com/bunyan-gke-stackdriver/-/bunyan-gke-stackdriver-0.1.2.tgz#a47e3724bbb324b1ec0b7dc4350c4d7073aae66d" - integrity sha512-eY5OLgAXvOvOq2YpxI0HlV5HjAcLm36Ln3PxxsztO+2GrFSgU3oXoic2LCif/heBKoyOZdMyXKWF5dvswSOS6w== - -bunyan@1.8.12: - version "1.8.12" - resolved "https://registry.yarnpkg.com/bunyan/-/bunyan-1.8.12.tgz#f150f0f6748abdd72aeae84f04403be2ef113797" - integrity sha512-dmDUbGHeGcvCDLRFOscZkwx1ZO/aFz3bJOCi5nCgzdhFGPxwK+y5AcDBnqagNGlJZ7lje/l6JUEz9mQcutttdg== - optionalDependencies: - dtrace-provider "~0.8" - moment "^2.10.6" - mv "~2" - safe-json-stringify "~1" - byte-size@^7.0.0: version "7.0.1" resolved "https://registry.yarnpkg.com/byte-size/-/byte-size-7.0.1.tgz#b1daf3386de7ab9d706b941a748dbfc71130dee3" integrity sha512-crQdqyCwhokxwV1UyDzLZanhkugAgft7vt0qbbdt60C6Zf3CAiGmtUCylbtYwrU6loOUw3euGrNtW1J651ot1A== -bytebuffer@~5: - version "5.0.1" - resolved "https://registry.yarnpkg.com/bytebuffer/-/bytebuffer-5.0.1.tgz#582eea4b1a873b6d020a48d58df85f0bba6cfddd" - integrity sha512-IuzSdmADppkZ6DlpycMkm8l9zeEq16fWtLvunEwFiYciR/BHo4E8/xs5piFquG+Za8OWmMqHF8zuRviz2LHvRQ== - dependencies: - long "~3" - bytes@3.1.2: version "3.1.2" resolved "https://registry.yarnpkg.com/bytes/-/bytes-3.1.2.tgz#8b0beeb98605adf1b128fa4386403c009e0221a5" @@ -7159,11 +6408,6 @@ camelcase-keys@^6.2.2: map-obj "^4.0.0" quick-lru "^4.0.1" -camelcase@^2.0.1: - version "2.1.1" - resolved "https://registry.yarnpkg.com/camelcase/-/camelcase-2.1.1.tgz#7c1d16d679a1bbe59ca02cacecfb011e201f5a1f" - integrity sha512-DLIsRzJVBQu72meAKPkWQOLcujdXT32hwdfnkI1frSiSRMK1MofjKHf+MEx0SB6fjEFXL8fBDv1dKymBlOp4Qw== - camelcase@^3.0.0: version "3.0.0" resolved "https://registry.yarnpkg.com/camelcase/-/camelcase-3.0.0.tgz#32fc4b9fcdaf845fcdf7e73bb97cac2261f0ab0a" @@ -7241,24 +6485,19 @@ cbor@^5.2.0: bignumber.js "^9.0.1" nofilter "^1.0.4" -chai-as-promised@^7.1.0, chai-as-promised@^7.1.1: +chai-as-promised@^7.1.1: version "7.1.1" resolved "https://registry.yarnpkg.com/chai-as-promised/-/chai-as-promised-7.1.1.tgz#08645d825deb8696ee61725dbf590c012eb00ca0" integrity sha512-azL6xMoi+uxu6z4rhWQ1jbdUhOMhis2PvscD/xjLqNMkv3BPPp2JyyuTHOrf9BOosGpNQ11v6BKv/g57RXbiaA== dependencies: check-error "^1.0.2" -chai-bignumber@^3.0.0: - version "3.1.0" - resolved "https://registry.yarnpkg.com/chai-bignumber/-/chai-bignumber-3.1.0.tgz#e196456c760df21f0e124f6df922289ea15a7e4c" - integrity sha512-omxEc80jAU+pZwRmoWr3aEzeLad4JW3iBhLRQlgISvghBdIxrMT7mVAGsDz4WSyCkKowENshH2j9OABAhld7QQ== - chai-subset@^1.6.0: version "1.6.0" resolved "https://registry.yarnpkg.com/chai-subset/-/chai-subset-1.6.0.tgz#a5d0ca14e329a79596ed70058b6646bd6988cfe9" integrity sha512-K3d+KmqdS5XKW5DWPd5sgNffL3uxdDe+6GdnJh3AYPhwnBGRY5urfvfcbRtWIvvpz+KxkL9FeBB6MZewLUNwug== -chai@^4.0.1, chai@^4.3.6, chai@^4.3.7: +chai@^4.3.6: version "4.3.7" resolved "https://registry.yarnpkg.com/chai/-/chai-4.3.7.tgz#ec63f6df01829088e8bf55fca839bcd464a8ec51" integrity sha512-HLnAzZ2iupm25PlN0xFreAlBA5zaBSv3og0DdeGA4Ar6h6rJ3A0rolRUKJhSF2V10GZKDgWF/VmAEsNWjCRB+A== @@ -7402,7 +6641,7 @@ chokidar@3.3.1: optionalDependencies: fsevents "~2.1.2" -chokidar@3.5.3, chokidar@^3.0.2, chokidar@^3.5.3: +chokidar@3.5.3, chokidar@^3.5.3: version "3.5.3" resolved "https://registry.yarnpkg.com/chokidar/-/chokidar-3.5.3.tgz#1cf37c8707b932bd1af1ae22c0432e2acd1903bd" integrity sha512-Dr3sfKRP6oTcjf2JmUmFJfeVMvXBdegxB0iVQ5eb2V10uFJUCAS8OByZdVAyVb8xXNz3GjjTgj9kLWsZTqE6kw== @@ -7417,7 +6656,7 @@ chokidar@3.5.3, chokidar@^3.0.2, chokidar@^3.5.3: optionalDependencies: fsevents "~2.3.2" -chownr@^1.0.1, chownr@^1.1.1, chownr@^1.1.4: +chownr@^1.1.4: version "1.1.4" resolved "https://registry.yarnpkg.com/chownr/-/chownr-1.1.4.tgz#6fc9d7b42d32a583596337666e7d08084da2cc6b" integrity sha512-jJ0bqzaylmJtVnNgzTeSOs8DPavpbYgEr/b0YL8/2GO3xJEhInFmhKMUnEJQjZumK7KXGFhUy89PrsJWlakBVg== @@ -7563,7 +6802,7 @@ cli-width@^3.0.0: resolved "https://registry.yarnpkg.com/cli-width/-/cli-width-3.0.0.tgz#a2f48437a2caa9a22436e794bf071ec9e61cedf6" integrity sha512-FxqpkPPwu1HjuN93Omfm4h8uIanXofW0RxVEW3k5RKx+mJJYSthzNhp32Kzxxy3YAEZ/Dc/EWN1vZRY0+kOhbw== -cliui@^3.0.3, cliui@^3.2.0: +cliui@^3.2.0: version "3.2.0" resolved "https://registry.yarnpkg.com/cliui/-/cliui-3.2.0.tgz#120601537a916d29940f934da3b48d585a39213d" integrity sha512-0yayqDxWQbqk3ojkYqUKqaAQ6AfNKeKWRNA8kR0WXzAsdHpP4BIaOmMAG87JGuO6qcobyW4GjxHd9PmhEd+T9w== @@ -7661,14 +6900,6 @@ code-point-at@^1.0.0: resolved "https://registry.yarnpkg.com/code-point-at/-/code-point-at-1.1.0.tgz#0d070b4d043a5bea33a2f1a40e2edb3d9a4ccf77" integrity sha512-RpAVKQA5T63xEj6/giIbUEtZwJ4UFIc3ZtvEkiaUERylqe8xb5IvqcgOurZLahv93CLKfxcw5YI+DZcUBRyLXA== -coinstring@^2.0.0: - version "2.3.0" - resolved "https://registry.yarnpkg.com/coinstring/-/coinstring-2.3.0.tgz#cdb63363a961502404a25afb82c2e26d5ff627a4" - integrity sha512-2xMhQ++4ETUPiy2oqOlfydsuQArNLB6TExNF33Jmv+IgpmV8Hf6v6yICQAwH4uEHTnkJ3DscSyeKFrg37ljIOw== - dependencies: - bs58 "^2.0.1" - create-hash "^1.1.1" - collect-v8-coverage@^1.0.0: version "1.0.1" resolved "https://registry.yarnpkg.com/collect-v8-coverage/-/collect-v8-coverage-1.0.1.tgz#cc2c8e94fc18bbdffe64d6534570c8a673b27f59" @@ -7708,16 +6939,11 @@ colors@1.0.x: resolved "https://registry.yarnpkg.com/colors/-/colors-1.0.3.tgz#0433f44d809680fdeb60ed260f1b0c262e82a40b" integrity sha512-pFGrxThWcWQ2MsAz6RtgeWe4NK2kUE1WfsrvvlctdII745EW9I0yflqhe7++M5LEc7bV2c/9/5zc8sFcpL0Drw== -colors@1.4.0, colors@^1.0.3, colors@^1.1.2: +colors@1.4.0, colors@^1.1.2: version "1.4.0" resolved "https://registry.yarnpkg.com/colors/-/colors-1.4.0.tgz#c50491479d4c1bdaed2c9ced32cf7c7dc2360f78" integrity sha512-a+UqTh4kgZg/SlGvfbzDHpgRu7AAQOmmqRHJnxhRZICKFUT91brVhNNt58CMWU9PsBbv3PDCZUHbVxuDiH2mtA== -colour@~0.7.1: - version "0.7.1" - resolved "https://registry.yarnpkg.com/colour/-/colour-0.7.1.tgz#9cb169917ec5d12c0736d3e8685746df1cadf778" - integrity sha512-Rel466v0EnmKPcsxHo91L4kgPs/6XF7Pu2LJNszq9lXYwi5CFWEeIiRaTX5ym7PPMdj4udDHkLSVC1//JVkZQg== - columnify@^1.6.0: version "1.6.0" resolved "https://registry.yarnpkg.com/columnify/-/columnify-1.6.0.tgz#6989531713c9008bb29735e61e37acf5bd553cf3" @@ -7790,28 +7016,11 @@ compare-func@^2.0.0: array-ify "^1.0.0" dot-prop "^5.1.0" -compare-versions@^6.0.0: - version "6.1.0" - resolved "https://registry.yarnpkg.com/compare-versions/-/compare-versions-6.1.0.tgz#3f2131e3ae93577df111dba133e6db876ffe127a" - integrity sha512-LNZQXhqUvqUTotpZ00qLSaify3b4VFD588aRr8MKFw4CMUr98ytzCW5wDH5qx/DEY5kCDXcbcRuCqL0szEf2tg== - -complex.js@2.0.11: - version "2.0.11" - resolved "https://registry.yarnpkg.com/complex.js/-/complex.js-2.0.11.tgz#09a873fbf15ffd8c18c9c2201ccef425c32b8bf1" - integrity sha512-6IArJLApNtdg1P1dFtn3dnyzoZBEF0MwMnrfF1exSBRpZYoy4yieMkpZhQDC0uwctw48vii0CFVyHfpgZ/DfGw== - component-emitter@^1.2.0: version "1.3.0" resolved "https://registry.yarnpkg.com/component-emitter/-/component-emitter-1.3.0.tgz#16e4070fba8ae29b679f2215853ee181ab2eabc0" integrity sha512-Rd3se6QB+sO1TwqZjscQrurpEPIfO0/yYnSin6Q/rD3mOutHvUrCAhJub3r90uNb+SESBuE0QYoB90YdfatsRg== -compressible@^2.0.12: - version "2.0.18" - resolved "https://registry.yarnpkg.com/compressible/-/compressible-2.0.18.tgz#af53cca6b070d4c3c0750fbd77286a6d7cc46fba" - integrity sha512-AF3r7P5dWxL8MxyITRMlORQNaOA2IkAFaTr4k7BUumjPtRpGDTZpl0Pb1XCO6JeDCBdp126Cgs9sMxqSjgYyRg== - dependencies: - mime-db ">= 1.43.0 < 2" - concat-map@0.0.1: version "0.0.1" resolved "https://registry.yarnpkg.com/concat-map/-/concat-map-0.0.1.tgz#d8a96bd77fd68df7793a73036a3ba0d5405d477b" @@ -7873,7 +7082,7 @@ configstore@^4.0.0: write-file-atomic "^2.0.0" xdg-basedir "^3.0.0" -console-control-strings@^1.0.0, console-control-strings@^1.1.0, console-control-strings@~1.1.0: +console-control-strings@^1.1.0: version "1.1.0" resolved "https://registry.yarnpkg.com/console-control-strings/-/console-control-strings-1.1.0.tgz#3d7cf4464db6446ea644bf4b39507f9851008e8e" integrity sha512-ty/fTekppD2fIwRvnZAVdeOiGd1c7YXEixbgJTNzqcxJWKQnjJ/V1bNEEE6hygpM3WjwHFUVK6HTjWSzV4a8sQ== @@ -8092,7 +7301,7 @@ create-ecdh@^4.0.0: bn.js "^4.1.0" elliptic "^6.5.3" -create-hash@^1.1.0, create-hash@^1.1.1, create-hash@^1.1.2, create-hash@^1.2.0: +create-hash@^1.1.0, create-hash@^1.1.2, create-hash@^1.2.0: version "1.2.0" resolved "https://registry.yarnpkg.com/create-hash/-/create-hash-1.2.0.tgz#889078af11a63756bcfb59bd221996be3a9ef196" integrity sha512-z00bCGNHDG8mHAkP7CtT1qVu+bFQUPjYq/4Iv3C3kWjTFV10zIjfSoeqXo9Asws8gwSHDGj/hl2u4OGIjapeCg== @@ -8120,19 +7329,13 @@ create-require@^1.1.0: resolved "https://registry.yarnpkg.com/create-require/-/create-require-1.1.1.tgz#c1d7e8f1e5f6cfc9ff65f9cd352d37348756c333" integrity sha512-dcKFX3jn0MpIaXjisoRvexIJVEKzaq7z2rZKxf+MSr9TkdmHmsU4m2lcLojrj/FHl8mk5VxMmYA+ftRkP/3oKQ== -cross-env@^5.1.6: - version "5.2.1" - resolved "https://registry.yarnpkg.com/cross-env/-/cross-env-5.2.1.tgz#b2c76c1ca7add66dc874d11798466094f551b34d" - integrity sha512-1yHhtcfAd1r4nwQgknowuUNfIT9E8dOMMspC36g45dN+iD1blloi7xp8X/xAIDnjHWyt1uQ8PHk2fkNaym7soQ== - dependencies: - cross-spawn "^6.0.5" - -cross-fetch@3.0.6: - version "3.0.6" - resolved "https://registry.yarnpkg.com/cross-fetch/-/cross-fetch-3.0.6.tgz#3a4040bc8941e653e0e9cf17f29ebcd177d3365c" - integrity sha512-KBPUbqgFjzWlVcURG+Svp9TlhA5uliYtiNx/0r8nv0pdypeQCRJ9IaSIc3q/x3q8t3F75cHuwxVql1HFGHCNJQ== +cross-fetch@^2.1.0: + version "2.2.6" + resolved "https://registry.yarnpkg.com/cross-fetch/-/cross-fetch-2.2.6.tgz#2ef0bb39a24ac034787965c457368a28730e220a" + integrity sha512-9JZz+vXCmfKUZ68zAptS7k4Nu8e2qcibe7WVZYps7sAgk5R8GYTc+T1WR0v1rlP9HxgARmOX1UTIJZFytajpNA== dependencies: - node-fetch "2.6.1" + node-fetch "^2.6.7" + whatwg-fetch "^2.0.4" cross-fetch@^3.1.4: version "3.1.5" @@ -8141,13 +7344,6 @@ cross-fetch@^3.1.4: dependencies: node-fetch "2.6.7" -cross-fetch@^4.0.0: - version "4.0.0" - resolved "https://registry.yarnpkg.com/cross-fetch/-/cross-fetch-4.0.0.tgz#f037aef1580bb3a1a35164ea2a848ba81b445983" - integrity sha512-e4a5N8lVvuLgAWgnCrLr2PP0YyDOTHa9H/Rj54dirp61qXnNq46m82bRhNqIA5VccJtWBvPTFRV3TtvHUKPB1g== - dependencies: - node-fetch "^2.6.12" - cross-spawn@^6.0.0, cross-spawn@^6.0.5: version "6.0.5" resolved "https://registry.yarnpkg.com/cross-spawn/-/cross-spawn-6.0.5.tgz#4a5ec7c64dfae22c3a14124dbacdee846d80cbc4" @@ -8234,26 +7430,6 @@ cssfilter@0.0.10: resolved "https://registry.yarnpkg.com/cssfilter/-/cssfilter-0.0.10.tgz#c6d2672632a2e5c83e013e6864a42ce8defd20ae" integrity sha512-FAaLDaplstoRsDR8XGYH51znUN0UY7nMc6Z9/fvE8EXGwvJE9hu7W2vHwx1+bd6gCYnln9nLbzxFTrcO9YQDZw== -csstype@^3.0.2: - version "3.1.2" - resolved "https://registry.yarnpkg.com/csstype/-/csstype-3.1.2.tgz#1d4bf9d572f11c14031f0436e1c10bc1f571f50b" - integrity sha512-I7K1Uu0MBPzaFKg4nI5Q7Vs2t+3gWWW648spaF+Rg7pI9ds18Ugn+lvg4SHczUdKlHI5LWBXyqfS8+DufyBsgQ== - -csv-parser@^2.0.0: - version "2.3.5" - resolved "https://registry.yarnpkg.com/csv-parser/-/csv-parser-2.3.5.tgz#6b3bf0907684914ff2c5abfbadab111a69eae5db" - integrity sha512-LCHolC4AlNwL+5EuD5LH2VVNKpD8QixZW2zzK1XmrVYUaslFY4c5BooERHOCIubG9iv/DAyFjs4x0HvWNZuyWg== - dependencies: - minimist "^1.2.0" - through2 "^3.0.1" - -csv-stringify@^4.3.1: - version "4.3.1" - resolved "https://registry.yarnpkg.com/csv-stringify/-/csv-stringify-4.3.1.tgz#7bee36f746ef555dd481a735a9e2938965f8478b" - integrity sha512-VRjPYIUzex5kfbsOY7LaJcNE2qMWGQQAanb3/Vv85WbOgA+dAfDNfwntRvv335icJgGYrnTX403WxJxRVpLDFA== - dependencies: - lodash.get "~4.4.2" - cycle@1.0.x: version "1.0.3" resolved "https://registry.yarnpkg.com/cycle/-/cycle-1.0.3.tgz#21e80b2be8580f98b468f379430662b046c34ad2" @@ -8289,11 +7465,6 @@ dataloader@2.1.0: resolved "https://registry.yarnpkg.com/dataloader/-/dataloader-2.1.0.tgz#c69c538235e85e7ac6c6c444bae8ecabf5de9df7" integrity sha512-qTcEYLen3r7ojZNgVUaRggOI+KM7jrKxXeSHhogh/TWxYMeONEMqY+hmkobiYQozsGIyg9OYVzO4ZIfoB4I0pQ== -date-and-time@^0.6.3: - version "0.6.3" - resolved "https://registry.yarnpkg.com/date-and-time/-/date-and-time-0.6.3.tgz#2daee52df67c28bd93bce862756ac86b68cf4237" - integrity sha512-lcWy3AXDRJOD7MplwZMmNSRM//kZtJaLz4n6D1P5z9wEmZGBKhJRBIr1Xs9KNQJmdXPblvgffynYji4iylUTcA== - dateformat@^3.0.0: version "3.0.3" resolved "https://registry.yarnpkg.com/dateformat/-/dateformat-3.0.3.tgz#a6e37499a4d9a9cf85ef5872044d62901c9889ae" @@ -8334,7 +7505,7 @@ debug@4, debug@4.3.4, debug@^4.0.1, debug@^4.1.0, debug@^4.1.1, debug@^4.3.1, de dependencies: ms "2.1.2" -debug@^3.1.0, debug@^3.2.6, debug@^3.2.7: +debug@^3.1.0, debug@^3.2.7: version "3.2.7" resolved "https://registry.yarnpkg.com/debug/-/debug-3.2.7.tgz#72580b7e9145fb39b6676f9c5e5fb100b934179a" integrity sha512-CFjzYYAi4ThfiQvizrFQevTTXHtnCqWfe7x1AhgEscTz6ZbLbfoLRLPugTQyBth6f8ZERVUSyWHFD/7Wu4t1XQ== @@ -8364,16 +7535,6 @@ decamelize@^4.0.0: resolved "https://registry.yarnpkg.com/decamelize/-/decamelize-4.0.0.tgz#aa472d7bf660eb15f3494efd531cab7f2a709837" integrity sha512-9iE1PgSik9HeIIw2JO94IidnE3eBoQrFJ3w7sFuzSX4DpmZ3v5sZpUiV5Swcf6mQEF+Y0ru8Neo+p+nyh2J+hQ== -decimal.js@10.2.0: - version "10.2.0" - resolved "https://registry.yarnpkg.com/decimal.js/-/decimal.js-10.2.0.tgz#39466113a9e036111d02f82489b5fd6b0b5ed231" - integrity sha512-vDPw+rDgn3bZe1+F/pyEwb1oMG2XTlRVgAa6B4KccTEpYgF8w6eQllVbQcfIJnZyvzFtFpxnpGtx8dd7DJp/Rw== - -decimal.js@^10.0.0: - version "10.4.3" - resolved "https://registry.yarnpkg.com/decimal.js/-/decimal.js-10.4.3.tgz#1044092884d245d1b7f65725fa4ad4c6f781cc23" - integrity sha512-VBBaLc1MgL5XpzgIP7ny5Z6Nx3UrRkIViUkPUdtl9aya5amy3De1gsUUSB1g3+3sExYNjCAsAznmukyxCb1GRA== - decode-uri-component@^0.2.0: version "0.2.2" resolved "https://registry.yarnpkg.com/decode-uri-component/-/decode-uri-component-0.2.2.tgz#e69dbe25d37941171dd540e024c444cd5188e1e9" @@ -8386,13 +7547,6 @@ decompress-response@^3.2.0, decompress-response@^3.3.0: dependencies: mimic-response "^1.0.0" -decompress-response@^4.2.0: - version "4.2.1" - resolved "https://registry.yarnpkg.com/decompress-response/-/decompress-response-4.2.1.tgz#414023cc7a302da25ce2ec82d0d5238ccafd8986" - integrity sha512-jOSne2qbyE+/r8G1VU+G/82LBs2Fs4LAsTiLSHOCOMZQl2OKZ6i8i4IyHemTe+/yIXOtTcRQMzPcgyhoFlqPkw== - dependencies: - mimic-response "^2.0.0" - decompress-response@^6.0.0: version "6.0.0" resolved "https://registry.yarnpkg.com/decompress-response/-/decompress-response-6.0.0.tgz#ca387612ddb7e104bd16d85aab00d5ecf09c66fc" @@ -8631,21 +7785,11 @@ detect-installed@^2.0.4: dependencies: get-installed-path "^2.0.3" -detect-libc@^1.0.2, detect-libc@^1.0.3: - version "1.0.3" - resolved "https://registry.yarnpkg.com/detect-libc/-/detect-libc-1.0.3.tgz#fa137c4bd698edf55cd5cd02ac559f91a4c4ba9b" - integrity sha512-pGjwhsmsp4kL2RTz08wcOlGN83otlqHeD/Z5T8GXZB+/YcpQ/dgo+lbU8ZsGxV0HIvqqxo9l7mqYwyYMD9bKDg== - detect-newline@^3.0.0: version "3.1.0" resolved "https://registry.yarnpkg.com/detect-newline/-/detect-newline-3.1.0.tgz#576f5dfc63ae1a192ff192d8ad3af6308991b651" integrity sha512-TLz+x/vEXm/Y7P7wn1EJFNLxYpUD4TgMosxY6fAVJUnJMbupHBOncxyWUG9OpTaH9EBD7uFI5LfEgmMOc54DsA== -detect-node@2.0.3: - version "2.0.3" - resolved "https://registry.yarnpkg.com/detect-node/-/detect-node-2.0.3.tgz#a2033c09cc8e158d37748fbde7507832bd6ce127" - integrity sha512-64uDTOK+fKEa6XoSbkkDoeAX8Ep1XhwxwZtL1aw1En5p5UOK/ekJoFqd5BB1o+uOvF1iHVv6qDUxdOQ/VgWEQg== - detect-package-manager@^2.0.1: version "2.0.1" resolved "https://registry.yarnpkg.com/detect-package-manager/-/detect-package-manager-2.0.1.tgz#6b182e3ae5e1826752bfef1de9a7b828cffa50d8" @@ -8697,11 +7841,6 @@ dir-glob@^3.0.1: dependencies: path-type "^4.0.0" -dirty-chai@^2.0.1: - version "2.0.1" - resolved "https://registry.yarnpkg.com/dirty-chai/-/dirty-chai-2.0.1.tgz#6b2162ef17f7943589da840abc96e75bda01aff3" - integrity sha512-ys79pWKvDMowIDEPC6Fig8d5THiC0DJ2gmTeGzVAoEH18J8OzLud0Jh7I9IWg3NSk8x2UocznUuFmfHCXYZx9w== - doctrine@0.7.2: version "0.7.2" resolved "https://registry.yarnpkg.com/doctrine/-/doctrine-0.7.2.tgz#7cb860359ba3be90e040b26b729ce4bfa654c523" @@ -8805,20 +7944,15 @@ dotenv-expand@^10.0.0: resolved "https://registry.yarnpkg.com/dotenv-expand/-/dotenv-expand-10.0.0.tgz#12605d00fb0af6d0a592e6558585784032e4ef37" integrity sha512-GopVGCpVS1UKH75VKHGuQFqS1Gusej0z4FyQkPdwjil2gNIv+LNsqBlboOzpJFZKVT95GkCyWJbBSdFEFUWI2A== -dotenv@*, dotenv@^16.0.3: +dotenv@^16.0.3: version "16.0.3" resolved "https://registry.yarnpkg.com/dotenv/-/dotenv-16.0.3.tgz#115aec42bac5053db3c456db30cc243a5a836a07" integrity sha512-7GO6HghkA5fYG9TYnNxi14/7K9f5occMlp3zXAuSxn7CKCxt9xbNWG7yF8hTCSUchlfWSe3uLmlPfigevRItzQ== -dotenv@8.2.0: - version "8.2.0" - resolved "https://registry.yarnpkg.com/dotenv/-/dotenv-8.2.0.tgz#97e619259ada750eea3e4ea3e26bceea5424b16a" - integrity sha512-8sJ78ElpbDJBHNeBzUbUVLsqKdccaa/BXF1uPTw3GrvQTBgrQrtObr2mUrE38vzYd8cEv+m/JBfDLioYcfXoaw== - -dotenv@^8.2.0: - version "8.6.0" - resolved "https://registry.yarnpkg.com/dotenv/-/dotenv-8.6.0.tgz#061af664d19f7f4d8fc6e4ff9b584ce237adcb8b" - integrity sha512-IrPdXQsk2BbzvCBGBOTmmSH5SodmqZNt4ERAZDmW4CT+tL8VtvinqywuANaFu4bOMWki16nqf0e4oC0QIaDr/g== +dotenv@^16.5.0: + version "16.5.0" + resolved "https://registry.yarnpkg.com/dotenv/-/dotenv-16.5.0.tgz#092b49f25f808f020050051d1ff258e404c78692" + integrity sha512-m/C+AwOAr9/W1UOIZUo232ejMNnJAJtYQjUbHoNTBNTJSvqzzDh7vnrei3o3r3m9blf6ZoDkvcw0VmozNRFJxg== dotenv@~10.0.0: version "10.0.0" @@ -8837,26 +7971,10 @@ double-ended-queue@2.1.0-0: resolved "https://registry.yarnpkg.com/double-ended-queue/-/double-ended-queue-2.1.0-0.tgz#103d3527fd31528f40188130c841efdd78264e5c" integrity sha512-+BNfZ+deCo8hMNpDqDnvT+c0XpJ5cUa6mqYq89bho2Ifze4URTqRkcwR399hWoTrTkbZ/XJYDgP6rc7pRgffEQ== -drbg.js@^1.0.1: - version "1.0.1" - resolved "https://registry.yarnpkg.com/drbg.js/-/drbg.js-1.0.1.tgz#3e36b6c42b37043823cdbc332d58f31e2445480b" - integrity sha512-F4wZ06PvqxYLFEZKkFxTDcns9oFNk34hvmJSEwdzsxVQ8YI5YaxtACgQatkYgv2VI2CFkUd2Y+xosPQnHv809g== - dependencies: - browserify-aes "^1.0.6" - create-hash "^1.1.2" - create-hmac "^1.1.4" - "ds-test@github:dapphub/ds-test": version "1.0.0" resolved "https://codeload.github.com/dapphub/ds-test/tar.gz/e282159d5170298eb2455a6c05280ab5a73a4ef0" -dtrace-provider@~0.8: - version "0.8.8" - resolved "https://registry.yarnpkg.com/dtrace-provider/-/dtrace-provider-0.8.8.tgz#2996d5490c37e1347be263b423ed7b297fb0d97e" - integrity sha512-b7Z7cNtHPhH9EJhNNbbeqTcXB8LGFFZhq1PGgEvpeHlzd36bhbdTWoE/Ba/YguqpBSlAPKnARWhVlhunCMwfxg== - dependencies: - nan "^2.14.0" - duplexer3@^0.1.4: version "0.1.5" resolved "https://registry.yarnpkg.com/duplexer3/-/duplexer3-0.1.5.tgz#0b5e4d7bad5de8901ea4440624c8e1d20099217e" @@ -8867,26 +7985,6 @@ duplexer@^0.1.1: resolved "https://registry.yarnpkg.com/duplexer/-/duplexer-0.1.1.tgz#ace6ff808c1ce66b57d1ebf97977acb02334cfc1" integrity sha1-rOb/gIwc5mtX0ev5eXessCM0z8E= -duplexify@^3.5.0, duplexify@^3.6.0: - version "3.7.1" - resolved "https://registry.yarnpkg.com/duplexify/-/duplexify-3.7.1.tgz#2a4df5317f6ccfd91f86d6fd25d8d8a103b88309" - integrity sha512-07z8uv2wMyS51kKhD1KsdXJg5WQ6t93RneqRxUHnskXVtlYYkLqM0gqStQZ3pj073g687jPCHrqNfCzawLYh5g== - dependencies: - end-of-stream "^1.0.0" - inherits "^2.0.1" - readable-stream "^2.0.0" - stream-shift "^1.0.0" - -duplexify@^4.0.0: - version "4.1.2" - resolved "https://registry.yarnpkg.com/duplexify/-/duplexify-4.1.2.tgz#18b4f8d28289132fa0b9573c898d9f903f81c7b0" - integrity sha512-fz3OjcNCHmRP12MJoZMPglx8m4rrFP8rovnk4vT8Fs+aonZoCwGg10dSsQsfP/E62eZcPTMSMP6686fu9Qlqtw== - dependencies: - end-of-stream "^1.4.1" - inherits "^2.0.3" - readable-stream "^3.1.1" - stream-shift "^1.0.0" - eastasianwidth@^0.2.0: version "0.2.0" resolved "https://registry.yarnpkg.com/eastasianwidth/-/eastasianwidth-0.2.0.tgz#696ce2ec0aa0e6ea93a397ffcf24aa7840c827cb" @@ -8900,21 +7998,13 @@ ecc-jsbn@~0.1.1: jsbn "~0.1.0" safer-buffer "^2.1.0" -ecdsa-sig-formatter@1.0.11, ecdsa-sig-formatter@^1.0.11: +ecdsa-sig-formatter@1.0.11: version "1.0.11" resolved "https://registry.yarnpkg.com/ecdsa-sig-formatter/-/ecdsa-sig-formatter-1.0.11.tgz#ae0f0fa2d85045ef14a817daa3ce9acd0489e5bf" integrity sha512-nagl3RYrbNv6kQkeJIpt6NJZy8twLB/2vtz6yN9Z4vRKHN4/QZJIEbqohALSgwKdnksuY3k5Addp5lg8sVoVcQ== dependencies: safe-buffer "^5.0.1" -ecurve@^1.0.6: - version "1.0.6" - resolved "https://registry.yarnpkg.com/ecurve/-/ecurve-1.0.6.tgz#dfdabbb7149f8d8b78816be5a7d5b83fcf6de797" - integrity sha512-/BzEjNfiSuB7jIWKcS/z8FK9jNjmEWvUV2YZ4RLSmcDtP7Lq0m6FvDuSnJpBlDpGRpfRQeTLGLBI8H+kEv0r+w== - dependencies: - bigi "^1.1.0" - safe-buffer "^5.0.1" - ee-first@1.1.1: version "1.1.1" resolved "https://registry.yarnpkg.com/ee-first/-/ee-first-1.1.1.tgz#590c61156b0ae2f4f0255732a158b266bc56b21d" @@ -9044,11 +8134,6 @@ enquirer@~2.3.6: dependencies: ansi-colors "^4.1.1" -ent@^2.2.0: - version "2.2.0" - resolved "https://registry.yarnpkg.com/ent/-/ent-2.2.0.tgz#e964219325a21d05f44466a2f686ed6ce5f5dd1d" - integrity sha512-GHrMyVZQWvTIdDtpiEXdHZnFQKzeO09apj8Cbl4pKWy4i0Oprcq17usfDt5aO63swf0JOeMWjWQE/LzgSRuWpA== - entities@^4.2.0, entities@^4.4.0: version "4.5.0" resolved "https://registry.yarnpkg.com/entities/-/entities-4.5.0.tgz#5d268ea5e7113ec74c4d033b79ea5a35a488fb48" @@ -9231,18 +8316,11 @@ es6-iterator@^2.0.3: es5-ext "^0.10.35" es6-symbol "^3.1.1" -es6-promise@^4.0.3, es6-promise@^4.2.8: +es6-promise@^4.2.8: version "4.2.8" resolved "https://registry.yarnpkg.com/es6-promise/-/es6-promise-4.2.8.tgz#4eb21594c972bc40553d276e510539143db53e0a" integrity sha512-HJDGx5daxeIvxdBxvG2cb9g4tEvwIk3i8+nhX0yGrYmZUzbkdg8QbDevheDB8gd0//uPj4c1EQua8Q+MViT0/w== -es6-promisify@^5.0.0: - version "5.0.0" - resolved "https://registry.yarnpkg.com/es6-promisify/-/es6-promisify-5.0.0.tgz#5109d62f3e56ea967c4b63505aef08291c8a5203" - integrity sha512-C+d6UdsYDk0lMebHNR4S2NybQMMngAOnOwYBQjTOiv0MkoJMP0Myw2mgpDLBcpfCmRLxyFqYhS/CfOENq4SJhQ== - dependencies: - es6-promise "^4.0.3" - es6-symbol@^3.1.1, es6-symbol@^3.1.3: version "3.1.3" resolved "https://registry.yarnpkg.com/es6-symbol/-/es6-symbol-3.1.3.tgz#bad5d3c1bcdac28269f4cb331e431c78ac705d18" @@ -9289,11 +8367,6 @@ escape-html@~1.0.3: resolved "https://registry.yarnpkg.com/escape-html/-/escape-html-1.0.3.tgz#0258eae4d3d0c0974de1c169188ef0051d1d1988" integrity sha512-NiSupZ4OeuGwr68lGIeym/ksIZMJodUGOSCZ/FSnTxcrekbvqrgdUxlJOMpijaKZVjAJrWrGs/6Jy8OMuyj9ow== -escape-latex@1.2.0: - version "1.2.0" - resolved "https://registry.yarnpkg.com/escape-latex/-/escape-latex-1.2.0.tgz#07c03818cf7dac250cce517f4fda1b001ef2bca1" - integrity sha512-nV5aVWW1K0wEiUIEdZ4erkGGH8mDxGyxSeqPzRNtWP7ataw+/olFObw7hujFWlVjNsaDFw5VZ5NzVSIqRgfTiw== - escape-string-regexp@1.0.5, escape-string-regexp@^1.0.5: version "1.0.5" resolved "https://registry.yarnpkg.com/escape-string-regexp/-/escape-string-regexp-1.0.5.tgz#1b61c0562190a8dff6ae3bb2cf0200ca130b86d4" @@ -9309,18 +8382,6 @@ escape-string-regexp@^2.0.0: resolved "https://registry.yarnpkg.com/escape-string-regexp/-/escape-string-regexp-2.0.0.tgz#a30304e99daa32e23b2fd20f51babd07cffca344" integrity sha512-UpzcLCXolUWcNu5HtVMHYdXJjArjsF9C0aNnquZYY4uW/Vu0miy5YoWvbV345HauVvcAUnpRuhMMcqTcGOY2+w== -escodegen@1.8.x: - version "1.8.1" - resolved "https://registry.yarnpkg.com/escodegen/-/escodegen-1.8.1.tgz#5a5b53af4693110bebb0867aa3430dd3b70a1018" - integrity sha512-yhi5S+mNTOuRvyW4gWlg5W1byMaQGWWSYHXsuFZ7GBo7tpyOwi2EdzMP/QWxh9hwkD2m+wDVHJsxhRIj+v/b/A== - dependencies: - esprima "^2.7.1" - estraverse "^1.9.1" - esutils "^2.0.2" - optionator "^0.8.1" - optionalDependencies: - source-map "~0.2.0" - eslint-import-resolver-node@^0.3.9: version "0.3.9" resolved "https://registry.yarnpkg.com/eslint-import-resolver-node/-/eslint-import-resolver-node-0.3.9.tgz#d4eaac52b8a2e7c3cd1903eb00f7e053356118ac" @@ -9497,11 +8558,6 @@ espree@^9.6.0, espree@^9.6.1: acorn-jsx "^5.3.2" eslint-visitor-keys "^3.4.1" -esprima@2.7.x, esprima@^2.7.1: - version "2.7.3" - resolved "https://registry.yarnpkg.com/esprima/-/esprima-2.7.3.tgz#96e3b70d5779f6ad49cd032673d1c312767ba581" - integrity sha512-OarPfz0lFCiW4/AV2Oy1Rp9qu0iusTKqykwTspGCZtPxmF81JR4MmIebvF1F9+UOKth2ZubLQ4XGGaU+hSn99A== - esprima@^4.0.0: version "4.0.1" resolved "https://registry.yarnpkg.com/esprima/-/esprima-4.0.1.tgz#13b04cdb3e6c5d19df91ab6987a8695619b0aa71" @@ -9521,11 +8577,6 @@ esrecurse@^4.1.0, esrecurse@^4.3.0: dependencies: estraverse "^5.2.0" -estraverse@^1.9.1: - version "1.9.3" - resolved "https://registry.yarnpkg.com/estraverse/-/estraverse-1.9.3.tgz#af67f2dc922582415950926091a4005d29c9bb44" - integrity sha512-25w1fMXQrGdoquWnScXZGckOv+Wes+JDnuN/+7ex3SauFRS72r2lFDec0EKPt2YD1wUJ/IrfEex+9yp4hfSOJA== - estraverse@^4.1.1: version "4.3.0" resolved "https://registry.yarnpkg.com/estraverse/-/estraverse-4.3.0.tgz#398ad3f3c5a24948be7725e83d11a7de28cdbd1d" @@ -9725,6 +8776,16 @@ ethereum-common@^0.0.18: resolved "https://registry.yarnpkg.com/ethereum-common/-/ethereum-common-0.0.18.tgz#2fdc3576f232903358976eb39da783213ff9523f" integrity sha512-EoltVQTRNg2Uy4o84qpa2aXymXDJhxm7eos/ACOg0DG4baAbMjhbdAEsx9GeE8sC3XCxnYvrrzZDH8D8MtA2iQ== +ethereum-cryptography@1.1.2: + version "1.1.2" + resolved "https://registry.yarnpkg.com/ethereum-cryptography/-/ethereum-cryptography-1.1.2.tgz#74f2ac0f0f5fe79f012c889b3b8446a9a6264e6d" + integrity sha512-XDSJlg4BD+hq9N2FjvotwUET9Tfxpxc3kWGE2AqUG5vcbeunnbImVk3cj6e/xT3phdW21mE8R5IugU4fspQDcQ== + dependencies: + "@noble/hashes" "1.1.2" + "@noble/secp256k1" "1.6.3" + "@scure/bip32" "1.1.0" + "@scure/bip39" "1.1.0" + ethereum-cryptography@1.2.0, ethereum-cryptography@^1.0.3, ethereum-cryptography@^1.1.2: version "1.2.0" resolved "https://registry.yarnpkg.com/ethereum-cryptography/-/ethereum-cryptography-1.2.0.tgz#5ccfa183e85fdaf9f9b299a79430c044268c9b3a" @@ -9776,23 +8837,12 @@ ethereum-cryptography@^2.1.2: "@scure/bip32" "1.3.1" "@scure/bip39" "1.2.1" -ethereum-types@^3.7.1: - version "3.7.1" - resolved "https://registry.yarnpkg.com/ethereum-types/-/ethereum-types-3.7.1.tgz#8fa75e5d9f5da3c85535ea0d4bcd2614b1d650a8" - integrity sha512-EBQwTGnGZQ9oHK7Za3DFEOxiElksRCoZECkk418vHiE2d59lLSejDZ1hzRVphtFjAu5YqONz4/XuAYdMBg+gWA== - dependencies: - "@types/node" "12.12.54" - bignumber.js "~9.0.2" +ethereum-protocol@^1.0.1: + version "1.0.1" + resolved "https://registry.yarnpkg.com/ethereum-protocol/-/ethereum-protocol-1.0.1.tgz#b7d68142f4105e0ae7b5e178cf42f8d4dc4b93cf" + integrity sha512-3KLX1mHuEsBW0dKG+c6EOJS1NBNqdCICvZW9sInmZTt5aY0oxmHVggYRE0lJu1tcnMD1K+AKHdLi6U43Awm1Vg== -ethereumjs-abi@^0.6.8: - version "0.6.8" - resolved "https://registry.yarnpkg.com/ethereumjs-abi/-/ethereumjs-abi-0.6.8.tgz#71bc152db099f70e62f108b7cdfca1b362c6fcae" - integrity sha512-Tx0r/iXI6r+lRsdvkFDlut0N08jWMnKRZ6Gkq+Nmw75lZe4e6o3EkSnkaBP5NF6+m5PTGAr9JP43N3LyeoglsA== - dependencies: - bn.js "^4.11.8" - ethereumjs-util "^6.0.0" - -"ethereumjs-abi@git+https://github.com/ethereumjs/ethereumjs-abi.git": +ethereumjs-abi@^0.6.8, "ethereumjs-abi@git+https://github.com/ethereumjs/ethereumjs-abi.git": version "0.6.8" resolved "git+https://github.com/ethereumjs/ethereumjs-abi.git#ee3994657fa7a427238e6ba92a84d0b529bbcde0" dependencies: @@ -9889,7 +8939,7 @@ ethereumjs-util@^5.0.0, ethereumjs-util@^5.0.1, ethereumjs-util@^5.1.1, ethereum rlp "^2.0.0" safe-buffer "^5.1.1" -ethereumjs-util@^6.0.0, ethereumjs-util@^6.1.0, ethereumjs-util@^6.2.0: +ethereumjs-util@^6.0.0, ethereumjs-util@^6.1.0, ethereumjs-util@^6.2.0, ethereumjs-util@^6.2.1: version "6.2.1" resolved "https://registry.yarnpkg.com/ethereumjs-util/-/ethereumjs-util-6.2.1.tgz#fcb4e4dd5ceacb9d2305426ab1a5cd93e3163b69" integrity sha512-W2Ktez4L01Vexijrm5EB6w7dg4n/TgpoYU4avuT5T3Vmnw/eCRtiBrJfQYS/DCSvDIOLn2k57GcHdeBcgVxAqw== @@ -9951,7 +9001,7 @@ ethereumjs-vm@^2.0.2, ethereumjs-vm@^2.3.4: rustbn.js "~0.2.0" safe-buffer "^5.1.1" -ethereumjs-wallet@^0.6.0, ethereumjs-wallet@^0.6.3: +ethereumjs-wallet@^0.6.0: version "0.6.5" resolved "https://registry.yarnpkg.com/ethereumjs-wallet/-/ethereumjs-wallet-0.6.5.tgz#685e9091645cee230ad125c007658833991ed474" integrity sha512-MDwjwB9VQVnpp/Dc1XzA6J1a3wgHQ4hSvA1uWNatdpOrtCbPVuQSKSyRnjLvS0a+KKMw2pvQ9Ybqpb3+eW8oNA== @@ -9982,7 +9032,7 @@ ethers@4.0.0-beta.3: uuid "2.0.1" xmlhttprequest "1.8.0" -ethers@^4.0.20, ethers@^4.0.32, ethers@^4.0.40, ethers@~4.0.4: +ethers@^4.0.20, ethers@^4.0.32, ethers@^4.0.40: version "4.0.49" resolved "https://registry.yarnpkg.com/ethers/-/ethers-4.0.49.tgz#0eb0e9161a0c8b4761be547396bbe2fb121a8894" integrity sha512-kPltTvWiyu+OktYy1IStSO16i2e7cS9D9OxZ81q2UUaiNPVrm/RTcbxamCXF9VUSKzJIdJV68EAIhTEVBalRWg== @@ -9997,7 +9047,7 @@ ethers@^4.0.20, ethers@^4.0.32, ethers@^4.0.40, ethers@~4.0.4: uuid "2.0.1" xmlhttprequest "1.8.0" -ethers@^5.0.13, ethers@^5.2.0, ethers@^5.7.1, ethers@^5.7.2: +ethers@^5.0.13, ethers@^5.7.1, ethers@^5.7.2: version "5.7.2" resolved "https://registry.yarnpkg.com/ethers/-/ethers-5.7.2.tgz#3a7deeabbb8c030d4126b24f84e525466145872e" integrity sha512-wswUsmWo1aOK8rR7DIKiWSw9DbLWe6x98Jrn8wcTflTVvaXhAMaB5zGAXy0GYQEQp9iO1iSHWVyARQm11zUtyg== @@ -10033,7 +9083,7 @@ ethers@^5.0.13, ethers@^5.2.0, ethers@^5.7.1, ethers@^5.7.2: "@ethersproject/web" "5.7.1" "@ethersproject/wordlists" "5.7.0" -ethjs-unit@0.1.6, ethjs-unit@^0.1.6: +ethjs-unit@0.1.6: version "0.1.6" resolved "https://registry.yarnpkg.com/ethjs-unit/-/ethjs-unit-0.1.6.tgz#c665921e476e87bce2a9d588a6fe0405b2c41699" integrity sha512-/Sn9Y0oKl0uqQuvgFk/zQgR7aw1g36qX/jzSQ5lSwlO0GigPymk4eGQfeNTD03w1dPOqfz8V77Cy43jH56pagw== @@ -10041,7 +9091,7 @@ ethjs-unit@0.1.6, ethjs-unit@^0.1.6: bn.js "4.11.6" number-to-bn "1.7.0" -ethjs-util@0.1.6, ethjs-util@^0.1.3: +ethjs-util@0.1.6, ethjs-util@^0.1.3, ethjs-util@^0.1.6: version "0.1.6" resolved "https://registry.yarnpkg.com/ethjs-util/-/ethjs-util-0.1.6.tgz#f308b62f185f9fe6237132fb2a9818866a5cd536" integrity sha512-CUnVOQq7gSpDHZVVrQW8ExxUETWrnrvXYvYz55wOU8Uj4VCgw56XC2B/fVqQN+f7gmrnRHSLVnFAwsCuNwji8w== @@ -10054,12 +9104,7 @@ event-target-shim@^5.0.0: resolved "https://registry.yarnpkg.com/event-target-shim/-/event-target-shim-5.0.1.tgz#5d4d3ebdf9583d63a5333ce2deb7480ab2b05789" integrity sha512-i/2XbnSz/uxRCU6+NdVJgKWDTM427+MqYbkQzD321DuCQJUqOuJKIA0IM2+W2xtYHdKOmZ4dR6fExsd4SXL+WQ== -eventemitter3@3.1.0: - version "3.1.0" - resolved "https://registry.yarnpkg.com/eventemitter3/-/eventemitter3-3.1.0.tgz#090b4d6cdbd645ed10bf750d4b5407942d7ba163" - integrity sha512-ivIvhpq/Y0uSjcHDcOIccjmYjGLcP09MFGE7ysAwkAvkXfpZlC985pH2/ui64DKazbTW/4kN3yqozUxlXzI6cA== - -eventemitter3@3.1.2, eventemitter3@^3.1.0: +eventemitter3@3.1.2: version "3.1.2" resolved "https://registry.yarnpkg.com/eventemitter3/-/eventemitter3-3.1.2.tgz#2d3d48f9c346698fce83a85d7d664e98535df6e7" integrity sha512-tvtQIeLVHjDkJYnzf2dgVMxfuSGJeM/7UCG17TT4EumTfNtF+0nebF/4zWOIkCreAbtNqhGEboB6BWrwqNaw4Q== @@ -10069,12 +9114,17 @@ eventemitter3@4.0.4: resolved "https://registry.yarnpkg.com/eventemitter3/-/eventemitter3-4.0.4.tgz#b5463ace635a083d018bdc7c917b4c5f10a85384" integrity sha512-rlaVLnVxtxvoyLsQQFBx53YmXHDxRIzzTLbdfxqi4yocpSjAxXwkU0cScM5JgSKMqEhrZpnvQ2D9gjylR0AimQ== +eventemitter3@5.0.1: + version "5.0.1" + resolved "https://registry.yarnpkg.com/eventemitter3/-/eventemitter3-5.0.1.tgz#53f5ffd0a492ac800721bb42c66b841de96423c4" + integrity sha512-GWkBvjiSZK87ELrYOSESUYeVIc9mvLLf/nXalMOS5dYrgZq9o5OVkbZAVM06CVxYsCwH9BDZFPlQTlPA1j4ahA== + eventemitter3@^4.0.4: version "4.0.7" resolved "https://registry.yarnpkg.com/eventemitter3/-/eventemitter3-4.0.7.tgz#2de9b68f6528d5644ef5c59526a1b4a07306169f" integrity sha512-8guHBZCwKnFhYdHr2ysuRWErTwhoN2X8XELRlrRwpmfeY2jjuUN4taQMsULKUVo1K4DvZl+0pgfyoysHxvmvEw== -events@^3.0.0, events@^3.3.0: +events@^3.0.0: version "3.3.0" resolved "https://registry.yarnpkg.com/events/-/events-3.3.0.tgz#31a95ad0a924e2d2c419a813aeb2c4e878ea7400" integrity sha512-mQw+2fkQbALzQ7V0MY0IqdnXNOeTtP4r0lN9z7AAawCXgqea7bDii20AYrIBrFd/Hx0M2Ocz6S111CaFkUcb0Q== @@ -10087,13 +9137,6 @@ evp_bytestokey@^1.0.0, evp_bytestokey@^1.0.3: md5.js "^1.3.4" safe-buffer "^5.1.1" -exception-formatter@^1.0.4: - version "1.0.7" - resolved "https://registry.yarnpkg.com/exception-formatter/-/exception-formatter-1.0.7.tgz#3291616b86fceabefa97aee6a4708032c6e3b96d" - integrity sha512-zV45vEsjytJrwfGq6X9qd1Ll56cW4NC2mhCO6lqwMk4ZpA1fZ6C3UiaQM/X7if+7wZFmCgss3ahp9B/uVFuLRw== - dependencies: - colors "^1.0.3" - execa@^1.0.0: version "1.0.0" resolved "https://registry.yarnpkg.com/execa/-/execa-1.0.0.tgz#c6236a5bb4df6d6f15e88e7f017798216749ddd8" @@ -10171,11 +9214,6 @@ expand-range@^1.8.1: dependencies: fill-range "^2.1.0" -expand-template@^2.0.3: - version "2.0.3" - resolved "https://registry.yarnpkg.com/expand-template/-/expand-template-2.0.3.tgz#6e14b3fcee0f3a6340ecb57d2e8918692052a47c" - integrity sha512-XYfuKMvj4O35f/pOXLObndIRvyQ+/+6AhODh+OKWj9S9498pHHn/IMszH+gt0fBCRWMNfk1ZSp5x3AifmnI2vg== - expand-tilde@^1.2.2: version "1.2.2" resolved "https://registry.yarnpkg.com/expand-tilde/-/expand-tilde-1.2.2.tgz#0b81eba897e5a3d31d1c3d102f8f01441e559449" @@ -10259,7 +9297,7 @@ extend-shallow@^2.0.0: dependencies: is-extendable "^0.1.0" -extend@^3.0.0, extend@^3.0.1, extend@^3.0.2, extend@~3.0.2: +extend@~3.0.2: version "3.0.2" resolved "https://registry.yarnpkg.com/extend/-/extend-3.0.2.tgz#f8b1136b4071fbd8eb140aff858b1019ec2915fa" integrity sha512-fjquC59cD7CyW6urNXK0FBufkZcoiGG80wTuPujX590cB5Ttln20E2UB4S/WARVqhXffZl2LNgS+gQdPIIim/g== @@ -10367,11 +9405,6 @@ fast-safe-stringify@^2.0.6: resolved "https://registry.yarnpkg.com/fast-safe-stringify/-/fast-safe-stringify-2.1.1.tgz#c406a83b6e70d9e35ce3b30a81141df30aeba884" integrity sha512-W+KJc2dmILlPplD/H4K9l9LcAHAfPtP6BY84uVLXQ6Evcz9Lcg33Y2z1IVblT6xdY54PXYVHEv+0Wpq8Io6zkA== -fast-text-encoding@^1.0.0, fast-text-encoding@^1.0.3: - version "1.0.6" - resolved "https://registry.yarnpkg.com/fast-text-encoding/-/fast-text-encoding-1.0.6.tgz#0aa25f7f638222e3396d72bf936afcf1d42d6867" - integrity sha512-VhXlQgj9ioXCqGstD37E/HBeqEGV/qOD/kmbVG8h5xKBYvM1L3lR1Zn4555cQ8GkYbJa8aJSipLPndE1k6zK2w== - fastq@^1.6.0: version "1.15.0" resolved "https://registry.yarnpkg.com/fastq/-/fastq-1.15.0.tgz#d04d07c6a2a68fe4599fea8d2e103a937fae6b3a" @@ -10704,7 +9737,7 @@ form-data-encoder@^2.1.2: resolved "https://registry.yarnpkg.com/form-data-encoder/-/form-data-encoder-2.1.4.tgz#261ea35d2a70d48d30ec7a9603130fa5515e9cd5" integrity sha512-yDYSgNMraqvnxiEXO4hi88+YZxaHC6QKzb5N84iRCTDeRO7ZALpir/lVmf/uXUhnwUr2O4HU8s/n6x+yNjQkHw== -form-data@^2.2.0, form-data@^2.5.0: +form-data@^2.2.0: version "2.5.1" resolved "https://registry.yarnpkg.com/form-data/-/form-data-2.5.1.tgz#f2cbec57b5e59e23716e128fe44d4e5dd23895f4" integrity sha512-m21N3WOmEEURgk6B9GLOE4RuWOFf28Lhh9qGYeNlGq4VDXUlJy2th2slBNU8Gp8EzloYZOibZJ7t5ecIrFSjVA== @@ -10757,11 +9790,6 @@ fp-ts@2.1.1: resolved "https://registry.yarnpkg.com/fp-ts/-/fp-ts-2.1.1.tgz#c910544499d7c959351bb4260ee7c44a544084c1" integrity sha512-YcWhMdDCFCja0MmaDroTgNu+NWWrrnUEn92nvDgrtVy9Z71YFnhNVIghoHPt8gs82ijoMzFGeWKvArbyICiJgw== -fraction.js@4.0.12: - version "4.0.12" - resolved "https://registry.yarnpkg.com/fraction.js/-/fraction.js-4.0.12.tgz#0526d47c65a5fb4854df78bc77f7bec708d7b8c3" - integrity sha512-8Z1K0VTG4hzYY7kA/1sj4/r1/RWLBD3xwReT/RCrUCbzPszjNQCCsy3ktkU/eaEqX3MYa4pY37a52eiBlPMlhA== - fresh@0.5.2: version "0.5.2" resolved "https://registry.yarnpkg.com/fresh/-/fresh-0.5.2.tgz#3d8cadd90d976569fa835ab1f8e4b23a105605a7" @@ -10926,10 +9954,10 @@ functions-have-names@^1.2.2, functions-have-names@^1.2.3: resolved "https://registry.yarnpkg.com/functions-have-names/-/functions-have-names-1.2.3.tgz#0404fe4ee2ba2f607f0e0ec3c80bae994133b834" integrity sha512-xckBUXyTIqT97tq2x2AMb+g163b5JFysYk0x4qxNFwbfQkmNZoiRHb6sPzI9/QV33WeuvVYBUIiD4NzNIyqaRQ== -ganache@7.8.0, ganache@^7.4.0, "ganache@npm:@celo/ganache@7.8.0-unofficial.0": - version "7.8.0-unofficial.0" - resolved "https://registry.yarnpkg.com/@celo/ganache/-/ganache-7.8.0-unofficial.0.tgz#7930a68ea8df36c7862425a164b44e3e89cb3d0f" - integrity sha512-csobquRjvgG/0mzw6NRB4ga+idPq7yQKLISNbxn2JSS/nCkGCTKn4yvTJqhmZ1dzwL2U3qHmGer2qXlxFQgbdg== +ganache@7.8.0: + version "7.8.0" + resolved "https://registry.yarnpkg.com/ganache/-/ganache-7.8.0.tgz#02154384f246b66e98974cbcbb18e8372df3c2e0" + integrity sha512-IrUYvsaE/m2/NaVIZ7D/gCnsmyU/buechnH6MhUipzG1qJcZIwIp/DoP/LZUcHyhy0Bv0NKZD2pGOjpRhn7l7A== dependencies: "@trufflesuite/bigint-buffer" "1.1.10" "@trufflesuite/uws-js-unofficial" "20.10.0-unofficial.2" @@ -10961,74 +9989,6 @@ gauge@^4.0.3: strip-ansi "^6.0.1" wide-align "^1.1.5" -gauge@~2.7.3: - version "2.7.4" - resolved "https://registry.yarnpkg.com/gauge/-/gauge-2.7.4.tgz#2c03405c7538c39d7eb37b317022e325fb018bf7" - integrity sha512-14x4kjc6lkD3ltw589k0NrPD6cCNTD6CWoVUNpB85+DrtONoZn+Rug6xZU5RvSC4+TZPxA5AnBibQYAvZn41Hg== - dependencies: - aproba "^1.0.3" - console-control-strings "^1.0.0" - has-unicode "^2.0.0" - object-assign "^4.1.0" - signal-exit "^3.0.0" - string-width "^1.0.1" - strip-ansi "^3.0.1" - wide-align "^1.1.0" - -gaxios@^1.0.2, gaxios@^1.0.4, gaxios@^1.2.1, gaxios@^1.2.2, gaxios@^1.5.0: - version "1.8.4" - resolved "https://registry.yarnpkg.com/gaxios/-/gaxios-1.8.4.tgz#e08c34fe93c0a9b67a52b7b9e7a64e6435f9a339" - integrity sha512-BoENMnu1Gav18HcpV9IleMPZ9exM+AvUjrAOV4Mzs/vfz2Lu/ABv451iEXByKiMPn2M140uul1txXCg83sAENw== - dependencies: - abort-controller "^3.0.0" - extend "^3.0.2" - https-proxy-agent "^2.2.1" - node-fetch "^2.3.0" - -gaxios@^4.0.0: - version "4.3.3" - resolved "https://registry.yarnpkg.com/gaxios/-/gaxios-4.3.3.tgz#d44bdefe52d34b6435cc41214fdb160b64abfc22" - integrity sha512-gSaYYIO1Y3wUtdfHmjDUZ8LWaxJQpiavzbF5Kq53akSzvmVg0RfyOcFDbO1KJ/KCGRFz2qG+lS81F0nkr7cRJA== - dependencies: - abort-controller "^3.0.0" - extend "^3.0.2" - https-proxy-agent "^5.0.0" - is-stream "^2.0.0" - node-fetch "^2.6.7" - -gcp-metadata@^1.0.0: - version "1.0.0" - resolved "https://registry.yarnpkg.com/gcp-metadata/-/gcp-metadata-1.0.0.tgz#5212440229fa099fc2f7c2a5cdcb95575e9b2ca6" - integrity sha512-Q6HrgfrCQeEircnNP3rCcEgiDv7eF9+1B+1MMgpE190+/+0mjQR8PxeOaRgxZWmdDAF9EIryHB9g1moPiw1SbQ== - dependencies: - gaxios "^1.0.2" - json-bigint "^0.3.0" - -gcp-metadata@^4.2.0: - version "4.3.1" - resolved "https://registry.yarnpkg.com/gcp-metadata/-/gcp-metadata-4.3.1.tgz#fb205fe6a90fef2fd9c85e6ba06e5559ee1eefa9" - integrity sha512-x850LS5N7V1F3UcV7PoupzGsyD6iVwTVvsh3tbXfkctZnBnjW5yu5z1/3k3SehF7TyoTIe78rJs02GMMy+LF+A== - dependencies: - gaxios "^4.0.0" - json-bigint "^1.0.0" - -gcs-resumable-upload@^1.0.0: - version "1.1.0" - resolved "https://registry.yarnpkg.com/gcs-resumable-upload/-/gcs-resumable-upload-1.1.0.tgz#2b06f5876dcf60f18a309343f79ed951aff01399" - integrity sha512-uBz7uHqp44xjSDzG3kLbOYZDjxxR/UAGbB47A0cC907W6yd2LkcyFDTHg+bjivkHMwiJlKv4guVWcjPCk2zScg== - dependencies: - abort-controller "^2.0.2" - configstore "^4.0.0" - gaxios "^1.5.0" - google-auth-library "^3.0.0" - pumpify "^1.5.1" - stream-events "^1.0.4" - -generate-password@^1.5.1: - version "1.7.0" - resolved "https://registry.yarnpkg.com/generate-password/-/generate-password-1.7.0.tgz#00ba4eb1e71f89a72307b0d6604ee0d4e7f5770c" - integrity sha512-WPCtlfy0jexf7W5IbwxGUgpIDvsZIohbI2DAq2Q6TSlKKis+G4GT9sxvPxrZUGL8kP6WUXMWNqYnxY6DDKAdFA== - gensync@^1.0.0-beta.2: version "1.0.0-beta.2" resolved "https://registry.yarnpkg.com/gensync/-/gensync-1.0.0-beta.2.tgz#32a6ee76c3d7f52d46b2b1ae5d93fea8580a25e0" @@ -11220,11 +10180,6 @@ gitconfiglocal@^1.0.0: dependencies: ini "^1.3.2" -github-from-package@0.0.0: - version "0.0.0" - resolved "https://registry.yarnpkg.com/github-from-package/-/github-from-package-0.0.0.tgz#97fb5d96bfde8973313f20e8288ef9a167fa64ce" - integrity sha512-SyHy3T1v2NUXn29OsWdxmK6RwHD+vkj3v8en8AOBZ1wBQ/hCAQ5bAQTD02kW4W9tUp/3Qh6J8r9EvntiyCmOOw== - glob-base@^0.3.0: version "0.3.0" resolved "https://registry.yarnpkg.com/glob-base/-/glob-base-0.3.0.tgz#dbb164f6221b1c0b1ccf82aea328b497df0ea3c4" @@ -11366,29 +10321,7 @@ glob@^10.3.7: minipass "^5.0.0 || ^6.0.2 || ^7.0.0" path-scurry "^1.10.1" -glob@^5.0.15: - version "5.0.15" - resolved "https://registry.yarnpkg.com/glob/-/glob-5.0.15.tgz#1bc936b9e02f4a603fcc222ecf7633d30b8b93b1" - integrity sha512-c9IPMazfRITpmAAKi22dK1VKxGDX9ehhqfABDriL/lzO92xcUKEJPQHrVA/2YHSNFB4iFlykVmWvwo48nr3OxA== - dependencies: - inflight "^1.0.4" - inherits "2" - minimatch "2 || 3" - once "^1.3.0" - path-is-absolute "^1.0.0" - -glob@^6.0.1: - version "6.0.4" - resolved "https://registry.yarnpkg.com/glob/-/glob-6.0.4.tgz#0f08860f6a155127b2fadd4f9ce24b1aab6e4d22" - integrity sha512-MKZeRNyYZAVVVG1oZeLaWie1uweH40m9AZwIwxyPbTSX4hHrVYSzLg0Ro5Z5R7XKkIX+Cc6oD1rqeDJnwsB8/A== - dependencies: - inflight "^1.0.4" - inherits "2" - minimatch "2 || 3" - once "^1.3.0" - path-is-absolute "^1.0.0" - -glob@^7.0.5, glob@^7.1.1, glob@^7.1.2, glob@^7.1.3, glob@^7.1.4, glob@^7.1.6, glob@~7.2.3: +glob@^7.1.1, glob@^7.1.2, glob@^7.1.3, glob@^7.1.4, glob@^7.1.6, glob@~7.2.3: version "7.2.3" resolved "https://registry.yarnpkg.com/glob/-/glob-7.2.3.tgz#b8df0fb802bbfa8e89bd1d938b4e16578ed44f2b" integrity sha512-nFR0zLpU2YCaRxwoCJvL6UvCH2JFyFVIvwTLsIf21AuHlMskA1hhTdk+LlYJtOlYt9v6dvszD2BGRqBL+iQK9Q== @@ -11499,119 +10432,6 @@ globby@^13.1.3: merge2 "^1.4.1" slash "^4.0.0" -google-auth-library@^3.0.0, google-auth-library@^3.1.1: - version "3.1.2" - resolved "https://registry.yarnpkg.com/google-auth-library/-/google-auth-library-3.1.2.tgz#ff2f88cd5cd2118a57bd3d5ad3c093c8837fc350" - integrity sha512-cDQMzTotwyWMrg5jRO7q0A4TL/3GWBgO7I7q5xGKNiiFf9SmGY/OJ1YsLMgI2MVHHsEGyrqYnbnmV1AE+Z6DnQ== - dependencies: - base64-js "^1.3.0" - fast-text-encoding "^1.0.0" - gaxios "^1.2.1" - gcp-metadata "^1.0.0" - gtoken "^2.3.2" - https-proxy-agent "^2.2.1" - jws "^3.1.5" - lru-cache "^5.0.0" - semver "^5.5.0" - -google-auth-library@^7.14.0: - version "7.14.1" - resolved "https://registry.yarnpkg.com/google-auth-library/-/google-auth-library-7.14.1.tgz#e3483034162f24cc71b95c8a55a210008826213c" - integrity sha512-5Rk7iLNDFhFeBYc3s8l1CqzbEBcdhwR193RlD4vSNFajIcINKI8W8P0JLmBpwymHqqWbX34pJDQu39cSy/6RsA== - dependencies: - arrify "^2.0.0" - base64-js "^1.3.0" - ecdsa-sig-formatter "^1.0.11" - fast-text-encoding "^1.0.0" - gaxios "^4.0.0" - gcp-metadata "^4.2.0" - gtoken "^5.0.4" - jws "^4.0.0" - lru-cache "^6.0.0" - -google-gax@^0.25.0: - version "0.25.6" - resolved "https://registry.yarnpkg.com/google-gax/-/google-gax-0.25.6.tgz#5ea5c743933ba957da63951bc828aef91fb69340" - integrity sha512-+CVtOSLQt42mwVvJJirhBiAvWsp8zKeb9zW5Wy3wyvb3VG9OugHzZpwvYO9D4yNPPspe7L9CpIs80I5nUJlS8w== - dependencies: - "@grpc/grpc-js" "^0.3.0" - "@grpc/proto-loader" "^0.4.0" - duplexify "^3.6.0" - google-auth-library "^3.0.0" - google-proto-files "^0.20.0" - grpc "^1.16.0" - grpc-gcp "^0.1.1" - is-stream-ended "^0.1.4" - lodash.at "^4.6.0" - lodash.has "^4.5.2" - protobufjs "^6.8.8" - retry-request "^4.0.0" - semver "^6.0.0" - walkdir "^0.3.2" - -google-gax@^2.30.0: - version "2.30.5" - resolved "https://registry.yarnpkg.com/google-gax/-/google-gax-2.30.5.tgz#e836f984f3228900a8336f608c83d75f9cb73eff" - integrity sha512-Jey13YrAN2hfpozHzbtrwEfEHdStJh1GwaQ2+Akh1k0Tv/EuNVSuBtHZoKSBm5wBMvNsxTsEIZ/152NrYyZgxQ== - dependencies: - "@grpc/grpc-js" "~1.6.0" - "@grpc/proto-loader" "^0.6.12" - "@types/long" "^4.0.0" - abort-controller "^3.0.0" - duplexify "^4.0.0" - fast-text-encoding "^1.0.3" - google-auth-library "^7.14.0" - is-stream-ended "^0.1.4" - node-fetch "^2.6.1" - object-hash "^3.0.0" - proto3-json-serializer "^0.1.8" - protobufjs "6.11.3" - retry-request "^4.0.0" - -google-p12-pem@^1.0.0: - version "1.0.5" - resolved "https://registry.yarnpkg.com/google-p12-pem/-/google-p12-pem-1.0.5.tgz#0b4721cdfc818759d884f0c62803518decdaf0d0" - integrity sha512-50rTrqYPTPPwlu9TNl/HkJbBENEpbRzTOVLFJ4YWM86njZgXHFy+FP+tLRSd9m132Li9Dqi27Z3KIWDEv5y+EA== - dependencies: - node-forge "^0.10.0" - pify "^4.0.0" - -google-p12-pem@^3.1.3: - version "3.1.4" - resolved "https://registry.yarnpkg.com/google-p12-pem/-/google-p12-pem-3.1.4.tgz#123f7b40da204de4ed1fbf2fd5be12c047fc8b3b" - integrity sha512-HHuHmkLgwjdmVRngf5+gSmpkyaRI6QmOg77J8tkNBHhNEI62sGHyw4/+UkgyZEI7h84NbWprXDJ+sa3xOYFvTg== - dependencies: - node-forge "^1.3.1" - -google-proto-files@^0.20.0: - version "0.20.0" - resolved "https://registry.yarnpkg.com/google-proto-files/-/google-proto-files-0.20.0.tgz#dfcd1635a0c3f00f49ca057462cf369108ff4b5e" - integrity sha512-ORU+XhOeDv/UPtnCYLkO1ItmfhRCRPR3ZoeVQ7GfVzEs7PVitPIhsYlY5ZzG8XXnsdmtK27ENurfQ1jhAWpZHg== - dependencies: - "@google-cloud/promisify" "^0.4.0" - protobufjs "^6.8.0" - walkdir "^0.3.0" - -googleapis-common@^0.7.0: - version "0.7.2" - resolved "https://registry.yarnpkg.com/googleapis-common/-/googleapis-common-0.7.2.tgz#a694f55d979cb7c2eac21a0e0439af12f9b418ba" - integrity sha512-9DEJIiO4nS7nw0VE1YVkEfXEj8x8MxsuB+yZIpOBULFSN9OIKcUU8UuKgSZFU4lJmRioMfngktrbkMwWJcUhQg== - dependencies: - gaxios "^1.2.2" - google-auth-library "^3.0.0" - pify "^4.0.0" - qs "^6.5.2" - url-template "^2.0.8" - uuid "^3.2.1" - -googleapis@^39.2.0: - version "39.2.0" - resolved "https://registry.yarnpkg.com/googleapis/-/googleapis-39.2.0.tgz#5c81f721e9da2e80cb0b25821ed60d3bc200c3da" - integrity sha512-66X8TG1B33zAt177sG1CoKoYHPP/B66tEpnnSANGCqotMuY5gqSQO8G/0gqHZR2jRgc5CHSSNOJCnpI0SuDxMQ== - dependencies: - google-auth-library "^3.0.0" - googleapis-common "^0.7.0" - gopd@^1.0.1: version "1.0.1" resolved "https://registry.yarnpkg.com/gopd/-/gopd-1.0.1.tgz#29ff76de69dac7489b7c0918a5788e56477c332c" @@ -11736,13 +10556,6 @@ graphql-tag@^2.11.0, graphql-tag@^2.12.6: dependencies: tslib "^2.1.0" -graphql@^14.1.1: - version "14.7.0" - resolved "https://registry.yarnpkg.com/graphql/-/graphql-14.7.0.tgz#7fa79a80a69be4a31c27dda824dc04dac2035a72" - integrity sha512-l0xWZpoPKpppFzMfvVyFmp9vLN7w/ZZJPefUicMCepfJeQ8sMcztloGYY9DfjVPo6tIUDzU5Hw3MUbIjj9AVVA== - dependencies: - iterall "^1.2.2" - graphql@^15.3.0: version "15.8.0" resolved "https://registry.yarnpkg.com/graphql/-/graphql-15.8.0.tgz#33410e96b012fa3bdb1091cc99a94769db212b38" @@ -11753,47 +10566,7 @@ growl@1.10.5: resolved "https://registry.yarnpkg.com/growl/-/growl-1.10.5.tgz#f2735dc2283674fa67478b10181059355c369e5e" integrity sha512-qBr4OuELkhPenW6goKVXiv47US3clb3/IbuWF9KNKEijAy9oeHxU9IgzjvJhHkUzhaj7rOUD7+YGWqUjLp5oSA== -grpc-gcp@^0.1.1: - version "0.1.1" - resolved "https://registry.yarnpkg.com/grpc-gcp/-/grpc-gcp-0.1.1.tgz#a11be8a7e7a6edf5f636b44a6a24fb4cc028f71f" - integrity sha512-MAt0Ae9QuL2Lbbt2d+kDta5AxqRD1JVXtBcJuQKp9GeFL5TxPw/hxIyDNyivPjKEXjbG3cBGwSE3CXq6a3KHEQ== - dependencies: - grpc "^1.16.0" - protobufjs "^6.8.8" - -grpc@^1.16.0: - version "1.24.2" - resolved "https://registry.yarnpkg.com/grpc/-/grpc-1.24.2.tgz#76d047bfa7b05b607cbbe3abb99065dcefe0c099" - integrity sha512-EG3WH6AWMVvAiV15d+lr+K77HJ/KV/3FvMpjKjulXHbTwgDZkhkcWbwhxFAoTdxTkQvy0WFcO3Nog50QBbHZWw== - dependencies: - "@types/bytebuffer" "^5.0.40" - lodash.camelcase "^4.3.0" - lodash.clone "^4.5.0" - nan "^2.13.2" - node-pre-gyp "^0.14.0" - protobufjs "^5.0.3" - -gtoken@^2.3.2: - version "2.3.3" - resolved "https://registry.yarnpkg.com/gtoken/-/gtoken-2.3.3.tgz#8a7fe155c5ce0c4b71c886cfb282a9060d94a641" - integrity sha512-EaB49bu/TCoNeQjhCYKI/CurooBKkGxIqFHsWABW0b25fobBYVTMe84A8EBVVZhl8emiUdNypil9huMOTmyAnw== - dependencies: - gaxios "^1.0.4" - google-p12-pem "^1.0.0" - jws "^3.1.5" - mime "^2.2.0" - pify "^4.0.0" - -gtoken@^5.0.4: - version "5.3.2" - resolved "https://registry.yarnpkg.com/gtoken/-/gtoken-5.3.2.tgz#deb7dc876abe002178e0515e383382ea9446d58f" - integrity sha512-gkvEKREW7dXWF8NV8pVrKfW7WqReAmjjkMBh6lNCCGOM4ucS0r0YyXXl0r/9Yj8wcW/32ISkfc8h5mPTDbtifQ== - dependencies: - gaxios "^4.0.0" - google-p12-pem "^3.1.3" - jws "^4.0.0" - -handlebars@^4.0.1, handlebars@^4.7.7: +handlebars@^4.7.7: version "4.7.7" resolved "https://registry.yarnpkg.com/handlebars/-/handlebars-4.7.7.tgz#9ce33416aad02dbd6c8fafa8240d5d98004945a1" integrity sha512-aAcXm5OAfE/8IXkcZvCepKU3VzW1/39Fb5ZuqMtgI/hT8X2YgoMvBY5dLhq/cpOvw7Lk1nK/UF71aLG/ZnVYRA== @@ -11828,11 +10601,6 @@ has-bigints@^1.0.1, has-bigints@^1.0.2: resolved "https://registry.yarnpkg.com/has-bigints/-/has-bigints-1.0.2.tgz#0871bd3e3d51626f6ca0966668ba35d5602d6eaa" integrity sha512-tSvCKtBr9lkF0Ex0aQiP9N+OpV4zi2r/Nee5VkRDbaqv35RLYMzbwQfFSZZH0kR+Rd6302UJZ2p/bJCEoR3VoQ== -has-flag@^1.0.0: - version "1.0.0" - resolved "https://registry.yarnpkg.com/has-flag/-/has-flag-1.0.0.tgz#9d9e793165ce017a00f00418c43f942a7b1d11fa" - integrity sha512-DyYHfIYwAJmjAjSSPKANxI8bFY9YtFrgkAfinBojQ8YJTOuOuav64tMUJv584SES4xl74PmuaevIyaLESHdTAA== - has-flag@^3.0.0: version "3.0.0" resolved "https://registry.yarnpkg.com/has-flag/-/has-flag-3.0.0.tgz#b5d454dc2199ae225699f3467e5a07f3b955bafd" @@ -11886,7 +10654,7 @@ has-tostringtag@^1.0.0: dependencies: has-symbols "^1.0.2" -has-unicode@^2.0.0, has-unicode@^2.0.1: +has-unicode@^2.0.1: version "2.0.1" resolved "https://registry.yarnpkg.com/has-unicode/-/has-unicode-2.0.1.tgz#e0e6fe6a28cf51138855e086d1691e771de2a8b9" integrity sha512-8Rf9Y83NBReMnx0gFzA8JImQACstCYWUplepDa9xprwwtmgEZUF0h/i5xSA625zB/I37EtrswSST6OXxwaaIJQ== @@ -11907,11 +10675,6 @@ hash-base@^3.0.0: readable-stream "^3.6.0" safe-buffer "^5.2.0" -hash-stream-validation@^0.2.1: - version "0.2.4" - resolved "https://registry.yarnpkg.com/hash-stream-validation/-/hash-stream-validation-0.2.4.tgz#ee68b41bf822f7f44db1142ec28ba9ee7ccb7512" - integrity sha512-Gjzu0Xn7IagXVkSu9cSFuK1fqzwtLwFhNhVL8IFJijRNMgUttFbBSIAzKuSIrsFMO1+g1RlsoN49zPIbwPDMGQ== - hash.js@1.1.3: version "1.1.3" resolved "https://registry.yarnpkg.com/hash.js/-/hash.js-1.1.3.tgz#340dedbe6290187151c1ea1d777a3448935df846" @@ -11935,24 +10698,6 @@ hasown@^2.0.0: dependencies: function-bind "^1.1.2" -hdkey@2.1.0: - version "2.1.0" - resolved "https://registry.yarnpkg.com/hdkey/-/hdkey-2.1.0.tgz#755b30b73f54e93c31919c1b2f19205a8e57cb92" - integrity sha512-i9Wzi0Dy49bNS4tXXeGeu0vIcn86xXdPQUpEYg+SO1YiO8HtomjmmRMaRyqL0r59QfcD4PfVbSF3qmsWFwAemA== - dependencies: - bs58check "^2.1.2" - ripemd160 "^2.0.2" - safe-buffer "^5.1.1" - secp256k1 "^4.0.0" - -hdkey@^0.7.1: - version "0.7.1" - resolved "https://registry.yarnpkg.com/hdkey/-/hdkey-0.7.1.tgz#caee4be81aa77921e909b8d228dd0f29acaee632" - integrity sha512-ADjIY5Bqdvp3Sh+SLSS1W3/gTJnlDwwM3UsM/5sHPojc4pLf6X3MfMMiTa96MgtADNhTPa+E+SAKMtqdv1zUfw== - dependencies: - coinstring "^2.0.0" - secp256k1 "^3.0.1" - he@1.1.1: version "1.1.1" resolved "https://registry.yarnpkg.com/he/-/he-1.1.1.tgz#93410fd21b009735151f8868c2f271f3427e23fd" @@ -12129,14 +10874,6 @@ http2-wrapper@^2.1.10: quick-lru "^5.1.1" resolve-alpn "^1.2.0" -https-proxy-agent@^2.2.1: - version "2.2.4" - resolved "https://registry.yarnpkg.com/https-proxy-agent/-/https-proxy-agent-2.2.4.tgz#4ee7a737abd92678a293d9b34a1af4d0d08c787b" - integrity sha512-OmvfoQ53WLjtA9HeYP9RNrWMJzzAz1JGaSFr1nijg0PVR1JaD/xbJq1mdEIIlxGpXp9eSe/O2LgU9DJmTPd0Eg== - dependencies: - agent-base "^4.3.0" - debug "^3.1.0" - https-proxy-agent@^5.0.0: version "5.0.1" resolved "https://registry.yarnpkg.com/https-proxy-agent/-/https-proxy-agent-5.0.1.tgz#c59ef224a04fe8b754f3db0063a25ea30d0005d6" @@ -12172,7 +10909,7 @@ husky@^8.0.0: resolved "https://registry.yarnpkg.com/husky/-/husky-8.0.3.tgz#4936d7212e46d1dea28fef29bb3a108872cd9184" integrity sha512-+dQSyqPh4x1hlO1swXBiNb2HzTDN1I2IGLQx1GrBuiqFJfoMrnZWwVmatvSiO+Iz8fBUnf+lekwNo4c2LlXItg== -iconv-lite@0.4.24, iconv-lite@^0.4.24, iconv-lite@^0.4.4: +iconv-lite@0.4.24, iconv-lite@^0.4.24: version "0.4.24" resolved "https://registry.yarnpkg.com/iconv-lite/-/iconv-lite-0.4.24.tgz#2022b4b25fbddc21d2f524974a474aafe733908b" integrity sha512-v3MXnZAcvnywkTUEZomIActle7RXXeedOR31wwl7VlyoXO4Qi9arvSenNQWne1TcRwhCL1HwLI21bEqdpj8/rA== @@ -12198,13 +10935,6 @@ ieee754@^1.1.13, ieee754@^1.2.1: resolved "https://registry.yarnpkg.com/ieee754/-/ieee754-1.2.1.tgz#8eb7a10a63fff25d15a57b001586d177d1b0d352" integrity sha512-dcyqhDvX1C46lXZcVqCpK+FtMRQVdIMN6/Df5js2zouUsqG7I6sFxitIC+7KYK29KdXOLHdu9zL4sFnoVQnqaA== -ignore-walk@^3.0.1: - version "3.0.4" - resolved "https://registry.yarnpkg.com/ignore-walk/-/ignore-walk-3.0.4.tgz#c9a09f69b7c7b479a5d74ac1a3c0d4236d2a6335" - integrity sha512-PY6Ii8o1jMRA1z4F2hRkH/xN59ox43DavKvD3oDpfurRlOJyAHpifIwpbdv1n4jt4ov0jSpw3kQ4GhJnpBL6WQ== - dependencies: - minimatch "^3.0.4" - ignore-walk@^5.0.1: version "5.0.1" resolved "https://registry.yarnpkg.com/ignore-walk/-/ignore-walk-5.0.1.tgz#5f199e23e1288f518d90358d461387788a154776" @@ -12313,25 +11043,6 @@ inquirer@^6.2.2: strip-ansi "^5.1.0" through "^2.3.6" -inquirer@^7.0.5: - version "7.3.3" - resolved "https://registry.yarnpkg.com/inquirer/-/inquirer-7.3.3.tgz#04d176b2af04afc157a83fd7c100e98ee0aad003" - integrity sha512-JG3eIAj5V9CwcGvuOmoo6LB9kbAYT8HXffUl6memuszlwDC/qvFAJw49XJ5NROSFNPxp3iQg1GqkFhaY/CR0IA== - dependencies: - ansi-escapes "^4.2.1" - chalk "^4.1.0" - cli-cursor "^3.1.0" - cli-width "^3.0.0" - external-editor "^3.0.3" - figures "^3.0.0" - lodash "^4.17.19" - mute-stream "0.0.8" - run-async "^2.4.0" - rxjs "^6.6.0" - string-width "^4.1.0" - strip-ansi "^6.0.0" - through "^2.3.6" - inquirer@^8.2.4: version "8.2.5" resolved "https://registry.yarnpkg.com/inquirer/-/inquirer-8.2.5.tgz#d8654a7542c35a9b9e069d27e2df4858784d54f8" @@ -12362,13 +11073,6 @@ internal-slot@^1.0.4, internal-slot@^1.0.5: has "^1.0.3" side-channel "^1.0.4" -invariant@2: - version "2.2.4" - resolved "https://registry.yarnpkg.com/invariant/-/invariant-2.2.4.tgz#610f3c92c9359ce1db616e538008d23ff35158e6" - integrity sha512-phJfQVBuaJM5raOpJjSfkiD6BpbCE4Ns//LaXl6wGYtUBY83nWS6Rf9tXm2e8VaK60JEjYldbPif/A2B1C2gNA== - dependencies: - loose-envify "^1.0.0" - invert-kv@^1.0.0: version "1.0.0" resolved "https://registry.yarnpkg.com/invert-kv/-/invert-kv-1.0.0.tgz#104a8e4aaca6d3d8cd157a8ef8bfab2d7a3ffdb6" @@ -12748,11 +11452,6 @@ is-ssh@^1.4.0: dependencies: protocols "^2.0.1" -is-stream-ended@^0.1.4: - version "0.1.4" - resolved "https://registry.yarnpkg.com/is-stream-ended/-/is-stream-ended-0.1.4.tgz#f50224e95e06bce0e356d440a4827cd35b267eda" - integrity sha512-xj0XPvmr7bQFTvirqnFr50o0hQIh6ZItDqloxt5aJrR4NQsYeSsyFQERYGCAzfindAcnKjINnwEEgLx4IqVzQw== - is-stream@^1.0.0, is-stream@^1.0.1, is-stream@^1.1.0: version "1.1.0" resolved "https://registry.yarnpkg.com/is-stream/-/is-stream-1.1.0.tgz#12d4a3dd4e68e0b79ceb8dbc84173ae80d91ca44" @@ -12915,14 +11614,6 @@ isomorphic-fetch@^2.2.0, isomorphic-fetch@^2.2.1: node-fetch "^1.0.1" whatwg-fetch ">=0.10.0" -isomorphic-fetch@^3.0.0: - version "3.0.0" - resolved "https://registry.yarnpkg.com/isomorphic-fetch/-/isomorphic-fetch-3.0.0.tgz#0267b005049046d2421207215d45d6a262b8b8b4" - integrity sha512-qvUtwJ3j6qwsF3jLxkZ72qCgjMysPzDfeV240JHiGZsANBYd+EEuu35v7dfrJ9Up0Ak07D7GGSkGhCHTqg/5wA== - dependencies: - node-fetch "^2.6.1" - whatwg-fetch "^3.4.1" - isomorphic-ws@^4.0.1: version "4.0.1" resolved "https://registry.yarnpkg.com/isomorphic-ws/-/isomorphic-ws-4.0.1.tgz#55fd4cd6c5e6491e76dc125938dd863f5cd4f2dc" @@ -12933,6 +11624,11 @@ isows@1.0.3: resolved "https://registry.yarnpkg.com/isows/-/isows-1.0.3.tgz#93c1cf0575daf56e7120bab5c8c448b0809d0d74" integrity sha512-2cKei4vlmg2cxEjm3wVSqn8pcoRF/LX/wpifuuNquFO4SQmPwarClT+SUCA2lt+l581tTeZIPIZuIDo2jWN1fg== +isows@1.0.7: + version "1.0.7" + resolved "https://registry.yarnpkg.com/isows/-/isows-1.0.7.tgz#1c06400b7eed216fbba3bcbd68f12490fc342915" + integrity sha512-I1fSfDCZL5P0v33sVqeTDSpcstAg/N+wF5HS033mogOVIp4B+oHC7oOCsA3axAbBSGTJ8QubbNmnIRN/h8U7hg== + isstream@0.1.x, isstream@~0.1.2: version "0.1.2" resolved "https://registry.yarnpkg.com/isstream/-/isstream-0.1.2.tgz#47e63f7af55afa6f92e1500e690eb8b8529c099a" @@ -12980,26 +11676,6 @@ istanbul-reports@^3.1.3: html-escaper "^2.0.0" istanbul-lib-report "^3.0.0" -istanbul@^0.4.5: - version "0.4.5" - resolved "https://registry.yarnpkg.com/istanbul/-/istanbul-0.4.5.tgz#65c7d73d4c4da84d4f3ac310b918fb0b8033733b" - integrity sha512-nMtdn4hvK0HjUlzr1DrKSUY8ychprt8dzHOgY2KXsIhHu5PuQQEOTM27gV9Xblyon7aUH/TSFIjRHEODF/FRPg== - dependencies: - abbrev "1.0.x" - async "1.x" - escodegen "1.8.x" - esprima "2.7.x" - glob "^5.0.15" - handlebars "^4.0.1" - js-yaml "3.x" - mkdirp "0.5.x" - nopt "3.x" - once "1.x" - resolve "1.1.x" - supports-color "^3.1.0" - which "^1.1.1" - wordwrap "^1.0.0" - isurl@^1.0.0-alpha5: version "1.0.0" resolved "https://registry.yarnpkg.com/isurl/-/isurl-1.0.0.tgz#b27f4f49f3cdaa3ea44a0a5b7f3462e6edc39d67" @@ -13015,11 +11691,6 @@ iter-tools@^7.0.2: dependencies: "@babel/runtime" "^7.12.1" -iterall@^1.2.2: - version "1.3.0" - resolved "https://registry.yarnpkg.com/iterall/-/iterall-1.3.0.tgz#afcb08492e2915cbd8a0884eb93a8c94d0d72fea" - integrity sha512-QZ9qOMdF+QLHxy1QIpUHUU1D5pS2CG2P69LF6L6CPjPYA/XMOmKV3PZpawHoAjHNyB0swdVTRxdYT4tbBbxqwg== - iterate-iterator@^1.0.1: version "1.0.2" resolved "https://registry.yarnpkg.com/iterate-iterator/-/iterate-iterator-1.0.2.tgz#551b804c9eaa15b847ea6a7cdc2f5bf1ec150f91" @@ -13033,21 +11704,6 @@ iterate-value@^1.0.0: es-get-iterator "^1.0.2" iterate-iterator "^1.0.1" -j6@^1.0.2: - version "1.0.2" - resolved "https://registry.yarnpkg.com/j6/-/j6-1.0.2.tgz#48088acb1c66f610baf2e50b4028d0932f35ed89" - integrity sha512-pKADWYWaJu2traMItqvGo9XUF7/FKUO0SaI7b7ddvsqKhHIsw6Y/CTUvhc0MBSt9M3Q8/ywvQc0+KYqxLJcMnw== - dependencies: - algebrite "^0.2.23" - jStat "^1.5.2" - lodash "^4.16.4" - numeric "^1.2.6" - -jStat@^1.5.2: - version "1.8.6" - resolved "https://registry.yarnpkg.com/jStat/-/jStat-1.8.6.tgz#ab4d465b21f583d37a72ab2f97a300492da7575d" - integrity sha512-Oh/ePZVSoFigme69pHTQudcGh64cpNH9Lz3hBZcRJWLrDqpw7JfuYU9F3dj9py3tBYmHz7og7ZT8hXTNbYq9Rw== - jackspeak@^2.3.5: version "2.3.6" resolved "https://registry.yarnpkg.com/jackspeak/-/jackspeak-2.3.6.tgz#647ecc472238aee4b06ac0e461acc21a8c505ca8" @@ -13067,11 +11723,6 @@ jake@^10.8.5: filelist "^1.0.1" minimatch "^3.0.4" -javascript-natural-sort@0.7.1: - version "0.7.1" - resolved "https://registry.yarnpkg.com/javascript-natural-sort/-/javascript-natural-sort-0.7.1.tgz#f9e2303d4507f6d74355a73664d1440fb5a0ef59" - integrity sha512-nO6jcEfZWQXDhOiBtG2KvKyEptz7RVbpGP4vTD2hLBdmNQSsCiicO2Ioinv6UI4y9ukqnBpy+XZ9H6uLNgJTlw== - jest-changed-files@^29.5.0: version "29.5.0" resolved "https://registry.yarnpkg.com/jest-changed-files/-/jest-changed-files-29.5.0.tgz#e88786dca8bf2aa899ec4af7644e16d9dcf9b23e" @@ -13448,11 +12099,6 @@ js-sha3@0.8.0, js-sha3@^0.8.0: resolved "https://registry.yarnpkg.com/js-sha3/-/js-sha3-0.8.0.tgz#b9b7a5da73afad7dedd0f8c463954cbde6818840" integrity sha512-gF1cRrHhIzNfToc802P800N8PpXS+evLLXfsVpowqmAFR9uwbi89WvXg2QspOmXL8QL86J4T1EpFu+yUkwJY3Q== -js-sha3@^0.7.0: - version "0.7.0" - resolved "https://registry.yarnpkg.com/js-sha3/-/js-sha3-0.7.0.tgz#0a5c57b36f79882573b2d84051f8bb85dd1bd63a" - integrity sha512-Wpks3yBDm0UcL5qlVhwW9Jr9n9i4FfeWBFOOXP5puDS/SiudJGhw7DPyBqn3487qD4F0lsC0q3zxink37f7zeA== - "js-tokens@^3.0.0 || ^4.0.0", js-tokens@^4.0.0: version "4.0.0" resolved "https://registry.yarnpkg.com/js-tokens/-/js-tokens-4.0.0.tgz#19203fb59991df98e3a287050d4647cdeaf32499" @@ -13466,14 +12112,6 @@ js-yaml@3.13.1: argparse "^1.0.7" esprima "^4.0.0" -js-yaml@3.x, js-yaml@^3.10.0, js-yaml@^3.13.0, js-yaml@^3.13.1: - version "3.14.1" - resolved "https://registry.yarnpkg.com/js-yaml/-/js-yaml-3.14.1.tgz#dae812fdb3825fa306609a8717383c50c36a0537" - integrity sha512-okMH7OXXJ7YrN9Ok3/SXrnu4iX9yOk+25nqX4imS2npuvTYDmo/QEZoqwZkYaIDk3jVvBOTOIEgEhaLOynBS9g== - dependencies: - argparse "^1.0.7" - esprima "^4.0.0" - js-yaml@4.1.0, js-yaml@^4.1.0: version "4.1.0" resolved "https://registry.yarnpkg.com/js-yaml/-/js-yaml-4.1.0.tgz#c1fb65f8f5017901cdd2c951864ba18458a10602" @@ -13481,6 +12119,14 @@ js-yaml@4.1.0, js-yaml@^4.1.0: dependencies: argparse "^2.0.1" +js-yaml@^3.10.0, js-yaml@^3.13.0, js-yaml@^3.13.1: + version "3.14.1" + resolved "https://registry.yarnpkg.com/js-yaml/-/js-yaml-3.14.1.tgz#dae812fdb3825fa306609a8717383c50c36a0537" + integrity sha512-okMH7OXXJ7YrN9Ok3/SXrnu4iX9yOk+25nqX4imS2npuvTYDmo/QEZoqwZkYaIDk3jVvBOTOIEgEhaLOynBS9g== + dependencies: + argparse "^1.0.7" + esprima "^4.0.0" + jsbn@~0.1.0: version "0.1.1" resolved "https://registry.yarnpkg.com/jsbn/-/jsbn-0.1.1.tgz#a5e654c2e5a2deb5f201d96cefbca80c0ef2f513" @@ -13491,20 +12137,6 @@ jsesc@^2.5.1: resolved "https://registry.yarnpkg.com/jsesc/-/jsesc-2.5.2.tgz#80564d2e483dacf6e8ef209650a67df3f0c283a4" integrity sha512-OYu7XEzjkCQ3C5Ps3QIZsQfNpqoJyZZA99wd9aWd05NCtC5pWOkShK2mkL6HXQR6/Cy2lbNdPlZBpuQHXE63gA== -json-bigint@^0.3.0: - version "0.3.1" - resolved "https://registry.yarnpkg.com/json-bigint/-/json-bigint-0.3.1.tgz#0c1729d679f580d550899d6a2226c228564afe60" - integrity sha512-DGWnSzmusIreWlEupsUelHrhwmPPE+FiQvg+drKfk2p+bdEYa5mp4PJ8JsCWqae0M2jQNb0HPvnwvf1qOTThzQ== - dependencies: - bignumber.js "^9.0.0" - -json-bigint@^1.0.0: - version "1.0.0" - resolved "https://registry.yarnpkg.com/json-bigint/-/json-bigint-1.0.0.tgz#ae547823ac0cad8398667f8cd9ef4730f5b01ff1" - integrity sha512-SiPv/8VpZuWbvLSMtTDU8hEfrZWg/mH/nV/b4o0CYbSxu1UIQPLdwKOCIyLQX+VIPO5vrLX3i8qtqFyhdPSUSQ== - dependencies: - bignumber.js "^9.0.0" - json-buffer@3.0.0: version "3.0.0" resolved "https://registry.yarnpkg.com/json-buffer/-/json-buffer-3.0.0.tgz#5b1f397afc75d677bde8bcfc0e47e1f9a3d9a898" @@ -13548,13 +12180,6 @@ json-rpc-engine@^6.1.0: "@metamask/safe-event-emitter" "^2.0.0" eth-rpc-errors "^4.0.2" -json-rpc-error@2.0.0: - version "2.0.0" - resolved "https://registry.yarnpkg.com/json-rpc-error/-/json-rpc-error-2.0.0.tgz#a7af9c202838b5e905c7250e547f1aff77258a02" - integrity sha512-EwUeWP+KgAZ/xqFpaP6YDAXMtCJi+o/QQpCQFIYyxr01AdADi2y413eM8hSqJcoQym9WMePAJWoaODEJufC4Ug== - dependencies: - inherits "^2.0.1" - json-rpc-random-id@^1.0.0, json-rpc-random-id@^1.0.1: version "1.0.1" resolved "https://registry.yarnpkg.com/json-rpc-random-id/-/json-rpc-random-id-1.0.1.tgz#ba49d96aded1444dbb8da3d203748acbbcdec8c8" @@ -13717,16 +12342,7 @@ jwa@^1.4.1: ecdsa-sig-formatter "1.0.11" safe-buffer "^5.0.1" -jwa@^2.0.0: - version "2.0.0" - resolved "https://registry.yarnpkg.com/jwa/-/jwa-2.0.0.tgz#a7e9c3f29dae94027ebcaf49975c9345593410fc" - integrity sha512-jrZ2Qx916EA+fq9cEAeCROWPTfCwi1IVHqT2tapuqLEVVDKFDENFw1oL+MwrTvH6msKxsd1YTDVw6uKEcsrLEA== - dependencies: - buffer-equal-constant-time "1.0.1" - ecdsa-sig-formatter "1.0.11" - safe-buffer "^5.0.1" - -jws@^3.1.5, jws@^3.2.2: +jws@^3.2.2: version "3.2.2" resolved "https://registry.yarnpkg.com/jws/-/jws-3.2.2.tgz#001099f3639468c9414000e99995fa52fb478304" integrity sha512-YHlZCB6lMTllWDtSPHz/ZXTsi8S00usEV6v1tjq8tOUZzw7DpSDWVXjXDre6ed1w/pd495ODpHZYSdkRTsa0HA== @@ -13734,14 +12350,6 @@ jws@^3.1.5, jws@^3.2.2: jwa "^1.4.1" safe-buffer "^5.0.1" -jws@^4.0.0: - version "4.0.0" - resolved "https://registry.yarnpkg.com/jws/-/jws-4.0.0.tgz#2d4e8cf6a318ffaa12615e9dec7e86e6c97310f4" - integrity sha512-KDncfTmOZoOMTFG4mBlG0qUIOlc03fmzH+ru6RgYVZhPkyiy/92Owlt/8UEN+a4TXR1FQetfIpJE8ApdvdVxTg== - dependencies: - jwa "^2.0.0" - safe-buffer "^5.0.1" - keccak@3.0.1: version "3.0.1" resolved "https://registry.yarnpkg.com/keccak/-/keccak-3.0.1.tgz#ae30a0e94dbe43414f741375cff6d64c8bea0bff" @@ -13830,11 +12438,6 @@ klaw@^1.0.0: optionalDependencies: graceful-fs "^4.1.9" -kleur@^2.0.1: - version "2.0.2" - resolved "https://registry.yarnpkg.com/kleur/-/kleur-2.0.2.tgz#b704f4944d95e255d038f0cb05fb8a602c55a300" - integrity sha512-77XF9iTllATmG9lSlIv0qdQ2BQ/h9t0bJllHlbvsQ0zUWfU7Yi0S8L5JXzPZgkefIiajLmBJJ4BsMJmqcf7oxQ== - kleur@^3.0.3: version "3.0.3" resolved "https://registry.yarnpkg.com/kleur/-/kleur-3.0.3.tgz#a79c9ecc86ee1ce3fa6206d1216c501f147fc07e" @@ -14275,36 +12878,11 @@ lodash.assign@^4.0.3, lodash.assign@^4.0.6: resolved "https://registry.yarnpkg.com/lodash.assign/-/lodash.assign-4.2.0.tgz#0d99f3ccd7a6d261d19bdaeb9245005d285808e7" integrity sha512-hFuH8TY+Yji7Eja3mGiuAxBqLagejScbG8GbG0j6o9vzn0YL14My+ktnqtZgFTosKymC9/44wP6s7xyuLfnClw== -lodash.at@^4.6.0: - version "4.6.0" - resolved "https://registry.yarnpkg.com/lodash.at/-/lodash.at-4.6.0.tgz#93cdce664f0a1994ea33dd7cd40e23afd11b0ff8" - integrity sha512-GOTh0SEp+Yosnlpjic+8cl2WM9MykorogkGA9xyIFkkObQ3H3kNZqZ+ohuq4K3FrSVo7hMcZBMataJemrxC3BA== - -lodash.camelcase@^4.3.0: - version "4.3.0" - resolved "https://registry.yarnpkg.com/lodash.camelcase/-/lodash.camelcase-4.3.0.tgz#b28aa6288a2b9fc651035c7711f65ab6190331a6" - integrity sha512-TwuEnCnxbc3rAvhf/LbG7tJUDzhqXyFnv3dtzLOPgCG/hODL7WFnsbwktkD7yUV0RrreP/l1PALq/YSg6VvjlA== - -lodash.clone@^4.5.0: - version "4.5.0" - resolved "https://registry.yarnpkg.com/lodash.clone/-/lodash.clone-4.5.0.tgz#195870450f5a13192478df4bc3d23d2dea1907b6" - integrity sha512-GhrVeweiTD6uTmmn5hV/lzgCQhccwReIVRLHp7LT4SopOjqEZ5BbX8b5WWEtAKasjmy8hR7ZPwsYlxRCku5odg== - lodash.debounce@^4.0.8: version "4.0.8" resolved "https://registry.yarnpkg.com/lodash.debounce/-/lodash.debounce-4.0.8.tgz#82d79bff30a67c4005ffd5e2515300ad9ca4d7af" integrity sha512-FT1yDzDYEoYWhnSGnpE/4Kj1fLZkDFyqRb7fNt6FdYOSxlUWAtp42Eh6Wb0rGIv/m9Bgo7x4GhQbm5Ys4SG5ow== -lodash.get@~4.4.2: - version "4.4.2" - resolved "https://registry.yarnpkg.com/lodash.get/-/lodash.get-4.4.2.tgz#2d177f652fa31e939b4438d5341499dfa3825e99" - integrity sha512-z+Uw/vLuy6gQe8cfaFWD7p0wVv8fJl3mbzXh33RS+0oW2wvUqiRXiQ69gLWSLpgB5/6sU+r6BlQR0MBILadqTQ== - -lodash.has@^4.5.2: - version "4.5.2" - resolved "https://registry.yarnpkg.com/lodash.has/-/lodash.has-4.5.2.tgz#d19f4dc1095058cccbe2b0cdf4ee0fe4aa37c862" - integrity sha512-rnYUdIo6xRCJnQmbVFEwcxF144erlD+M3YcJUVesflU9paQaE8p+fJDcIQrlMYbxoANFL+AB9hZrzSBBk5PL+g== - lodash.includes@^4.3.0: version "4.3.0" resolved "https://registry.yarnpkg.com/lodash.includes/-/lodash.includes-4.3.0.tgz#60bb98a87cb923c68ca1e51325483314849f553f" @@ -14345,7 +12923,7 @@ lodash.memoize@4.x: resolved "https://registry.yarnpkg.com/lodash.memoize/-/lodash.memoize-4.1.2.tgz#bcc6c49a42a2840ed997f323eada5ecd182e0bfe" integrity sha512-t7j+NzmgnQzTAYXcsHYLgimltOV1MXHtlOWf6GjL9Kj8GK5FInw5JotxvbOs+IvV1/Dzo04/fCGfLVs7aXb4Ag== -lodash.merge@^4.6.0, lodash.merge@^4.6.2: +lodash.merge@^4.6.2: version "4.6.2" resolved "https://registry.yarnpkg.com/lodash.merge/-/lodash.merge-4.6.2.tgz#558aa53b43b661e1925a0afdfa36a9a1085fe57a" integrity sha512-0KpjqXRVvrYyCsX1swR/XTK0va6VQkQM6MNo7PqW77ByjAhoARA8EfrP1N4+KlKj8YS0ZUCtRT/YUuhyYDujIQ== @@ -14355,11 +12933,6 @@ lodash.once@^4.0.0: resolved "https://registry.yarnpkg.com/lodash.once/-/lodash.once-4.1.1.tgz#0dd3971213c7c56df880977d504c88fb471a97ac" integrity sha512-Sb487aTOCr9drQVL8pIxOzVhafOjZN9UU54hiN8PU3uAiSV7lx1yYNpbNmex2PK6dSJoNTSJUUswT651yww3Mg== -lodash.snakecase@^4.1.1: - version "4.1.1" - resolved "https://registry.yarnpkg.com/lodash.snakecase/-/lodash.snakecase-4.1.1.tgz#39d714a35357147837aefd64b5dcbb16becd8f8d" - integrity sha512-QZ1d4xoBHYUeuouhEq3lk3Uq7ldgyFXGBhg04+oRLnIz8o9T65Eh+8YdroUwn846zchkA9yDsDl5CVVaV2nqYw== - lodash.sortby@^4.7.0: version "4.7.0" resolved "https://registry.yarnpkg.com/lodash.sortby/-/lodash.sortby-4.7.0.tgz#edd14c824e2cc9c1e0b0a1b42bb5210516a42438" @@ -14370,12 +12943,7 @@ lodash.truncate@^4.4.2: resolved "https://registry.yarnpkg.com/lodash.truncate/-/lodash.truncate-4.4.2.tgz#5a350da0b1113b837ecfffd5812cbe58d6eae193" integrity sha512-jttmRe7bRse52OsWIMDLaXxWqRAmtIUccAQ3garviCqJjafXOfNMO0yMfNpdD6zbGaTU0P5Nz7e7gAT6cKmJRw== -lodash.values@^4.3.0: - version "4.3.0" - resolved "https://registry.yarnpkg.com/lodash.values/-/lodash.values-4.3.0.tgz#a3a6c2b0ebecc5c2cba1c17e6e620fe81b53d347" - integrity sha512-r0RwvdCv8id9TUblb/O7rYPwVy6lerCbcawrfdo9iC/1t1wsNMJknO79WNBgwkH0hIeJ08jmvvESbFpNb4jH0Q== - -lodash@^4.16.4, lodash@^4.17.11, lodash@^4.17.12, lodash@^4.17.14, lodash@^4.17.15, lodash@^4.17.19, lodash@^4.17.21, lodash@^4.2.1: +lodash@^4.17.11, lodash@^4.17.12, lodash@^4.17.14, lodash@^4.17.15, lodash@^4.17.19, lodash@^4.17.21, lodash@^4.2.1: version "4.17.21" resolved "https://registry.yarnpkg.com/lodash/-/lodash-4.17.21.tgz#679591c564c3bffaae8454cf0b3df370c3d6911c" integrity sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg== @@ -14410,7 +12978,7 @@ log-symbols@^5.1.0: chalk "^5.0.0" is-unicode-supported "^1.1.0" -loglevel@^1.6.1, loglevel@^1.6.8: +loglevel@^1.6.8: version "1.8.1" resolved "https://registry.yarnpkg.com/loglevel/-/loglevel-1.8.1.tgz#5c621f83d5b48c54ae93b6156353f555963377b4" integrity sha512-tCRIJM51SHjAayKwC+QAg8hT8vg6z7GSgLJKGvzuPb1Wc+hLzqtuVLxp6/HzSPOozuK+8ErAhy7U/sVzw8Dgfg== @@ -14420,17 +12988,7 @@ long@^4.0.0: resolved "https://registry.yarnpkg.com/long/-/long-4.0.0.tgz#9a7b71cfb7d361a194ea555241c92f7468d5bf28" integrity sha512-XsP+KhQif4bjX1kbuSiySJFNAehNxgLb6hPRGJ9QsUr8ajHkuXGdrHmFUTUUXhDwVX2R5bY4JNZEwbUiMhV+MA== -long@^5.0.0: - version "5.2.3" - resolved "https://registry.yarnpkg.com/long/-/long-5.2.3.tgz#a3ba97f3877cf1d778eccbcb048525ebb77499e1" - integrity sha512-lcHwpNoggQTObv5apGNCTdJrO69eHOZMi4BNC+rTLER8iHAqGrUVeLh/irVIM7zTw2bOXA8T6uNPeujwOLg/2Q== - -long@~3: - version "3.2.0" - resolved "https://registry.yarnpkg.com/long/-/long-3.2.0.tgz#d821b7138ca1cb581c172990ef14db200b5c474b" - integrity sha512-ZYvPPOMqUwPoDsbJaR10iQJYnMuZhRTvHYl62ErLIEX7RgFlziSBUUvrt3OVfc47QlHHpzPZYP17g3Fv7oeJkg== - -loose-envify@^1.0.0, loose-envify@^1.1.0: +loose-envify@^1.1.0: version "1.4.0" resolved "https://registry.yarnpkg.com/loose-envify/-/loose-envify-1.4.0.tgz#71ee51fa7be4caec1a63839f7e682d8132d30caf" integrity sha512-lyuxPGr/Wfhrlem2CL/UcnUc1zcqKAImBDzukY7Y5F/yQiNdko6+fRLevlw1HgMySw7f611UIY408EtxRSoK3Q== @@ -14483,7 +13041,7 @@ lowercase-keys@^3.0.0: resolved "https://registry.yarnpkg.com/lru-cache/-/lru-cache-7.13.1.tgz#267a81fbd0881327c46a81c5922606a2cfe336c4" integrity sha512-CHqbAq7NFlW3RSnoWXLJBxCWaZVBrfa9UEHId2M3AW8iEBurbqduNexEUCGc3SHc6iCYXNJCDi903LajSVAEPQ== -lru-cache@^5.0.0, lru-cache@^5.1.1: +lru-cache@^5.1.1: version "5.1.1" resolved "https://registry.yarnpkg.com/lru-cache/-/lru-cache-5.1.1.tgz#1da27e6710271947695daf6848e847f01d84b920" integrity sha512-KpNARQA3Iwv+jTA0utUVVbrh+Jlrr1Fv0e56GGzAFOXN7dk/FviaDW8LHmK52DlcH4WP2n6gI8vN1aesBFgo9w== @@ -14602,20 +13160,6 @@ math-random@^1.0.1: resolved "https://registry.yarnpkg.com/math-random/-/math-random-1.0.4.tgz#5dd6943c938548267016d4e34f057583080c514c" integrity sha512-rUxjysqif/BZQH2yhd5Aaq7vXMSx9NdEsQcyA07uEzIvxgI7zIr33gGsh+RU0/XjmQpCW7RsVof1vlkvQVCK5A== -mathjs@^5.0.4: - version "5.10.3" - resolved "https://registry.yarnpkg.com/mathjs/-/mathjs-5.10.3.tgz#e998885f932ea8886db8b40f7f5b199f89b427f1" - integrity sha512-ySjg30BC3dYjQm73ILZtwcWzFJde0VU6otkXW/57IjjuYRa3Qaf0Kb8pydEuBZYtqW2OxreAtsricrAmOj3jIw== - dependencies: - complex.js "2.0.11" - decimal.js "10.2.0" - escape-latex "1.2.0" - fraction.js "4.0.12" - javascript-natural-sort "0.7.1" - seed-random "2.2.0" - tiny-emitter "2.1.0" - typed-function "1.1.0" - mcl-wasm@^0.7.1: version "0.7.9" resolved "https://registry.yarnpkg.com/mcl-wasm/-/mcl-wasm-0.7.9.tgz#c1588ce90042a8700c3b60e40efb339fc07ab87f" @@ -14814,12 +13358,12 @@ miller-rabin@^4.0.0: bn.js "^4.0.0" brorand "^1.0.1" -mime-db@1.52.0, "mime-db@>= 1.43.0 < 2": +mime-db@1.52.0: version "1.52.0" resolved "https://registry.yarnpkg.com/mime-db/-/mime-db-1.52.0.tgz#bbabcdc02859f4987301c856e3387ce5ec43bf70" integrity sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg== -mime-types@^2.0.8, mime-types@^2.1.12, mime-types@^2.1.16, mime-types@~2.1.19, mime-types@~2.1.24, mime-types@~2.1.34: +mime-types@^2.1.12, mime-types@^2.1.16, mime-types@~2.1.19, mime-types@~2.1.24, mime-types@~2.1.34: version "2.1.35" resolved "https://registry.yarnpkg.com/mime-types/-/mime-types-2.1.35.tgz#381a871b62a734450660ae3deee44813f70d959a" integrity sha512-ZDY+bPm5zTTF+YpCrAU9nK0UgICYPT0QtT1NZWFv4s++TNkcgVaT0g6+4R2uI4MjQjzysHB1zxuWL50hzaeXiw== @@ -14831,11 +13375,6 @@ mime@1.6.0: resolved "https://registry.yarnpkg.com/mime/-/mime-1.6.0.tgz#32cd9e5c64553bd58d19a568af452acff04981b1" integrity sha512-x0Vn8spI+wuJ1O6S7gnbaQg8Pxh4NNHb7KSINmEWKiPE4RKOplvijn+NkmYmmRgP68mc70j2EbeTFRsrswaQeg== -mime@^2.2.0: - version "2.6.0" - resolved "https://registry.yarnpkg.com/mime/-/mime-2.6.0.tgz#a2a682a95cd4d0cb1d6257e28f83da7e35800367" - integrity sha512-USPkMeET31rOMiarsBNIHZKLGgvKc/LrjofAnBlOttf5ajRvqiRA8QsenbcooctK6d6Ts6aqZXBA+XbkKthiQg== - mimic-fn@^1.0.0: version "1.2.0" resolved "https://registry.yarnpkg.com/mimic-fn/-/mimic-fn-1.2.0.tgz#820c86a39334640e99516928bd03fca88057d022" @@ -14861,11 +13400,6 @@ mimic-response@^1.0.0, mimic-response@^1.0.1: resolved "https://registry.yarnpkg.com/mimic-response/-/mimic-response-1.0.1.tgz#4923538878eef42063cb8a3e3b0798781487ab1b" integrity sha512-j5EctnkH7amfV/q5Hgmoal1g2QHFJRraOtmx0JpIqkxhBhI/lJSl1nMpQ45hVarwNETOoWEimndZ4QK0RHxuxQ== -mimic-response@^2.0.0: - version "2.1.0" - resolved "https://registry.yarnpkg.com/mimic-response/-/mimic-response-2.1.0.tgz#d13763d35f613d09ec37ebb30bac0469c0ee8f43" - integrity sha512-wXqjST+SLt7R009ySCglWBCFpjUygmCIfD790/kVbiGmUgfYGuB14PiTd5DwVxSV4NcYHjzMkoj5LjQZwTQLEA== - mimic-response@^3.1.0: version "3.1.0" resolved "https://registry.yarnpkg.com/mimic-response/-/mimic-response-3.1.0.tgz#2d1d59af9c1b129815accc2c46a022a5ce1fa3c9" @@ -14898,13 +13432,6 @@ minimalistic-crypto-utils@^1.0.1: resolved "https://registry.yarnpkg.com/minimalistic-crypto-utils/-/minimalistic-crypto-utils-1.0.1.tgz#f6c00c1c0b082246e5c4d99dfb8c7c083b2b582a" integrity sha512-JIYlbt6g8i5jKfJ3xz7rF0LXmv2TkDxBLUkiBeZ7bAx4GnnNMr8xFpGnOxn6GhTEHx3SjRrZEoU+j04prX1ktg== -"minimatch@2 || 3", minimatch@^3.0.4, minimatch@^3.0.5, minimatch@^3.1.1, minimatch@^3.1.2: - version "3.1.2" - resolved "https://registry.yarnpkg.com/minimatch/-/minimatch-3.1.2.tgz#19cd194bfd3e428f049a70817c038d89ab4be35b" - integrity sha512-J7p63hRiAjw1NDEww1W7i37+ByIrOWO5XQQAzZ3VOcL0PNybwpfmV/N05zFAzwQ9USyEcX6t3UO+K5aqBQOIHw== - dependencies: - brace-expansion "^1.1.7" - minimatch@3.0.4: version "3.0.4" resolved "https://registry.yarnpkg.com/minimatch/-/minimatch-3.0.4.tgz#5166e286457f03306064be5497e8dbb0c3d32083" @@ -14933,6 +13460,13 @@ minimatch@9.0.3, minimatch@^9.0.1: dependencies: brace-expansion "^2.0.1" +minimatch@^3.0.4, minimatch@^3.0.5, minimatch@^3.1.1, minimatch@^3.1.2: + version "3.1.2" + resolved "https://registry.yarnpkg.com/minimatch/-/minimatch-3.1.2.tgz#19cd194bfd3e428f049a70817c038d89ab4be35b" + integrity sha512-J7p63hRiAjw1NDEww1W7i37+ByIrOWO5XQQAzZ3VOcL0PNybwpfmV/N05zFAzwQ9USyEcX6t3UO+K5aqBQOIHw== + dependencies: + brace-expansion "^1.1.7" + minimatch@^5.0.1: version "5.1.6" resolved "https://registry.yarnpkg.com/minimatch/-/minimatch-5.1.6.tgz#1cfcb8cf5522ea69952cd2af95ae09477f122a96" @@ -14954,7 +13488,7 @@ minimist@0.0.8: resolved "https://registry.yarnpkg.com/minimist/-/minimist-0.0.8.tgz#857fcabfc3397d2625b8228262e86aa7a011b05d" integrity sha512-miQKw5Hv4NS1Psg2517mV4e4dYNaO3++hjAvLOAzKqZ61rH8NS1SK+vbfBWZ5PY/Me/bEWhUwqMghEW5Fb9T7Q== -minimist@^1.2.0, minimist@^1.2.3, minimist@^1.2.5, minimist@^1.2.6, minimist@~1.2.7: +minimist@^1.2.0, minimist@^1.2.5, minimist@^1.2.6, minimist@~1.2.7: version "1.2.8" resolved "https://registry.yarnpkg.com/minimist/-/minimist-1.2.8.tgz#c1a464e7693302e082a075cee0c057741ac4772c" integrity sha512-2yyAR8qBkN3YuheJanUpWC5U3bb5osDywNB8RzDVlDwDHbocAJveqqj1u8+SVD7jkWT4yvsHCpWqqWqAxb0zCA== @@ -15054,11 +13588,6 @@ mixin-object@^2.0.0: for-in "^0.1.3" is-extendable "^0.1.1" -mkdirp-classic@^0.5.2, mkdirp-classic@^0.5.3: - version "0.5.3" - resolved "https://registry.yarnpkg.com/mkdirp-classic/-/mkdirp-classic-0.5.3.tgz#fa10c9115cc6d8865be221ba47ee9bed78601113" - integrity sha512-gKLcREMhtuZRwRAfqP3RFW+TK4JqApVBtOIftVgjuABpAtpxhPGaDcfvbhNvD0B8iD1oUr/txX35NjcaY6Ns/A== - mkdirp-infer-owner@^2.0.0: version "2.0.0" resolved "https://registry.yarnpkg.com/mkdirp-infer-owner/-/mkdirp-infer-owner-2.0.0.tgz#55d3b368e7d89065c38f32fd38e638f0ab61d316" @@ -15087,7 +13616,7 @@ mkdirp@0.5.1: dependencies: minimist "0.0.8" -mkdirp@0.5.x, mkdirp@^0.5.1, mkdirp@^0.5.5, mkdirp@~0.5.1: +mkdirp@^0.5.1, mkdirp@^0.5.5: version "0.5.6" resolved "https://registry.yarnpkg.com/mkdirp/-/mkdirp-0.5.6.tgz#7def03d2432dcae4ba1d611445c48396062255f6" integrity sha512-FP+p8RB8OWpF3YZBCrP5gtADmtXApB5AMLn+vdyA+PyxCjrCs00mjyUozssO33cwDeT3wNGdLxJ5M//YqtHAJw== @@ -15221,11 +13750,6 @@ module-not-found-error@^1.0.1: resolved "https://registry.yarnpkg.com/module-not-found-error/-/module-not-found-error-1.0.1.tgz#cf8b4ff4f29640674d6cdd02b0e3bc523c2bbdc0" integrity sha512-pEk4ECWQXV6z2zjhRZUongnLJNUeGQJ3w6OQ5ctGwD+i5o93qjRQUk2Rt6VdNeu3sEP0AB4LcfvdebpxBRVr4g== -moment@^2.10.6: - version "2.29.4" - resolved "https://registry.yarnpkg.com/moment/-/moment-2.29.4.tgz#3dbe052889fe7c1b2ed966fcb3a77328964ef108" - integrity sha512-5LC9SOxjSc2HF6vO2CyuTDNivEdoz2IvyJJGj6X8DJ0eFyfszE0QiEd+iXmBvUP3WHxSjFH/vIsA0EN00cgr8w== - mri@^1.1.4: version "1.2.0" resolved "https://registry.yarnpkg.com/mri/-/mri-1.2.0.tgz#6721480fec2a11a4889861115a48b6cbe7cc8f0b" @@ -15325,15 +13849,6 @@ mute-stream@0.0.8, mute-stream@~0.0.4: resolved "https://registry.yarnpkg.com/mute-stream/-/mute-stream-0.0.8.tgz#1630c42b2251ff81e2a283de96a5497ea92e5e0d" integrity sha512-nnbWWOkoWyUsTjKrhgD0dcz22mdkSnpYqbEjIm2nhwhuxlSkpywJmBo8h0ZqJdkp73mb90SssHkN4rsRaBAfAA== -mv@~2: - version "2.1.1" - resolved "https://registry.yarnpkg.com/mv/-/mv-2.1.1.tgz#ae6ce0d6f6d5e0a4f7d893798d03c1ea9559b6a2" - integrity sha512-at/ZndSy3xEGJ8i0ygALh8ru9qy7gWW1cmkaqBN29JmMlIvM//MEO9y1sk/avxuwnPcfhkejkLsuPxH81BrkSg== - dependencies: - mkdirp "~0.5.1" - ncp "~2.0.0" - rimraf "~2.4.0" - mythxjs@^1.3.11: version "1.3.13" resolved "https://registry.yarnpkg.com/mythxjs/-/mythxjs-1.3.13.tgz#b27ec4170307a3f3bd3943e7e8628c2eb070d09d" @@ -15343,15 +13858,6 @@ mythxjs@^1.3.11: chai-as-promised "^7.1.1" jsonwebtoken "^8.5.1" -mz@^2.7.0: - version "2.7.0" - resolved "https://registry.yarnpkg.com/mz/-/mz-2.7.0.tgz#95008057a56cafadc2bc63dde7f9ff6955948e32" - integrity sha512-z81GNO7nnYMEhrGh9LeymoE4+Yr0Wn5McHIZMK5cfQCl+NDX08sCZgUc9/6MHni9IWuFLm1Z3HTCXu2z9fN62Q== - dependencies: - any-promise "^1.0.0" - object-assign "^4.0.1" - thenify-all "^1.0.0" - nan@^2.13.2, nan@^2.14.0, nan@^2.2.1: version "2.17.0" resolved "https://registry.yarnpkg.com/nan/-/nan-2.17.0.tgz#c0150a2368a182f033e9aa5195ec76ea41a199cb" @@ -15372,11 +13878,6 @@ nanoid@3.3.3: resolved "https://registry.yarnpkg.com/nanoid/-/nanoid-3.3.3.tgz#fd8e8b7aa761fe807dba2d1b98fb7241bb724a25" integrity sha512-p1sjXuopFs0xg+fPASzQ28agW1oHD7xDsd9Xkf3T15H3c/cifrFHVwrh74PdoklAPi+i7MdRsE47vm2r6JoB+w== -napi-build-utils@^1.0.1: - version "1.0.2" - resolved "https://registry.yarnpkg.com/napi-build-utils/-/napi-build-utils-1.0.2.tgz#b1fddc0b2c46e380a0b7a76f984dd47c41a13806" - integrity sha512-ONmRUqK7zj7DWX0D9ADe03wbwOBZxNAfF20PlGfCWQcD3+/MakShIHrMqx9YwPTfxDdF1zLeL+RGZiR9kGMLdg== - napi-macros@^2.2.2: version "2.2.2" resolved "https://registry.yarnpkg.com/napi-macros/-/napi-macros-2.2.2.tgz#817fef20c3e0e40a963fbf7b37d1600bd0201044" @@ -15392,20 +13893,6 @@ natural-compare@^1.4.0: resolved "https://registry.yarnpkg.com/natural-compare/-/natural-compare-1.4.0.tgz#4abebfeed7541f2c27acfb29bdbbd15c8d5ba4f7" integrity sha512-OWND8ei3VtNC9h7V60qff3SVobHr996CTwgxubgyQYEpg290h9J0buyECNNJexkFm5sOajh5G116RYA1c8ZMSw== -ncp@~2.0.0: - version "2.0.0" - resolved "https://registry.yarnpkg.com/ncp/-/ncp-2.0.0.tgz#195a21d6c46e361d2fb1281ba38b91e9df7bdbb3" - integrity sha512-zIdGUrPRFTUELUvr3Gmc7KZ2Sw/h1PiVM0Af/oHB6zgnV1ikqSfRk+TOufi79aHYCW3NiOXmr1BP5nWbzojLaA== - -needle@^2.2.1: - version "2.9.1" - resolved "https://registry.yarnpkg.com/needle/-/needle-2.9.1.tgz#22d1dffbe3490c2b83e301f7709b6736cd8f2684" - integrity sha512-6R9fqJ5Zcmf+uYaFgdIHmLwNldn5HbK8L5ybn7Uz+ylX/rnOsSp1AHcvQSrCaFN+qNM1wpymHqD7mVasEOlHGQ== - dependencies: - debug "^3.2.6" - iconv-lite "^0.4.4" - sax "^1.2.4" - negotiator@0.6.3, negotiator@^0.6.3: version "0.6.3" resolved "https://registry.yarnpkg.com/negotiator/-/negotiator-0.6.3.tgz#58e323a72fedc0d6f9cd4d31fe49f51479590ccd" @@ -15448,13 +13935,6 @@ no-case@^3.0.4: lower-case "^2.0.2" tslib "^2.0.3" -node-abi@^2.21.0: - version "2.30.1" - resolved "https://registry.yarnpkg.com/node-abi/-/node-abi-2.30.1.tgz#c437d4b1fe0e285aaf290d45b45d4d7afedac4cf" - integrity sha512-/2D0wOQPgaUWzVSVgRMx+trKJRC2UG4SUc4oCJoXx9Uxjtp0Vy3/kt7zcbxHF8+Z/pK3UloLWzBISg72brfy1w== - dependencies: - semver "^5.4.1" - node-abort-controller@^3.0.1: version "3.1.1" resolved "https://registry.yarnpkg.com/node-abort-controller/-/node-abort-controller-3.1.1.tgz#a94377e964a9a37ac3976d848cb5c765833b8548" @@ -15465,26 +13945,16 @@ node-addon-api@^2.0.0: resolved "https://registry.yarnpkg.com/node-addon-api/-/node-addon-api-2.0.2.tgz#432cfa82962ce494b132e9d72a15b29f71ff5d32" integrity sha512-Ntyt4AIXyaLIuMHF6IOoTakB3K+RWxwtsHNRxllEoA6vPwP9o4866g6YWDLUdnucilZhmkxiHwHr11gAENw+QA== -node-addon-api@^3.0.2, node-addon-api@^3.2.1: +node-addon-api@^3.2.1: version "3.2.1" resolved "https://registry.yarnpkg.com/node-addon-api/-/node-addon-api-3.2.1.tgz#81325e0a2117789c0128dab65e7e38f07ceba161" integrity sha512-mmcei9JghVNDYydghQmeDX8KoAm0FAiYyIcUt/N4nhyAipB17pllZQDOJD2fotxABnt4Mdz+dKTO7eftLg4d0A== -node-addon-api@^4.2.0: - version "4.3.0" - resolved "https://registry.yarnpkg.com/node-addon-api/-/node-addon-api-4.3.0.tgz#52a1a0b475193e0928e98e0426a0d1254782b77f" - integrity sha512-73sE9+3UaLYYFmDsFZnqCInzPyh3MqIwZO9cw58yIqAZhONrrabrYyYe3TuIqtIiOuTXVhsGau8hcrhhwSsDIQ== - node-domexception@^1.0.0: version "1.0.0" resolved "https://registry.yarnpkg.com/node-domexception/-/node-domexception-1.0.0.tgz#6888db46a1f71c0b76b3f7555016b63fe64766e5" integrity sha512-/jKZoMpw0F8GRwl4/eLROPA3cfcXtLApP0QzLmUT/HuPCZWyB7IY9ZrMeKw2O/nFIqPQB3PVM9aYm0F312AXDQ== -node-fetch@2.6.1: - version "2.6.1" - resolved "https://registry.yarnpkg.com/node-fetch/-/node-fetch-2.6.1.tgz#045bd323631f76ed2e2b55573394416b639a0052" - integrity sha512-V4aYg89jEoVRxRb2fJdAg8FHvI7cEyYdVAh94HH0UIK8oJxUfkjlDQN9RbMx+bEjP7+ggMiFRprSti032Oipxw== - node-fetch@2.6.7: version "2.6.7" resolved "https://registry.yarnpkg.com/node-fetch/-/node-fetch-2.6.7.tgz#24de9fba827e3b4ae44dc8b20256a379160052ad" @@ -15500,20 +13970,13 @@ node-fetch@^1.0.1: encoding "^0.1.11" is-stream "^1.0.1" -node-fetch@^2.2.0, node-fetch@^2.3.0, node-fetch@^2.6.0, node-fetch@^2.6.1, node-fetch@^2.6.7, node-fetch@^2.6.9: +node-fetch@^2.6.0, node-fetch@^2.6.1, node-fetch@^2.6.7, node-fetch@^2.6.9: version "2.6.9" resolved "https://registry.yarnpkg.com/node-fetch/-/node-fetch-2.6.9.tgz#7c7f744b5cc6eb5fd404e0c7a9fec630a55657e6" integrity sha512-DJm/CJkZkRjKKj4Zi4BsKVZh3ValV5IR5s7LVZnW+6YMh0W1BfNA8XSs6DLMGYlId5F3KnA70uu2qepcR08Qqg== dependencies: whatwg-url "^5.0.0" -node-fetch@^2.6.12: - version "2.7.0" - resolved "https://registry.yarnpkg.com/node-fetch/-/node-fetch-2.7.0.tgz#d0f0fa6e3e2dc1d27efcd8ad99d550bda94d187d" - integrity sha512-c4FRfUm/dbcWZ7U+1Wq0AwCyFL+3nt2bEw05wfxSz+DWpWsitgmSgYmy2dQdWyKC1694ELPqMs/YzUSNozLt8A== - dependencies: - whatwg-url "^5.0.0" - node-fetch@^3.3.0: version "3.3.2" resolved "https://registry.yarnpkg.com/node-fetch/-/node-fetch-3.3.2.tgz#d1e889bacdf733b4ff3b2b243eb7a12866a0b78b" @@ -15523,16 +13986,6 @@ node-fetch@^3.3.0: fetch-blob "^3.1.4" formdata-polyfill "^4.0.10" -node-forge@^0.10.0: - version "0.10.0" - resolved "https://registry.yarnpkg.com/node-forge/-/node-forge-0.10.0.tgz#32dea2afb3e9926f02ee5ce8794902691a676bf3" - integrity sha512-PPmu8eEeG9saEUvI97fm4OYxXVB6bFvyNTyiUOBichBpFG8A1Ljw3bY62+5oOjDEMHRnd0Y7HQ+x7uzxOzC6JA== - -node-forge@^1.3.1: - version "1.3.1" - resolved "https://registry.yarnpkg.com/node-forge/-/node-forge-1.3.1.tgz#be8da2af243b2417d5f646a770663a92b7e9ded3" - integrity sha512-dPEtOeMvF9VMcYV/1Wb8CPoVAXtp6MKMlcbAt4ddqmGqUJ6fQZFXkNZNkNlfevtNkGtaSoXf/vNNNSvgrdXwtA== - node-gyp-build@4.3.0: version "4.3.0" resolved "https://registry.yarnpkg.com/node-gyp-build/-/node-gyp-build-4.3.0.tgz#9f256b03e5826150be39c764bf51e993946d71a3" @@ -15569,15 +14022,6 @@ node-gyp@^9.0.0: tar "^6.1.2" which "^2.0.2" -node-hid@2.1.1: - version "2.1.1" - resolved "https://registry.yarnpkg.com/node-hid/-/node-hid-2.1.1.tgz#f83c8aa0bb4e6758b5f7383542477da93f67359d" - integrity sha512-Skzhqow7hyLZU93eIPthM9yjot9lszg9xrKxESleEs05V2NcbUptZc5HFqzjOkSmL0sFlZFr3kmvaYebx06wrw== - dependencies: - bindings "^1.5.0" - node-addon-api "^3.0.2" - prebuild-install "^6.0.0" - node-int64@^0.4.0: version "0.4.0" resolved "https://registry.yarnpkg.com/node-int64/-/node-int64-0.4.0.tgz#87a9065cdb355d3182d8f94ce11188b825c68a3b" @@ -15590,22 +14034,6 @@ node-interval-tree@^1.3.3: dependencies: shallowequal "^1.0.2" -node-pre-gyp@^0.14.0: - version "0.14.0" - resolved "https://registry.yarnpkg.com/node-pre-gyp/-/node-pre-gyp-0.14.0.tgz#9a0596533b877289bcad4e143982ca3d904ddc83" - integrity sha512-+CvDC7ZttU/sSt9rFjix/P05iS43qHCOOGzcr3Ry99bXG7VX953+vFyEuph/tfqoYu8dttBkE86JSKBO2OzcxA== - dependencies: - detect-libc "^1.0.2" - mkdirp "^0.5.1" - needle "^2.2.1" - nopt "^4.0.1" - npm-packlist "^1.1.6" - npmlog "^4.0.2" - rc "^1.2.7" - rimraf "^2.6.1" - semver "^5.3.0" - tar "^4.4.2" - node-releases@^2.0.8: version "2.0.10" resolved "https://registry.yarnpkg.com/node-releases/-/node-releases-2.0.10.tgz#c311ebae3b6a148c89b1813fd7c4d3c024ef537f" @@ -15621,21 +14049,6 @@ noncharacters@^1.1.0: resolved "https://registry.yarnpkg.com/noncharacters/-/noncharacters-1.1.0.tgz#af33df30fd50ed3c53cd202258f25ada90b540d2" integrity sha512-U69XzMNq7UQXR27xT17tkQsHPsLc+5W9yfXvYzVCwFxghVf+7VttxFnCKFMxM/cHD+/QIyU009263hxIIurj4g== -nopt@3.x: - version "3.0.6" - resolved "https://registry.yarnpkg.com/nopt/-/nopt-3.0.6.tgz#c6465dbf08abcd4db359317f79ac68a646b28ff9" - integrity sha512-4GUt3kSEYmk4ITxzB/b9vaIDfUVWN/Ml1Fwl11IlnIG2iaJ9O6WXZ9SrYM9NLI8OCBieN2Y8SWC2oJV0RQ7qYg== - dependencies: - abbrev "1" - -nopt@^4.0.1: - version "4.0.3" - resolved "https://registry.yarnpkg.com/nopt/-/nopt-4.0.3.tgz#a375cad9d02fd921278d954c2254d5aa57e15e48" - integrity sha512-CvaGwVMztSMJLOeXPrez7fyfObdZqNUK1cPAEzLHrTybIua9pMdmmPR5YwtfNftIOMv3DPUhFaxsZMNTQO20Kg== - dependencies: - abbrev "1" - osenv "^0.1.4" - nopt@^5.0.0: version "5.0.0" resolved "https://registry.yarnpkg.com/nopt/-/nopt-5.0.0.tgz#530942bb58a512fccafe53fe210f13a25355dc88" @@ -15707,7 +14120,7 @@ normalize-url@^8.0.0: resolved "https://registry.yarnpkg.com/normalize-url/-/normalize-url-8.0.1.tgz#9b7d96af9836577c58f5883e939365fa15623a4a" integrity sha512-IO9QvjUMWxPQQhs60oOu10CRkWCiZzSUkzbXGGV9pviYl1fXYcvkzQ5jV9z8Y6un8ARoVRl4EtC6v6jNqbaJ/w== -npm-bundled@^1.0.1, npm-bundled@^1.1.1: +npm-bundled@^1.1.1: version "1.1.2" resolved "https://registry.yarnpkg.com/npm-bundled/-/npm-bundled-1.1.2.tgz#944c78789bd739035b70baa2ca5cc32b8d860bc1" integrity sha512-x5DHup0SuyQcmL3s7Rx/YQ8sbw/Hzg0rj48eN0dV7hf5cmQq5PXIeioroH3raV1QC1yh3uTYuMThvEQF3iKgGQ== @@ -15757,15 +14170,6 @@ npm-package-arg@^9.0.0, npm-package-arg@^9.0.1: semver "^7.3.5" validate-npm-package-name "^4.0.0" -npm-packlist@^1.1.6: - version "1.4.8" - resolved "https://registry.yarnpkg.com/npm-packlist/-/npm-packlist-1.4.8.tgz#56ee6cc135b9f98ad3d51c1c95da22bbb9b2ef3e" - integrity sha512-5+AZgwru5IevF5ZdnFglB5wNlHG1AOOuw28WhUq8/8emhBmLv6jX5by4WJCh7lW0uSYZYS6DXqIsyZVIXRZU9A== - dependencies: - ignore-walk "^3.0.1" - npm-bundled "^1.0.1" - npm-normalize-package-bin "^1.0.1" - npm-packlist@^5.1.0, npm-packlist@^5.1.1: version "5.1.3" resolved "https://registry.yarnpkg.com/npm-packlist/-/npm-packlist-5.1.3.tgz#69d253e6fd664b9058b85005905012e00e69274b" @@ -15827,16 +14231,6 @@ npm-run-path@^5.1.0: dependencies: path-key "^4.0.0" -npmlog@^4.0.1, npmlog@^4.0.2: - version "4.1.2" - resolved "https://registry.yarnpkg.com/npmlog/-/npmlog-4.1.2.tgz#08a7f2a8bf734604779a9efa4ad5cc717abb954b" - integrity sha512-2uUqazuKlTaSI/dC8AzicUck7+IrEaOnN/e0jd3Xtt1KcGpwx30v50mL7oPyr/h9bL3E4aZccVwpwP+5W9Vjkg== - dependencies: - are-we-there-yet "~1.1.2" - console-control-strings "~1.1.0" - gauge "~2.7.3" - set-blocking "~2.0.0" - npmlog@^6.0.0, npmlog@^6.0.2: version "6.0.2" resolved "https://registry.yarnpkg.com/npmlog/-/npmlog-6.0.2.tgz#c8166017a42f2dea92d6453168dd865186a70830" @@ -15867,11 +14261,6 @@ number-to-bn@1.7.0: bn.js "4.11.6" strip-hex-prefix "1.0.0" -numeric@^1.2.6: - version "1.2.6" - resolved "https://registry.yarnpkg.com/numeric/-/numeric-1.2.6.tgz#765b02bef97988fcf880d4eb3f36b80fa31335aa" - integrity sha512-avBiDAP8siMa7AfJgYyuxw1oyII4z2sswS23+O+ZfV28KrtNzy0wxUFwi4f3RyM4eeeXNs1CThxR7pb5QQcMiw== - nx@15.9.2, "nx@>=14.8.1 < 16": version "15.9.2" resolved "https://registry.yarnpkg.com/nx/-/nx-15.9.2.tgz#d7ace1e5ae64a47f1b553dc5da08dbdd858bde96" @@ -15938,11 +14327,6 @@ object-assign@^4, object-assign@^4.0.1, object-assign@^4.1.0, object-assign@^4.1 resolved "https://registry.yarnpkg.com/object-assign/-/object-assign-4.1.1.tgz#2109adc7965887cfc05cbbd442cac8bfbb360863" integrity sha512-rJgTQnkUnH1sFw8yT6VSU3zD3sWmu6sZhIseY8VX+GRu3P6F7Fu+JNDoXfklElbLJSnc3FUQHVe4cU5hj+BcUg== -object-hash@^3.0.0: - version "3.0.0" - resolved "https://registry.yarnpkg.com/object-hash/-/object-hash-3.0.0.tgz#73f97f753e7baffc0e2cc9d6e079079744ac82e9" - integrity sha512-RSn9F68PjH9HqtltsSnqYC1XXoWe9Bju5+213R98cNGttag9q9yAOTzdbsqvIa7aNm5WffBZFpWYr2aWrklWAw== - object-inspect@^1.12.3, object-inspect@^1.9.0, object-inspect@~1.12.3: version "1.12.3" resolved "https://registry.yarnpkg.com/object-inspect/-/object-inspect-1.12.3.tgz#ba62dffd67ee256c8c086dfae69e016cd1f198b9" @@ -16083,7 +14467,7 @@ on-finished@2.4.1: dependencies: ee-first "1.1.1" -once@1.x, once@^1.3.0, once@^1.3.1, once@^1.4.0: +once@^1.3.0, once@^1.3.1, once@^1.4.0: version "1.4.0" resolved "https://registry.yarnpkg.com/once/-/once-1.4.0.tgz#583b1aa775961d4b113ac17d9c50baef9dd76bd1" integrity sha512-lNaJgI+2Q5URQBkccEKHTQOPaXdUxnZZElQTZY0MFUAuaEqe1E+Nyvgdz/aIyNi6Z9MzO5dv1H8n58/GELp3+w== @@ -16125,7 +14509,7 @@ openzeppelin-solidity@^2.5.0: resolved "https://registry.yarnpkg.com/openzeppelin-solidity/-/openzeppelin-solidity-2.5.1.tgz#1cdcce30c4c6a7b6625dab62ccd0440a813ab597" integrity sha512-oCGtQPLOou4su76IMr4XXJavy9a8OZmAXeUZ8diOdFznlL/mlkIlYr7wajqCzH4S47nlKPS7m0+a2nilCTpVPQ== -optionator@^0.8.1, optionator@^0.8.2: +optionator@^0.8.2: version "0.8.3" resolved "https://registry.yarnpkg.com/optionator/-/optionator-0.8.3.tgz#84fa1d036fe9d3c7e21d99884b601167ec8fb495" integrity sha512-+IW9pACdk3XWmmTXG8m3upGUJst5XRGzxMRjXzAuJ1XnIFNvfhjjIuYkDvysnPQ7qzqVzLt78BCruntqRhWQbA== @@ -16149,11 +14533,6 @@ optionator@^0.9.3: prelude-ls "^1.2.1" type-check "^0.4.0" -optjs@~3.2.2: - version "3.2.2" - resolved "https://registry.yarnpkg.com/optjs/-/optjs-3.2.2.tgz#69a6ce89c442a44403141ad2f9b370bd5bb6f4ee" - integrity sha512-f8lTJm4LKirX+45xsFhuRNjA4f46QVLQKfGoNH7e2AEWS+24eM4XNH4pQ8Tw2LISCIvbST/wNcLdtgvgcqVaxA== - ora@^3.4.0: version "3.4.0" resolved "https://registry.yarnpkg.com/ora/-/ora-3.4.0.tgz#bf0752491059a3ef3ed4c85097531de9fdbcd318" @@ -16201,7 +14580,7 @@ original-require@^1.0.1: resolved "https://registry.yarnpkg.com/original-require/-/original-require-1.0.1.tgz#0f130471584cd33511c5ec38c8d59213f9ac5e20" integrity sha512-5vdKMbE58WaE61uVD+PKyh8xdM398UnjPBLotW2sjG5MzHARwta/+NtMBCBA0t2WQblGYBvq5vsiZpWokwno+A== -os-homedir@^1.0.0, os-homedir@^1.0.1: +os-homedir@^1.0.1: version "1.0.2" resolved "https://registry.yarnpkg.com/os-homedir/-/os-homedir-1.0.2.tgz#ffbc4988336e0e833de0c168c7ef152121aa7fb3" integrity sha512-B5JU3cabzk8c67mRRd3ECmROafjYMXbuzlwtqdM8IbS8ktlTix8aFGb2bAGKrSRIlnfKwovGUUr72JUPyOb6kQ== @@ -16222,18 +14601,23 @@ os-locale@^3.1.0: lcid "^2.0.0" mem "^4.0.0" -os-tmpdir@^1.0.0, os-tmpdir@~1.0.2: +os-tmpdir@~1.0.2: version "1.0.2" resolved "https://registry.yarnpkg.com/os-tmpdir/-/os-tmpdir-1.0.2.tgz#bbe67406c79aa85c5cfec766fe5734555dfa1274" integrity sha512-D2FR03Vir7FIu45XBY20mTb+/ZSWB00sjU9jdQXt83gDrI4Ztz5Fs7/yy74g2N5SVQY4xY1qDr4rNddwYRVX0g== -osenv@^0.1.4: - version "0.1.5" - resolved "https://registry.yarnpkg.com/osenv/-/osenv-0.1.5.tgz#85cdfafaeb28e8677f416e287592b5f3f49ea410" - integrity sha512-0CWcCECdMVc2Rw3U5w9ZjqX6ga6ubk1xDVKxtBQPK7wis/0F2r9T6k4ydGYhecl7YUBxBVxhL5oisPsNxAPe2g== +ox@0.6.9: + version "0.6.9" + resolved "https://registry.yarnpkg.com/ox/-/ox-0.6.9.tgz#da1ee04fa10de30c8d04c15bfb80fe58b1f554bd" + integrity sha512-wi5ShvzE4eOcTwQVsIPdFr+8ycyX+5le/96iAJutaZAvCes1J0+RvpEPg5QDPDiaR0XQQAvZVl7AwqQcINuUug== dependencies: - os-homedir "^1.0.0" - os-tmpdir "^1.0.0" + "@adraffy/ens-normalize" "^1.10.1" + "@noble/curves" "^1.6.0" + "@noble/hashes" "^1.5.0" + "@scure/bip32" "^1.5.0" + "@scure/bip39" "^1.4.0" + abitype "^1.0.6" + eventemitter3 "5.0.1" p-cancelable@^0.3.0: version "0.3.0" @@ -16409,11 +14793,6 @@ package-json@^8.1.0: registry-url "^6.0.0" semver "^7.3.7" -packet-reader@1.0.0: - version "1.0.0" - resolved "https://registry.yarnpkg.com/packet-reader/-/packet-reader-1.0.0.tgz#9238e5480dedabacfe1fe3f2771063f164157d74" - integrity sha512-HAKu/fG3HpHFO0AA8WE8q2g+gBJaZ9MG7fcKk+IJPLTGAD6Psw4443l+9DGRbOIh3/aXr7Phy0TjilYivJo5XQ== - pacote@^13.0.3, pacote@^13.6.1: version "13.6.2" resolved "https://registry.yarnpkg.com/pacote/-/pacote-13.6.2.tgz#0d444ba3618ab3e5cd330b451c22967bbd0ca48a" @@ -16711,7 +15090,7 @@ pathval@^1.1.1: resolved "https://registry.yarnpkg.com/pathval/-/pathval-1.1.1.tgz#8534e77a77ce7ac5a2512ea21e0fdb8fcf6c3d8d" integrity sha512-Dp6zGqpTdETdR63lehJYPeIOqpiNBNtc7BpWSLrOje7UaIsE5aY92r/AunQA7rsXvet3lrJ3JnZX29UPTKXyKQ== -pbkdf2@^3.0.17, pbkdf2@^3.0.3, pbkdf2@^3.0.9: +pbkdf2@^3.0.17, pbkdf2@^3.0.3: version "3.1.2" resolved "https://registry.yarnpkg.com/pbkdf2/-/pbkdf2-3.1.2.tgz#dd822aa0887580e52f1a039dc3eda108efae3075" integrity sha512-iuh7L6jA7JEGu2WxDwtQP1ddOpaJNC4KlDEFfdQajSGgGPNi4OyDc2R7QnbY2bR9QjBVGwgvTdNJZoE7RaxUMA== @@ -16732,63 +15111,6 @@ performance-now@^2.1.0: resolved "https://registry.yarnpkg.com/performance-now/-/performance-now-2.1.0.tgz#6309f4e0e5fa913ec1c69307ae364b4b377c9e7b" integrity sha512-7EAHlyLHI56VEIdK57uwHdHKIaAGbnXPiw0yWbarQZOKaKpvUIgW0jWRVLiatnM+XXlSwsanIBH/hzGMJulMow== -pg-connection-string@0.1.3: - version "0.1.3" - resolved "https://registry.yarnpkg.com/pg-connection-string/-/pg-connection-string-0.1.3.tgz#da1847b20940e42ee1492beaf65d49d91b245df7" - integrity sha512-i0NV/CrSkFTaiOQs9AGy3tq0dkSjtTd4d7DfsjeDVZAA4aIHInwfFEmriNYGGJUfZ5x6IAC/QddoUpUJjQAi0w== - -pg-int8@1.0.1: - version "1.0.1" - resolved "https://registry.yarnpkg.com/pg-int8/-/pg-int8-1.0.1.tgz#943bd463bf5b71b4170115f80f8efc9a0c0eb78c" - integrity sha512-WCtabS6t3c8SkpDBUlb1kjOs7l66xsGdKpIPZsg4wR+B3+u9UAum2odSsF9tnvxg80h4ZxLWMy4pRjOsFIqQpw== - -pg-packet-stream@^1.1.0: - version "1.1.0" - resolved "https://registry.yarnpkg.com/pg-packet-stream/-/pg-packet-stream-1.1.0.tgz#e45c3ae678b901a2873af1e17b92d787962ef914" - integrity sha512-kRBH0tDIW/8lfnnOyTwKD23ygJ/kexQVXZs7gEyBljw4FYqimZFxnMMx50ndZ8In77QgfGuItS5LLclC2TtjYg== - -pg-pool@^2.0.10: - version "2.0.10" - resolved "https://registry.yarnpkg.com/pg-pool/-/pg-pool-2.0.10.tgz#842ee23b04e86824ce9d786430f8365082d81c4a" - integrity sha512-qdwzY92bHf3nwzIUcj+zJ0Qo5lpG/YxchahxIN8+ZVmXqkahKXsnl2aiJPHLYN9o5mB/leG+Xh6XKxtP7e0sjg== - -pg-protocol@^1.2.0: - version "1.6.0" - resolved "https://registry.yarnpkg.com/pg-protocol/-/pg-protocol-1.6.0.tgz#4c91613c0315349363af2084608db843502f8833" - integrity sha512-M+PDm637OY5WM307051+bsDia5Xej6d9IR4GwJse1qA1DIhiKlksvrneZOYQq42OM+spubpcNYEo2FcKQrDk+Q== - -pg-types@^2.1.0, pg-types@^2.2.0: - version "2.2.0" - resolved "https://registry.yarnpkg.com/pg-types/-/pg-types-2.2.0.tgz#2d0250d636454f7cfa3b6ae0382fdfa8063254a3" - integrity sha512-qTAAlrEsl8s4OiEQY69wDvcMIdQN6wdz5ojQiOy6YRMuynxenON0O5oCpJI6lshc6scgAY8qvJ2On/p+CXY0GA== - dependencies: - pg-int8 "1.0.1" - postgres-array "~2.0.0" - postgres-bytea "~1.0.0" - postgres-date "~1.0.4" - postgres-interval "^1.1.0" - -pg@^7.18.0: - version "7.18.2" - resolved "https://registry.yarnpkg.com/pg/-/pg-7.18.2.tgz#4e219f05a00aff4db6aab1ba02f28ffa4513b0bb" - integrity sha512-Mvt0dGYMwvEADNKy5PMQGlzPudKcKKzJds/VbOeZJpb6f/pI3mmoXX0JksPgI3l3JPP/2Apq7F36O63J7mgveA== - dependencies: - buffer-writer "2.0.0" - packet-reader "1.0.0" - pg-connection-string "0.1.3" - pg-packet-stream "^1.1.0" - pg-pool "^2.0.10" - pg-types "^2.1.0" - pgpass "1.x" - semver "4.3.2" - -pgpass@1.x: - version "1.0.5" - resolved "https://registry.yarnpkg.com/pgpass/-/pgpass-1.0.5.tgz#9b873e4a564bb10fa7a7dbd55312728d422a223d" - integrity sha512-FdW9r/jQZhSeohs1Z3sI1yxFQNFvMcnmfuj4WBMUTxOrAyLMaTcE1aAMBiTlbMNaXvBCQuVi0R7hd8udDSP7ug== - dependencies: - split2 "^4.1.0" - picocolors@^1.0.0: version "1.0.0" resolved "https://registry.yarnpkg.com/picocolors/-/picocolors-1.0.0.tgz#cb5bdc74ff3f51892236eaf79d68bc44564ab81c" @@ -16809,7 +15131,7 @@ pify@^3.0.0: resolved "https://registry.yarnpkg.com/pify/-/pify-3.0.0.tgz#e5a4acd2c101fdf3d9a4d07f0dbc4db49dd28176" integrity sha512-C3FsVNH1udSEX48gGX1xfvwTWfsYWj5U+8/uK15BGzIGrKoUpghX8hWZwa/OFnakBiiVNmBvemTJR5mcy7iPcg== -pify@^4.0.0, pify@^4.0.1: +pify@^4.0.1: version "4.0.1" resolved "https://registry.yarnpkg.com/pify/-/pify-4.0.1.tgz#4b2cd25c50d598735c50292224fd8c6df41e3231" integrity sha512-uB80kBFb/tfd68bVleG9T5GGsGPjJrLAUpR5PZIrhBnIaRTQRjqdJSsIKkOP6OAIFbj7GOrcudc5pNjZ+geV2g== @@ -16857,43 +15179,11 @@ pkg-up@^3.1.0: dependencies: find-up "^3.0.0" -pluralize@^7.0.0: - version "7.0.0" - resolved "https://registry.yarnpkg.com/pluralize/-/pluralize-7.0.0.tgz#298b89df8b93b0221dbf421ad2b1b1ea23fc6777" - integrity sha512-ARhBOdzS3e41FbkW/XWrTEtukqqLoK5+Z/4UeDaLuSW+39JPeFgs4gCGqsrJHVZX0fUrx//4OF0K1CUGwlIFow== - pluralize@^8.0.0: version "8.0.0" resolved "https://registry.yarnpkg.com/pluralize/-/pluralize-8.0.0.tgz#1a6fa16a38d12a1901e0320fa017051c539ce3b1" integrity sha512-Nc3IT5yHzflTfbjgqWcCPpo7DaKy4FnpB0l/zCAW0Tc7jxAiuqSxHasntB3D7887LSrA93kDJ9IXovxJYxyLCA== -popper.js@1.14.3: - version "1.14.3" - resolved "https://registry.yarnpkg.com/popper.js/-/popper.js-1.14.3.tgz#1438f98d046acf7b4d78cd502bf418ac64d4f095" - integrity sha512-3lmujhsHXzb83+sI0PzfrE3O1XHZG8m8MXNMTupvA6LrM1/nnsiqYaacYc/RIente9VqnTDPztGEM8uvPAMGyg== - -postgres-array@~2.0.0: - version "2.0.0" - resolved "https://registry.yarnpkg.com/postgres-array/-/postgres-array-2.0.0.tgz#48f8fce054fbc69671999329b8834b772652d82e" - integrity sha512-VpZrUqU5A69eQyW2c5CA1jtLecCsN2U/bD6VilrFDWq5+5UIEVO7nazS3TEcHf1zuPYO/sqGvUvW62g86RXZuA== - -postgres-bytea@~1.0.0: - version "1.0.0" - resolved "https://registry.yarnpkg.com/postgres-bytea/-/postgres-bytea-1.0.0.tgz#027b533c0aa890e26d172d47cf9ccecc521acd35" - integrity sha512-xy3pmLuQqRBZBXDULy7KbaitYqLcmxigw14Q5sj8QBVLqEwXfeybIKVWiqAXTlcvdvb0+xkOtDbfQMOf4lST1w== - -postgres-date@~1.0.4: - version "1.0.7" - resolved "https://registry.yarnpkg.com/postgres-date/-/postgres-date-1.0.7.tgz#51bc086006005e5061c591cee727f2531bf641a8" - integrity sha512-suDmjLVQg78nMK2UZ454hAG+OAW+HQPZ6n++TNDUX+L0+uUlLywnoxJKDou51Zm+zTCjrCl0Nq6J9C5hP9vK/Q== - -postgres-interval@^1.1.0: - version "1.2.0" - resolved "https://registry.yarnpkg.com/postgres-interval/-/postgres-interval-1.2.0.tgz#b460c82cb1587507788819a06aa0fffdb3544695" - integrity sha512-9ZhXKM/rw350N1ovuWHbGxnGh/SNJ4cnxHiM0rxE4VN41wsg8P8zWn9hv/buK00RP4WvlOyr/RBDiptyxVbkZQ== - dependencies: - xtend "^4.0.0" - pouchdb-abstract-mapreduce@7.3.1: version "7.3.1" resolved "https://registry.yarnpkg.com/pouchdb-abstract-mapreduce/-/pouchdb-abstract-mapreduce-7.3.1.tgz#96ff4a0f41cbe273f3f52fde003b719005a2093c" @@ -17080,25 +15370,6 @@ pouchdb@7.3.0: uuid "8.3.2" vuvuzela "1.0.3" -prebuild-install@^6.0.0: - version "6.1.4" - resolved "https://registry.yarnpkg.com/prebuild-install/-/prebuild-install-6.1.4.tgz#ae3c0142ad611d58570b89af4986088a4937e00f" - integrity sha512-Z4vpywnK1lBg+zdPCVCsKq0xO66eEV9rWo2zrROGGiRS4JtueBOdlB1FnY8lcy7JsUud/Q3ijUxyWN26Ika0vQ== - dependencies: - detect-libc "^1.0.3" - expand-template "^2.0.3" - github-from-package "0.0.0" - minimist "^1.2.3" - mkdirp-classic "^0.5.3" - napi-build-utils "^1.0.1" - node-abi "^2.21.0" - npmlog "^4.0.1" - pump "^3.0.0" - rc "^1.2.7" - simple-get "^3.0.3" - tar-fs "^2.0.0" - tunnel-agent "^0.6.0" - precond@0.2: version "0.2.3" resolved "https://registry.yarnpkg.com/precond/-/precond-0.2.3.tgz#aa9591bcaa24923f1e0f4849d240f47efc1075ac" @@ -17259,14 +15530,6 @@ prompt@1.2.0: revalidator "0.1.x" winston "2.x" -prompts@1.2.0: - version "1.2.0" - resolved "https://registry.yarnpkg.com/prompts/-/prompts-1.2.0.tgz#598f7722032fb6c399beb24533129d00604c7007" - integrity sha512-g+I6Cer6EefDTawQhGHpdX98nhD7KQrRqyRgKCb+Sc+GG4P64EWRe5DZE402ZNkwrItf97Asf0L1z0g3waOgAA== - dependencies: - kleur "^2.0.1" - sisteransi "^1.0.0" - prompts@^2.0.1: version "2.4.2" resolved "https://registry.yarnpkg.com/prompts/-/prompts-2.4.2.tgz#7b57e73b3a48029ad10ebd44f74b01722a4cb069" @@ -17287,60 +15550,6 @@ proto-list@~1.2.1: resolved "https://registry.yarnpkg.com/proto-list/-/proto-list-1.2.4.tgz#212d5bfe1318306a420f6402b8e26ff39647a849" integrity sha512-vtK/94akxsTMhe0/cbfpR+syPuszcuwhqVjJq26CuNDgFGj682oRBXOP5MJpv2r7JtE8MsiepGIqvvOTBwn2vA== -proto3-json-serializer@^0.1.8: - version "0.1.9" - resolved "https://registry.yarnpkg.com/proto3-json-serializer/-/proto3-json-serializer-0.1.9.tgz#705ddb41b009dd3e6fcd8123edd72926abf65a34" - integrity sha512-A60IisqvnuI45qNRygJjrnNjX2TMdQGMY+57tR3nul3ZgO2zXkR9OGR8AXxJhkqx84g0FTnrfi3D5fWMSdANdQ== - dependencies: - protobufjs "^6.11.2" - -protobufjs@6.11.3, protobufjs@^6.11.2, protobufjs@^6.11.3, protobufjs@^6.8.0, protobufjs@^6.8.1, protobufjs@^6.8.6, protobufjs@^6.8.8: - version "6.11.3" - resolved "https://registry.yarnpkg.com/protobufjs/-/protobufjs-6.11.3.tgz#637a527205a35caa4f3e2a9a4a13ddffe0e7af74" - integrity sha512-xL96WDdCZYdU7Slin569tFX712BxsxslWwAfAhCYjQKGTq7dAU91Lomy6nLLhh/dyGhk/YH4TwTSRxTzhuHyZg== - dependencies: - "@protobufjs/aspromise" "^1.1.2" - "@protobufjs/base64" "^1.1.2" - "@protobufjs/codegen" "^2.0.4" - "@protobufjs/eventemitter" "^1.1.0" - "@protobufjs/fetch" "^1.1.0" - "@protobufjs/float" "^1.0.2" - "@protobufjs/inquire" "^1.1.0" - "@protobufjs/path" "^1.1.2" - "@protobufjs/pool" "^1.1.0" - "@protobufjs/utf8" "^1.1.0" - "@types/long" "^4.0.1" - "@types/node" ">=13.7.0" - long "^4.0.0" - -protobufjs@^5.0.3: - version "5.0.3" - resolved "https://registry.yarnpkg.com/protobufjs/-/protobufjs-5.0.3.tgz#e4dfe9fb67c90b2630d15868249bcc4961467a17" - integrity sha512-55Kcx1MhPZX0zTbVosMQEO5R6/rikNXd9b6RQK4KSPcrSIIwoXTtebIczUrXlwaSrbz4x8XUVThGPob1n8I4QA== - dependencies: - ascli "~1" - bytebuffer "~5" - glob "^7.0.5" - yargs "^3.10.0" - -protobufjs@^7.0.0: - version "7.2.3" - resolved "https://registry.yarnpkg.com/protobufjs/-/protobufjs-7.2.3.tgz#01af019e40d9c6133c49acbb3ff9e30f4f0f70b2" - integrity sha512-TtpvOqwB5Gdz/PQmOjgsrGH1nHjAQVCN7JG4A6r1sXRWESL5rNMAiRcBQlCAdKxZcAbstExQePYG8xof/JVRgg== - dependencies: - "@protobufjs/aspromise" "^1.1.2" - "@protobufjs/base64" "^1.1.2" - "@protobufjs/codegen" "^2.0.4" - "@protobufjs/eventemitter" "^1.1.0" - "@protobufjs/fetch" "^1.1.0" - "@protobufjs/float" "^1.0.2" - "@protobufjs/inquire" "^1.1.0" - "@protobufjs/path" "^1.1.2" - "@protobufjs/pool" "^1.1.0" - "@protobufjs/utf8" "^1.1.0" - "@types/node" ">=13.7.0" - long "^5.0.0" - protocols@^2.0.0, protocols@^2.0.1: version "2.0.1" resolved "https://registry.yarnpkg.com/protocols/-/protocols-2.0.1.tgz#8f155da3fc0f32644e83c5782c8e8212ccf70a86" @@ -17390,22 +15599,6 @@ public-encrypt@^4.0.0: randombytes "^2.0.1" safe-buffer "^5.1.2" -pump@^1.0.0: - version "1.0.3" - resolved "https://registry.yarnpkg.com/pump/-/pump-1.0.3.tgz#5dfe8311c33bbf6fc18261f9f34702c47c08a954" - integrity sha512-8k0JupWme55+9tCVE+FS5ULT3K6AbgqrGa58lTT49RpyfwwcGedHqaC5LlQNdEAumn/wFsu6aPwkuPMioy8kqw== - dependencies: - end-of-stream "^1.1.0" - once "^1.3.1" - -pump@^2.0.0: - version "2.0.1" - resolved "https://registry.yarnpkg.com/pump/-/pump-2.0.1.tgz#12399add6e4cf7526d973cbc8b5ce2e2908b3909" - integrity sha512-ruPMNRkN3MHP1cWJc9OWr+T/xDP0jhXYCLfJcBuX54hhfIBnaQmAUMfDcG4DM5UMWByBbJY69QSphm3jtDKIkA== - dependencies: - end-of-stream "^1.1.0" - once "^1.3.1" - pump@^3.0.0: version "3.0.0" resolved "https://registry.yarnpkg.com/pump/-/pump-3.0.0.tgz#b4a2116815bde2f4e1ea602354e8c75565107a64" @@ -17414,15 +15607,6 @@ pump@^3.0.0: end-of-stream "^1.1.0" once "^1.3.1" -pumpify@^1.5.1: - version "1.5.1" - resolved "https://registry.yarnpkg.com/pumpify/-/pumpify-1.5.1.tgz#36513be246ab27570b1a374a5ce278bfd74370ce" - integrity sha512-oClZI37HvuUJJxSKKrC17bZ9Cu0ZYhEAGPsPUy9KlMUmv9dKX2o77RUmq7f3XjIxbwyGwYzbzQ1L2Ks8sIradQ== - dependencies: - duplexify "^3.6.0" - inherits "^2.0.3" - pump "^2.0.0" - punycode@2.1.0: version "2.1.0" resolved "https://registry.yarnpkg.com/punycode/-/punycode-2.1.0.tgz#5f863edc89b96db09074bad7947bf09056ca4e7d" @@ -17462,13 +15646,6 @@ qs@^6.4.0: dependencies: side-channel "^1.0.4" -qs@^6.5.2: - version "6.11.2" - resolved "https://registry.yarnpkg.com/qs/-/qs-6.11.2.tgz#64bea51f12c1f5da1bc01496f48ffcff7c69d7d9" - integrity sha512-tDNIz22aBzCDxLtVH++VnTfzxlfeK5CbqohpSqpJgj1Wg/cQbStNAz3NuqCs5vV+pjBsK4x4pN9HlVh7rcYRiA== - dependencies: - side-channel "^1.0.4" - qs@~6.5.2: version "6.5.3" resolved "https://registry.yarnpkg.com/qs/-/qs-6.5.3.tgz#3aeeffc91967ef6e35c0e488ef46fb296ab76aad" @@ -17488,7 +15665,7 @@ querystring@^0.2.1: resolved "https://registry.yarnpkg.com/querystring/-/querystring-0.2.1.tgz#40d77615bb09d16902a85c3e38aa8b5ed761c2dd" integrity sha512-wkvS7mL/JMugcup3/rMitHmd9ecIGd2lhFhK9N3UUQ450h66d1r3Y9nvXzQAW1Lq+wyx61k/1pfKS5KuKiyEbg== -querystringify@^2.0.0, querystringify@^2.1.1: +querystringify@^2.1.1: version "2.2.0" resolved "https://registry.yarnpkg.com/querystringify/-/querystringify-2.2.0.tgz#3345941b4153cb9d082d8eee4cda2016a9aef7f6" integrity sha512-FIqgj2EUvTa7R50u0rGsyTftzjYmv/a3hO345bZNrqabNqjtgiDMgmo4mkUjd+nzU5oF3dClKqFIPUKybUyqoQ== @@ -17562,7 +15739,7 @@ raw-body@2.5.2: iconv-lite "0.4.24" unpipe "1.0.0" -rc@1.2.8, rc@^1.1.2, rc@^1.2.7: +rc@1.2.8, rc@^1.1.2: version "1.2.8" resolved "https://registry.yarnpkg.com/rc/-/rc-1.2.8.tgz#cd924bf5200a075b83c188cd6b9e211b7fc0d3ed" integrity sha512-y3bGgqKj3QBdxLbLkomlohkvsA8gdAiUQlSBJnBhfn+BPxg4bc62d8TcBW15wavDfgexCgccckhcZvywyQYPOw== @@ -17582,13 +15759,6 @@ read-cmd-shim@^3.0.0: resolved "https://registry.yarnpkg.com/read-cmd-shim/-/read-cmd-shim-3.0.1.tgz#868c235ec59d1de2db69e11aec885bc095aea087" integrity sha512-kEmDUoYf/CDy8yZbLTmhB1X9kkjf9Q80PCNsDMb7ufrGd6zZSQA1+UyjrO+pZm5K/S4OXCWJeiIt1JA8kAsa6g== -read-last-lines@^1.7.2: - version "1.8.0" - resolved "https://registry.yarnpkg.com/read-last-lines/-/read-last-lines-1.8.0.tgz#4f94d4345ece7b8083ebb71c5fcdf60bd7afb9cc" - integrity sha512-oPL0cnZkhsO2xF7DBrdzVhXSNajPP5TzzCim/2IAjeGb17ArLLTRriI/ceV6Rook3L27mvbrOvLlf9xYYnaftQ== - dependencies: - mz "^2.7.0" - read-package-json-fast@^2.0.2, read-package-json-fast@^2.0.3: version "2.0.3" resolved "https://registry.yarnpkg.com/read-package-json-fast/-/read-package-json-fast-2.0.3.tgz#323ca529630da82cb34b36cc0b996693c98c2b83" @@ -17686,7 +15856,7 @@ readable-stream@1.1.14, readable-stream@^1.0.33: string_decoder "^1.1.1" util-deprecate "^1.0.1" -readable-stream@^2.0.0, readable-stream@^2.0.6, readable-stream@^2.2.2, readable-stream@^2.2.9, readable-stream@^2.3.0, readable-stream@^2.3.5, readable-stream@^2.3.6, readable-stream@~2.3.6: +readable-stream@^2.0.0, readable-stream@^2.2.2, readable-stream@^2.2.9, readable-stream@^2.3.0, readable-stream@^2.3.5, readable-stream@^2.3.6, readable-stream@~2.3.6: version "2.3.8" resolved "https://registry.yarnpkg.com/readable-stream/-/readable-stream-2.3.8.tgz#91125e8042bba1b9887f49345f6277027ce8be9b" integrity sha512-8p0AUk4XODgIewSi0l8Epjs+EVnWiK7NoDIEGU0HhE7+ZyY8D1IMY7odu5lRrFXGg71L15KG8QrPmum45RTtdA== @@ -17923,7 +16093,7 @@ require-from-string@^1.1.0: resolved "https://registry.yarnpkg.com/require-from-string/-/require-from-string-1.2.1.tgz#529c9ccef27380adfec9a2f965b649bbee636418" integrity sha512-H7AkJWMobeskkttHyhTVtS0fxpFLjxhbfMa6Bk3wimP7sdPRGL3EyCg3sAQenFfAe+xQ+oAc85Nmtvq0ROM83Q== -require-from-string@^2.0.0, require-from-string@^2.0.1, require-from-string@^2.0.2: +require-from-string@^2.0.0, require-from-string@^2.0.2: version "2.0.2" resolved "https://registry.yarnpkg.com/require-from-string/-/require-from-string-2.0.2.tgz#89a7fdd938261267318eafe14f9c32e598c36909" integrity sha512-Xf0nWe6RseziFMu+Ap9biiUbmplq6S9/p+7w7YXP/JBHhrUDDUhwa+vANyubuqfZWTveU//DYVGsDG7RKL/vEw== @@ -18005,11 +16175,6 @@ resolve.exports@^2.0.0: resolved "https://registry.yarnpkg.com/resolve.exports/-/resolve.exports-2.0.2.tgz#f8c934b8e6a13f539e38b7098e2e36134f01e800" integrity sha512-X2UW6Nw3n/aMgDVy+0rSqgHlv39WZAlZrXCdnbyEiKm17DSqHX4MmQMaST3FbeWR5FTuRcUwYAziZajji0Y7mg== -resolve@1.1.x: - version "1.1.7" - resolved "https://registry.yarnpkg.com/resolve/-/resolve-1.1.7.tgz#203114d82ad2c5ed9e8e0411b3932875e889e97b" - integrity sha512-9znBF0vBcaSN3W2j7wKvdERPwqTxSpCq+if5C0WoTCyV9n24rua28jeuQ2pL/HOf+yUe/Mef+H/5p60K0Id3bg== - resolve@^1.10.0, resolve@^1.11.1, resolve@^1.14.2, resolve@^1.20.0, resolve@^1.8.1, resolve@~1.22.1: version "1.22.2" resolved "https://registry.yarnpkg.com/resolve/-/resolve-1.22.2.tgz#0ed0943d4e301867955766c9f3e1ae6d01c6845f" @@ -18080,14 +16245,6 @@ resumer@~0.0.0: dependencies: through "~2.3.4" -retry-request@^4.0.0: - version "4.2.2" - resolved "https://registry.yarnpkg.com/retry-request/-/retry-request-4.2.2.tgz#b7d82210b6d2651ed249ba3497f07ea602f1a903" - integrity sha512-xA93uxUD/rogV7BV59agW/JHPGXeREMWiZc9jhcwY4YdZ7QOtC7qbomYg0n4wyk2lJhggjvKvhNX8wln/Aldhg== - dependencies: - debug "^4.1.1" - extend "^3.0.2" - retry@0.13.1: version "0.13.1" resolved "https://registry.yarnpkg.com/retry/-/retry-0.13.1.tgz#185b1587acf67919d63b357349e03537b2484658" @@ -18115,7 +16272,7 @@ rimraf@2.6.3: dependencies: glob "^7.1.3" -rimraf@^2.2.8, rimraf@^2.6.1, rimraf@^2.6.2, rimraf@^2.6.3: +rimraf@^2.2.8, rimraf@^2.6.3: version "2.7.1" resolved "https://registry.yarnpkg.com/rimraf/-/rimraf-2.7.1.tgz#35797f13a7fdadc566142c29d4f07ccad483e3ec" integrity sha512-uWjbaKIK3T1OSVptzX7Nl6PvQ3qAGtKEtVRjRuazjfL3Bx5eI409VZSqgND+4UNnmzLVdPj9FqFJNPqBZFve4w== @@ -18136,13 +16293,6 @@ rimraf@^5.0.5: dependencies: glob "^10.3.7" -rimraf@~2.4.0: - version "2.4.5" - resolved "https://registry.yarnpkg.com/rimraf/-/rimraf-2.4.5.tgz#ee710ce5d93a8fdb856fb5ea8ff0e2d75934b2da" - integrity sha512-J5xnxTyqaiw06JjMftq7L9ouA448dw/E7dKghkP9WpKNuwmARNNg+Gk8/u5ryb9N/Yo2+z3MCwuqFK/+qPOPfQ== - dependencies: - glob "^6.0.1" - ripemd160-min@0.0.6: version "0.0.6" resolved "https://registry.yarnpkg.com/ripemd160-min/-/ripemd160-min-0.0.6.tgz#a904b77658114474d02503e819dcc55853b67e62" @@ -18187,7 +16337,7 @@ rustbn.js@~0.2.0: resolved "https://registry.yarnpkg.com/rustbn.js/-/rustbn.js-0.2.0.tgz#8082cb886e707155fd1cb6f23bd591ab8d55d0ca" integrity sha512-4VlvkRUuCJvr2J6Y0ImW7NvTCriMi7ErOAqWk1y69vAdoNIzCF3yPmgeNzx+RQTLEDFq5sHfscn1MwHxP9hNfA== -rxjs@6, rxjs@^6.4.0, rxjs@^6.6.0: +rxjs@^6.4.0: version "6.6.7" resolved "https://registry.yarnpkg.com/rxjs/-/rxjs-6.6.7.tgz#90ac018acabf491bf65044235d5863c4dab804c9" integrity sha512-hTdwr+7yYNIT5n4AMYp85KA6yw2Va0FLa3Rguvbpa4W3I5xynaBZo41cM3XM+4Q6fRMj3sBYIR1VAmZMXYJvRQ== @@ -18238,11 +16388,6 @@ safe-event-emitter@^1.0.1: dependencies: events "^3.0.0" -safe-json-stringify@~1: - version "1.2.0" - resolved "https://registry.yarnpkg.com/safe-json-stringify/-/safe-json-stringify-1.2.0.tgz#356e44bc98f1f93ce45df14bcd7c01cda86e0afd" - integrity sha512-gH8eh2nZudPQO6TytOvbxnuhYBOvDBBLW52tz5q6X58lJcd/tkmqFR+5Z9adS8aJtURSXWThWy/xJtJwixErvg== - safe-regex-test@^1.0.0: version "1.0.0" resolved "https://registry.yarnpkg.com/safe-regex-test/-/safe-regex-test-1.0.0.tgz#793b874d524eb3640d1873aad03596db2d4f2295" @@ -18257,11 +16402,6 @@ safe-regex-test@^1.0.0: resolved "https://registry.yarnpkg.com/safer-buffer/-/safer-buffer-2.1.2.tgz#44fa161b0187b9549dd84bb91802f9bd8385cd6a" integrity sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg== -sax@^1.2.4: - version "1.2.4" - resolved "https://registry.yarnpkg.com/sax/-/sax-1.2.4.tgz#2816234e2378bddc4e5354fab5caa895df7100d9" - integrity sha512-NqVDv9TpANUjFm0N8uM5GxL36UgKi9/atZw+x7YFnQ8ckwFGKrl4xX4yWtrey3UJm5nP1kUbnYgLopqWNSRhWw== - scrypt-js@2.0.3: version "2.0.3" resolved "https://registry.yarnpkg.com/scrypt-js/-/scrypt-js-2.0.3.tgz#bb0040be03043da9a012a2cea9fc9f852cfc87d4" @@ -18296,7 +16436,7 @@ scryptsy@^2.1.0: resolved "https://registry.yarnpkg.com/scryptsy/-/scryptsy-2.1.0.tgz#8d1e8d0c025b58fdd25b6fa9a0dc905ee8faa790" integrity sha512-1CdSqHQowJBnMAFyPEBRfqag/YP9OF394FV+4YREIJX4ljD7OxvQRDayyoyyCk+senRjSkP6VnUNQmVQqB6g7w== -secp256k1@4.0.3, secp256k1@^4.0.0, secp256k1@^4.0.1: +secp256k1@4.0.3, secp256k1@^4.0.1: version "4.0.3" resolved "https://registry.yarnpkg.com/secp256k1/-/secp256k1-4.0.3.tgz#c4559ecd1b8d3c1827ed2d1b94190d69ce267303" integrity sha512-NLZVf+ROMxwtEj3Xa562qgv2BK5e2WNmXPiOdVIPLgs6lyTzMvBq0aWTYMI5XCP9jZMVKOcqZLw/Wc4vDkuxhA== @@ -18305,25 +16445,6 @@ secp256k1@4.0.3, secp256k1@^4.0.0, secp256k1@^4.0.1: node-addon-api "^2.0.0" node-gyp-build "^4.2.0" -secp256k1@^3.0.1: - version "3.8.0" - resolved "https://registry.yarnpkg.com/secp256k1/-/secp256k1-3.8.0.tgz#28f59f4b01dbee9575f56a47034b7d2e3b3b352d" - integrity sha512-k5ke5avRZbtl9Tqx/SA7CbY3NF6Ro+Sj9cZxezFzuBlLDmyqPiL8hJJ+EmzD8Ig4LUDByHJ3/iPOVoRixs/hmw== - dependencies: - bindings "^1.5.0" - bip66 "^1.1.5" - bn.js "^4.11.8" - create-hash "^1.2.0" - drbg.js "^1.0.1" - elliptic "^6.5.2" - nan "^2.14.0" - safe-buffer "^5.1.2" - -seed-random@2.2.0: - version "2.2.0" - resolved "https://registry.yarnpkg.com/seed-random/-/seed-random-2.2.0.tgz#2a9b19e250a817099231a5b99a4daf80b7fbed54" - integrity sha512-34EQV6AAHQGhoc0tn/96a9Fsi6v2xdqe/dMUwljGRaFOzR3EgRmECvD0O8vi8X+/uQ50LGHfkNu/Eue5TPKZkQ== - seedrandom@3.0.5: version "3.0.5" resolved "https://registry.yarnpkg.com/seedrandom/-/seedrandom-3.0.5.tgz#54edc85c95222525b0c7a6f6b3543d8e0b3aa0a7" @@ -18346,21 +16467,11 @@ semaphore@>=1.0.1, semaphore@^1.0.3, semaphore@^1.1.0: resolved "https://registry.yarnpkg.com/semaphore/-/semaphore-1.1.0.tgz#aaad8b86b20fe8e9b32b16dc2ee682a8cd26a8aa" integrity sha512-O4OZEaNtkMd/K0i6js9SL+gqy0ZCBMgUvlSqHKi4IBdjhe7wB8pwztUk1BbZ1fmrvpwFrPbHzqd2w5pTcJH6LA== -"semver@2 || 3 || 4 || 5", semver@^5.3.0, semver@^5.4.1, semver@^5.5.0, semver@^5.5.1, semver@^5.6.0: +"semver@2 || 3 || 4 || 5", semver@^5.3.0, semver@^5.5.0, semver@^5.5.1, semver@^5.6.0: version "5.7.1" resolved "https://registry.yarnpkg.com/semver/-/semver-5.7.1.tgz#a954f931aeba508d307bbf069eff0c01c96116f7" integrity sha512-sauaDf/PZdVgrLTNYHRtpXa1iRiKcaebiKQ1BJdpQlWH2lCvexQdX55snPFyK7QzpudqbCI0qXFfOasHdyNDGQ== -semver@4.3.2: - version "4.3.2" - resolved "https://registry.yarnpkg.com/semver/-/semver-4.3.2.tgz#c7a07158a80bedd052355b770d82d6640f803be7" - integrity sha512-VyFUffiBx8hABJ9HYSTXLRwyZtdDHMzMtFmID1aiNAD2BZppBmJm0Hqw3p2jkgxP9BNt1pQ9RnC49P0EcXf6cA== - -semver@5.5.0: - version "5.5.0" - resolved "https://registry.yarnpkg.com/semver/-/semver-5.5.0.tgz#dc4bbc7a6ca9d916dee5d43516f0092b58f7b8ab" - integrity sha512-4SJ3dm0WAwWy/NVeioZh5AntkdJoWKxHxcmyP622fOkgHa4z3R0TdBJICINyaSDE6uNwVc8gZr+ZinwZAH4xIA== - semver@7.3.4: version "7.3.4" resolved "https://registry.yarnpkg.com/semver/-/semver-7.3.4.tgz#27aaa7d2e4ca76452f98d3add093a72c943edc97" @@ -18485,7 +16596,7 @@ servify@^0.1.12: request "^2.79.0" xhr "^2.3.3" -set-blocking@^2.0.0, set-blocking@~2.0.0: +set-blocking@^2.0.0: version "2.0.0" resolved "https://registry.yarnpkg.com/set-blocking/-/set-blocking-2.0.0.tgz#045f9782d011ae9a6803ddd382b24392b3d890f7" integrity sha512-KiKBS8AnWGEyLzofFfmvKwpdPzqiy16LvQfK3yv/fVH7Bj13/wl3JSR1J+rfgRE9q7xUJK4qvgS8raSOeLUehw== @@ -18630,16 +16741,7 @@ simple-get@^2.7.0: once "^1.3.1" simple-concat "^1.0.0" -simple-get@^3.0.3: - version "3.1.1" - resolved "https://registry.yarnpkg.com/simple-get/-/simple-get-3.1.1.tgz#cc7ba77cfbe761036fbfce3d021af25fc5584d55" - integrity sha512-CQ5LTKGfCpvE1K0n2us+kuMPbk/q0EKl82s4aheV9oXjFEz6W/Y7oQFVJuU6QG77hRT4Ghb5RURteF5vnWjupA== - dependencies: - decompress-response "^4.2.0" - once "^1.3.1" - simple-concat "^1.0.0" - -sisteransi@^1.0.0, sisteransi@^1.0.5: +sisteransi@^1.0.5: version "1.0.5" resolved "https://registry.yarnpkg.com/sisteransi/-/sisteransi-1.0.5.tgz#134d681297756437cc05ca01370d3a7a571075ed" integrity sha512-bLGGlR1QxBcynn2d5YmDX4MGjlZvy2MRBDRNHLJ8VI6l6+9FUiyTFNJ0IveOSP0bcXgVDPRcfGqA0pjaqUpfVg== @@ -18654,11 +16756,6 @@ slash@^4.0.0: resolved "https://registry.yarnpkg.com/slash/-/slash-4.0.0.tgz#2422372176c4c6c5addb5e2ada885af984b396a7" integrity sha512-3dOsAHXXUkQTpOYcoAxLIorMTp4gIQr5IW3iVb7A7lFIp0VHhnynm9izx6TssdrIcVIESAlVjtnO2K8bg+Coew== -sleep-promise@^8.0.1: - version "8.0.1" - resolved "https://registry.yarnpkg.com/sleep-promise/-/sleep-promise-8.0.1.tgz#8d795a27ea23953df6b52b91081e5e22665993c5" - integrity sha512-nfwyX+G1dsx2R1DMMKWLpNxuHMOCL7JIRBUw0fl7Z4nZ1YZK0apZuGY8MDexn0HDZzgbERgj/CrNtsYpo/B7eA== - sleep@6.1.0: version "6.1.0" resolved "https://registry.yarnpkg.com/sleep/-/sleep-6.1.0.tgz#5507b520556a82ffb983d39123c5459470fa2a9e" @@ -18704,11 +16801,6 @@ snake-case@^3.0.4: dot-case "^3.0.4" tslib "^2.0.3" -snakeize@^0.1.0: - version "0.1.0" - resolved "https://registry.yarnpkg.com/snakeize/-/snakeize-0.1.0.tgz#10c088d8b58eb076b3229bb5a04e232ce126422d" - integrity sha512-ot3bb6pQt6IVq5G/JQ640ceSYTPtriVrwNyfoUw1LmQQGzPMAGxE5F+ded2UwSUCyf2PW1fFAYUnVEX21PWbpQ== - socks-proxy-agent@^7.0.0: version "7.0.0" resolved "https://registry.yarnpkg.com/socks-proxy-agent/-/socks-proxy-agent-7.0.0.tgz#dc069ecf34436621acb41e3efa66ca1b5fed15b6" @@ -18792,19 +16884,6 @@ solc@^0.6.0: semver "^5.5.0" tmp "0.0.33" -solc@^0.8: - version "0.8.19" - resolved "https://registry.yarnpkg.com/solc/-/solc-0.8.19.tgz#cac6541106ae3cff101c740042c7742aa56a2ed3" - integrity sha512-yqurS3wzC4LdEvmMobODXqprV4MYJcVtinuxgrp61ac8K2zz40vXA0eSAskSHPgv8dQo7Nux39i3QBsHx4pqyA== - dependencies: - command-exists "^1.2.8" - commander "^8.1.0" - follow-redirects "^1.12.1" - js-sha3 "0.8.0" - memorystream "^0.3.1" - semver "^5.5.0" - tmp "0.0.33" - solhint@^4.5.4: version "4.5.4" resolved "https://registry.yarnpkg.com/solhint/-/solhint-4.5.4.tgz#171cf33f46c36b8499efe60c0e425f6883a54e50" @@ -18851,11 +16930,6 @@ solidity-comments-extractor@^0.0.8: resolved "https://registry.yarnpkg.com/solidity-comments-extractor/-/solidity-comments-extractor-0.0.8.tgz#f6e148ab0c49f30c1abcbecb8b8df01ed8e879f8" integrity sha512-htM7Vn6LhHreR+EglVMd2s+sZhcXAirB1Zlyrv5zBuTxieCvjfnRpd7iZk75m/u6NOlEyQ94C6TWbBn2cY7w8g== -solidity-parser-antlr@^0.4.2: - version "0.4.11" - resolved "https://registry.yarnpkg.com/solidity-parser-antlr/-/solidity-parser-antlr-0.4.11.tgz#af43e1f13b3b88309a875455f5d6e565b05ee5f1" - integrity sha512-4jtxasNGmyC0midtjH/lTFPZYvTTUMy6agYcF+HoMnzW8+cqo3piFrINb4ZCzpPW+7tTVFCGa5ubP34zOzeuMg== - sort-keys@^2.0.0: version "2.0.0" resolved "https://registry.yarnpkg.com/sort-keys/-/sort-keys-2.0.0.tgz#658535584861ec97d730d6cf41822e1f56684128" @@ -18878,7 +16952,7 @@ source-map-support@0.5.13: buffer-from "^1.0.0" source-map "^0.6.0" -source-map-support@^0.5.0, source-map-support@^0.5.16, source-map-support@^0.5.17, source-map-support@^0.5.19: +source-map-support@^0.5.16, source-map-support@^0.5.17, source-map-support@^0.5.19: version "0.5.21" resolved "https://registry.yarnpkg.com/source-map-support/-/source-map-support-0.5.21.tgz#04fe7c7f9e1ed2d662233c28cb2b35b9f63f6e4f" integrity sha512-uBHU3L3czsIyYXKX88fdrGovxdSCoTGDRZ6SYXtSRxLZUzHg5P/66Ht6uoUlHu9EZod+inXhKo3qQgwXUT/y1w== @@ -18891,13 +16965,6 @@ source-map@^0.6.0, source-map@^0.6.1: resolved "https://registry.yarnpkg.com/source-map/-/source-map-0.6.1.tgz#74722af32e9614e9c287a8d0bbde48b5e2f1a263" integrity sha512-UjgapumWlbMhkBgzT7Ykc5YXUT46F0iKu8SGXq0bcwP5dz/h0Plj6enJqjz1Zbq2l5WaqYnrVbwWOWMyF3F47g== -source-map@~0.2.0: - version "0.2.0" - resolved "https://registry.yarnpkg.com/source-map/-/source-map-0.2.0.tgz#dab73fbcfc2ba819b4de03bd6f6eaa48164b3f9d" - integrity sha512-CBdZ2oa/BHhS4xj5DlhjWNHcan57/5YuvfdLf17iVmIpd9KRm+DFLmC6nBNj+6Ua7Kt3TmOjDpQT1aTYOQtoUA== - dependencies: - amdefine ">=0.0.4" - spark-md5@3.0.2: version "3.0.2" resolved "https://registry.yarnpkg.com/spark-md5/-/spark-md5-3.0.2.tgz#7952c4a30784347abcee73268e473b9c0167e3fc" @@ -18938,13 +17005,6 @@ spinnies@^0.4.2: cli-cursor "^3.0.0" strip-ansi "^5.2.0" -split-array-stream@^2.0.0: - version "2.0.0" - resolved "https://registry.yarnpkg.com/split-array-stream/-/split-array-stream-2.0.0.tgz#85a4f8bfe14421d7bca7f33a6d176d0c076a53b1" - integrity sha512-hmMswlVY91WvGMxs0k8MRgq8zb2mSen4FmDNc5AFiTWtrBpdZN6nwD6kROVe4vNL+ywrvbCKsWVCnEd4riELIg== - dependencies: - is-stream-ended "^0.1.4" - split2@^3.0.0: version "3.2.2" resolved "https://registry.yarnpkg.com/split2/-/split2-3.2.2.tgz#bf2cf2a37d838312c249c89206fd7a17dd12365f" @@ -18952,11 +17012,6 @@ split2@^3.0.0: dependencies: readable-stream "^3.0.0" -split2@^4.1.0: - version "4.2.0" - resolved "https://registry.yarnpkg.com/split2/-/split2-4.2.0.tgz#c9c5920904d148bab0b9f67145f245a86aadbfa4" - integrity sha512-UcjcJOWknrNkF6PLX83qcHM6KHgVKNkV62Y8a5uYDVv9ydGQVwAHMKqHdJje1VTWpljG0WYpCDhrCdAOYH4TWg== - split@^1.0.0: version "1.0.1" resolved "https://registry.yarnpkg.com/split/-/split-1.0.1.tgz#605bd9be303aa59fb35f9229fbea0ddec9ea07d9" @@ -19037,28 +17092,11 @@ stop-iteration-iterator@^1.0.0: dependencies: internal-slot "^1.0.4" -stream-events@^1.0.1, stream-events@^1.0.4: - version "1.0.5" - resolved "https://registry.yarnpkg.com/stream-events/-/stream-events-1.0.5.tgz#bbc898ec4df33a4902d892333d47da9bf1c406d5" - integrity sha512-E1GUzBSgvct8Jsb3v2X15pjzN1tYebtbLaMg+eBOUOAxgbLoSbT2NS91ckc5lJD1KfLjId+jXJRgo0qnV5Nerg== - dependencies: - stubs "^3.0.0" - -stream-shift@^1.0.0: - version "1.0.1" - resolved "https://registry.yarnpkg.com/stream-shift/-/stream-shift-1.0.1.tgz#d7088281559ab2778424279b0877da3c392d5a3d" - integrity sha512-AiisoFqQ0vbGcZgQPY1cdP2I76glaVA/RauYR4G4thNFgkTqr90yXTo4LYX60Jl+sIlPNHHdGSwo01AvbKUSVQ== - strict-uri-encode@^1.0.0: version "1.1.0" resolved "https://registry.yarnpkg.com/strict-uri-encode/-/strict-uri-encode-1.1.0.tgz#279b225df1d582b1f54e65addd4352e18faa0713" integrity sha512-R3f198pcvnB+5IpnBlRkphuE9n46WyVl8I39W/ZUTZLz4nqSP/oLYUrcnJrw462Ds8he4YKMov2efsTIw1BDGQ== -string-hash@^1.1.3: - version "1.1.3" - resolved "https://registry.yarnpkg.com/string-hash/-/string-hash-1.1.3.tgz#e8aafc0ac1855b4666929ed7dd1275df5d6c811b" - integrity sha512-kJUvRUFK49aub+a7T1nNE66EJbZBMnBgoC1UbCZ5n6bsZKBRga4KgBRTMn/pFkeCZSYtNeSyMxPDM0AXWELk2A== - string-length@^4.0.1: version "4.0.2" resolved "https://registry.yarnpkg.com/string-length/-/string-length-4.0.2.tgz#a8a8dc7bd5c1a82b9b3c8b87e125f66871b6e57a" @@ -19067,7 +17105,7 @@ string-length@^4.0.1: char-regex "^1.0.2" strip-ansi "^6.0.0" -"string-width-cjs@npm:string-width@^4.2.0", "string-width@^1.0.2 || 2 || 3 || 4", string-width@^4.1.0, string-width@^4.2.0, string-width@^4.2.3: +"string-width-cjs@npm:string-width@^4.2.0": version "4.2.3" resolved "https://registry.yarnpkg.com/string-width/-/string-width-4.2.3.tgz#269c7117d27b05ad2e536830a8ec895ef9c6d010" integrity sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g== @@ -19093,6 +17131,15 @@ string-width@^1.0.1: is-fullwidth-code-point "^2.0.0" strip-ansi "^4.0.0" +"string-width@^1.0.2 || 2 || 3 || 4", string-width@^4.1.0, string-width@^4.2.0, string-width@^4.2.3: + version "4.2.3" + resolved "https://registry.yarnpkg.com/string-width/-/string-width-4.2.3.tgz#269c7117d27b05ad2e536830a8ec895ef9c6d010" + integrity sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g== + dependencies: + emoji-regex "^8.0.0" + is-fullwidth-code-point "^3.0.0" + strip-ansi "^6.0.1" + string-width@^3.0.0, string-width@^3.1.0: version "3.1.0" resolved "https://registry.yarnpkg.com/string-width/-/string-width-3.1.0.tgz#22767be21b62af1081574306f69ac51b62203961" @@ -19184,7 +17231,7 @@ string_decoder@~1.1.1: dependencies: safe-buffer "~5.1.0" -"strip-ansi-cjs@npm:strip-ansi@^6.0.1", strip-ansi@^6.0.0, strip-ansi@^6.0.1: +"strip-ansi-cjs@npm:strip-ansi@^6.0.1": version "6.0.1" resolved "https://registry.yarnpkg.com/strip-ansi/-/strip-ansi-6.0.1.tgz#9e26c63d30f53443e9489495b2105d37b67a85d9" integrity sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A== @@ -19212,6 +17259,13 @@ strip-ansi@^5.0.0, strip-ansi@^5.1.0, strip-ansi@^5.2.0: dependencies: ansi-regex "^4.1.0" +strip-ansi@^6.0.0, strip-ansi@^6.0.1: + version "6.0.1" + resolved "https://registry.yarnpkg.com/strip-ansi/-/strip-ansi-6.0.1.tgz#9e26c63d30f53443e9489495b2105d37b67a85d9" + integrity sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A== + dependencies: + ansi-regex "^5.0.1" + strip-ansi@^7.0.1: version "7.1.0" resolved "https://registry.yarnpkg.com/strip-ansi/-/strip-ansi-7.1.0.tgz#d5b6568ca689d8561370b0707685d22434faff45" @@ -19236,11 +17290,6 @@ strip-bom@^4.0.0: resolved "https://registry.yarnpkg.com/strip-bom/-/strip-bom-4.0.0.tgz#9c3505c1db45bcedca3d9cf7a16f5c5aa3901878" integrity sha512-3xurFv5tEgii33Zi8Jtp55wEIILR9eh34FAW00PZf+JnSsTmV/ioewSgQl97JHvgjoRGwPShsWm+IdrxB35d0w== -strip-comments@^2.0.1: - version "2.0.1" - resolved "https://registry.yarnpkg.com/strip-comments/-/strip-comments-2.0.1.tgz#4ad11c3fbcac177a67a40ac224ca339ca1c1ba9b" - integrity sha512-ZprKx+bBLXv067WTCALv8SSz5l2+XhpYCsVtSqlMnkAXMWDq+/ekVbl1ghqP9rUHTzv6sm/DwCOiYutU/yp1fw== - strip-dirs@^2.0.0: version "2.1.0" resolved "https://registry.yarnpkg.com/strip-dirs/-/strip-dirs-2.1.0.tgz#4987736264fc344cf20f6c34aca9d13d1d4ed6c5" @@ -19306,11 +17355,6 @@ strong-log-transformer@^2.1.0: minimist "^1.2.0" through "^2.3.4" -stubs@^3.0.0: - version "3.0.0" - resolved "https://registry.yarnpkg.com/stubs/-/stubs-3.0.0.tgz#e8d2ba1fa9c90570303c030b6900f7d5f89abe5b" - integrity sha512-PdHt7hHUJKxvTCgbKX9C1V/ftOcjJQgz8BZwNfV5c4B6dcGqlpelTbJ999jBGZ2jYiPAwcX5dP6oBwVlBlUbxw== - sublevel-pouchdb@7.3.1: version "7.3.1" resolved "https://registry.yarnpkg.com/sublevel-pouchdb/-/sublevel-pouchdb-7.3.1.tgz#c1cc03af45081345c7c82821d6dcaa74564ae2ef" @@ -19342,13 +17386,6 @@ supports-color@8.1.1, supports-color@^8.0.0: dependencies: has-flag "^4.0.0" -supports-color@^3.1.0: - version "3.2.3" - resolved "https://registry.yarnpkg.com/supports-color/-/supports-color-3.2.3.tgz#65ac0504b3954171d8a64946b2ae3cbb8a5f54f6" - integrity sha512-Jds2VIYDrlp5ui7t8abHN2bjAu4LV/q4N2KivFPpGH0lrka0BMq/33AmECUXlKPcHigkNaqfXRENFju+rlcy+A== - dependencies: - has-flag "^1.0.0" - supports-color@^5.3.0: version "5.5.0" resolved "https://registry.yarnpkg.com/supports-color/-/supports-color-5.5.0.tgz#e2e69a44ac8772f78a1ec0b35b689df6530efc8f" @@ -19474,27 +17511,7 @@ tape@^4.4.0: string.prototype.trim "~1.2.7" through "~2.3.8" -tar-fs@^1.8.1: - version "1.16.3" - resolved "https://registry.yarnpkg.com/tar-fs/-/tar-fs-1.16.3.tgz#966a628841da2c4010406a82167cbd5e0c72d509" - integrity sha512-NvCeXpYx7OsmOh8zIOP/ebG55zZmxLE0etfWRbWok+q2Qo8x/vOR/IJT1taADXPe+jsiu9axDb3X4B+iIgNlKw== - dependencies: - chownr "^1.0.1" - mkdirp "^0.5.1" - pump "^1.0.0" - tar-stream "^1.1.2" - -tar-fs@^2.0.0: - version "2.1.1" - resolved "https://registry.yarnpkg.com/tar-fs/-/tar-fs-2.1.1.tgz#489a15ab85f1f0befabb370b7de4f9eb5cbe8784" - integrity sha512-V0r2Y9scmbDRLCNex/+hYzvp/zyYjvFbHPNgVTKfQvVrb6guiE/fxP+XblDNR011utopbkex2nM4dHNV6GDsng== - dependencies: - chownr "^1.1.1" - mkdirp-classic "^0.5.2" - pump "^3.0.0" - tar-stream "^2.1.4" - -tar-stream@^1.1.2, tar-stream@^1.5.2: +tar-stream@^1.5.2: version "1.6.2" resolved "https://registry.yarnpkg.com/tar-stream/-/tar-stream-1.6.2.tgz#8ea55dab37972253d9a9af90fdcd559ae435c555" integrity sha512-rzS0heiNf8Xn7/mpdSVVSMAWAoy9bfb1WOTYC78Z0UQKeKa/CWS8FOq0lKGNa8DWKAn9gxjCvMLYc5PGXYlK2A== @@ -19507,7 +17524,7 @@ tar-stream@^1.1.2, tar-stream@^1.5.2: to-buffer "^1.1.1" xtend "^4.0.0" -tar-stream@^2.1.4, tar-stream@~2.2.0: +tar-stream@~2.2.0: version "2.2.0" resolved "https://registry.yarnpkg.com/tar-stream/-/tar-stream-2.2.0.tgz#acad84c284136b060dc3faa64474aa9aebd77287" integrity sha512-ujeqbceABgwMZxEJnk2HDY2DlnUZ+9oEcb1KzTVfYHio0UE6dG71n60d8D2I4qNvleWrrXpmjpt7vZeF1LnMZQ== @@ -19518,7 +17535,7 @@ tar-stream@^2.1.4, tar-stream@~2.2.0: inherits "^2.0.3" readable-stream "^3.1.1" -tar@^4.0.2, tar@^4.4.2: +tar@^4.0.2: version "4.4.19" resolved "https://registry.yarnpkg.com/tar/-/tar-4.4.19.tgz#2e4d7263df26f2b914dee10c825ab132123742f3" integrity sha512-a20gEsvHnWe0ygBY8JbxoM4w3SJdhc7ZAuxkLqh+nvNQN2IOt0B5lLgM490X5Hl8FF0dl0tOf2ewFYAlIFgzVA== @@ -19543,22 +17560,6 @@ tar@^6.1.0, tar@^6.1.11, tar@^6.1.2: mkdirp "^1.0.3" yallist "^4.0.0" -targz@^1.0.1: - version "1.0.1" - resolved "https://registry.yarnpkg.com/targz/-/targz-1.0.1.tgz#8f76a523694cdedfbb5d60a4076ff6eeecc5398f" - integrity sha512-6q4tP9U55mZnRuMTBqnqc3nwYQY3kv+QthCFZuMk+Tn1qYUnMPmL/JZ/mzgXINzFpSqfU+242IFmFU9VPvqaQw== - dependencies: - tar-fs "^1.8.1" - -teeny-request@^3.11.3: - version "3.11.3" - resolved "https://registry.yarnpkg.com/teeny-request/-/teeny-request-3.11.3.tgz#335c629f7645e5d6599362df2f3230c4cbc23a55" - integrity sha512-CKncqSF7sH6p4rzCgkb/z/Pcos5efl0DmolzvlqRQUNcpRIruOhY9+T1FsIlyEbfWd7MsFpodROOwHYh2BaXzw== - dependencies: - https-proxy-agent "^2.2.1" - node-fetch "^2.2.0" - uuid "^3.3.2" - temp-dir@^1.0.0: version "1.0.0" resolved "https://registry.yarnpkg.com/temp-dir/-/temp-dir-1.0.0.tgz#0a7c0ea26d3a39afa7e0ebea9c1fc0bc4daa011d" @@ -19613,21 +17614,7 @@ then-request@^6.0.0: promise "^8.0.0" qs "^6.4.0" -thenify-all@^1.0.0: - version "1.6.0" - resolved "https://registry.yarnpkg.com/thenify-all/-/thenify-all-1.6.0.tgz#1a1918d402d8fc3f98fbf234db0bcc8cc10e9726" - integrity sha512-RNxQH/qI8/t3thXJDwcstUO4zeqo64+Uy/+sNVRBx4Xn2OX+OZ9oP+iJnNFqplFra2ZUVeKCSa2oVWi3T4uVmA== - dependencies: - thenify ">= 3.1.0 < 4" - -"thenify@>= 3.1.0 < 4": - version "3.3.1" - resolved "https://registry.yarnpkg.com/thenify/-/thenify-3.3.1.tgz#8932e686a4066038a016dd9e2ca46add9838a95f" - integrity sha512-RVZSIV5IG10Hk3enotrhvz0T9em6cyHBLkH/YAZuKqd8hRkKhSfCGIcP2KUY0EPxndzANBmNllzWPwak+bheSw== - dependencies: - any-promise "^1.0.0" - -through2@3.0.2, through2@^3.0.1: +through2@3.0.2: version "3.0.2" resolved "https://registry.yarnpkg.com/through2/-/through2-3.0.2.tgz#99f88931cfc761ec7678b41d5d7336b5b6a07bf4" integrity sha512-enaDQ4MUyP2W6ZyT6EsMzqBPZaM/avg8iuo+l2d3QCs0J+6RaqkHV/2/lOwDTueBHeJ/2LG9lrLW3d5rWPucuQ== @@ -19643,14 +17630,6 @@ through2@^2.0.0: readable-stream "~2.3.6" xtend "~4.0.1" -through2@^3.0.0: - version "3.0.0" - resolved "https://registry.yarnpkg.com/through2/-/through2-3.0.0.tgz#468b461df9cd9fcc170f22ebf6852e467e578ff2" - integrity sha512-8B+sevlqP4OiCjonI1Zw03Sf8PuV1eRsYQgLad5eonILOdyeRsY27A/2Ze8IlvlMvq31OH+3fz/styI7Ya62yQ== - dependencies: - readable-stream "2 || 3" - xtend "~4.0.1" - through2@^4.0.0: version "4.0.2" resolved "https://registry.yarnpkg.com/through2/-/through2-4.0.2.tgz#a7ce3ac2a7a8b0b966c80e7c49f0484c3b239764" @@ -19675,18 +17654,6 @@ tiny-async-pool@^1.0.4: dependencies: semver "^5.5.0" -tiny-emitter@2.1.0: - version "2.1.0" - resolved "https://registry.yarnpkg.com/tiny-emitter/-/tiny-emitter-2.1.0.tgz#1d1a56edfc51c43e863cbb5382a72330e3555423" - integrity sha512-NB6Dk1A9xgQPMoGqC5CVXn123gWyte215ONT5Pp5a0yt4nlEoO1ZWeCwpncaekPHXO60i47ihFnZPiRPjRMq4Q== - -tiny-secp256k1@2.2.1: - version "2.2.1" - resolved "https://registry.yarnpkg.com/tiny-secp256k1/-/tiny-secp256k1-2.2.1.tgz#a61d4791b7031aa08a9453178a131349c3e10f9b" - integrity sha512-/U4xfVqnVxJXN4YVsru0E6t5wVncu2uunB8+RVR40fYUxkKYUPS10f+ePQZgFBoE/Jbf9H1NBveupF2VmB58Ng== - dependencies: - uint8array-tools "0.0.7" - tiny-typed-emitter@^2.1.0: version "2.1.0" resolved "https://registry.yarnpkg.com/tiny-typed-emitter/-/tiny-typed-emitter-2.1.0.tgz#b3b027fdd389ff81a152c8e847ee2f5be9fad7b5" @@ -20096,11 +18063,21 @@ tunnel@0.0.6: resolved "https://registry.yarnpkg.com/tunnel/-/tunnel-0.0.6.tgz#72f1314b34a5b192db012324df2cc587ca47f92c" integrity sha512-1h/Lnq9yajKY2PEbBadPXj3VxsDDu844OnaAo52UVmIzIvwwtBPIuNvkjuzBlTWpfJyUbG3ez0KSBibQkj4ojg== +tweetnacl-util@^0.15.1: + version "0.15.1" + resolved "https://registry.yarnpkg.com/tweetnacl-util/-/tweetnacl-util-0.15.1.tgz#b80fcdb5c97bcc508be18c44a4be50f022eea00b" + integrity sha512-RKJBIj8lySrShN4w6i/BonWp2Z/uxwC3h4y7xsRrpP59ZboCd0GpEVsOnMDYLMmKBpYhb5TgHzZXy7wTfYFBRw== + tweetnacl@^0.14.3, tweetnacl@~0.14.0: version "0.14.5" resolved "https://registry.yarnpkg.com/tweetnacl/-/tweetnacl-0.14.5.tgz#5ae68177f192d4456269d108afa93ff8743f4f64" integrity sha512-KXXFFdAbFXY4geFIwoyNK+f5Z1b7swfXABfL7HXCmoIWMKU3dmS26672A4EeQtDzLKy7SXmfBu51JolvEKwtGA== +tweetnacl@^1.0.3: + version "1.0.3" + resolved "https://registry.yarnpkg.com/tweetnacl/-/tweetnacl-1.0.3.tgz#ac0af71680458d8a6378d0d0d050ab1407d35596" + integrity sha512-6rt+RN7aOi1nGMyC4Xa5DdYiukl2UWCbcJft7YhxReBGQD7OAM8Pbxw6YMo4r2diNEA8FEmu32YOn9rhaiE5yw== + type-check@^0.4.0, type-check@~0.4.0: version "0.4.0" resolved "https://registry.yarnpkg.com/type-check/-/type-check-0.4.0.tgz#07b8203bfa7056c0657050e3ccd2c37730bab8f1" @@ -20232,11 +18209,6 @@ typed-array-length@^1.0.4: for-each "^0.3.3" is-typed-array "^1.1.9" -typed-function@1.1.0: - version "1.1.0" - resolved "https://registry.yarnpkg.com/typed-function/-/typed-function-1.1.0.tgz#ea149706e0fb42aca1791c053a6d94ccd6c4fdcb" - integrity sha512-TuQzwiT4DDg19beHam3E66oRXhyqlyfgjHB/5fcvsRXbfmWPJfto9B4a0TBdTrQAPGlGmXh/k7iUI+WsObgORA== - typedarray-to-buffer@^3.1.5: version "3.1.5" resolved "https://registry.yarnpkg.com/typedarray-to-buffer/-/typedarray-to-buffer-3.1.5.tgz#a97ee7a9ff42691b9f783ff1bc5112fe3fca9080" @@ -20249,11 +18221,6 @@ typedarray@^0.0.6: resolved "https://registry.yarnpkg.com/typedarray/-/typedarray-0.0.6.tgz#867ac74e3864187b1d3d47d996a78ec5c8830777" integrity sha512-/aCDEGatGvZ2BIk+HmLf4ifCJFwvKFNb9/JeZPMulfgFracn9QFcAf5GO8B/mweUjSoblS5In0cWhqpfs/5PQA== -typeforce@^1.11.5: - version "1.18.0" - resolved "https://registry.yarnpkg.com/typeforce/-/typeforce-1.18.0.tgz#d7416a2c5845e085034d70fcc5b6cc4a90edbfdc" - integrity sha512-7uc1O8h1M1g0rArakJdf0uLRSSgFcYexrVoKo+bzJd32gd4gDy2L/Z+8/FjPnU9ydY3pEnVPtr9FyscYY60K1g== - typescript-compare@^0.0.2: version "0.0.2" resolved "https://registry.yarnpkg.com/typescript-compare/-/typescript-compare-0.0.2.tgz#7ee40a400a406c2ea0a7e551efd3309021d5f425" @@ -20293,11 +18260,6 @@ uglify-js@^3.1.4: resolved "https://registry.yarnpkg.com/uglify-js/-/uglify-js-3.17.4.tgz#61678cf5fa3f5b7eb789bb345df29afb8257c22c" integrity sha512-T9q82TJI9e/C1TAxYvfb16xO120tMVFZrGA3f9/P4424DNu6ypK103y0GPFVa17yotwSyZW5iYXgjYHkGrJW/g== -uint8array-tools@0.0.7: - version "0.0.7" - resolved "https://registry.yarnpkg.com/uint8array-tools/-/uint8array-tools-0.0.7.tgz#a7a2bb5d8836eae2fade68c771454e6a438b390d" - integrity sha512-vrrNZJiusLWoFWBqz5Y5KMCgP9W9hnjZHzZiZRT8oNAkq3d5Z5Oe76jAvVVSRh4U8GGR90N2X1dWtrhvx6L8UQ== - ultron@~1.1.0: version "1.1.1" resolved "https://registry.yarnpkg.com/ultron/-/ultron-1.1.1.tgz#9fe1536a10a664a65266a1e3ccf85fd36302bc9c" @@ -20437,14 +18399,6 @@ url-parse-lax@^3.0.0: dependencies: prepend-http "^2.0.0" -url-parse@1.4.4: - version "1.4.4" - resolved "https://registry.yarnpkg.com/url-parse/-/url-parse-1.4.4.tgz#cac1556e95faa0303691fec5cf9d5a1bc34648f8" - integrity sha512-/92DTTorg4JjktLNLe6GPS2/RvAd/RGr6LuktmWSMLEOa6rjnlrFXNgSbSmkNvCoL2T028A0a1JaJLzRMlFoHg== - dependencies: - querystringify "^2.0.0" - requires-port "^1.0.0" - url-parse@^1.5.3: version "1.5.10" resolved "https://registry.yarnpkg.com/url-parse/-/url-parse-1.5.10.tgz#9d3c2f736c1d75dd3bd2be507dcc111f1e2ea9c1" @@ -20458,24 +18412,11 @@ url-set-query@^1.0.0: resolved "https://registry.yarnpkg.com/url-set-query/-/url-set-query-1.0.0.tgz#016e8cfd7c20ee05cafe7795e892bd0702faa339" integrity sha512-3AChu4NiXquPfeckE5R5cGdiHCMWJx1dwCWOmWIL4KHAziJNOFIYJlpGFeKDvwLPHovZRCxK3cYlwzqI9Vp+Gg== -url-template@^2.0.8: - version "2.0.8" - resolved "https://registry.yarnpkg.com/url-template/-/url-template-2.0.8.tgz#fc565a3cccbff7730c775f5641f9555791439f21" - integrity sha512-XdVKMF4SJ0nP/O7XIPB0JwAEuT9lDIYnNsK8yGVe43y0AWoKeJNdv3ZNWh7ksJ6KqQFjOO6ox/VEitLnaVNufw== - url-to-options@^1.0.1: version "1.0.1" resolved "https://registry.yarnpkg.com/url-to-options/-/url-to-options-1.0.1.tgz#1505a03a289a48cbd7a434efbaeec5055f5633a9" integrity sha512-0kQLIzG4fdk/G5NONku64rSH/x32NOA39LVQqlK8Le6lvTF6GGRJpqaQFGgU+CLwySIqBSMdwYM0sYcW9f6P4A== -usb@^1.7.0: - version "1.9.2" - resolved "https://registry.yarnpkg.com/usb/-/usb-1.9.2.tgz#fb6b36f744ecc707a196c45a6ec72442cb6f2b73" - integrity sha512-dryNz030LWBPAf6gj8vyq0Iev3vPbCLHCT8dBw3gQRXRzVNsIdeuU+VjPp3ksmSPkeMAl1k+kQ14Ij0QHyeiAg== - dependencies: - node-addon-api "^4.2.0" - node-gyp-build "^4.3.0" - utf-8-validate@5.0.7: version "5.0.7" resolved "https://registry.yarnpkg.com/utf-8-validate/-/utf-8-validate-5.0.7.tgz#c15a19a6af1f7ad9ec7ddc425747ca28c3644922" @@ -20490,11 +18431,6 @@ utf-8-validate@^5.0.2: dependencies: node-gyp-build "^4.3.0" -utf8@2.1.1: - version "2.1.1" - resolved "https://registry.yarnpkg.com/utf8/-/utf8-2.1.1.tgz#2e01db02f7d8d0944f77104f1609eb0c304cf768" - integrity sha512-FzZp4f0vPa0AfWf+eav6hqZEqbn7TU1my/GUexpF9e0Afe/fnuLQvgdq5KgD3ggUpu3DpwRUGC0iS8q35eVBLQ== - utf8@3.0.0, utf8@^3.0.0: version "3.0.0" resolved "https://registry.yarnpkg.com/utf8/-/utf8-3.0.0.tgz#f052eed1364d696e769ef058b183df88c87f69d1" @@ -20554,7 +18490,7 @@ uuid@8.3.2, uuid@^8.3.2: resolved "https://registry.yarnpkg.com/uuid/-/uuid-8.3.2.tgz#80d5b5ced271bb9af6c445f21a1a04c606cefbe2" integrity sha512-+NYs2QeMWy+GWFOEm9xnn6HCDp0l7QBD7ml8zLUmJ+93Q5NF0NocErnwkTkXVFNiX3/fpC6afS8Dhb/gz7R7eg== -uuid@^3.2.1, uuid@^3.3.2, uuid@^3.3.3: +uuid@^3.3.2, uuid@^3.3.3: version "3.4.0" resolved "https://registry.yarnpkg.com/uuid/-/uuid-3.4.0.tgz#b23e4358afa8a202fe7a100af1f5f883f02007ee" integrity sha512-HjSDRw6gZE5JMggctHBcjVak08+KEVhSIiDzFnT9S9aegmp85S/bReBVTb4QTFaRNptJ9kuYaNhnbNEOkbKb/A== @@ -20583,11 +18519,6 @@ v8-to-istanbul@^9.0.1: "@types/istanbul-lib-coverage" "^2.0.1" convert-source-map "^1.6.0" -valid-url@^1.0.9: - version "1.0.9" - resolved "https://registry.yarnpkg.com/valid-url/-/valid-url-1.0.9.tgz#1c14479b40f1397a75782f115e4086447433a200" - integrity sha512-QQDsV8OnSf5Uc30CKSwG9lnhMPe6exHtTXLRYX8uMwKENy640pU+2BgBL0LRbDh/eYRahNCS7aewCx0wf3NYVA== - validate-npm-package-license@^3.0.1, validate-npm-package-license@^3.0.4: version "3.0.4" resolved "https://registry.yarnpkg.com/validate-npm-package-license/-/validate-npm-package-license-3.0.4.tgz#fc91f6b9c7ba15c857f4cb2c5defeec39d4f410a" @@ -20639,6 +18570,20 @@ verror@1.10.0: core-util-is "1.0.2" extsprintf "^1.2.0" +viem@2.29.4: + version "2.29.4" + resolved "https://registry.yarnpkg.com/viem/-/viem-2.29.4.tgz#f727c868056198de854a7c18f672a7ead7875226" + integrity sha512-Dhyae+w1LKKpYVXypGjBnZ3WU5EHl/Uip5RtVwVRYSVxD5VvHzqKzIfbFU1KP4vnnh3++ZNgLjBY/kVT/tPrrg== + dependencies: + "@noble/curves" "1.8.2" + "@noble/hashes" "1.7.2" + "@scure/bip32" "1.6.2" + "@scure/bip39" "1.5.4" + abitype "1.0.8" + isows "1.0.7" + ox "0.6.9" + ws "8.18.1" + viem@^1.0.0: version "1.19.0" resolved "https://registry.yarnpkg.com/viem/-/viem-1.19.0.tgz#e9f2b5084795d217c4adeb4b1c46d2afa949967c" @@ -20663,11 +18608,6 @@ walk-up-path@^1.0.0: resolved "https://registry.yarnpkg.com/walk-up-path/-/walk-up-path-1.0.0.tgz#d4745e893dd5fd0dbb58dd0a4c6a33d9c9fec53e" integrity sha512-hwj/qMDUEjCU5h0xr90KGCf0tg0/LgJbmOWgrWKYlcJZM7XvquvUJZ0G/HMGr7F7OQMOUuPHWP9JpriinkAlkg== -walkdir@^0.3.0, walkdir@^0.3.2: - version "0.3.2" - resolved "https://registry.yarnpkg.com/walkdir/-/walkdir-0.3.2.tgz#ac8437a288c295656848ebc19981ebc677a5f590" - integrity sha512-0Twghia4Z5wDGDYWURlhZmI47GvERMCsXIu0QZWVVZyW9ZjpbbZvD9Zy9M6cWiQQRRbAcYajIyKNavaZZDt1Uw== - walker@^1.0.8: version "1.0.8" resolved "https://registry.yarnpkg.com/walker/-/walker-1.0.8.tgz#bd498db477afe573dc04185f011d3ab8a8d7653f" @@ -20682,11 +18622,6 @@ wcwidth@^1.0.0, wcwidth@^1.0.1: dependencies: defaults "^1.0.3" -weak-map@^1.0.5: - version "1.0.8" - resolved "https://registry.yarnpkg.com/weak-map/-/weak-map-1.0.8.tgz#394c18a9e8262e790544ed8b55c6a4ddad1cb1a3" - integrity sha512-lNR9aAefbGPpHO7AEnY0hCFjz1eTkWCXYvkTRrTHs9qv8zJp+SkVYpzfLIFXQQiG3tVvbNFQgVg2bQS8YGgxyw== - web-streams-polyfill@^3.0.3: version "3.2.1" resolved "https://registry.yarnpkg.com/web-streams-polyfill/-/web-streams-polyfill-3.2.1.tgz#71c2718c52b45fd49dbeee88634b3a60ceab42a6" @@ -20701,15 +18636,6 @@ web3-bzz@1.10.0: got "12.1.0" swarm-js "^0.1.40" -web3-bzz@1.10.4: - version "1.10.4" - resolved "https://registry.yarnpkg.com/web3-bzz/-/web3-bzz-1.10.4.tgz#dcc787970767d9004c73d11d0eeef774ce16b880" - integrity sha512-ZZ/X4sJ0Uh2teU9lAGNS8EjveEppoHNQiKlOXAjedsrdWuaMErBPdLQjXfcrYvN6WM6Su9PMsAxf3FXXZ+HwQw== - dependencies: - "@types/node" "^12.12.6" - got "12.1.0" - swarm-js "^0.1.40" - web3-bzz@1.2.2: version "1.2.2" resolved "https://registry.yarnpkg.com/web3-bzz/-/web3-bzz-1.2.2.tgz#a3b9f613c49fd3e120e0997088a73557d5adb724" @@ -20738,17 +18664,6 @@ web3-bzz@1.8.2: got "12.1.0" swarm-js "^0.1.40" -web3-core-helpers@1.0.0-beta.55: - version "1.0.0-beta.55" - resolved "https://registry.yarnpkg.com/web3-core-helpers/-/web3-core-helpers-1.0.0-beta.55.tgz#832b8499889f9f514b1d174f00172fd3683d63de" - integrity sha512-suj9Xy/lIqajaYLJTEjr2rlFgu6hGYwChHmf8+qNrC2luZA6kirTamtB9VThWMxbywx7p0bqQFjW6zXogAgWhg== - dependencies: - "@babel/runtime" "^7.3.1" - lodash "^4.17.11" - web3-core "1.0.0-beta.55" - web3-eth-iban "1.0.0-beta.55" - web3-utils "1.0.0-beta.55" - web3-core-helpers@1.10.0: version "1.10.0" resolved "https://registry.yarnpkg.com/web3-core-helpers/-/web3-core-helpers-1.10.0.tgz#1016534c51a5df77ed4f94d1fcce31de4af37fad" @@ -20757,14 +18672,6 @@ web3-core-helpers@1.10.0: web3-eth-iban "1.10.0" web3-utils "1.10.0" -web3-core-helpers@1.10.4: - version "1.10.4" - resolved "https://registry.yarnpkg.com/web3-core-helpers/-/web3-core-helpers-1.10.4.tgz#bd2b4140df2016d5dd3bb2b925fc29ad8678677c" - integrity sha512-r+L5ylA17JlD1vwS8rjhWr0qg7zVoVMDvWhajWA5r5+USdh91jRUYosp19Kd1m2vE034v7Dfqe1xYRoH2zvG0g== - dependencies: - web3-eth-iban "1.10.4" - web3-utils "1.10.4" - web3-core-helpers@1.2.2: version "1.2.2" resolved "https://registry.yarnpkg.com/web3-core-helpers/-/web3-core-helpers-1.2.2.tgz#484974f4bd4a487217b85b0d7cfe841af0907619" @@ -20790,20 +18697,6 @@ web3-core-helpers@1.8.2: web3-eth-iban "1.8.2" web3-utils "1.8.2" -web3-core-method@1.0.0-beta.55: - version "1.0.0-beta.55" - resolved "https://registry.yarnpkg.com/web3-core-method/-/web3-core-method-1.0.0-beta.55.tgz#0af994295ac2dd64ccd53305b7df8da76e11da49" - integrity sha512-w1cW/s2ji9qGELHk2uMJCn1ooay0JJLVoPD1nvmsW6OTRWcVjxa62nJrFQhe6P5lEb83Xk9oHgmCxZoVUHibOw== - dependencies: - "@babel/runtime" "^7.3.1" - eventemitter3 "3.1.0" - lodash "^4.17.11" - rxjs "^6.4.0" - web3-core "1.0.0-beta.55" - web3-core-helpers "1.0.0-beta.55" - web3-core-subscriptions "1.0.0-beta.55" - web3-utils "1.0.0-beta.55" - web3-core-method@1.10.0: version "1.10.0" resolved "https://registry.yarnpkg.com/web3-core-method/-/web3-core-method-1.10.0.tgz#82668197fa086e8cc8066742e35a9d72535e3412" @@ -20815,17 +18708,6 @@ web3-core-method@1.10.0: web3-core-subscriptions "1.10.0" web3-utils "1.10.0" -web3-core-method@1.10.4: - version "1.10.4" - resolved "https://registry.yarnpkg.com/web3-core-method/-/web3-core-method-1.10.4.tgz#566b52f006d3cbb13b21b72b8d2108999bf5d6bf" - integrity sha512-uZTb7flr+Xl6LaDsyTeE2L1TylokCJwTDrIVfIfnrGmnwLc6bmTWCCrm71sSrQ0hqs6vp/MKbQYIYqUN0J8WyA== - dependencies: - "@ethersproject/transactions" "^5.6.2" - web3-core-helpers "1.10.4" - web3-core-promievent "1.10.4" - web3-core-subscriptions "1.10.4" - web3-utils "1.10.4" - web3-core-method@1.2.2: version "1.2.2" resolved "https://registry.yarnpkg.com/web3-core-method/-/web3-core-method-1.2.2.tgz#d4fe2bb1945b7152e5f08e4ea568b171132a1e56" @@ -20866,13 +18748,6 @@ web3-core-promievent@1.10.0: dependencies: eventemitter3 "4.0.4" -web3-core-promievent@1.10.4: - version "1.10.4" - resolved "https://registry.yarnpkg.com/web3-core-promievent/-/web3-core-promievent-1.10.4.tgz#629b970b7934430b03c5033c79f3bb3893027e22" - integrity sha512-2de5WnJQ72YcIhYwV/jHLc4/cWJnznuoGTJGD29ncFQHAfwW/MItHFSVKPPA5v8AhJe+r6y4Y12EKvZKjQVBvQ== - dependencies: - eventemitter3 "4.0.4" - web3-core-promievent@1.2.2: version "1.2.2" resolved "https://registry.yarnpkg.com/web3-core-promievent/-/web3-core-promievent-1.2.2.tgz#3b60e3f2a0c96db8a891c927899d29d39e66ab1c" @@ -20906,17 +18781,6 @@ web3-core-requestmanager@1.10.0: web3-providers-ipc "1.10.0" web3-providers-ws "1.10.0" -web3-core-requestmanager@1.10.4: - version "1.10.4" - resolved "https://registry.yarnpkg.com/web3-core-requestmanager/-/web3-core-requestmanager-1.10.4.tgz#eb1f147e6b9df84e3a37e602162f8925bdb4bb9a" - integrity sha512-vqP6pKH8RrhT/2MoaU+DY/OsYK9h7HmEBNCdoMj+4ZwujQtw/Mq2JifjwsJ7gits7Q+HWJwx8q6WmQoVZAWugg== - dependencies: - util "^0.12.5" - web3-core-helpers "1.10.4" - web3-providers-http "1.10.4" - web3-providers-ipc "1.10.4" - web3-providers-ws "1.10.4" - web3-core-requestmanager@1.2.2: version "1.2.2" resolved "https://registry.yarnpkg.com/web3-core-requestmanager/-/web3-core-requestmanager-1.2.2.tgz#667ba9ac724c9c76fa8965ae8a3c61f66e68d8d6" @@ -20950,15 +18814,6 @@ web3-core-requestmanager@1.8.2: web3-providers-ipc "1.8.2" web3-providers-ws "1.8.2" -web3-core-subscriptions@1.0.0-beta.55: - version "1.0.0-beta.55" - resolved "https://registry.yarnpkg.com/web3-core-subscriptions/-/web3-core-subscriptions-1.0.0-beta.55.tgz#105902c13db53466fc17d07a981ad3d41c700f76" - integrity sha512-pb3oQbUzK7IoyXwag8TYInQddg0rr7BHxKc+Pbs/92hVNQ5ps4iGMVJKezdrjlQ1IJEEUiDIglXl4LZ1hIuMkw== - dependencies: - "@babel/runtime" "^7.3.1" - eventemitter3 "^3.1.0" - lodash "^4.17.11" - web3-core-subscriptions@1.10.0: version "1.10.0" resolved "https://registry.yarnpkg.com/web3-core-subscriptions/-/web3-core-subscriptions-1.10.0.tgz#b534592ee1611788fc0cb0b95963b9b9b6eacb7c" @@ -20967,14 +18822,6 @@ web3-core-subscriptions@1.10.0: eventemitter3 "4.0.4" web3-core-helpers "1.10.0" -web3-core-subscriptions@1.10.4: - version "1.10.4" - resolved "https://registry.yarnpkg.com/web3-core-subscriptions/-/web3-core-subscriptions-1.10.4.tgz#2f4dcb404237e92802a563265d11a33934dc38e6" - integrity sha512-o0lSQo/N/f7/L76C0HV63+S54loXiE9fUPfHFcTtpJRQNDBVsSDdWRdePbWwR206XlsBqD5VHApck1//jEafTw== - dependencies: - eventemitter3 "4.0.4" - web3-core-helpers "1.10.4" - web3-core-subscriptions@1.2.2: version "1.2.2" resolved "https://registry.yarnpkg.com/web3-core-subscriptions/-/web3-core-subscriptions-1.2.2.tgz#bf4ba23a653a003bdc3551649958cc0b080b068e" @@ -21000,19 +18847,6 @@ web3-core-subscriptions@1.8.2: eventemitter3 "4.0.4" web3-core-helpers "1.8.2" -web3-core@1.0.0-beta.55: - version "1.0.0-beta.55" - resolved "https://registry.yarnpkg.com/web3-core/-/web3-core-1.0.0-beta.55.tgz#26b9abbf1bc1837c9cc90f06ecbc4ed714f89b53" - integrity sha512-AMMp7TLEtE7u8IJAu/THrRhBTZyZzeo7Y6GiWYNwb5+KStC9hIGLr9cI1KX9R6ZioTOLRHrqT7awDhnJ1ku2mg== - dependencies: - "@babel/runtime" "^7.3.1" - "@types/bn.js" "^4.11.4" - "@types/node" "^10.12.18" - lodash "^4.17.11" - web3-core-method "1.0.0-beta.55" - web3-providers "1.0.0-beta.55" - web3-utils "1.0.0-beta.55" - web3-core@1.10.0: version "1.10.0" resolved "https://registry.yarnpkg.com/web3-core/-/web3-core-1.10.0.tgz#9aa07c5deb478cf356c5d3b5b35afafa5fa8e633" @@ -21026,19 +18860,6 @@ web3-core@1.10.0: web3-core-requestmanager "1.10.0" web3-utils "1.10.0" -web3-core@1.10.4: - version "1.10.4" - resolved "https://registry.yarnpkg.com/web3-core/-/web3-core-1.10.4.tgz#639de68b8b9871d2dc8892e0dd4e380cb1361a98" - integrity sha512-B6elffYm81MYZDTrat7aEhnhdtVE3lDBUZft16Z8awYMZYJDbnykEbJVS+l3mnA7AQTnSDr/1MjWofGDLBJPww== - dependencies: - "@types/bn.js" "^5.1.1" - "@types/node" "^12.12.6" - bignumber.js "^9.0.0" - web3-core-helpers "1.10.4" - web3-core-method "1.10.4" - web3-core-requestmanager "1.10.4" - web3-utils "1.10.4" - web3-core@1.2.2: version "1.2.2" resolved "https://registry.yarnpkg.com/web3-core/-/web3-core-1.2.2.tgz#334b99c8222ef9cfd0339e27352f0b58ea789a2f" @@ -21110,7 +18931,7 @@ web3-eth-abi@1.8.1: "@ethersproject/abi" "^5.6.3" web3-utils "1.8.1" -web3-eth-abi@1.8.2, web3-eth-abi@^1.0.0-beta.24: +web3-eth-abi@1.8.2: version "1.8.2" resolved "https://registry.yarnpkg.com/web3-eth-abi/-/web3-eth-abi-1.8.2.tgz#16e1e9be40e2527404f041a4745111211488f31a" integrity sha512-Om9g3kaRNjqiNPAgKwGT16y+ZwtBzRe4ZJFGjLiSs6v5I7TPNF+rRMWuKnR6jq0azQZDj6rblvKFMA49/k48Og== @@ -21134,22 +18955,6 @@ web3-eth-accounts@1.10.0: web3-core-method "1.10.0" web3-utils "1.10.0" -web3-eth-accounts@1.10.4: - version "1.10.4" - resolved "https://registry.yarnpkg.com/web3-eth-accounts/-/web3-eth-accounts-1.10.4.tgz#df30e85a7cd70e475f8cf52361befba408829e34" - integrity sha512-ysy5sVTg9snYS7tJjxVoQAH6DTOTkRGR8emEVCWNGLGiB9txj+qDvSeT0izjurS/g7D5xlMAgrEHLK1Vi6I3yg== - dependencies: - "@ethereumjs/common" "2.6.5" - "@ethereumjs/tx" "3.5.2" - "@ethereumjs/util" "^8.1.0" - eth-lib "0.2.8" - scrypt-js "^3.0.1" - uuid "^9.0.0" - web3-core "1.10.4" - web3-core-helpers "1.10.4" - web3-core-method "1.10.4" - web3-utils "1.10.4" - web3-eth-accounts@1.2.2: version "1.2.2" resolved "https://registry.yarnpkg.com/web3-eth-accounts/-/web3-eth-accounts-1.2.2.tgz#c187e14bff6baa698ac352220290222dbfd332e5" @@ -21201,19 +19006,6 @@ web3-eth-accounts@1.8.2: web3-core-method "1.8.2" web3-utils "1.8.2" -web3-eth-admin@1.0.0-beta.55: - version "1.0.0-beta.55" - resolved "https://registry.yarnpkg.com/web3-eth-admin/-/web3-eth-admin-1.0.0-beta.55.tgz#dcbcc5be4e3a008ce367c2ac83432b9a711f39e9" - integrity sha512-7IbnTsIJ5zx5K/Nw5f8u1cFj6qwgTAdr/1MlL2/V5gE8IsO2qsSjbPHDZEawbgfpCLGuoHpOnUDIrce/HOYHbw== - dependencies: - "@babel/runtime" "^7.3.1" - web3-core "1.0.0-beta.55" - web3-core-helpers "1.0.0-beta.55" - web3-core-method "1.0.0-beta.55" - web3-net "1.0.0-beta.55" - web3-providers "1.0.0-beta.55" - web3-utils "1.0.0-beta.55" - web3-eth-contract@1.10.0: version "1.10.0" resolved "https://registry.yarnpkg.com/web3-eth-contract/-/web3-eth-contract-1.10.0.tgz#8e68c7654576773ec3c91903f08e49d0242c503a" @@ -21228,20 +19020,6 @@ web3-eth-contract@1.10.0: web3-eth-abi "1.10.0" web3-utils "1.10.0" -web3-eth-contract@1.10.4: - version "1.10.4" - resolved "https://registry.yarnpkg.com/web3-eth-contract/-/web3-eth-contract-1.10.4.tgz#22d39f04e11d9ff4e726e8025a56d78e843a2c3d" - integrity sha512-Q8PfolOJ4eV9TvnTj1TGdZ4RarpSLmHnUnzVxZ/6/NiTfe4maJz99R0ISgwZkntLhLRtw0C7LRJuklzGYCNN3A== - dependencies: - "@types/bn.js" "^5.1.1" - web3-core "1.10.4" - web3-core-helpers "1.10.4" - web3-core-method "1.10.4" - web3-core-promievent "1.10.4" - web3-core-subscriptions "1.10.4" - web3-eth-abi "1.10.4" - web3-utils "1.10.4" - web3-eth-contract@1.2.2: version "1.2.2" resolved "https://registry.yarnpkg.com/web3-eth-contract/-/web3-eth-contract-1.2.2.tgz#84e92714918a29e1028ee7718f0712536e14e9a1" @@ -21299,20 +19077,6 @@ web3-eth-ens@1.10.0: web3-eth-contract "1.10.0" web3-utils "1.10.0" -web3-eth-ens@1.10.4: - version "1.10.4" - resolved "https://registry.yarnpkg.com/web3-eth-ens/-/web3-eth-ens-1.10.4.tgz#3d991adac52bc8e598f1f1b8528337fa6291004c" - integrity sha512-LLrvxuFeVooRVZ9e5T6OWKVflHPFgrVjJ/jtisRWcmI7KN/b64+D/wJzXqgmp6CNsMQcE7rpmf4CQmJCrTdsgg== - dependencies: - content-hash "^2.5.2" - eth-ens-namehash "2.0.8" - web3-core "1.10.4" - web3-core-helpers "1.10.4" - web3-core-promievent "1.10.4" - web3-eth-abi "1.10.4" - web3-eth-contract "1.10.4" - web3-utils "1.10.4" - web3-eth-ens@1.2.2: version "1.2.2" resolved "https://registry.yarnpkg.com/web3-eth-ens/-/web3-eth-ens-1.2.2.tgz#0a4abed1d4cbdacbf5e1ab06e502d806d1192bc6" @@ -21355,15 +19119,6 @@ web3-eth-ens@1.8.2: web3-eth-contract "1.8.2" web3-utils "1.8.2" -web3-eth-iban@1.0.0-beta.55: - version "1.0.0-beta.55" - resolved "https://registry.yarnpkg.com/web3-eth-iban/-/web3-eth-iban-1.0.0-beta.55.tgz#15146a69de21addc99e7dbfb2920555b1e729637" - integrity sha512-a2Fxsb5Mssa+jiXgjUdIzJipE0175IcQXJbZLpKft2+zeSJWNTbaa3PQD2vPPpIM4W789q06N+f9Zc0Fyls+1g== - dependencies: - "@babel/runtime" "^7.3.1" - bn.js "4.11.8" - web3-utils "1.0.0-beta.55" - web3-eth-iban@1.10.0: version "1.10.0" resolved "https://registry.yarnpkg.com/web3-eth-iban/-/web3-eth-iban-1.10.0.tgz#5a46646401965b0f09a4f58e7248c8a8cd22538a" @@ -21372,14 +19127,6 @@ web3-eth-iban@1.10.0: bn.js "^5.2.1" web3-utils "1.10.0" -web3-eth-iban@1.10.4: - version "1.10.4" - resolved "https://registry.yarnpkg.com/web3-eth-iban/-/web3-eth-iban-1.10.4.tgz#bc61b4a1930d19b1df8762c606d669902558e54d" - integrity sha512-0gE5iNmOkmtBmbKH2aTodeompnNE8jEyvwFJ6s/AF6jkw9ky9Op9cqfzS56AYAbrqEFuClsqB/AoRves7LDELw== - dependencies: - bn.js "^5.2.1" - web3-utils "1.10.4" - web3-eth-iban@1.2.2: version "1.2.2" resolved "https://registry.yarnpkg.com/web3-eth-iban/-/web3-eth-iban-1.2.2.tgz#76bec73bad214df7c4192388979a59fc98b96c5a" @@ -21416,18 +19163,6 @@ web3-eth-personal@1.10.0: web3-net "1.10.0" web3-utils "1.10.0" -web3-eth-personal@1.10.4: - version "1.10.4" - resolved "https://registry.yarnpkg.com/web3-eth-personal/-/web3-eth-personal-1.10.4.tgz#e2ee920f47e84848288e03442659cdbb2c4deea2" - integrity sha512-BRa/hs6jU1hKHz+AC/YkM71RP3f0Yci1dPk4paOic53R4ZZG4MgwKRkJhgt3/GPuPliwS46f/i5A7fEGBT4F9w== - dependencies: - "@types/node" "^12.12.6" - web3-core "1.10.4" - web3-core-helpers "1.10.4" - web3-core-method "1.10.4" - web3-net "1.10.4" - web3-utils "1.10.4" - web3-eth-personal@1.2.2: version "1.2.2" resolved "https://registry.yarnpkg.com/web3-eth-personal/-/web3-eth-personal-1.2.2.tgz#eee1c86a8132fa16b5e34c6d421ca92e684f0be6" @@ -21482,24 +19217,6 @@ web3-eth@1.10.0: web3-net "1.10.0" web3-utils "1.10.0" -web3-eth@1.10.4: - version "1.10.4" - resolved "https://registry.yarnpkg.com/web3-eth/-/web3-eth-1.10.4.tgz#3a908c635cb5d935bd30473e452f3bd7f2ee66a5" - integrity sha512-Sql2kYKmgt+T/cgvg7b9ce24uLS7xbFrxE4kuuor1zSCGrjhTJ5rRNG8gTJUkAJGKJc7KgnWmgW+cOfMBPUDSA== - dependencies: - web3-core "1.10.4" - web3-core-helpers "1.10.4" - web3-core-method "1.10.4" - web3-core-subscriptions "1.10.4" - web3-eth-abi "1.10.4" - web3-eth-accounts "1.10.4" - web3-eth-contract "1.10.4" - web3-eth-ens "1.10.4" - web3-eth-iban "1.10.4" - web3-eth-personal "1.10.4" - web3-net "1.10.4" - web3-utils "1.10.4" - web3-eth@1.2.2: version "1.2.2" resolved "https://registry.yarnpkg.com/web3-eth/-/web3-eth-1.2.2.tgz#65a1564634a23b990efd1655bf94ad513904286c" @@ -21555,19 +19272,6 @@ web3-eth@1.8.2: web3-net "1.8.2" web3-utils "1.8.2" -web3-net@1.0.0-beta.55: - version "1.0.0-beta.55" - resolved "https://registry.yarnpkg.com/web3-net/-/web3-net-1.0.0-beta.55.tgz#daf24323df16a890a0bac6c6eda48b6e8c7e96ef" - integrity sha512-do2WY8+/GArJSWX7k/zZ7nBnV9Y3n6LhPYkwT3LeFqDzD515bKwlomaNC8hOaTc6UQyXIoPprYTK2FevL7jrZw== - dependencies: - "@babel/runtime" "^7.3.1" - lodash "^4.17.11" - web3-core "1.0.0-beta.55" - web3-core-helpers "1.0.0-beta.55" - web3-core-method "1.0.0-beta.55" - web3-providers "1.0.0-beta.55" - web3-utils "1.0.0-beta.55" - web3-net@1.10.0: version "1.10.0" resolved "https://registry.yarnpkg.com/web3-net/-/web3-net-1.10.0.tgz#be53e7f5dafd55e7c9013d49c505448b92c9c97b" @@ -21577,15 +19281,6 @@ web3-net@1.10.0: web3-core-method "1.10.0" web3-utils "1.10.0" -web3-net@1.10.4: - version "1.10.4" - resolved "https://registry.yarnpkg.com/web3-net/-/web3-net-1.10.4.tgz#20e12c60e4477d4298979d8d5d66b9abf8e66a09" - integrity sha512-mKINnhOOnZ4koA+yV2OT5s5ztVjIx7IY9a03w6s+yao/BUn+Luuty0/keNemZxTr1E8Ehvtn28vbOtW7Ids+Ow== - dependencies: - web3-core "1.10.4" - web3-core-method "1.10.4" - web3-utils "1.10.4" - web3-net@1.2.2: version "1.2.2" resolved "https://registry.yarnpkg.com/web3-net/-/web3-net-1.2.2.tgz#5c3226ca72df7c591422440ce6f1203fd42ddad9" @@ -21613,15 +19308,16 @@ web3-net@1.8.2: web3-core-method "1.8.2" web3-utils "1.8.2" -web3-provider-engine@16.0.4: - version "16.0.4" - resolved "https://registry.yarnpkg.com/web3-provider-engine/-/web3-provider-engine-16.0.4.tgz#a6565d85f3cfdc2da68f141af8728f90ad198f3f" - integrity sha512-f5WxJ9+LTF+4aJo4tCOXtQ6SDytBtLkhvV+qh/9gImHAuG9sMr6utY0mn/pro1Rx7O3hbztBxvQKjGMdOo8muw== +web3-provider-engine@16.0.3: + version "16.0.3" + resolved "https://registry.yarnpkg.com/web3-provider-engine/-/web3-provider-engine-16.0.3.tgz#8ff93edf3a8da2f70d7f85c5116028c06a0d9f07" + integrity sha512-Q3bKhGqLfMTdLvkd4TtkGYJHcoVQ82D1l8jTIwwuJp/sAp7VHnRYb9YJ14SW/69VMWoOhSpPLZV2tWb9V0WJoA== dependencies: "@ethereumjs/tx" "^3.3.0" async "^2.5.0" backoff "^2.5.0" clone "^2.0.0" + cross-fetch "^2.1.0" eth-block-tracker "^4.4.2" eth-json-rpc-filters "^4.2.1" eth-json-rpc-infura "^5.1.0" @@ -21697,16 +19393,6 @@ web3-providers-http@1.10.0: es6-promise "^4.2.8" web3-core-helpers "1.10.0" -web3-providers-http@1.10.4: - version "1.10.4" - resolved "https://registry.yarnpkg.com/web3-providers-http/-/web3-providers-http-1.10.4.tgz#ca7aa58aeaf8123500c24ffe0595896319f830e8" - integrity sha512-m2P5Idc8hdiO0l60O6DSCPw0kw64Zgi0pMjbEFRmxKIck2Py57RQMu4bxvkxJwkF06SlGaEQF8rFZBmuX7aagQ== - dependencies: - abortcontroller-polyfill "^1.7.5" - cross-fetch "^4.0.0" - es6-promise "^4.2.8" - web3-core-helpers "1.10.4" - web3-providers-http@1.2.2: version "1.2.2" resolved "https://registry.yarnpkg.com/web3-providers-http/-/web3-providers-http-1.2.2.tgz#155e55c1d69f4c5cc0b411ede40dea3d06720956" @@ -21743,14 +19429,6 @@ web3-providers-ipc@1.10.0: oboe "2.1.5" web3-core-helpers "1.10.0" -web3-providers-ipc@1.10.4: - version "1.10.4" - resolved "https://registry.yarnpkg.com/web3-providers-ipc/-/web3-providers-ipc-1.10.4.tgz#2e03437909e4e7771d646ff05518efae44b783c3" - integrity sha512-YRF/bpQk9z3WwjT+A6FI/GmWRCASgd+gC0si7f9zbBWLXjwzYAKG73bQBaFRAHex1hl4CVcM5WUMaQXf3Opeuw== - dependencies: - oboe "2.1.5" - web3-core-helpers "1.10.4" - web3-providers-ipc@1.2.2: version "1.2.2" resolved "https://registry.yarnpkg.com/web3-providers-ipc/-/web3-providers-ipc-1.2.2.tgz#c6d165a12bc68674b4cdd543ea18aec79cafc2e8" @@ -21785,15 +19463,6 @@ web3-providers-ws@1.10.0: web3-core-helpers "1.10.0" websocket "^1.0.32" -web3-providers-ws@1.10.4: - version "1.10.4" - resolved "https://registry.yarnpkg.com/web3-providers-ws/-/web3-providers-ws-1.10.4.tgz#55d0c3ba36c6a79d105f02e20a707eb3978e7f82" - integrity sha512-j3FBMifyuFFmUIPVQR4pj+t5ILhAexAui0opgcpu9R5LxQrLRUZxHSnU+YO25UycSOa/NAX8A+qkqZNpcFAlxA== - dependencies: - eventemitter3 "4.0.4" - web3-core-helpers "1.10.4" - websocket "^1.0.32" - web3-providers-ws@1.2.2: version "1.2.2" resolved "https://registry.yarnpkg.com/web3-providers-ws/-/web3-providers-ws-1.2.2.tgz#d2c05c68598cea5ad3fa6ef076c3bcb3ca300d29" @@ -21821,23 +19490,6 @@ web3-providers-ws@1.8.2: web3-core-helpers "1.8.2" websocket "^1.0.32" -web3-providers@1.0.0-beta.55: - version "1.0.0-beta.55" - resolved "https://registry.yarnpkg.com/web3-providers/-/web3-providers-1.0.0-beta.55.tgz#639503517741b69baaa82f1f940630df6a25992b" - integrity sha512-MNifc7W+iF6rykpbDR1MuX152jshWdZXHAU9Dk0Ja2/23elhIs4nCWs7wOX9FHrKgdrQbscPoq0uy+0aGzyWVQ== - dependencies: - "@babel/runtime" "^7.3.1" - "@types/node" "^10.12.18" - eventemitter3 "3.1.0" - lodash "^4.17.11" - url-parse "1.4.4" - web3-core "1.0.0-beta.55" - web3-core-helpers "1.0.0-beta.55" - web3-core-method "1.0.0-beta.55" - web3-utils "1.0.0-beta.55" - websocket "^1.0.28" - xhr2-cookies "1.1.0" - web3-shh@1.10.0: version "1.10.0" resolved "https://registry.yarnpkg.com/web3-shh/-/web3-shh-1.10.0.tgz#c2979b87e0f67a7fef2ce9ee853bd7bfbe9b79a8" @@ -21848,16 +19500,6 @@ web3-shh@1.10.0: web3-core-subscriptions "1.10.0" web3-net "1.10.0" -web3-shh@1.10.4: - version "1.10.4" - resolved "https://registry.yarnpkg.com/web3-shh/-/web3-shh-1.10.4.tgz#9852d6f3d05678e31e49235a60fea10ca7a9e21d" - integrity sha512-cOH6iFFM71lCNwSQrC3niqDXagMqrdfFW85hC9PFUrAr3PUrIem8TNstTc3xna2bwZeWG6OBy99xSIhBvyIACw== - dependencies: - web3-core "1.10.4" - web3-core-method "1.10.4" - web3-core-subscriptions "1.10.4" - web3-net "1.10.4" - web3-shh@1.2.2: version "1.2.2" resolved "https://registry.yarnpkg.com/web3-shh/-/web3-shh-1.2.2.tgz#44ed998f2a6ba0ec5cb9d455184a0f647826a49c" @@ -21888,22 +19530,6 @@ web3-shh@1.8.2: web3-core-subscriptions "1.8.2" web3-net "1.8.2" -web3-utils@1.0.0-beta.55: - version "1.0.0-beta.55" - resolved "https://registry.yarnpkg.com/web3-utils/-/web3-utils-1.0.0-beta.55.tgz#beb40926b7c04208b752d36a9bc959d27a04b308" - integrity sha512-ASWqUi8gtWK02Tp8ZtcoAbHenMpQXNvHrakgzvqTNNZn26wgpv+Q4mdPi0KOR6ZgHFL8R/9b5BBoUTglS1WPpg== - dependencies: - "@babel/runtime" "^7.3.1" - "@types/bn.js" "^4.11.4" - "@types/node" "^10.12.18" - bn.js "4.11.8" - eth-lib "0.2.8" - ethjs-unit "^0.1.6" - lodash "^4.17.11" - number-to-bn "1.7.0" - randombytes "^2.1.0" - utf8 "2.1.1" - web3-utils@1.10.0: version "1.10.0" resolved "https://registry.yarnpkg.com/web3-utils/-/web3-utils-1.10.0.tgz#ca4c1b431a765c14ac7f773e92e0fd9377ccf578" @@ -22010,19 +19636,6 @@ web3@1.10.0: web3-shh "1.10.0" web3-utils "1.10.0" -web3@1.10.4: - version "1.10.4" - resolved "https://registry.yarnpkg.com/web3/-/web3-1.10.4.tgz#5d5e59b976eaf758b060fe1a296da5fe87bdc79c" - integrity sha512-kgJvQZjkmjOEKimx/tJQsqWfRDPTTcBfYPa9XletxuHLpHcXdx67w8EFn5AW3eVxCutE9dTVHgGa9VYe8vgsEA== - dependencies: - web3-bzz "1.10.4" - web3-core "1.10.4" - web3-eth "1.10.4" - web3-eth-personal "1.10.4" - web3-net "1.10.4" - web3-shh "1.10.4" - web3-utils "1.10.4" - web3@1.2.2: version "1.2.2" resolved "https://registry.yarnpkg.com/web3/-/web3-1.2.2.tgz#b1b8b69aafdf94cbaeadbb68a8aa1df2ef266aec" @@ -22076,7 +19689,7 @@ webidl-conversions@^3.0.0: resolved "https://registry.yarnpkg.com/webidl-conversions/-/webidl-conversions-3.0.1.tgz#24534275e2a7bc6be7bc86611cc16ae0a5654871" integrity sha512-2JAn3z8AR6rjK8Sm8orRC0h/bcl/DqL7tRPdGZ4I1CjdF+EaMLmYxBHyXuKL849eucPFhvBoxMsflfOb8kxaeQ== -websocket@^1.0.28, websocket@^1.0.32: +websocket@^1.0.32: version "1.0.34" resolved "https://registry.yarnpkg.com/websocket/-/websocket-1.0.34.tgz#2bdc2602c08bf2c82253b730655c0ef7dcab3111" integrity sha512-PRDso2sGwF6kM75QykIesBijKSVceR6jL2G8NGYyq2XrItNC2P5/qL5XeR056GhA+Ly7JMFvJb9I312mJfmqnQ== @@ -22098,11 +19711,16 @@ websocket@^1.0.28, websocket@^1.0.32: typedarray-to-buffer "^3.1.5" yaeti "^0.0.6" -whatwg-fetch@>=0.10.0, whatwg-fetch@^3.4.1: +whatwg-fetch@>=0.10.0: version "3.6.2" resolved "https://registry.yarnpkg.com/whatwg-fetch/-/whatwg-fetch-3.6.2.tgz#dced24f37f2624ed0281725d51d0e2e3fe677f8c" integrity sha512-bJlen0FcuU/0EMLrdbJ7zOnW6ITZLrZMIarMUVmdKtsGvZna8vxKYaexICWPfZ8qwf9fzNq+UEIZrnSaApt6RA== +whatwg-fetch@^2.0.4: + version "2.0.4" + resolved "https://registry.yarnpkg.com/whatwg-fetch/-/whatwg-fetch-2.0.4.tgz#dde6a5df315f9d39991aa17621853d720b85566f" + integrity sha512-dcQ1GWpOD/eEQ97k66aiEVpNnapVj90/+R+SXTPYGHpYBBypfKJEQjLrvMZ7YXbKm21gXd4NcuxUTjiv1YtLng== + whatwg-mimetype@^3.0.0: version "3.0.0" resolved "https://registry.yarnpkg.com/whatwg-mimetype/-/whatwg-mimetype-3.0.0.tgz#5fa1a7623867ff1af6ca3dc72ad6b8a4208beba7" @@ -22167,7 +19785,7 @@ which@2.0.2, which@^2.0.1, which@^2.0.2: dependencies: isexe "^2.0.0" -which@^1.1.1, which@^1.2.12, which@^1.2.14, which@^1.2.9: +which@^1.2.12, which@^1.2.14, which@^1.2.9: version "1.3.1" resolved "https://registry.yarnpkg.com/which/-/which-1.3.1.tgz#a45043d54f5805316da8d62f9f50918d3da70b0a" integrity sha512-HxJdYWq1MTIQbJ3nw0cqssHoTNU267KlrDuGZ1WYlxDStUtKUhOaJmh112/TZmHxxUfuJqPXSOm7tDyas0OSIQ== @@ -22181,25 +19799,13 @@ wide-align@1.1.3: dependencies: string-width "^1.0.2 || 2" -wide-align@^1.1.0, wide-align@^1.1.5: +wide-align@^1.1.5: version "1.1.5" resolved "https://registry.yarnpkg.com/wide-align/-/wide-align-1.1.5.tgz#df1d4c206854369ecf3c9a4898f1b23fbd9d15d3" integrity sha512-eDMORYaPNZ4sQIuuYPDHdQvf4gyCF9rEEV/yPxGfwPkRodwEgiMUUXTx/dex+Me0wxx53S+NgUHaP7y3MGlDmg== dependencies: string-width "^1.0.2 || 2 || 3 || 4" -wif@^2.0.6: - version "2.0.6" - resolved "https://registry.yarnpkg.com/wif/-/wif-2.0.6.tgz#08d3f52056c66679299726fade0d432ae74b4704" - integrity sha512-HIanZn1zmduSF+BQhkE+YXIbEiH0xPr1012QbFEGB0xsKqJii0/SqJjyn8dFv6y36kOznMgMB+LGcbZTJ1xACQ== - dependencies: - bs58check "<3.0.0" - -window-size@^0.1.4: - version "0.1.4" - resolved "https://registry.yarnpkg.com/window-size/-/window-size-0.1.4.tgz#f8e1aa1ee5a53ec5bf151ffa09742a6ad7697876" - integrity sha512-2thx4pB0cV3h+Bw7QmMXcEbdmOzv9t0HFplJH/Lz6yu60hXYy5RT8rUu+wlIreVxWsGN20mo+MHeCSfUpQBwPw== - window-size@^0.2.0: version "0.2.0" resolved "https://registry.yarnpkg.com/window-size/-/window-size-0.2.0.tgz#b4315bb4214a3d7058ebeee892e13fa24d98b075" @@ -22237,7 +19843,7 @@ workerpool@6.2.1: resolved "https://registry.yarnpkg.com/workerpool/-/workerpool-6.2.1.tgz#46fc150c17d826b86a008e5a4508656777e9c343" integrity sha512-ILEIE97kDZvF9Wb9f6h5aXK4swSlKGUcOEGiIYb2OOu/IrDU9iwj0fD//SsA6E5ibwJxpEvhullJY4Sl4GcpAw== -"wrap-ansi-cjs@npm:wrap-ansi@^7.0.0", wrap-ansi@^7.0.0: +"wrap-ansi-cjs@npm:wrap-ansi@^7.0.0": version "7.0.0" resolved "https://registry.yarnpkg.com/wrap-ansi/-/wrap-ansi-7.0.0.tgz#67e145cff510a6a6984bdf1152911d69d2eb9e43" integrity sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q== @@ -22263,6 +19869,15 @@ wrap-ansi@^5.1.0: string-width "^3.0.0" strip-ansi "^5.0.0" +wrap-ansi@^7.0.0: + version "7.0.0" + resolved "https://registry.yarnpkg.com/wrap-ansi/-/wrap-ansi-7.0.0.tgz#67e145cff510a6a6984bdf1152911d69d2eb9e43" + integrity sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q== + dependencies: + ansi-styles "^4.0.0" + string-width "^4.1.0" + strip-ansi "^6.0.0" + wrap-ansi@^8.1.0: version "8.1.0" resolved "https://registry.yarnpkg.com/wrap-ansi/-/wrap-ansi-8.1.0.tgz#56dc22368ee570face1b49819975d9b9a5ead214" @@ -22361,6 +19976,11 @@ ws@8.13.0: resolved "https://registry.yarnpkg.com/ws/-/ws-8.13.0.tgz#9a9fb92f93cf41512a0735c8f4dd09b8a1211cd0" integrity sha512-x9vcZYTrFPC7aSIbj7sRCYo7L/Xb8Iy+pW0ng0wt2vCJv7M9HOMy0UoN3rr+IFC7hb7vXoqS+P9ktyLLLhO+LA== +ws@8.18.1: + version "8.18.1" + resolved "https://registry.yarnpkg.com/ws/-/ws-8.18.1.tgz#ea131d3784e1dfdff91adb0a4a116b127515e3cb" + integrity sha512-RKW2aJZMXeMxVpnZ6bck+RswznaxmzdULiBr6KY7XkTnW8uvt0iT9H5DkHUChXrc+uurzwa0rVI16n/Xzjdz1w== + ws@8.2.3: version "8.2.3" resolved "https://registry.yarnpkg.com/ws/-/ws-8.2.3.tgz#63a56456db1b04367d0b721a0b80cae6d8becbba" @@ -22459,7 +20079,7 @@ xtend@~2.1.1: dependencies: object-keys "~0.4.0" -y18n@^3.2.0, y18n@^3.2.1: +y18n@^3.2.1: version "3.2.2" resolved "https://registry.yarnpkg.com/y18n/-/y18n-3.2.2.tgz#85c901bd6470ce71fc4bb723ad209b70f7f28696" integrity sha512-uGZHXkHnhF0XeeAPgnKfPv1bgKAYyVvmNL1xlKsPYZPaIHxGti2hHqvOCQv71XMsLxu1QjergkqogUnms5D3YQ== @@ -22588,19 +20208,6 @@ yargs@16.2.0, yargs@^16.2.0: y18n "^5.0.5" yargs-parser "^20.2.2" -yargs@17.7.2: - version "17.7.2" - resolved "https://registry.yarnpkg.com/yargs/-/yargs-17.7.2.tgz#991df39aca675a192b816e1e0363f9d75d2aa269" - integrity sha512-7dSzzRQ++CKnNI/krKnYRV7JKKPUXMEh61soaHKg9mrWEhzFWhFnxPxGl+69cD1Ou63C13NUPCnmIcrvqCuM6w== - dependencies: - cliui "^8.0.1" - escalade "^3.1.1" - get-caller-file "^2.0.5" - require-directory "^2.1.1" - string-width "^4.2.3" - y18n "^5.0.5" - yargs-parser "^21.1.1" - yargs@^11.0.0: version "11.1.1" resolved "https://registry.yarnpkg.com/yargs/-/yargs-11.1.1.tgz#5052efe3446a4df5ed669c995886cc0f13702766" @@ -22636,7 +20243,7 @@ yargs@^14.0.0: y18n "^4.0.0" yargs-parser "^15.0.1" -yargs@^17.3.1, yargs@^17.5.1, yargs@^17.6.2: +yargs@^17.3.1, yargs@^17.6.2: version "17.7.1" resolved "https://registry.yarnpkg.com/yargs/-/yargs-17.7.1.tgz#34a77645201d1a8fc5213ace787c220eabbd0967" integrity sha512-cwiTb08Xuv5fqF4AovYacTFNxk62th7LKJ6BL9IGUpTJrWoU7/7WdQGTP2SjKf1dUNBGzDd28p/Yfs/GI6JrLw== @@ -22649,19 +20256,6 @@ yargs@^17.3.1, yargs@^17.5.1, yargs@^17.6.2: y18n "^5.0.5" yargs-parser "^21.1.1" -yargs@^3.10.0: - version "3.32.0" - resolved "https://registry.yarnpkg.com/yargs/-/yargs-3.32.0.tgz#03088e9ebf9e756b69751611d2a5ef591482c995" - integrity sha512-ONJZiimStfZzhKamYvR/xvmgW3uEkAUFSP91y2caTEPhzF6uP2JfPiVZcq66b/YR0C3uitxSV7+T1x8p5bkmMg== - dependencies: - camelcase "^2.0.1" - cliui "^3.0.3" - decamelize "^1.1.1" - os-locale "^1.4.0" - string-width "^1.0.1" - window-size "^0.1.4" - y18n "^3.2.0" - yargs@^4.7.1: version "4.8.1" resolved "https://registry.yarnpkg.com/yargs/-/yargs-4.8.1.tgz#c0c42924ca4aaa6b0e6da1739dfb216439f9ddc0"