You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* filter BNS processing for successful txs only ([#1309](https://github.com/hirosystems/stacks-blockchain-api/issues/1309)) ([6a12936](https://github.com/hirosystems/stacks-blockchain-api/commit/6a129369c6d9fcdc79b5a7ad288d37784cbe77cc))
* add indexes for index_block_hash on BNS tables ([#1304](https://github.com/hirosystems/stacks-blockchain-api/issues/1304)) ([bbf4b2d](https://github.com/hirosystems/stacks-blockchain-api/commit/bbf4b2d2b8c7f6ed30bfda6eaa430d5c2e84cdf5))
* detect name transfers and renewals in special circumstances ([#1303](https://github.com/hirosystems/stacks-blockchain-api/issues/1303)) ([cd381a9](https://github.com/hirosystems/stacks-blockchain-api/commit/cd381a95b4d0d3f4bb08e447500153c3f652eff6))
* import BNS v1 data during event replay ([#1301](https://github.com/hirosystems/stacks-blockchain-api/issues/1301)) ([bc59817](https://github.com/hirosystems/stacks-blockchain-api/commit/bc59817aa98dd3a978a27b73d14738b64eb823f9))
* optimize tables and improve canonical treatment of BNS data (#1287)
49
+
50
+
### Features
51
+
52
+
* optimize tables and improve canonical treatment of BNS data ([#1287](https://github.com/hirosystems/stacks-blockchain-api/issues/1287)) ([1f64818](https://github.com/hirosystems/stacks-blockchain-api/commit/1f648187b8c701e802a06bac52b077fd10571ff7))
This command has two modes of operation, specified by the `--mode` option:
129
-
*`archival` (default): The process will import and ingest *all* blockchain events that have happened since the first block.
130
-
*`pruned`: The import process will ignore some prunable events (mempool, microblocks) until the import block height has reached `chain tip - 256` blocks. This saves a considerable amount of time during import, but sacrifices some historical data. You can use this mode if you're mostly interested in running an API that prioritizes real time information.
131
-
132
-
Alternatively, instead of performing the `export-events` command in step 1, an environmental variable can be set which enables events to be streamed to a file
133
-
as they are received, while the application is running normally. To enable this feature, set the `STACKS_EXPORT_EVENTS_FILE` env var to the file path where
134
-
events should be appended. Example:
162
+
* `archival` (default): The process will import and ingest *all* blockchain events that have
163
+
happened since the first block.
164
+
* `pruned`: The import process will ignore some prunable events (mempool, microblocks) until the
165
+
import block height has reached `chain tip - 256` blocks. This saves a considerable amount of
166
+
time during import, but sacrifices some historical data. You can use this mode if you're mostly
167
+
interested in running an API that prioritizes real time information.
168
+
169
+
Alternatively, instead of performing the `export-events`commandin step 1, an environmental
170
+
variable can be set which enables events to be streamed to a file as they are received, while the
171
+
application is running normally. To enable this feature, set the `STACKS_EXPORT_EVENTS_FILE` env var
172
+
to the file path where events should be appended. Example:
if [ $(sha256sum $file| awk {'print $1'})==$(cat ${file}.sha256 ) ];then
102
-
echo"sha256 Matched $file"
103
-
else
104
-
echo"sha256 Mismatch $file"
105
-
fi
106
-
done
107
-
```
108
-
109
89
## Postgres
110
90
111
91
The `postgres:alpine` image can be run with default settings, the only requirement is that a password Environment Variable is set for the `postgres` user: `POSTGRES_PASSWORD=postgres`
0 commit comments