Skip to content

Commit 86dfdb5

Browse files
authored
feat: run inserts in batch and in parallel when processing new block (#1818)
* feat: run inserts in batch and in parallel when processing new block Signed-off-by: bestmike007 <[email protected]> * chore: use asserts Signed-off-by: bestmike007 <[email protected]> * chore: make insert concurrency configurable Signed-off-by: bestmike007 <[email protected]> * chore: rename env to STACKS_BLOCK_DATA_INSERT_CONCURRENCY and enforce min value Signed-off-by: bestmike007 <[email protected]> * chore: fix switch-case fall-through Signed-off-by: bestmike007 <[email protected]> * chore: fix env entry wording Signed-off-by: bestmike007 <[email protected]> --------- Signed-off-by: bestmike007 <[email protected]>
1 parent 867ed30 commit 86dfdb5

File tree

5 files changed

+273
-191
lines changed

5 files changed

+273
-191
lines changed

.env

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -43,6 +43,10 @@ PG_APPLICATION_NAME=stacks-blockchain-api
4343
# Limit to how many concurrent connections can be created, defaults to 10
4444
# PG_CONNECTION_POOL_MAX=10
4545

46+
# Insert concurrency when processing new blocks
47+
# If your PostgreSQL is operating on SSD and has multiple CPU cores, consider raising this value, for instance, to 8 or 16.
48+
# STACKS_BLOCK_DATA_INSERT_CONCURRENCY=4
49+
4650
# If specified, controls the Stacks Blockchain API mode. The possible values are:
4751
# * `readonly`: Runs the API endpoints without an Event Server that listens to events from a node and
4852
# writes them to the local database. The API will only read data from the PG database

0 commit comments

Comments
 (0)