Skip to content

Conversation

@p8
Copy link
Contributor

@p8 p8 commented Nov 3, 2024

Reduce connection pool size to avoid connection errors like:

 psycopg2.OperationalError: connection to server at "10.0.0.2", port
 5432 failed: FATAL:  sorry, too many clients already

Reduce connection pool size to avoid connection errors like:

     psycopg2.OperationalError: connection to server at "10.0.0.2", port
     5432 failed: FATAL:  sorry, too many clients already
@NateBrady23 NateBrady23 merged commit aecc2a4 into TechEmpower:master Nov 4, 2024
4 checks passed
@p8 p8 deleted the lucky/fix-test branch November 4, 2024 21:02
RUN shards build bench --release --no-debug

ENV DATABASE_URL postgres://benchmarkdbuser:benchmarkdbpass@tfb-database:5432/hello_world?initial_pool_size=56&max_idle_pool_size=56
ENV DATABASE_URL postgres://benchmarkdbuser:benchmarkdbpass@tfb-database:5432/hello_world?initial_pool_size=10&max_idle_pool_size=10
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, only an small question !!
All are playing with different pool sizes for the databases.

But we are learning (all languages) witch one is better ???

Copy link
Contributor

@joanhey joanhey Nov 10, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's only a question for reflexion !!
So we can learn all together.

Nothing about this PR.

PD: try to manage the pool size relative to the CPU cores, server, ... and not hardcoded.

Copy link
Contributor Author

@p8 p8 Nov 11, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No problem @joanhey 😄
I think the pool size is already per CPU core.
This was causing the runs to fail as it was making too many connections (new machine has more cores).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants