-
Notifications
You must be signed in to change notification settings - Fork 2.4k
Open
Description
Discussed in #5032
Originally posted by Sax388 October 17, 2025
Hi there,
I spent a little (too much) time figuring out why my batch job was running into out-of-memory (OOM) errors until finally figuring out that with postgresql you have to disable auto-commit in your datasource (to prevent materializing the full result set in memory). In my case hikariDataSource.setAutoCommit(false) did the trick:
@Bean
@ConfigurationProperties("spring.datasource.revenue-message.configuration")
public HikariDataSource revenueMessagesDataSource(
@Qualifier("revenueMessagesDataSourceProperties")
DataSourceProperties revenueMessagesDataSourceProperties) {
var hikariDataSource = revenueMessagesDataSourceProperties
.initializeDataSourceBuilder()
.type(HikariDataSource.class)
.build();
// To make use of JdbcCursorItemReader from Spring Batch, see https://jdbc.postgresql.org/documentation/query/#getting-results-based-on-a-cursor
hikariDataSource.setAutoCommit(false);
return hikariDataSource;
}I looked at the docs here https://docs.spring.io/spring-batch/reference/readers-and-writers/database.html and I would have loved a small admonition box in which this oddity was mentioned <3.
Thanks in advance 🙂