Replies: 2 comments
-
|
I agree and I can relate, because I also spent as much time and energy (and lost some hair during debugging sessions) before discovering that MySQL only streams data when the fetch size is set to Another similar caveat with PostgreSQL as well is when the item writer get stuck when trying to update rows selected by the reader using Even though these are DB specific issues, it would be great if Spring Batch warns about them in the docs. I will turn this into a documentation enhancement and move it to the issue tracker. |
Beta Was this translation helpful? Give feedback.
-
|
Moved to the issue tracker as #5040 |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi there,
I spent a little (too much) time figuring out why my batch job was running into out-of-memory (OOM) errors until finally figuring out that with
postgresqlyou have to disableauto-commitin your datasource (to prevent materializing the full result set in memory). In my casehikariDataSource.setAutoCommit(false)did the trick:I looked at the docs here https://docs.spring.io/spring-batch/reference/readers-and-writers/database.html and I would have loved a small admonition box in which this oddity was mentioned <3.
Thanks in advance 🙂
Beta Was this translation helpful? Give feedback.
All reactions