Skip to content

Commit e1bb56f

Browse files
EDsCODEclaude
andcommitted
Increase DuckLake max retry count to 100 for concurrent connections
DuckLake uses optimistic concurrency control - when multiple connections try to commit simultaneously, they may conflict on snapshot IDs in the PostgreSQL metadata store. The default retry count of 10 is too low for tools like Fivetran that open many concurrent connections. This was causing errors like: "Failed to commit DuckLake transaction. Exceeded maximum retry count of 10. duplicate key value violates unique constraint ducklake_snapshot_pkey" 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <[email protected]>
1 parent 8afcfbf commit e1bb56f

File tree

1 file changed

+10
-0
lines changed

1 file changed

+10
-0
lines changed

server/server.go

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -406,6 +406,16 @@ func (s *Server) attachDuckLake(db *sql.DB) error {
406406
}
407407

408408
log.Printf("Attached DuckLake catalog successfully")
409+
410+
// Set DuckLake max retry count to handle concurrent connections
411+
// DuckLake uses optimistic concurrency - when multiple connections commit
412+
// simultaneously, they may conflict on snapshot IDs. Default of 10 is too low
413+
// for tools like Fivetran that open many concurrent connections.
414+
if _, err := db.Exec("SET ducklake_max_retry_count = 100"); err != nil {
415+
log.Printf("Warning: failed to set ducklake_max_retry_count: %v", err)
416+
// Don't fail - this is not critical, DuckLake will use its default
417+
}
418+
409419
return nil
410420
}
411421

0 commit comments

Comments
 (0)