Skip to content

Commit 4093142

Browse files
authored
exporter: Add retry logging to indexer connection (#4846)
## Motivation When block exporters fail to connect to the indexer, they retry with exponential backoff but provide no visibility into the retry process. Operators cannot determine whether connection issues are transient (retrying successfully) or persistent (about to fail). This lack of observability makes it difficult to diagnose network connectivity issues and indexer availability problems during deployments and operations. ## Proposal Add comprehensive logging to the indexer connection retry logic: - **WARNING level** on each retry attempt, including: - Current retry count and max retries remaining - Calculated backoff delay in milliseconds - Error that triggered the retry - **ERROR level** when all retries are exhausted, clearly indicating: - Final retry count - Max retries configured - Error that caused final failure - Explicit statement that exporter task will exit This provides operators with actionable telemetry to distinguish transient network issues from persistent failures. ## Test Plan 1. Deploy network with intentionally unreachable indexer endpoint 2. Observe exporter logs show WARNING messages with increasing retry counts and backoff delays 3. Wait for max retries to be exhausted 4. Verify ERROR message appears with clear failure indication 5. Confirm logs provide enough information to diagnose the connection issue ## Release Plan - These changes should be backported to the latest `testnet` branch, then - be released in a validator hotfix.
1 parent ad1f276 commit 4093142

File tree

1 file changed

+13
-0
lines changed
  • linera-service/src/exporter/runloops/indexer

1 file changed

+13
-0
lines changed

linera-service/src/exporter/runloops/indexer/client.rs

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -98,10 +98,23 @@ impl IndexerClient {
9898
}
9999
Err(e) => {
100100
if retry_count > self.max_retries {
101+
tracing::error!(
102+
retry_count,
103+
max_retries = self.max_retries,
104+
error = %e,
105+
"Failed to connect to indexer after exhausting all retries, exporter task will exit"
106+
);
101107
return Err(ExporterError::SynchronizationFailed(e.into()));
102108
}
103109

104110
let delay = self.retry_delay.saturating_mul(retry_count);
111+
tracing::warn!(
112+
retry_count,
113+
max_retries = self.max_retries,
114+
retry_delay_ms = delay.as_millis(),
115+
error = %e,
116+
"Failed to connect to indexer, retrying with exponential backoff"
117+
);
105118
sleep(delay).await;
106119
retry_count += 1;
107120
}

0 commit comments

Comments
 (0)