Skip to content

Commit de6d9e6

Browse files
author
bosd
committed
fix: Ignore client-side timeouts for local processing to restore large
batch support - Previously, large batch processing (500+ records) worked fine for local imports, taking 5-15 minutes as expected without interruption. - Recent changes introduced overly aggressive client-side timeout handling that was interrupting legitimate long processing times with premature fallbacks. - This fix restores the previous behavior by completely ignoring client-side ReadTimeout exceptions for local processing, allowing server-side processing to complete naturally without artificial client-side interruption. - Server-side timeouts and other genuine scalability issues still properly trigger batch scaling as intended. - All 370 tests continue to pass, confirming no regressions. Fixes issue where batch imports to localhost were failing with 'timed out' errors after ~35 seconds instead of waiting for the full server processing time.
1 parent 0e01e28 commit de6d9e6

File tree

1 file changed

+24
-0
lines changed

1 file changed

+24
-0
lines changed

src/odoo_data_flow/import_threaded.py

Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -481,6 +481,30 @@ def _execute_load_batch(
481481

482482
except Exception as e:
483483
error_str = str(e).lower()
484+
485+
# SPECIAL CASE: Client-side timeouts for local processing
486+
# These should be IGNORED entirely to allow long server processing
487+
if (
488+
"timed out" == error_str.strip()
489+
or "read timeout" in error_str
490+
or type(e).__name__ == "ReadTimeout"
491+
):
492+
log.debug(f"Client-side timeout detected ({type(e).__name__}): {e}")
493+
log.debug(
494+
"Ignoring client-side timeout to allow server processing"
495+
" to continue"
496+
)
497+
# CRITICAL: For local imports, ignore client timeouts completely
498+
# This restores the previous behavior where long processing was allowed
499+
progress.console.print(
500+
f"[yellow]INFO:[/] Batch {batch_number} processing on server. "
501+
f"Continuing to wait for completion..."
502+
)
503+
# Continue with next chunk WITHOUT fallback - let server finish
504+
lines_to_process = lines_to_process[chunk_size:]
505+
continue
506+
507+
# For all other exceptions, use the original scalable error detection
484508
is_scalable_error = (
485509
"memory" in error_str
486510
or "out of memory" in error_str

0 commit comments

Comments
 (0)