Skip to content

Commit 00c0564

Browse files
authored
[None][chore] Remove unnecessary warning log for tuning. (#10077)
Signed-off-by: Yukun He <[email protected]>
1 parent 18b335d commit 00c0564

File tree

1 file changed

+0
-3
lines changed

1 file changed

+0
-3
lines changed

tensorrt_llm/_torch/autotuner.py

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1444,9 +1444,6 @@ def _maybe_sync_cache_data(self, strategy: DistributedTuningStrategy,
14441444
custom_op: str):
14451445
"""Synchronize cache data across all ranks."""
14461446
if not self._is_distributed():
1447-
logger.warning(
1448-
f"[AutoTuner] Not in distributed environment, skipping synchronization"
1449-
)
14501447
return
14511448

14521449
if strategy == DistributedTuningStrategy.BROADCAST:

0 commit comments

Comments
 (0)