Skip to content

Conversation

@yossiovadia
Copy link
Collaborator

The validation script was failing to detect successful model loading because it used --tail=200 to check container logs. Since model containers generate many health check logs after startup, the "model loaded" messages were beyond the 200-line limit.

Changes:

  • Remove --tail=200 from model-a validation (line 98)
  • Remove --tail=200 from model-b validation (line 110)
  • Now scans entire log history for "model loaded" message

Fixes #391

…ection

The validation script was failing to detect successful model loading because
it used --tail=200 to check container logs. Since model containers generate
many health check logs after startup, the "model loaded" messages were
beyond the 200-line limit.

Changes:
- Remove --tail=200 from model-a validation (line 98)
- Remove --tail=200 from model-b validation (line 110)
- Now scans entire log history for "model loaded" message

Fixes vllm-project#391

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <[email protected]>
Signed-off-by: Yossi Ovadia <[email protected]>
@netlify
Copy link

netlify bot commented Oct 10, 2025

Deploy Preview for vllm-semantic-router ready!

Name Link
🔨 Latest commit 4a29c4b
🔍 Latest deploy log https://app.netlify.com/projects/vllm-semantic-router/deploys/68e93f26292fdc0008caee53
😎 Deploy Preview https://deploy-preview-392--vllm-semantic-router.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify project configuration.

@github-actions
Copy link

👥 vLLM Semantic Team Notification

The following members have been identified for the changed files in this PR and have been automatically assigned:

📁 deploy

Owners: @rootfs, @Xunzhuo
Files changed:

  • deploy/openshift/validate-deployment.sh

vLLM

🎉 Thanks for your contributions!

This comment was automatically generated based on the OWNER files in the repository.

@rootfs rootfs merged commit 95c222f into vllm-project:main Oct 11, 2025
9 checks passed
joyful-ii-V-I pushed a commit to joyful-ii-V-I/semantic-router that referenced this pull request Oct 13, 2025
…ection (vllm-project#392)

The validation script was failing to detect successful model loading because
it used --tail=200 to check container logs. Since model containers generate
many health check logs after startup, the "model loaded" messages were
beyond the 200-line limit.

Changes:
- Remove --tail=200 from model-a validation (line 98)
- Remove --tail=200 from model-b validation (line 110)
- Now scans entire log history for "model loaded" message

Fixes vllm-project#391

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Signed-off-by: Yossi Ovadia <[email protected]>
Co-authored-by: Claude <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

fix: validation script fails to detect model loading due to log tail limit

3 participants