Skip to content

Conversation

@ParamThakkar123
Copy link
Contributor

@ParamThakkar123 ParamThakkar123 commented Nov 4, 2025

The error for vLLM are being reported in this way on the frontend after this change :

image

instead of plain "Error with exit code 1" as output

Fixes transformerlab/transformerlab-app#841

Fixes transformerlab/transformerlab-app#855

@codecov
Copy link

codecov bot commented Nov 4, 2025

Codecov Report

❌ Patch coverage is 14.28571% with 6 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
transformerlab/shared/shared.py 14.28% 6 Missing ⚠️

📢 Thoughts on this report? Let us know!

@ParamThakkar123
Copy link
Contributor Author

The test/api/test_dataset_service.py is unrelated to this PR but got reformatted when I hit ruff format .

@dadmobile
Copy link
Member

Also you will need to increment the version number on vllm_server info.json!

@dadmobile
Copy link
Member

Fixes transformerlab/transformerlab-app#855

@deep1401
Copy link
Member

deep1401 commented Nov 4, 2025

I think this is not being tagged by CodeQL because of the way plugins work for us but we should not be showing bare errors on frontend right?

@ParamThakkar123
Copy link
Contributor Author

Also you will need to increment the version number on vllm_server info.json!

Yes. I will increment the version number too because I made the change to the vllm server plugin file 😅

@ParamThakkar123
Copy link
Contributor Author

ParamThakkar123 commented Nov 5, 2025

I think this is not being tagged by CodeQL because of the way plugins work for us but we should not be showing bare errors on frontend right?

Yeah, I think showing bare errors won't be a good idea, because it's a bit messy. Should I modify it to display some kind of custom text ?

@ParamThakkar123
Copy link
Contributor Author

Forgot to increment plugin version number, incremented it now

@ParamThakkar123
Copy link
Contributor Author

@dadmobile Reverted all changes to vllm_server

@ParamThakkar123 ParamThakkar123 merged commit 877abd7 into main Nov 6, 2025
7 of 8 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

vLLM start up failure because max_model_len param not read vLLM Errors aren't reporting correctly

4 participants