-
Notifications
You must be signed in to change notification settings - Fork 43
vLLM error reports on the frontend #663
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Codecov Report❌ Patch coverage is
📢 Thoughts on this report? Let us know! |
|
The test/api/test_dataset_service.py is unrelated to this PR but got reformatted when I hit ruff format . |
|
Also you will need to increment the version number on vllm_server info.json! |
|
I think this is not being tagged by CodeQL because of the way plugins work for us but we should not be showing bare errors on frontend right? |
Yes. I will increment the version number too because I made the change to the vllm server plugin file 😅 |
Yeah, I think showing bare errors won't be a good idea, because it's a bit messy. Should I modify it to display some kind of custom text ? |
…ab-api into fix/vllm_error_logging
|
Forgot to increment plugin version number, incremented it now |
|
@dadmobile Reverted all changes to vllm_server |
…ab-api into fix/vllm_error_logging
The error for vLLM are being reported in this way on the frontend after this change :
instead of plain "Error with exit code 1" as output
Fixes transformerlab/transformerlab-app#841
Fixes transformerlab/transformerlab-app#855