-
Notifications
You must be signed in to change notification settings - Fork 20
Support bypass tools option on lightspeed-stack /streaming_query API #1737
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support bypass tools option on lightspeed-stack /streaming_query API #1737
Conversation
b93fe70
to
dae0254
Compare
|
2 similar comments
|
|
7b6d07b
to
ae2dd38
Compare
ae2dd38
to
3766792
Compare
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM 👍
@@ -33,7 +33,9 @@ const botName = | |||
ANSIBLE_LIGHTSPEED_PRODUCT_NAME; | |||
|
|||
export const modelsSupported: LLMModel[] = [ | |||
{ model: "granite-3.3-8b-instruct", provider: "rhoai" }, | |||
{ model: "granite-3.3-8b-instruct", provider: "my_rhoai_dev" }, | |||
{ model: "gemini/gemini-2.5-flash", provider: "gemini" }, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Whilst these other models will work with presumably your AI Installer backend they will break LSIA. Are you planning on updating the SaaS backend to support these too?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok. I will revert the line 36.
I thought it needed to be consistent with our (road-core based) chatbot configuration (this line), and that's the reason that I changed it to my_rhoai_dev
, but the operator (this line) assumes rhoai
as the provider name for granite-3.3-8b-instruct
, I think I need to modify https://github.com/ansible/ansible-wisdom-ops/pull/1483 to change the provider name.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(additional update) ... Though It is possible to have use provider names in /aap_chatbot
and /ansible_ai_connect_chatbot
, I want make these code as common as possible...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@TamiTakamiya Operator deployments are different to what we have in ansible-wisdom--ops
. It was the Gemini models that'd cause problems as the SaaS backend only supports Granite in llama-stack
s run configuration.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok. Then I will revert line 36 & 37 of useChatbot.ts
under /aap_chatbot
, but not the one under /ansible_ai_connect_chatbot
. Though they are not displayed in non-debug mode, on-prem chatbot does not support Gemini and it does not require the change and the provider for Granite needs to be remain as rhoai
for on-prem version.
bb98a1e
to
8516186
Compare
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Thanks @TamiTakamiya
Jira Issue: https://issues.redhat.com/browse/AAP-51359
Assisted-by: n/a
Generated by: n/a
Description
Support the "no_tools" option on lightspeed-stack's
/streaming_query
and/query
APIs.This option is available when the chatbot UI is running in the DEBUG mode. The new "Bypass tools" checkbox
is added on the UI for setting System Prompt Override.
Testing
Steps to test
Scenarios tested
Unit tests + manual tests using local server
Production deployment