-
Notifications
You must be signed in to change notification settings - Fork 28
ENG-24977 initial draft on how to enable and access the chatbot inter… #808
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
ENG-24977 initial draft on how to enable and access the chatbot inter… #808
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM :)
:_module-type: PROCEDURE | ||
|
||
[id="accessing-the-chat-bot-interface_{context}"] | ||
= Accessing the chat bot interface |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@alexcreasy, in RHOAI 2.21 RC1, when following these instructions I see the Chatbot tab appearing in Dashboard but, when writing in the chat, nothing happens. Looking at the Network tab in the Developer Tools there are no network calls.
Is it possible that we need ODH master to run the chatbot?
bf5d244
to
15acbc3
Compare
WalkthroughA new procedural documentation module has been introduced, detailing the steps required to access and enable a Llama stack LLM chat bot interface within a data science project on the OpenShift platform. The guide outlines prerequisites, configuration steps, navigation instructions, and verification procedures for successful chat bot usage. Changes
Poem
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (4)
modules/accessing-the-chat-bot-interface.adoc (4)
6-8
: Use the standard AsciiDoc abstract block
Instead of[role='_abstract']
, consider using the built-in[abstract]
block for clarity and consistency. For example:-[role='_abstract'] + [abstract] Embedding a Llama stack large language model (LLM) chat bot interface…
20-23
: Rephrase prerequisites as actions
To improve consistency with other procedures, reword bullets to start with an imperative verb. For example:-* You have logged in to {productname-long}. +* Log in to {productname-long}. -* The `disableLlamaStackClient` feature flag value is set to `false` in the `OdhDashboardConfig` custom resource (CR) in {openshift-platform}. +* Set the `disableLlamaStackClient` feature flag to `false` in the `OdhDashboardConfig` CR in {openshift-platform}.
34-42
: Standardize UI element styling
Italicizing tab names (*Chat bot*
) can clash with list markers. It’s clearer to use monospace or quotes for UI elements. For example:-. Click the *Chat bot* tab. + . Click the `Chat bot` tab. -The LlamaStack chat bot interface displays. + The LlamaStack chat bot interface displays.
45-46
: Avoid nested asterisks in bullet lists
Using*Chat bot*
inside a*
-bulleted list can break markup. Use underscores for emphasis or backticks:-* You can access the *Chat bot* tab from within a data science project. +* You can access the _Chat bot_ tab from within a data science project. -* You can successfully send queries to the chat bot and then receive relevant answers. +* You can successfully send queries to the chat bot and then receive relevant answers.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
modules/accessing-the-chat-bot-interface.adoc
(1 hunks)
🔇 Additional comments (2)
modules/accessing-the-chat-bot-interface.adoc (2)
1-1
: Verify the module attribute name
The attribute_module-type
with a leading underscore may not be recognized by the documentation tool. Confirm whether this should be:module-type: PROCEDURE
instead of:_module-type:
to align with the rest of the docs.
9-11
:❓ Verification inconclusive
Confirm the feature-flag field name
The guide refers to thedisableLlamaStackClient
flag inOdhDashboardConfig
. Please verify this exact field exists in the CRD and that its default value istrue
.
🏁 Script executed:
#!/bin/bash # Verify the CRD schema for disableLlamaStackClient ast-grep --pattern $'custom-resource-definition {\n $$$\n spec {\n $$$\n disableLlamaStackClient: $_\n $$$\n }\n $$$\n}'Length of output: 126
🏁 Script executed:
#!/bin/bash # Search for any references to disableLlamaStackClient in the repository rg --color=never -n 'disableLlamaStackClient' .Length of output: 591
🏁 Script executed:
#!/bin/bash # Locate CRD definitions for OdhDashboardConfig echo "Searching for OdhDashboardConfig CRD files..." crd_files=$(rg --color=never -l 'kind: *OdhDashboardConfig' .) if [[ -z "$crd_files" ]]; then echo "No CRD files found containing 'kind: OdhDashboardConfig'." exit 0 fi echo "Found CRD files:" echo "$crd_files" echo # Search each CRD file for disableLlamaStackClient property and its default for file in $crd_files; do echo "=== $file ===" rg --color=never -C3 'disableLlamaStackClient' "$file" || echo " disableLlamaStackClient not found in this file." echo doneLength of output: 320
Confirm OdhDashboardConfig feature‐flag existence & default
I was unable to locate theOdhDashboardConfig
CRD schema in this repo. Please manually verify that the upstream CustomResourceDefinition forOdhDashboardConfig
:
- Defines the
disableLlamaStackClient
field- Sets its default value to
true
This is currently blocked while we clarify if there's anything else involved in the enablement process. When QE tested this, they did not receive a response from the chatbot. We must verify if further backend configuration and enablement is required. In addition, the chat bot user interface integration has been delayed until 2.23. |
…face within a dsp
Description
By using LlamaStack, you can now integrate an Llama 3.2 LLM model into a data science project. This is in the form of a chat bot interface, which can be accessed from the dashboard from within a data science project's details page. By default, this feature is disabled and hidden behind a feature flag.
How Has This Been Tested?
Merge criteria:
Summary by CodeRabbit