@@ -210,6 +210,35 @@ This advanced case study demonstrates the application of VueGen in a real-world
210210> [ !NOTE]
211211> The EMP case study is available online as [ HTML] [ emp-html-demo ] and [ Streamlit] [ emp-st-demo ] reports.
212212
213+ ** 3. ChatBot Component**
214+
215+ This case study highlights VueGen’s capability to embed a chatbot component into a report subsection,
216+ enabling interactive conversations inside the report.
217+
218+ Two API modes are supported:
219+
220+ - ** Ollama-style streaming chat completion**
221+ If a ` model ` parameter is specified in the config file, VueGen assumes the chatbot is using Ollama’s [ /api/chat endpoint] [ ollama_chat ] .
222+ Messages are handled as chat history, and the assistant responses are streamed in real time for a smooth and responsive experience.
223+ This mode supports LLMs such as ` llama3 ` , ` deepsek ` , or ` mistral ` .
224+
225+ > [ !TIP]
226+ > See [ Ollama’s website] [ ollama ] for more details.
227+
228+ - ** Standard prompt-response API**
229+ If no ` model ` is provided, VueGen uses a simpler prompt-response flow.
230+ A single prompt is sent to an endpoint, and a structured JSON object is expected in return.
231+ Currently, the response can include:
232+ - ` text ` : the main textual reply
233+ - ` links ` : a list of source URLs (optional)
234+ - ` HTML content ` : an HTML snippet with a Pyvis network visualization (optional)
235+
236+ This response structure is currently customized for an internal knowledge graph assistant, but VueGen is being actively developed
237+ to support more flexible and general-purpose response formats in future releases.
238+
239+ > [ !NOTE]
240+ > You can see a [ configuration file example] [ config-chatbot ] for the chatbot component in the ` docs/example_config_files ` folder.
241+
213242## Web application deployment
214243
215244Once a Streamlit report is generated, it can be deployed as a web application to make it accessible online. There are multiple ways to achieve this:
@@ -269,6 +298,9 @@ We appreciate your feedback! If you have any comments, suggestions, or run into
269298[ ci-docs ] : https://github.com/Multiomics-Analytics-Group/vuegen/actions/workflows/docs.yml
270299[ emp-html-demo ] : https://multiomics-analytics-group.github.io/vuegen/
271300[ emp-st-demo ] : https://earth-microbiome-vuegen-demo.streamlit.app/
301+ [ ollama_chat ] : https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-chat-completion
302+ [ ollama ] : https://ollama.com/
303+ [ config-chatbot ] : https://github.com/Multiomics-Analytics-Group/vuegen/blob/main/docs/example_config_files/Chatbot_example_config.yaml
272304[ issues ] : https://github.com/Multiomics-Analytics-Group/vuegen/issues
273305[ pulls ] : https://github.com/Multiomics-Analytics-Group/vuegen/pulls
274306[ vuegen-preprint ] : https://doi.org/10.1101/2025.03.05.641152
0 commit comments