File tree Expand file tree Collapse file tree 1 file changed +3
-3
lines changed Expand file tree Collapse file tree 1 file changed +3
-3
lines changed Original file line number Diff line number Diff line change @@ -75,7 +75,7 @@ pip install -e .
7575VueGen is also available on [ Bioconda] [ vuegen-conda ] and can be installed using conda:
7676
7777``` bash
78- conda install bioconda:: vuegen
78+ conda install -c bioconda -c conda-forge vuegen
7979```
8080
8181### Dependencies
@@ -332,15 +332,15 @@ available for other report types.
332332
333333Two API modes are supported:
334334
335- - ** Ollama-style streaming chat completion**
335+ - ** Ollama-style streaming chat completion: **
336336 If a ` model ` parameter is specified in the config file, VueGen assumes the chatbot is using Ollama’s [ /api/chat endpoint] [ ollama_chat ] .
337337 Messages are handled as chat history, and the assistant responses are streamed in real time for a smooth and responsive experience.
338338 This mode supports LLMs such as ` llama3 ` , ` deepsek ` , or ` mistral ` .
339339
340340> [ !TIP]
341341> See [ Ollama’s website] [ ollama ] for more details.
342342
343- - ** Standard prompt-response API**
343+ - ** Standard prompt-response API: **
344344 If no ` model ` is provided, VueGen uses a simpler prompt-response flow.
345345 A single prompt is sent to an endpoint, and a structured JSON object is expected in return.
346346 Currently, the response can include:
You can’t perform that action at this time.
0 commit comments