You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/how-to/llms/use_llms.md
+4-1Lines changed: 4 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -29,7 +29,7 @@ LLMs in Ragbits allow you to customize the behavior of the model using various o
29
29
30
30
### LiteLLM Options
31
31
32
-
The `LiteLLMOptions` class provides options for remote LLMs, aligning with the LiteLLM API. These options allow you to control the behavior of models from various providers. Each of the option is described in the [LiteLLM documentation](https://docs.litellm.ai/docs/completion/input).
32
+
The `LiteLLMOptions` class provides options for remote LLMs, aligning with the LiteLLM API. These options allow you to control the behavior of models from various providers. Each of the option is described in the [LiteLLM documentation](https://docs.litellm.ai/docs/completion/input) and [Reasoning Documentation](https://docs.litellm.ai/docs/reasoning_content)
33
33
34
34
Example usage:
35
35
```python
@@ -47,6 +47,9 @@ response = llm.generate("Write a short story about a robot learning to paint.")
47
47
print(response)
48
48
```
49
49
50
+
!!! warning
51
+
If you provide reasoning_effort to the OpenAI model, [the reasoning content will not be returned](https://platform.openai.com/docs/guides/reasoning?api-mode=responses).
52
+
50
53
## Using Local LLMs
51
54
52
55
For guidance on setting up and using local models in Ragbits, refer to the [Local LLMs Guide](https://ragbits.deepsense.ai/how-to/llms/use_local_llms/).
0 commit comments