You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: solutions/search/agent-builder/models.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -94,9 +94,9 @@ You can connect a locally hosted LLM to Elastic using the OpenAI connector. This
94
94
### Requirements
95
95
96
96
**Model selection:**
97
-
- Must include "instruct" in the model name to work with Elastic
98
97
- Download from trusted sources only
99
98
- Consider parameter size, context window, and quantization format for your needs
99
+
- Prefer "instruct" variants over "base" or "chat" versions when multiple variants are available, as instruct models are typically better tuned for following instructions
100
100
101
101
**Integration setup:**
102
102
- For Elastic Cloud: Requires a reverse proxy (such as Nginx) to authenticate requests using a bearer token and forward them to your local LLM endpoint
0 commit comments