Skip to content

Commit 832cd31

Browse files
Docs exposeConfig for LLM connectors (#2437)
Fixes #1003 by providing info about how to use the exposeConfig parameter for preconfigured connectors to improve debugging for LLM connectors. --------- Co-authored-by: florent-leborgne <[email protected]>
1 parent 19fac0a commit 832cd31

File tree

1 file changed

+9
-0
lines changed

1 file changed

+9
-0
lines changed

solutions/security/ai/set-up-connectors-for-large-language-models-llm.md

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,16 @@ Follow these guides to connect to one or more third-party LLM providers:
3232

3333
You can [connect to LM Studio](/solutions/security/ai/connect-to-own-local-llm.md) to use a custom LLM deployed and managed by you.
3434

35+
## Preconfigured connectors
3536

37+
```{applies_to}
38+
stack: ga 9.0
39+
serverless: unavailable
40+
```
41+
42+
You can use [preconfigured connectors](kibana://reference/connectors-kibana/pre-configured-connectors.md) to set up a third-party LLM connector.
43+
44+
If you use a preconfigured connector for your LLM connector we recommend you add the `exposeConfig: true` parameter within the `xpack.actions.preconfigured` section of the `kibana.yml` config file. This parameter makes debugging easier by adding configuration information to the debug logs, including which large language model the connector uses.
3645

3746

3847

0 commit comments

Comments
 (0)