Skip to content

Commit 6800df7

Browse files
Update service-openai.asciidoc (elastic#125419)
Many customers want to use our OpenAI Inference endpoint against OpenAI compatible API's they have written, or Ollama, or Nvidia Triton OpenAI API front end. I had heard that was the intent of this OpenAI inference endpoint, but we do not state it directly. Can we validate this is OK with Search PM and include this? Co-authored-by: István Zoltán Szabó <[email protected]>
1 parent a0946c0 commit 6800df7

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

docs/reference/inference/service-openai.asciidoc

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@
77
For the most up-to-date API details, refer to {api-es}/group/endpoint-inference[{infer-cap} APIs].
88
--
99

10-
Creates an {infer} endpoint to perform an {infer} task with the `openai` service.
10+
Creates an {infer} endpoint to perform an {infer} task with the `openai` service or `openai` compatible API's.
1111

1212

1313
[discrete]
@@ -176,4 +176,4 @@ PUT _inference/completion/openai-completion
176176
}
177177
}
178178
------------------------------------------------------------
179-
// TEST[skip:TBD]
179+
// TEST[skip:TBD]

0 commit comments

Comments
 (0)