Skip to content

Commit dfa3de1

Browse files
szabostevedavidkyle
andcommitted
[8.16] [DOCS] Rename inference services to inference integrations in docs (elastic#120212)
Co-authored-by: David Kyle <[email protected]>
1 parent ee18ffe commit dfa3de1

16 files changed

+29
-29
lines changed

docs/reference/inference/inference-apis.asciidoc

Lines changed: 4 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -10,10 +10,8 @@ trained models. However, if you do not plan to use the {infer} APIs to use these
1010
models or if you want to use non-NLP models, use the
1111
<<ml-df-trained-models-apis>>.
1212

13-
The {infer} APIs enable you to create {infer} endpoints and use {ml} models of
14-
different providers - such as Amazon Bedrock, Anthropic, Azure AI Studio,
15-
Cohere, Google AI, Mistral, OpenAI, or HuggingFace - as a service. Use
16-
the following APIs to manage {infer} models and perform {infer}:
13+
The {infer} APIs enable you to create {infer} endpoints and integrate with {ml} models of different services - such as Amazon Bedrock, Anthropic, Azure AI Studio, Cohere, Google AI, Mistral, OpenAI, or HuggingFace.
14+
Use the following APIs to manage {infer} models and perform {infer}:
1715

1816
* <<delete-inference-api>>
1917
* <<get-inference-api>>
@@ -30,10 +28,8 @@ An {infer} endpoint enables you to use the corresponding {ml} model without
3028
manual deployment and apply it to your data at ingestion time through
3129
<<semantic-search-semantic-text, semantic text>>.
3230

33-
Choose a model from your provider or use ELSER – a retrieval model trained by
34-
Elastic –, then create an {infer} endpoint by the <<put-inference-api>>.
35-
Now use <<semantic-search-semantic-text, semantic text>> to perform
36-
<<semantic-search, semantic search>> on your data.
31+
Choose a model from your service or use ELSER – a retrieval model trained by Elastic –, then create an {infer} endpoint by the <<put-inference-api>>.
32+
Now use <<semantic-search-semantic-text, semantic text>> to perform <<semantic-search, semantic search>> on your data.
3733

3834
[discrete]
3935
[[adaptive-allocations]]

docs/reference/inference/put-inference.asciidoc

Lines changed: 11 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,11 @@ include::inference-shared.asciidoc[tag=inference-id]
3636
include::inference-shared.asciidoc[tag=task-type]
3737
+
3838
--
39+
<<<<<<< HEAD
3940
Refer to the service list in the <<put-inference-api-desc,API description section>> for the available task types.
41+
=======
42+
Refer to the integration list in the <<put-inference-api-desc,API description section>> for the available task types.
43+
>>>>>>> c60b3be6c72 ([DOCS] Rename inference services to inference integrations in docs (#120212))
4044
--
4145
4246
@@ -48,15 +52,15 @@ The create {infer} API enables you to create an {infer} endpoint and configure a
4852
4953
[IMPORTANT]
5054
====
51-
* When creating an inference endpoint, the associated machine learning model is automatically deployed if it is not already running.
55+
* When creating an {infer} endpoint, the associated {ml} model is automatically deployed if it is not already running.
5256
* After creating the endpoint, wait for the model deployment to complete before using it. You can verify the deployment status by using the <<get-trained-models-stats, Get trained model statistics>> API. In the response, look for `"state": "fully_allocated"` and ensure the `"allocation_count"` matches the `"target_allocation_count"`.
5357
* Avoid creating multiple endpoints for the same model unless required, as each endpoint consumes significant resources.
5458
====
5559
5660
57-
The following services are available through the {infer} API.
58-
You can find the available task types next to the service name.
59-
Click the links to review the configuration details of the services:
61+
The following integrations are available through the {infer} API.
62+
You can find the available task types next to the integration name.
63+
Click the links to review the configuration details of the integrations:
6064
6165
* <<infer-service-alibabacloud-ai-search,AlibabaCloud AI Search>> (`completion`, `rerank`, `sparse_embedding`, `text_embedding`)
6266
* <<infer-service-amazon-bedrock,Amazon Bedrock>> (`completion`, `text_embedding`)
@@ -73,14 +77,14 @@ Click the links to review the configuration details of the services:
7377
* <<infer-service-openai,OpenAI>> (`completion`, `text_embedding`)
7478
* <<infer-service-watsonx-ai>> (`text_embedding`)
7579
76-
The {es} and ELSER services run on a {ml} node in your {es} cluster. The rest of
77-
the services connect to external providers.
80+
The {es} and ELSER services run on a {ml} node in your {es} cluster.
81+
The rest of the integrations connect to external services.
7882
7983
[discrete]
8084
[[adaptive-allocations-put-inference]]
8185
==== Adaptive allocations
8286
83-
Adaptive allocations allow inference services to dynamically adjust the number of model allocations based on the current load.
87+
Adaptive allocations allow inference endpoints to dynamically adjust the number of model allocations based on the current load.
8488
8589
When adaptive allocations are enabled:
8690

docs/reference/inference/service-alibabacloud-ai-search.asciidoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
[[infer-service-alibabacloud-ai-search]]
2-
=== AlibabaCloud AI Search {infer} service
2+
=== AlibabaCloud AI Search {infer} integration
33

44
Creates an {infer} endpoint to perform an {infer} task with the `alibabacloud-ai-search` service.
55

docs/reference/inference/service-amazon-bedrock.asciidoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
[[infer-service-amazon-bedrock]]
2-
=== Amazon Bedrock {infer} service
2+
=== Amazon Bedrock {infer} integration
33

44
Creates an {infer} endpoint to perform an {infer} task with the `amazonbedrock` service.
55

docs/reference/inference/service-anthropic.asciidoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
[[infer-service-anthropic]]
2-
=== Anthropic {infer} service
2+
=== Anthropic {infer} integration
33

44
Creates an {infer} endpoint to perform an {infer} task with the `anthropic` service.
55

docs/reference/inference/service-azure-ai-studio.asciidoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
[[infer-service-azure-ai-studio]]
2-
=== Azure AI studio {infer} service
2+
=== Azure AI studio {infer} integration
33

44
Creates an {infer} endpoint to perform an {infer} task with the `azureaistudio` service.
55

docs/reference/inference/service-azure-openai.asciidoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
[[infer-service-azure-openai]]
2-
=== Azure OpenAI {infer} service
2+
=== Azure OpenAI {infer} integration
33

44
Creates an {infer} endpoint to perform an {infer} task with the `azureopenai` service.
55

docs/reference/inference/service-cohere.asciidoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
[[infer-service-cohere]]
2-
=== Cohere {infer} service
2+
=== Cohere {infer} integration
33

44
Creates an {infer} endpoint to perform an {infer} task with the `cohere` service.
55

docs/reference/inference/service-elasticsearch.asciidoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
[[infer-service-elasticsearch]]
2-
=== Elasticsearch {infer} service
2+
=== Elasticsearch {infer} integration
33

44
Creates an {infer} endpoint to perform an {infer} task with the `elasticsearch` service.
55

docs/reference/inference/service-elser.asciidoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
[[infer-service-elser]]
2-
=== ELSER {infer} service
2+
=== ELSER {infer} integration
33

44
Creates an {infer} endpoint to perform an {infer} task with the `elser` service.
55
You can also deploy ELSER by using the <<infer-service-elasticsearch>>.

0 commit comments

Comments
 (0)