Skip to content

Commit 1586708

Browse files
authored
Merge branch 'main' into lucene_snapshot_10_1
2 parents 1a2b5ea + 94720bb commit 1586708

File tree

76 files changed

+1821
-609
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

76 files changed

+1821
-609
lines changed

docs/changelog/119743.yaml

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
pr: 119743
2+
summary: POC mark read-only
3+
area: Engine
4+
type: enhancement
5+
issues: []

docs/changelog/120354.yaml

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
pr: 120354
2+
summary: Move scoring in ES|QL out of snapshot
3+
area: ES|QL
4+
type: enhancement
5+
issues: []

docs/changelog/120370.yaml

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
pr: 120370
2+
summary: "Merge field mappers when updating mappings with [subobjects:false]"
3+
area: Mapping
4+
type: bug
5+
issues:
6+
- 120216

docs/reference/esql/metadata-fields.asciidoc

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -20,6 +20,8 @@ supported ones are:
2020
* <<mapping-ignored-field,`_ignored`>>: the ignored source document fields. The field is of the type
2121
<<keyword,keyword>>.
2222

23+
* `_score`: when enabled, the final score assigned to each row matching an ES|QL query. Scoring will be updated when using <<esql-search-functions,full text search functions>>.
24+
2325
To enable the access to these fields, the <<esql-from,`FROM`>> source command needs
2426
to be provided with a dedicated directive:
2527

docs/reference/inference/inference-apis.asciidoc

Lines changed: 4 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -16,10 +16,8 @@ models or if you want to use non-NLP models, use the
1616
For the most up-to-date API details, refer to {api-es}/group/endpoint-inference[{infer-cap} APIs].
1717
--
1818

19-
The {infer} APIs enable you to create {infer} endpoints and use {ml} models of
20-
different providers - such as Amazon Bedrock, Anthropic, Azure AI Studio,
21-
Cohere, Google AI, Mistral, OpenAI, or HuggingFace - as a service. Use
22-
the following APIs to manage {infer} models and perform {infer}:
19+
The {infer} APIs enable you to create {infer} endpoints and integrate with {ml} models of different services - such as Amazon Bedrock, Anthropic, Azure AI Studio, Cohere, Google AI, Mistral, OpenAI, or HuggingFace.
20+
Use the following APIs to manage {infer} models and perform {infer}:
2321

2422
* <<delete-inference-api>>
2523
* <<get-inference-api>>
@@ -37,10 +35,8 @@ An {infer} endpoint enables you to use the corresponding {ml} model without
3735
manual deployment and apply it to your data at ingestion time through
3836
<<semantic-search-semantic-text, semantic text>>.
3937

40-
Choose a model from your provider or use ELSER – a retrieval model trained by
41-
Elastic –, then create an {infer} endpoint by the <<put-inference-api>>.
42-
Now use <<semantic-search-semantic-text, semantic text>> to perform
43-
<<semantic-search, semantic search>> on your data.
38+
Choose a model from your service or use ELSER – a retrieval model trained by Elastic –, then create an {infer} endpoint by the <<put-inference-api>>.
39+
Now use <<semantic-search-semantic-text, semantic text>> to perform <<semantic-search, semantic search>> on your data.
4440

4541
[discrete]
4642
[[adaptive-allocations]]

docs/reference/inference/put-inference.asciidoc

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,7 @@ include::inference-shared.asciidoc[tag=inference-id]
4242
include::inference-shared.asciidoc[tag=task-type]
4343
+
4444
--
45-
Refer to the service list in the <<put-inference-api-desc,API description section>> for the available task types.
45+
Refer to the integration list in the <<put-inference-api-desc,API description section>> for the available task types.
4646
--
4747

4848

@@ -54,15 +54,15 @@ The create {infer} API enables you to create an {infer} endpoint and configure a
5454

5555
[IMPORTANT]
5656
====
57-
* When creating an inference endpoint, the associated machine learning model is automatically deployed if it is not already running.
57+
* When creating an {infer} endpoint, the associated {ml} model is automatically deployed if it is not already running.
5858
* After creating the endpoint, wait for the model deployment to complete before using it. You can verify the deployment status by using the <<get-trained-models-stats, Get trained model statistics>> API. In the response, look for `"state": "fully_allocated"` and ensure the `"allocation_count"` matches the `"target_allocation_count"`.
5959
* Avoid creating multiple endpoints for the same model unless required, as each endpoint consumes significant resources.
6060
====
6161

6262

63-
The following services are available through the {infer} API.
64-
You can find the available task types next to the service name.
65-
Click the links to review the configuration details of the services:
63+
The following integrations are available through the {infer} API.
64+
You can find the available task types next to the integration name.
65+
Click the links to review the configuration details of the integrations:
6666

6767
* <<infer-service-alibabacloud-ai-search,AlibabaCloud AI Search>> (`completion`, `rerank`, `sparse_embedding`, `text_embedding`)
6868
* <<infer-service-amazon-bedrock,Amazon Bedrock>> (`completion`, `text_embedding`)
@@ -80,14 +80,14 @@ Click the links to review the configuration details of the services:
8080
* <<infer-service-watsonx-ai>> (`text_embedding`)
8181
* <<infer-service-jinaai,JinaAI>> (`text_embedding`, `rerank`)
8282

83-
The {es} and ELSER services run on a {ml} node in your {es} cluster. The rest of
84-
the services connect to external providers.
83+
The {es} and ELSER services run on a {ml} node in your {es} cluster.
84+
The rest of the integrations connect to external services.
8585

8686
[discrete]
8787
[[adaptive-allocations-put-inference]]
8888
==== Adaptive allocations
8989

90-
Adaptive allocations allow inference services to dynamically adjust the number of model allocations based on the current load.
90+
Adaptive allocations allow inference endpoints to dynamically adjust the number of model allocations based on the current load.
9191

9292
When adaptive allocations are enabled:
9393

docs/reference/inference/service-alibabacloud-ai-search.asciidoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
[[infer-service-alibabacloud-ai-search]]
2-
=== AlibabaCloud AI Search {infer} service
2+
=== AlibabaCloud AI Search {infer} integration
33

44
.New API reference
55
[sidebar]

docs/reference/inference/service-amazon-bedrock.asciidoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
[[infer-service-amazon-bedrock]]
2-
=== Amazon Bedrock {infer} service
2+
=== Amazon Bedrock {infer} integration
33

44
.New API reference
55
[sidebar]

docs/reference/inference/service-anthropic.asciidoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
[[infer-service-anthropic]]
2-
=== Anthropic {infer} service
2+
=== Anthropic {infer} integration
33

44
.New API reference
55
[sidebar]

docs/reference/inference/service-azure-ai-studio.asciidoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
[[infer-service-azure-ai-studio]]
2-
=== Azure AI studio {infer} service
2+
=== Azure AI studio {infer} integration
33

44
.New API reference
55
[sidebar]

0 commit comments

Comments
 (0)