Skip to content

Commit ff7f906

Browse files
committed
Rescoring options moved to additional section
1 parent 8785024 commit ff7f906

File tree

1 file changed

+15
-17
lines changed

1 file changed

+15
-17
lines changed

docs/reference/search/search-your-data/knn-search.asciidoc

Lines changed: 15 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -1090,19 +1090,7 @@ Generally, we have found that:
10901090
* `int4` requires some rescoring for higher accuracy and larger recall scenarios. Generally, oversampling by 1.5x-2x recovers most of the accuracy loss.
10911091
* `bbq` requires rescoring except on exceptionally large indices or models specifically designed for quantization. We have found that between 3x-5x oversampling is generally sufficient. But for fewer dimensions or vectors that do not quantize well, higher oversampling may be required.
10921092

1093-
There are three main ways to oversample and rescore:
1094-
1095-
* <<dense-vector-knn-search-reranking-rescore-parameter>>
1096-
* <<dense-vector-knn-search-reranking-rescore-section>>
1097-
* <<dense-vector-knn-search-reranking-script-score>>
1098-
1099-
[discrete]
1100-
[[dense-vector-knn-search-reranking-rescore-parameter]]
1101-
===== Use the `rescore_vector` option to rescore per shard
1102-
1103-
preview:[]
1104-
1105-
You can use the `rescore_vector` option to automatically perform reranking.
1093+
You can use the `rescore_vector` preview:[] option to automatically perform reranking.
11061094
When a rescore `num_candidates_factor` parameter is specified, the approximate kNN search will retrieve the top `num_candidates * oversample` candidates per shard.
11071095
It will then use the original vectors to rescore them, and return the top `k` results.
11081096

@@ -1134,12 +1122,19 @@ This example will:
11341122
* Rescore the top 200 candidates per shard using the original, non quantized vectors.
11351123
* Merge the rescored canddidates from all shards, and return the top 10 (`k`) results.
11361124

1125+
[discrete]
1126+
[[dense-vector-knn-search-reranking-rescore-additional]]
1127+
===== Additional rescoring techniques
1128+
1129+
The following sections provide additional ways of rescoring:
11371130

11381131
[discrete]
11391132
[[dense-vector-knn-search-reranking-rescore-section]]
1140-
===== Use the `rescore_vector` section for top-level kNN search
1133+
====== Use the `rescore_vector` section for top-level kNN search
11411134

1142-
You can use the <<rescore, rescore section>> in the `_search` request to rescore the top results from a kNN search.
1135+
You can use this option when you don't want to rescore on each shard, but on the top results from all shards.
1136+
1137+
Use the <<rescore, rescore section>> in the `_search` request to rescore the top results from a kNN search.
11431138

11441139
Here is an example using the top level `knn` search with oversampling and using `rescore` to rerank the results:
11451140

@@ -1191,9 +1186,12 @@ gathering 20 nearest neighbors according to quantized scoring and rescoring with
11911186

11921187
[discrete]
11931188
[[dense-vector-knn-search-reranking-script-score]]
1194-
===== Use a `script_score` query to rescore per shard
1189+
====== Use a `script_score` query to rescore per shard
1190+
1191+
You can use this option when you want to rescore on each shard and want more fine-grained control on the rescoring
1192+
than the `rescore_vector` option provides.
11951193

1196-
You can rescore per shard with the <<query-dsl-knn-query, knn query>> and <<query-dsl-script-score-query, script_score query >>.
1194+
Use rescore per shard with the <<query-dsl-knn-query, knn query>> and <<query-dsl-script-score-query, script_score query >>.
11971195
Generally, this means that there will be more rescoring per shard, but this can increase overall recall at the cost of compute.
11981196

11991197
[source,console]

0 commit comments

Comments
 (0)