Milvus Vector ANN Search #47796
-
|
Hello wonderful Milvus team! 👋 Our current situation: {
"size": "25",
"query": {
"knn": {
"my_vector": {
"k": 1024
}
}
}
}Our Milvus setup: {
"limit": 25,
"searchParams": {
"metricType": "COSINE",
"params": {
"ef": 1024
}
}
}We're getting about 97% recall. Our friendly questions for you: We'd be incredibly grateful for any insights, tips, or suggestions you might have. We're really passionate about making the right choice and would love to join the Milvus family! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
|
Q1: Can Milvus do exact KNN search with HNSW index? No, HNSW is an ANN (approximate nearest neighbor) algorithm by design, so exact KNN is not supported with HNSW. However, you can absolutely push recall beyond 97% — achieving 99% or even 99.5% is Q2: Is it fair to compare OpenSearch KNN results with Milvus ANN results? Not really — this isn't an apples-to-apples comparison. KNN (brute-force exact search) is significantly slower than ANN, so comparing them directly isn't fair to either system. The right way to |
Beta Was this translation helpful? Give feedback.
Q1: Can Milvus do exact KNN search with HNSW index?
No, HNSW is an ANN (approximate nearest neighbor) algorithm by design, so exact KNN is not supported with HNSW. However, you can absolutely push recall beyond 97% — achieving 99% or even 99.5% is
realistic by tuning your search parameters (e.g. increasing ef). The 97% you're seeing likely has room for improvement depending on your query parameters and data distribution.
Q2: Is it fair to compare OpenSearch KNN results with Milvus ANN results?
Not really — this isn't an apples-to-apples comparison. KNN (brute-force exact search) is significantly slower than ANN, so comparing them directly isn't fair to either system. The right way to
benc…