Describe the bug
Vector search (FT.SEARCH) via the Python Redis connector is completely broken due to two issues in the search path:
-
RedisCollection._inner_search (line 324) passes a StorageType enum to redisvl's process_results(), which since redisvl 0.5.0 expects an IndexSchema object. Every search call raises AttributeError: 'StorageType' object has no attribute 'index'.
-
RedisHashsetCollection._deserialize_store_models_to_dicts (line 620) unconditionally calls buffer_to_array(rec[field.name], dtype) for every vector field. When include_vectors=False (the default for search), the vector field is absent from the result dict, causing KeyError: 'vector'.
Issue 1 blocks all search calls. Issue 2 blocks hashset search even after issue 1 is fixed, unless the user explicitly passes include_vectors=True.
Expected behavior
collection.search(vector=[...], top=N) should execute FT.SEARCH and return results. With include_vectors=False (default), results should contain data fields but not vector fields.
To Reproduce
Prerequisites: a Redis Stack server reachable on localhost:6379, redis-cli, and a working Python SK dev environment.
-
Install Python SK per python/DEV_SETUP.md:
cd python && make install-sk PYTHON_VERSION=3.12
-
Start Redis Stack on port 6379:
podman run -d -p 6379:6379 docker.io/redis/redis-stack:latest
-
From python/, run the following script:
# repro_search.py
import asyncio
from dataclasses import dataclass, field
from typing import Annotated
from uuid import uuid4
from semantic_kernel.connectors.redis import RedisHashsetCollection
from semantic_kernel.data.vector import VectorStoreField, vectorstoremodel
@vectorstoremodel
@dataclass
class MyModel:
vector: Annotated[
list[float] | None,
VectorStoreField("vector", index_kind="hnsw", dimensions=3,
distance_function="cosine_similarity", type="float"),
] = None
id: Annotated[str, VectorStoreField("key", type="str")] = field(
default_factory=lambda: str(uuid4())
)
content: Annotated[str, VectorStoreField("data", type="str")] = "hello"
async def main():
async with RedisHashsetCollection(
record_type=MyModel,
collection_name="repro_search",
prefix_collection_name_to_key_names=True,
) as col:
await col.ensure_collection_deleted()
await col.ensure_collection_exists()
await col.upsert([MyModel(id="1", content="test", vector=[0.1, 0.2, 0.3])])
# This raises AttributeError: 'StorageType' object has no attribute 'index'
results = await col.search(vector=[0.1, 0.2, 0.3], top=1)
async for r in results.results:
print(r.record.id, r.record.content)
await col.ensure_collection_deleted()
asyncio.run(main())
REDIS_CONNECTION_STRING="redis://localhost:6379" uv run python repro_search.py
Output:
VectorSearchExecutionException: An error occurred during the search:
'StorageType' object has no attribute 'index'
Root cause
The connector was written against redisvl 0.4.x where process_results(results, query, storage_type: StorageType) accepted a StorageType enum. Upstream redisvl 0.5.0 changed the third parameter to schema: IndexSchema. The pyproject.toml pin redisvl ~= 0.4 resolves in PEP 440 to >=0.4, <1.0, so uv.lock picks up redisvl 0.15.0 where the API is incompatible. Tightening the pin to <0.5 is not possible because redisvl 0.4.x requires redis <6 while SK requires redis >=6.
The KeyError: 'vector' in the hashset deserializer is a separate code defect: the deserialization loop does not check whether the vector field is present in the result dict before attempting to decode it.
Platform
- Language: Python
- Source:
main branch of microsoft/semantic-kernel (SK version 1.41.2)
redis-py version: 6.4.0
redisvl version: 0.15.0 (resolved from ~= 0.4 pin)
- Backend tested: Redis Stack 7.4.7 (RediSearch v21020, ReJSON v20809)
- IDE: Kiro
- OS: macOS 25.4.0 (Darwin), arm64
Additional context
This affects every Python user who installs SK from the committed uv.lock or resolves dependencies fresh. The redisvl API break happened at 0.5.0; the committed lockfile resolves 0.15.0. No version of redisvl that is compatible with both the SK code and the SK redis >=6 requirement exists — the code must be updated to the new redisvl >=0.5 API.
The KeyError in the hashset deserializer is independent of the redisvl version and affects any search call with include_vectors=False (the default).
Test coverage gap
Tried to use the Redis connector for vector search and it didn't work. The existing integration tests (tests/integration/memory/test_vector_store.py) only cover single-record upsert → get → delete and never call collection.search(), so these bugs have had zero test coverage. Added more tests covering the full public surface — vector search, batch CRUD, filters, paging, include_vectors, prefix mode, etc. — and that's how these issues were found. The new tests should be included alongside the fix to prevent regression.
Describe the bug
Vector search (
FT.SEARCH) via the Python Redis connector is completely broken due to two issues in the search path:RedisCollection._inner_search(line 324) passes aStorageTypeenum toredisvl'sprocess_results(), which sinceredisvl0.5.0 expects anIndexSchemaobject. Every search call raisesAttributeError: 'StorageType' object has no attribute 'index'.RedisHashsetCollection._deserialize_store_models_to_dicts(line 620) unconditionally callsbuffer_to_array(rec[field.name], dtype)for every vector field. Wheninclude_vectors=False(the default for search), the vector field is absent from the result dict, causingKeyError: 'vector'.Issue 1 blocks all search calls. Issue 2 blocks hashset search even after issue 1 is fixed, unless the user explicitly passes
include_vectors=True.Expected behavior
collection.search(vector=[...], top=N)should executeFT.SEARCHand return results. Withinclude_vectors=False(default), results should contain data fields but not vector fields.To Reproduce
Prerequisites: a Redis Stack server reachable on localhost:6379,
redis-cli, and a working Python SK dev environment.Install Python SK per
python/DEV_SETUP.md:Start Redis Stack on port 6379:
From
python/, run the following script:REDIS_CONNECTION_STRING="redis://localhost:6379" uv run python repro_search.pyOutput:
Root cause
The connector was written against
redisvl0.4.x whereprocess_results(results, query, storage_type: StorageType)accepted aStorageTypeenum. Upstreamredisvl0.5.0 changed the third parameter toschema: IndexSchema. Thepyproject.tomlpinredisvl ~= 0.4resolves in PEP 440 to>=0.4, <1.0, souv.lockpicks upredisvl0.15.0 where the API is incompatible. Tightening the pin to<0.5is not possible becauseredisvl0.4.x requiresredis <6while SK requiresredis >=6.The
KeyError: 'vector'in the hashset deserializer is a separate code defect: the deserialization loop does not check whether the vector field is present in the result dict before attempting to decode it.Platform
mainbranch ofmicrosoft/semantic-kernel(SK version1.41.2)redis-pyversion: 6.4.0redisvlversion: 0.15.0 (resolved from~= 0.4pin)Additional context
This affects every Python user who installs SK from the committed
uv.lockor resolves dependencies fresh. TheredisvlAPI break happened at 0.5.0; the committed lockfile resolves 0.15.0. No version ofredisvlthat is compatible with both the SK code and the SKredis >=6requirement exists — the code must be updated to the newredisvl >=0.5API.The
KeyErrorin the hashset deserializer is independent of theredisvlversion and affects any search call withinclude_vectors=False(the default).Test coverage gap
Tried to use the Redis connector for vector search and it didn't work. The existing integration tests (
tests/integration/memory/test_vector_store.py) only cover single-record upsert → get → delete and never callcollection.search(), so these bugs have had zero test coverage. Added more tests covering the full public surface — vector search, batch CRUD, filters, paging,include_vectors, prefix mode, etc. — and that's how these issues were found. The new tests should be included alongside the fix to prevent regression.