Skip to content

Conversation

@seedspirit
Copy link

resolve issue #3 (Valkey Support)

  1. Valkey Support
  • Added the valkey interface
  • Improved performance by using valkey-glide instead of valkey-py inside
  • However, we chose to inject a separately created config dataclass in consideration of the possibility that it will be replaced by valkey-py in the future.
  1. Provided Decorator Factory for DX improvement
  • Currently, it is cumbersome to initialize separate CacheInterface and Cache Decorator each time, so we added a Factory class to shorten the process.
  1. Test Code
  • Added a test framework for the valkey cache decorator
  • Refactored to inject decorator fixtures to prevent rewriting the same test code repeatedly for each decorator type.

Copy link

@Ilevk Ilevk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

here's some comments.


result = await func(*args, **kwargs)
if result is not None:
await self.cache.set(_key, result, ttl=current_ttl)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

p3. even if result is None, I think it's better to cache. it helps the code not query a source repeatedly.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But redis / valkey don’t allow to set None value. so you should find other way.
In my case, I used a special value to represent None in valkey. e.g) '_NA'

logger = logging.getLogger(__name__)


class ValkeyCacheDecorator(CacheDecoratorInterface):
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

p3. I'll return to fundamental questions. do we need a decorator implementation for each cache?

I think the sync/async decorator is enough, adding another kind of cache would make the source code duplicate a lot.

for example, redis cache decorator and valkey cache decorator have a similar interface and logic. (mostly same).

cached_keys: list[str] = await self.cache.get_keys_regex(
target_func_name=target_func_name, pattern=pattern
)
for cache_key in cached_keys:
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

p3. you can use pipeline to send commands at once. but it might need to add interface for pipeline in cache class.

Comment on lines 26 to 29
dependencies = [
"redis>=6.2.0",
"valkey-glide>=2.0.1",
]
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Make dependency optional

Comment on lines +29 to +37
async def _serialize(self, value: Any) -> bytes:
data = pickle.dumps(value)
return data

async def _deserialize(self, data: bytes | None) -> Any:
if data is None:
return None
data = pickle.loads(data) # noqa: S301
return data
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  1. I think we can serialize with mspack
  2. Can we sperate serial/deserializer layer?

Copy link

@Ilevk Ilevk Jul 29, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need @sigridjineth's opinion

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants