Why not rely on SK connector ? #575
Replies: 4 comments
-
hi @kbeaugrand. In SK, "Memory" is one hardcoded Skill (aka Plugin) automatically instantiated when providing one embedding generator and one vector storage (which I think it what you refer to "sk connector"). While that design works, it also leads to some problems:
We've received several requests about having the ability to "talk to my data", concurrently for different data locations, and this solution builds on the code matured under SK, adding new features, and allowing to use multiple storage types at the same time. On the SK side, we're planning to remove the hardcoded Memory instance, recommending to use Memory Plugins instead. About the support limited to Qdrant and Azure Search, it's only a matter of time. To support other engines we need to add new features that are not available in SK Memory, such as the ability to filter memories.
This is already possible today, either using one of the included engines by configuration, or implementing |
Beta Was this translation helpful? Give feedback.
-
Right, I was talking about the SK Memory plugin. My question was more relating about already existing connectors for Semantic Kernel. I developed two connectors in the past weeks (SQL Server and MongoDB) that are working on Semantic Kernel, and those connectors cannot be used inside the semantic-memory. I'll have to create a new implementation for that dedicated on the semantic memory. But I guess that using the SK directly in the orchestrator of semantic memory would be better. |
Beta Was this translation helpful? Give feedback.
-
Unfortunately the SK connectors don't have all the features we need here, so we can't exactly "reuse" them. On the other hand, the long term plan is to "move" them here, so we will do the migration. For instance, we will port core connectors over from SK to SM - while we're still thinking about the best approach to support all the third party vector DBs. In that case I believe we should invite third parties to maintain a nuget extending SM. |
Beta Was this translation helpful? Give feedback.
-
Thank you for your collaboration and contribution to this issue! As the matter at hand appears to be resolved, I'll be closing this issue for now. However, please don't hesitate to reach out if you have any further questions or concerns. We're more than happy to reopen the issue or continue the conversation as needed. Wishing you all the best, and happy coding! 😊 |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Semantic Memory seems excellent for the projects I'm addressing and I think I'll consider going with it fairly soon.
On the other hand, I'm surprised to see that this solution doesn't rely on the semantic kernel connector.
What's more, it doesn't provide an abstraction package enabling us to "easily" integrate other storage providers.
I understand that the functionalities needed to realize this solution were not available, but an evolution might have been necessary?
At present, only CognitiveSearch and QDrant are supported, which is good enough to start with, but may slow down the scale-up of these solutions in the enterprise.
Are there any plans to configure the memory service with your own storage provider?
Beta Was this translation helpful? Give feedback.
All reactions