-
Notifications
You must be signed in to change notification settings - Fork 36
Description
Problem/Motivation
(Solution inspired on langmem .. )
Currently, the EAT framework's "Smart Memory" is purely episodic. The eat_agent_experiences collection stores a raw, chronological log of everything an agent does. While this is invaluable for detailed analysis, it is inefficient for the SystemAgent to sift through raw episodes every time it needs to make a decision.
The system lacks a mechanism for storing generalized knowledge, or "wisdom," derived from these experiences. For example, if the system successfully processes invoices using a specific toolchain five times, this learning should be distilled into a single, high-confidence "fact" or "best practice."
This new layer of memory, inspired by the "Semantic Memory" concept in the langmem project, will allow agents to quickly access established patterns and facts, making their planning and decision-making faster and more effective.
Proposed Solution
We will introduce a new "Semantic Memory" store backed by a dedicated MongoDB collection, eat_semantic_memory. This collection will store distilled pieces of knowledge (facts, patterns, heuristics) that are generated by a background process (to be implemented in a separate issue).
This issue covers the creation of the storage backend, the data model, and the necessary tools to interact with it.
Implementation Details
-
Define MongoDB Schema:
- Create a new schema definition file:
eat_semantic_memory_schema.md. - The schema for the
eat_semantic_memorycollection should include the following fields:Field Name Data Type Description fact_idString (UUID) Primary Key. Unique identifier for the semantic fact. fact_textString The distilled piece of knowledge in natural language. (e.g., "Using ToolA followed by ToolB is an effective pattern for 'invoice data extraction'.") fact_embeddingArray (Vector) The vector embedding of fact_textfor semantic search.confidence_scoreFloat A score from 0.0 to 1.0 indicating the system's confidence in this fact. source_experience_idsList of Strings A list of experience_ids fromeat_agent_experiencesthat were used to derive this fact.domainString The operational domain this fact applies to (e.g., 'finance', 'code_generation'). Indexed. tagsList of Strings Keywords for filtering and categorization. Indexed. created_atISODate Timestamp of when the fact was created. last_accessed_atISODate Timestamp of when the fact was last retrieved.
- Create a new schema definition file:
-
Create Pydantic Data Model:
- In a new file,
evolving_agents/memory/models.py, create a Pydantic modelSemanticFactthat mirrors the MongoDB schema. This will ensure type safety when working with the data in Python.
- In a new file,
-
Implement the Storage Tool:
- Create a new tool:
evolving_agents/tools/internal/mongo_semantic_memory_tool.py. - This tool,
MongoSemanticMemoryTool, will be responsible for all interactions with theeat_semantic_memorycollection. - It should be initialized with an
LLMServiceand aMongoDBClient. - Implement the following async methods:
add_fact(fact_text: str, confidence: float, source_ids: List[str], ...): This method will take the fact details, generate an embedding forfact_textusing theLLMService, create aSemanticFactobject, and insert it into MongoDB.search_facts(query: str, top_k: int = 5, ...): This method will embed thequeryand perform a$vectorSearchon thefact_embeddingfield in MongoDB to retrieve the most relevant facts.update_fact_confidence(fact_id: str, new_confidence: float): Allows for updating the confidence of an existing fact.find_fact_by_id(fact_id: str): Retrieves a single fact by its ID.
- Create a new tool:
-
Update Database Setup:
- Modify
docs/MONGO-SETUP.mdto include instructions for creating the neweat_semantic_memorycollection. - Crucially, add the definition for the new Atlas Vector Search index on the
fact_embeddingfield. The index should be named something likevector_index_semantic_facts_default.
- Modify
-
Integration with Dependency Container:
- Update
dependency_container.pyor the main application setup to instantiateMongoSemanticMemoryTooland register it in the container so other components can access it.
- Update
Acceptance Criteria
- The
eat_semantic_memory_schema.mdfile is created and defines the new collection structure. - The
MongoSemanticMemoryToolis implemented with methods for adding and searching for semantic facts. - Unit tests are created for
MongoSemanticMemoryToolto verify that facts can be added and semantically searched. - The
docs/MONGO-SETUP.mdguide is updated with instructions for the new collection and its vector index. - The new tool is successfully registered and retrievable from the
DependencyContainer.