Skip to content

feat: implement memory agents with session summaries, agentic managem…

2c5a455
Select commit
Loading
Failed to load commit list.
Open

feat: implement memory agents with session summaries, agentic management, and enhanced references #978

feat: implement memory agents with session summaries, agentic managem…
2c5a455
Select commit
Loading
Failed to load commit list.
Cursor / Cursor BugBot completed Jul 17, 2025 in 2m 35s

BugBot Review

BugBot Analysis Progress (2m 36s elapsed)

✅ Gathered PR context (1s)
✅ Analyzed code changes (0s)
✅ Completed bug detection — 2 potential bugs found (2m 33s)
✅ Validation and filtering completed (0s)
✅ Posted analysis results — 2 bugs reported (2s)
✅ Analysis completed successfully (0s)

Final Result: BugBot completed review and found 2 potential issues

Request ID: serverGenReqId_a416ce11-98b3-4a2a-bf95-7c3499fc34dc

Details

Bug: Undefined Variable in Conditional Logic

A NameError occurs in the remember() method when agentic memory is enabled but auto-classification is disabled. The importance_score variable is referenced in the metadata update but is only assigned when self.auto_classify is True, causing it to be undefined when self.auto_classify is False.

src/praisonai-agents/praisonaiagents/memory/memory.py#L1268-L1292

# -------------------------------------------------------------------------
def remember(self, fact: str, metadata: Optional[Dict[str, Any]] = None) -> bool:
"""Store important information with agentic classification"""
if not self.agentic_enabled:
# Fallback to regular long-term storage
self.store_long_term(fact, metadata=metadata)
return True
# Auto-classify the importance if enabled
if self.auto_classify:
importance_score = self._classify_importance(fact)
if importance_score < self.confidence_threshold:
self._log_verbose(f"Fact importance {importance_score} below threshold {self.confidence_threshold}")
return False
# Store with agentic metadata
agentic_metadata = metadata or {}
agentic_metadata.update({
"stored_by": "agentic_memory",
"importance_score": importance_score if self.auto_classify else 1.0,
"auto_classified": self.auto_classify
})
self.store_long_term(fact, metadata=agentic_metadata)
return True

Fix in CursorFix in Web


Bug: Metadata Loss and Data Inconsistency

The update_memory method has two issues:

  1. It incorrectly replaces all existing metadata with only update flags in both the SQLite database and ChromaDB, leading to the loss of original metadata (e.g., quality, importance scores).
  2. In ChromaDB, if embedding generation fails after a memory is deleted for an update, the memory is not re-added, causing data inconsistency where the record exists in SQLite but is missing from the vector store.

src/praisonai-agents/praisonaiagents/memory/memory.py#L1299-L1329

c = conn.cursor()
c.execute(
"UPDATE long_mem SET content = ?, meta = ? WHERE id = ?",
(new_fact, json.dumps({"updated": True, "updated_at": time.time()}), memory_id)
)
updated = c.rowcount > 0
conn.commit()
conn.close()
# Update in vector store if available
if self.use_rag and hasattr(self, "chroma_col"):
try:
# ChromaDB doesn't support direct updates, so we delete and re-add
self.chroma_col.delete(ids=[memory_id])
if LITELLM_AVAILABLE:
import litellm
response = litellm.embedding(
model=self.embedding_model,
input=new_fact
)
embedding = response.data[0]["embedding"]
elif OPENAI_AVAILABLE:
from openai import OpenAI
client = OpenAI()
response = client.embeddings.create(
input=new_fact,
model=self.embedding_model
)
embedding = response.data[0].embedding
else:
return updated

Fix in CursorFix in Web


BugBot free trial expires on July 22, 2025
Learn more in the Cursor dashboard.

Was this report helpful? Give feedback by reacting with 👍 or 👎