Skip to content

feat(eval): add longmemeval evaluation pipeline#104

Merged
Ki-Seki merged 25 commits intoMemTensor:devfrom
Duguce:dev
Jul 17, 2025
Merged

feat(eval): add longmemeval evaluation pipeline#104
Ki-Seki merged 25 commits intoMemTensor:devfrom
Duguce:dev

Conversation

@Duguce
Copy link
Contributor

@Duguce Duguce commented Jul 16, 2025

Description

Summary: (summary)

Fix: #(issue)

Reviewer: @hush-cd

Checklist:

  • I have performed a self-review of my own code | 我已自行检查了自己的代码
  • I have commented my code in hard-to-understand areas | 我已在难以理解的地方对代码进行了注释
  • I have added tests that prove my fix is effective or that my feature works | 我已添加测试以证明我的修复有效或功能正常
  • I have added necessary documentation (if applicable) | 我已添加必要的文档(如果适用)
  • I have linked the issue to this PR (if applicable) | 我已将 issue 链接到此 PR(如果适用)
  • I have mentioned the person who will review this PR | 我已提及将审查此 PR 的人

@Duguce Duguce marked this pull request as ready for review July 16, 2025 15:48
@Ki-Seki
Copy link
Member

Ki-Seki commented Jul 17, 2025

@Duguce It looks like this PR can be merged, right?

@Duguce
Copy link
Contributor Author

Duguce commented Jul 17, 2025

@Ki-Seki yeah, thank you!

@Ki-Seki Ki-Seki merged commit 2387f6d into MemTensor:dev Jul 17, 2025
16 checks passed
tianxing02 pushed a commit to tianxing02/MemOS that referenced this pull request Feb 24, 2026
* feat(eval): add eval dependencies

* feat(eval): add configs example

* docs(eval): update README.md

* feat(eval): remove the dependency (pydantic)

* feat(eval): add run locomo eval script

* fix(eval): delete about memos redundant search branches

* chore: fix format

* feat(eval): add openai memory on locomo - eval guide

* docs(eval): modify openai memory on locomo - eval guide

* feat(eval): add longmemeval evaluation pipeline

* chore(eval): formatter

* chore: update

* feat(eval): add configs example
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants