Skip to content

Conversation

Copy link

Copilot AI commented Nov 4, 2025

Implements a memory management system for BotSharp using lifecycle hooks that inject context before AI model invocation and persist conversations afterward, similar to Microsoft's agent-framework AIContextProvider.

Changes

Core Abstraction (BotSharp.Abstraction/AIContext/)

  • IAIContextProvider interface with two lifecycle methods:
    • InvokingAsync(InvokingContext) - retrieve/inject context before AI call
    • InvokedAsync(InvokedContext) - save conversation after AI response
  • Context models: InvokingContext, InvokedContext, AIContext
  • AIContextProviderBase - base implementation with priority support

Integration (BotSharp.Core/Routing/)

  • Modified RoutingService.InvokeAgent to execute registered providers in priority order
  • Providers inject context messages into dialog list before model invocation
  • Error handling isolates provider failures

Default Plugin (BotSharp.Plugin.AIMemory/)

  • ConversationMemoryProvider - sample implementation demonstrating the pattern
  • Retrieves recent conversation history in InvokingAsync
  • Logs memory save operations in InvokedAsync

Usage

// Custom long-term memory provider
public class VectorMemoryProvider : AIContextProviderBase
{
    public override int Priority => 10; // Execute after session context
    
    public override async Task<AIContext?> InvokingAsync(InvokingContext context)
    {
        var similar = await _vectorDb.SearchAsync(context.Dialogs.Last().Content);
        return new AIContext { ContextMessages = BuildContextFrom(similar) };
    }
    
    public override async Task InvokedAsync(InvokedContext context)
    {
        await _vectorDb.StoreAsync(context.Response);
    }
}

// Register in DI
services.AddScoped<IAIContextProvider, VectorMemoryProvider>();

Priority-based execution allows layering: session context (0) → user preferences (5) → vector search (10) → knowledge graphs (20).

Original prompt

参考microsoft/agent-framework 框架中的一个核心抽象基类 AIContextProvider 。你可以把它理解为一个AI代理(Agent)的“上下文管理器”或“记忆插件”。 它的主要职责是在AI模型(例如大语言模型)被调用之前和之后,动态地管理和注入上下文信息。 核心作用 AIContextProvider 有两个关键的生命周期方法,通过重写(override)这两个方法,开发者可以实现自己的上下文管理逻辑: InvokingAsync(InvokingContext context, ...) : 调用前 时机: 在框架准备向AI模型发送请求之前执行。 作用: 这是提供上下文的核心步骤。它负责收集、检索或生成额外的背景信息,然后将这些信息打包成一个 AIConte xt 对象返回。这些信息随后会被插入到发送给AI模型的最终请求中。 这样,这次对话就变成了未来的“记忆”,可以在下一次 Invokin gAsync 阶段被检索到。
为BotSharp 设计一个记忆插件, 在RoutingService.InvokeAgent 方法中使用


💬 We'd love your input! Share your thoughts on Copilot coding agent in our 2 minute survey.

Copilot AI changed the title [WIP] Add memory plugin for BotSharp using AIContextProvider Add AI Memory Plugin with AIContextProvider pattern inspired by Microsoft Agent Framework Nov 4, 2025
Copilot AI requested a review from geffzhang November 4, 2025 13:31
@geffzhang geffzhang closed this Nov 17, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants