Skip to content

Chat Memory Stops Working When Using Custom Model Naming ConfigurationΒ #1889

@labidiaymen

Description

@labidiaymen

Issue Summary

Chat memory (@MemoryId) stops working when using custom named model configuration (e.g., quarkus.langchain4j.openai.default.*). The memory works correctly with the standard unnamed configuration but breaks when switching to named model configuration patterns.

Environment

  • Quarkus Version: 3.28.4
  • Quarkus LangChain4j BOM Version: 3.28.4
  • Java Version: 24
  • LangChain4j Extensions:
    • quarkus-langchain4j-core
    • quarkus-langchain4j-openai
    • quarkus-langchain4j-pgvector
    • quarkus-langchain4j-mcp
    • quarkus-langchain4j-agentic
  • Build Tool: Maven
  • OS: macOS

AI Service Interface

@RegisterAiService(
    retrievalAugmentor = RagRetriever.class,
    modelName = "default"
)
public interface AssistantService {

    @McpToolBox
    @ToolBox(TimeTools.class) 
    @SystemMessage("{systemMessage}")
    @UserMessage("{userMessage}")
    Multi<String> generateChatResponse(
        @MemoryId String conversationId, 
        String systemMessage, 
        String userMessage
    );
}

Working Configuration (Standard Unnamed Model)

When using the standard unnamed configuration, chat memory works correctly:

# Standard unnamed model configuration
quarkus.langchain4j.chat-model.provider=openai
quarkus.langchain4j.openai.api-key=sk-proj-***
quarkus.langchain4j.openai.chat-model.model-name=gpt-4

# Memory configuration - WORKS with standard unnamed model
quarkus.langchain4j.chat-memory.type=MESSAGE_WINDOW
quarkus.langchain4j.chat-memory.memory-window.max-messages=20

Broken Configuration (Custom Named Model)

When switching to custom named model configuration (e.g., using .default. prefix), chat memory stops working:

# Custom named model configuration using .default.
quarkus.langchain4j.chat-model.provider=openai
quarkus.langchain4j.default.chat-model.provider=openai
quarkus.langchain4j.openai.default.api-key=sk-proj-***
quarkus.langchain4j.openai.default.chat-model.model-name=gpt-4

# Attempted memory configuration 1 - DOESN'T WORK
quarkus.langchain4j.chat-memory.type=MESSAGE_WINDOW
quarkus.langchain4j.chat-memory.memory-window.max-messages=20

Alternative Memory Configuration Attempts (All Failed)

# Attempt 2 - Using .default. prefix for memory - DOESN'T WORK
quarkus.langchain4j.default.chat-memory.type=MESSAGE_WINDOW
quarkus.langchain4j.default.chat-memory.memory-window.max-messages=20

Is there any documentation or example showing:

  • Custom named model configuration + chat memory working together
  • Multiple named models with separate memory configurations

Thanks

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions