Skip to content

Bug: LLMConfig.extra forbidden does not allow passing LLM-specific options. #1791

@harishmohanraj

Description

@harishmohanraj

Original issue is reported in discord:


Hi Team 👋,

I was exploring the websocket streaming example and wanted to stream responses similarly.

It worked fine in AG2 v0.8.1, but now fails in v0.8.7.


✅ Works in v0.8.1

router_agent = ConversableAgent(
    llm_config={
        "config_list": config_list,
        "stream": True,
    },
    name="router_agent",
    ...
)

❌ Fails in v0.8.7

ConversableAgent(
    name=WELCOME_AGENT_NAME,
    system_message=geeting_prompt,
    llm_config={"config_list": config_list, "stream": True},
)

Error:

1 validation error for _LLMConfig
stream
Extra inputs are not permitted


🔍 Diff Observed
In llm_config.py, line 84:

v0.8.1:

model_config = ConfigDict(extra="allow")

v0.8.7:

model_config = ConfigDict(extra="forbid")

Changing extra back to "allow" in v0.8.7 makes streaming work again.


❓ Questions
Was there a reason for switching to extra="forbid"?
Is "stream": True deprecated in newer versions?
What's the correct way to stream in v0.9+?
If streaming isn't viable, what’s the best way to emit output from a Swarm agent—through events or by building up context?

Thanks for the support! 🙏

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    Projects

    Status

    Todo

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions