Skip to content

add globals.config and OnlineChat support dynamic source#1067

Merged
wzh1994 merged 17 commits intoLazyAGI:mainfrom
wzh1994:wzh/tool_key
Mar 31, 2026
Merged

add globals.config and OnlineChat support dynamic source#1067
wzh1994 merged 17 commits intoLazyAGI:mainfrom
wzh1994:wzh/tool_key

Conversation

@wzh1994
Copy link
Copy Markdown
Contributor

@wzh1994 wzh1994 commented Mar 23, 2026

📌 PR 内容 / PR Description

本 PR 引入了两项相互配合的核心特性:会话级动态配置系统(globals.configOnlineChatModule 动态 source 路由

1. globals.config — 会话级动态配置

新增 _GlobalConfig 类(通过 globals.config 访问),在原有 lazyllm.config(进程级静态配置)之上增加了一层会话(session)级动态覆盖能力:

  • globals.config.add(name, type, default, env) — 注册一个支持动态覆盖的配置项,底层仍委托给 lazyllm.config 注册默认值和环境变量
  • globals.config[key] — 读取时优先从当前 session 的 globals['config'] 查找;若值为 ConfigsDict(keyed by module id),则根据当前调用栈(globals.current_stack())匹配最具体的配置,最终回退到 lazyllm.config
  • globals.config[key] = value — 将值写入当前 session 的 globals['config'],不影响其他 session

与此同时,为支持按模块粒度的配置查找,在 Globals 上新增了调用栈管理

  • push_stack / pop_stack / current_stack / stack_enter(module_id) — 在模块 forward 调用期间维护调用栈,使 ConfigsDict 能按 module id 精准匹配

新增 SessionConfigableBase mixin,为需要参与动态配置的对象提供统一的 idnamegroup_id 身份标识,以及 identities 属性供配置查找使用。

2. OnlineChatModule 动态 source 路由

新增 _DynamicSourceRouterMixindynamic_model_config_context,使 OnlineChatModule(以及后续的 OnlineEmbeddingModuleOnlineMultiModalModule)支持运行时动态切换 source / model / url / skip_auth

  • source='dynamic' — 构造时不绑定具体 supplier,改为在每次 forward 时从 globals.config['dynamic_model_configs'] 读取当前生效的配置,并懒加载对应 supplier(线程安全缓存)
  • dynamic_auth=True — 支持在 forward 的 kwargs 或 globals.config 中动态传入 api_key,适用于多租户场景
  • dynamic_chat_config(modules, source, model, url, skip_auth) — 上下文管理器,在 with 块内为指定模块(或 'default')临时覆盖配置,退出后自动恢复快照,线程/session 安全

🔍 相关 Issue / Related Issue

  • N/A

✅ 变更类型 / Type of Change

  • 新功能 / New feature (non-breaking change that adds functionality)

🧪 如何测试 / How Has This Been Tested?

  1. 构造 OnlineChatModule(source='dynamic', dynamic_auth=True)
  2. 使用 dynamic_chat_config 上下文管理器在不同 session 中分别设置不同的 source/model/api_key,验证各 session 互不干扰
  3. 验证 globals.config.add 注册的配置项在无 session 覆盖时正确回退到 lazyllm.config 默认值
  4. 验证 ConfigsDict 按 module id 精准匹配配置,default 键作为兜底

⚡ 更新后的用法示例 / Usage After Update

import lazyllm
from lazyllm.module.llms.onlinemodule.chat import OnlineChatModule, dynamic_chat_config

# 构造一个动态路由的 chat 模块,不预先绑定 source
m = OnlineChatModule(source='dynamic', dynamic_auth=True)

# 在会话中动态指定 source 和 api_key
with dynamic_chat_config(m, source='openai', model='gpt-4o'):
    # forward 时自动从 globals.config 读取 source/model,并懒加载 supplier
    result = m('Hello')

# 也可以通过 globals.config 全局(session 级)覆盖
from lazyllm import globals
globals.config['dynamic_model_configs'] = {
    'default': {'chat': {'source': 'qwen', 'model': 'qwen-turbo'}}
}
result = m('Hello')

⚠️ 注意事项 / Additional Notes

  • globals.config 写入的是当前 session 的配置,不同 session(线程/协程)隔离,不会互相影响
  • ConfigsDict 的查找顺序为:当前调用栈各层 module id → 'default',因此模块级配置优先于全局默认
  • dynamic_auth=True 要求 source='dynamic'skip_auth=False;若同时设置 skip_auth=True 则需提供 base_url
  • Supplier 实例按 (source, skip_auth) 为 key 进行线程安全缓存,相同参数不会重复创建

@wzh1994 wzh1994 requested review from a team as code owners March 23, 2026 07:38
@gemini-code-assist
Copy link
Copy Markdown

Summary of Changes

Hello, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request introduces a robust global configuration mechanism and enables dynamic selection of online LLM sources, significantly improving the system's adaptability. It also expands the core file system abstraction with new copy and move functionalities across multiple cloud providers, accompanied by extensive documentation. These changes collectively enhance the framework's flexibility, configurability, and file management capabilities.

Highlights

  • Global Configuration System: Introduced a new global configuration system (globals.config) allowing dynamic management of application-wide settings. This includes adding a config attribute to the Globals class and defining a _GlobalConfig class to handle configuration access and modification.
  • Dynamic Online Chat Module Source: Enhanced the OnlineChatModule to support dynamic selection of LLM sources. This allows the module to instantiate and delegate calls to different online LLM providers based on a globally configured dynamic_llm_source setting.
  • Expanded File System Operations: Implemented copy and move operations across various cloud file system suppliers (e.g., Confluence, Feishu, Google Drive, OneDrive, Obsidian, S3, Ones). This significantly broadens the capabilities of the LazyLLMFSBase for file and directory management, with appropriate NotImplementedError for unsupported platforms.
  • Centralized LLM Static Parameters: Refactored StaticParams into servermodule.py and integrated it into LLMBase, centralizing the management of static parameters for LLM instances. The share method now includes an option to copy these static parameters.
  • Improved Module Initialization: Modified ModuleBase to accept optional id, name, and group_id during initialization, providing more flexibility for module identification and grouping.
  • Comprehensive File System Documentation: Added detailed Chinese and English documentation for new and existing LazyLLMFSBase methods, including exists, read_bytes, read_file, write_file, copy, and move, clarifying their functionality and platform support.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for GitHub and other Google products, sign up here.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Copy Markdown

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

这个 PR 包含两个主要的功能增强:

  1. OnlineChat 模块增加了动态数据源(dynamic source)的支持,并引入了 globals.config 来实现请求级别的配置覆盖。
  2. 为文件系统抽象(LazyLLMFSBase)添加了 copymove 操作,并为 S3、Google Drive、Feishu 等多个后端实现了这些操作。

此外,代码进行了一些重构,例如将 StaticParams 的处理逻辑统一移动到了 LLMBase 中,提高了代码的可维护性。

整体来看,这是一个重要的功能更新。我发现了一些可以改进的地方,主要涉及动态数据源的认证处理、全局配置中的一个潜在 bug,S3 后端 move 操作的性能问题,以及一个与 Python 内置 type 关键字冲突的参数命名。请查看具体的审查评论。

Comment on lines +354 to +355
assert __key in self._supported_configs, f'Config {__key} is not supported'
return globals['config'].get(__key) or config[__key]
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

__getitem__ 的实现中存在一个潜在的 bug。当 globals['config'].get(__key) 返回一个 "falsy" 值(如 False, 0, '')时,or 表达式会错误地回退到 config[__key],导致无法通过 globals 正确地覆盖这些值。

例如,如果全局配置中一个键的值是 True,而在某个请求中希望通过 globals 将其覆盖为 False,当前实现会失败。

建议修改逻辑,明确检查 globals['config'] 中是否存在该键,而不是依赖 or 的短路求值。

Suggested change
assert __key in self._supported_configs, f'Config {__key} is not supported'
return globals['config'].get(__key) or config[__key]
assert __key in self._supported_configs, f'Config {__key} is not supported'
if __key in globals['config']:
return globals['config'][__key]
return config[__key]

Comment on lines +63 to +82
def __init__(self, model: str = None, source: str = None, base_url: str = None, stream: bool = True,
return_trace: bool = False, skip_auth: bool = True, type: Optional[str] = None,
api_key: str = None, static_params: Optional[StaticParams] = None, **kwargs):
assert model is None, 'model should be given in forward method or global config.'
assert base_url is None, 'base_url should be given in forward method or global config.'
assert api_key is None, 'api_key should be given in forward method or global config.'
assert skip_auth is True, 'skip_auth should be True for dynamic LLM source.'
super().__init__(stream=stream, type=type, static_params=static_params)
self._return_trace = return_trace
self._kwargs = kwargs
self._suppliers: Dict[str, LLMBase] = {}

def _get_supplier(self):
if (source := globals.config['dynamic_llm_source']) is None:
raise KeyError('No source is configured for dynamic LLM source.')
if source not in self._suppliers:
self._suppliers[source] = getattr(lazyllm.online.chat, source)(
stream=self._stream, type=self._type, static_params=self._static_params,
skip_auth=True, return_trace=self._return_trace, **self._kwargs)
return self._suppliers[source]
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

动态 OnlineChatModule 的实现强制 skip_auth=True,这使得需要认证的动态数据源无法使用。

  1. __init__ 方法中的 assert skip_auth is True 过于严格,限制了用户创建需要认证的动态模块。
  2. _get_supplier 方法中硬编码了 skip_auth=True 来创建供应商实例。

建议修改此逻辑,允许为动态模块配置认证行为。可以将 skip_auth 参数存储在实例中,并在创建供应商时传递它。

Suggested change
def __init__(self, model: str = None, source: str = None, base_url: str = None, stream: bool = True,
return_trace: bool = False, skip_auth: bool = True, type: Optional[str] = None,
api_key: str = None, static_params: Optional[StaticParams] = None, **kwargs):
assert model is None, 'model should be given in forward method or global config.'
assert base_url is None, 'base_url should be given in forward method or global config.'
assert api_key is None, 'api_key should be given in forward method or global config.'
assert skip_auth is True, 'skip_auth should be True for dynamic LLM source.'
super().__init__(stream=stream, type=type, static_params=static_params)
self._return_trace = return_trace
self._kwargs = kwargs
self._suppliers: Dict[str, LLMBase] = {}
def _get_supplier(self):
if (source := globals.config['dynamic_llm_source']) is None:
raise KeyError('No source is configured for dynamic LLM source.')
if source not in self._suppliers:
self._suppliers[source] = getattr(lazyllm.online.chat, source)(
stream=self._stream, type=self._type, static_params=self._static_params,
skip_auth=True, return_trace=self._return_trace, **self._kwargs)
return self._suppliers[source]
def __init__(self, model: str = None, source: str = None, base_url: str = None, stream: bool = True,
return_trace: bool = False, skip_auth: bool = False, type: Optional[str] = None,
api_key: str = None, static_params: Optional[StaticParams] = None, **kwargs):
assert model is None, 'model should be given in forward method or global config.'
assert base_url is None, 'base_url should be given in forward method or global config.'
assert api_key is None, 'api_key should be given in forward method or global config.'
super().__init__(stream=stream, type=type, static_params=static_params)
self._return_trace = return_trace
self._skip_auth = skip_auth
self._kwargs = kwargs
self._suppliers: Dict[str, LLMBase] = {}
def _get_supplier(self):
if (source := globals.config['dynamic_llm_source']) is None:
raise KeyError('No source is configured for dynamic LLM source.')
if source not in self._suppliers:
self._suppliers[source] = getattr(lazyllm.online.chat, source)(
stream=self._stream, type=self._type, static_params=self._static_params,
skip_auth=self._skip_auth, return_trace=self._return_trace, **self._kwargs)
return self._suppliers[source]

Comment on lines +169 to +178
paginator = self._s3_client.get_paginator('list_objects_v2')
to_delete: List[str] = []
for page in paginator.paginate(Bucket=src_bucket, Prefix=src_prefix):
for obj in page.get('Contents', []):
rel = obj['Key'][len(src_prefix):]
self._s3_client.copy_object(CopySource={'Bucket': src_bucket, 'Key': obj['Key']},
Bucket=dst_bucket, Key=dst_prefix + rel)
to_delete.append(obj['Key'])
for key in to_delete:
self._s3_client.delete_object(Bucket=src_bucket, Key=key)
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

move 方法中,对于目录的移动,当前实现是先复制所有对象,然后逐个删除源对象。当处理包含大量对象的目录时,逐个删除效率很低。

建议使用 S3 的 delete_objects API 进行批量删除,可以显著提升性能并减少 API 调用次数。可以在遍历每个分页(page)时,复制完该分页的对象后,就批量删除它们。

Suggested change
paginator = self._s3_client.get_paginator('list_objects_v2')
to_delete: List[str] = []
for page in paginator.paginate(Bucket=src_bucket, Prefix=src_prefix):
for obj in page.get('Contents', []):
rel = obj['Key'][len(src_prefix):]
self._s3_client.copy_object(CopySource={'Bucket': src_bucket, 'Key': obj['Key']},
Bucket=dst_bucket, Key=dst_prefix + rel)
to_delete.append(obj['Key'])
for key in to_delete:
self._s3_client.delete_object(Bucket=src_bucket, Key=key)
paginator = self._s3_client.get_paginator('list_objects_v2')
for page in paginator.paginate(Bucket=src_bucket, Prefix=src_prefix):
contents = page.get('Contents', [])
if not contents:
continue
for obj in contents:
rel = obj['Key'][len(src_prefix):]
self._s3_client.copy_object(CopySource={'Bucket': src_bucket, 'Key': obj['Key']},
Bucket=dst_bucket, Key=dst_prefix + rel)
to_delete_payload = {'Objects': [{'Key': obj['Key']} for obj in contents]}
self._s3_client.delete_objects(Bucket=src_bucket, Delete=to_delete_payload)

@wzh1994 wzh1994 merged commit a0ddc95 into LazyAGI:main Mar 31, 2026
4 of 5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant