Skip to content

Latest commit

 

History

History
203 lines (154 loc) · 6.59 KB

File metadata and controls

203 lines (154 loc) · 6.59 KB
slug Config
title Config & Parameters
description Ms-Agent Configuration & Parameters

Config

MS-Agent uses a yaml file for configuration management, typically named agent.yaml, which allows different scenarios to read different configuration files. The specific fields contained in this file are:

Type Configuration

Optional

# type: codeagent
type: llmagent

Identifies the agent type corresponding to this configuration, supporting two types: llmagent and codeagent. Default is llmagent. If the yaml contains a code_file field, code_file takes priority.

Custom Code

Optional, used when customizing LLMAgent

code_file: custom_agent

An external agent class can be used, which needs to inherit from LLMAgent. Several methods can be overridden. If code_file has a value, the type field does not take effect.

LLM Configuration

Required

llm:
  # Large model service backend
  service: modelscope
  # Model id
  model: Qwen/Qwen3-235B-A22B-Instruct-2507
  # Model api_key
  modelscope_api_key:
  # Model base_url
  modelscope_base_url: https://api-inference.modelscope.cn/v1

Inference Configuration

Required

generation_config:
  # The following fields are all standard parameters of OpenAI SDK, you can also configure other parameters supported by OpenAI here.
  top_p: 0.6
  temperature: 0.2
  top_k: 20
  stream: true
  extra_body:
    enable_thinking: false

system and query

Optional, but system is recommended

prompt:
  # LLM system, if not passed, the default `you are a helpful assistant.` is used
  system:
  # LLM initial query, usually not needed
  query:

callbacks

Optional, recommended

callbacks:
  # User input callback, this callback automatically waits for user input after assistant reply
  - input_callback

Tool Configuration

Optional, recommended

tools:
  # Tool name
  file_system:
    # Whether it is mcp
    mcp: false
    # Excluded functions, can be empty
    exclude:
      - create_directory
      - write_file
  amap-maps:
    mcp: true
    type: sse
    url: https://mcp.api-inference.modelscope.net/xxx/sse
    exclude:
      - map_geo
  # Local codebase / document search (sirchmunk), exposed as the `localsearch` tool
  localsearch:
    mcp: false
    paths:
      - ./src
      - ./docs
    work_path: ./.sirchmunk
    mode: FAST
    # Optional: llm_api_key, llm_base_url, llm_model_name (else inherited from `llm`)
    # When true, a shallow sirchmunk DirectoryScanner run at tool connect injects file titles/previews
    # into the `localsearch` tool description (default: false)
    # description_catalog: false
    # description_catalog_max_files: 120
    # description_catalog_max_depth: 5
    # description_catalog_max_chars: 10000
    # description_catalog_max_preview_chars: 400
    # description_catalog_cache_ttl_seconds: 300
    # description_catalog_exclude: []  # extra globs / dir names merged with sirchmunk defaults

For the complete list of supported tools and custom tools, please refer to here

Others

Optional, configure as needed

# Automatic conversation rounds, default is 20 rounds
max_chat_round: 9999

# Tool call timeout, in seconds
tool_call_timeout: 30000

# Output artifact directory
output_dir: output

# Help information, usually appears after runtime errors
help: |
  A commonly use config, try whatever you want!

config_handler

To facilitate customization of config at the beginning of tasks, MS-Agent has built a mechanism called ConfigLifecycleHandler. This is a callback class, and developers can add such a configuration in the yaml file:

handler: custom_handler

This means there is a custom_handler.py file at the same level as the yaml file, and the class in this file inherits from ConfigLifecycleHandler, with two methods:

def task_begin(self, config: DictConfig, tag: str) -> DictConfig:
    return config
def task_end(self, config: DictConfig, tag: str) -> DictConfig:
    return config

task_begin takes effect when the LLMAgent class is constructed, and in this method you can make some modifications to the config. This mechanism is helpful if downstream tasks in your workflow will inherit the yaml configuration from upstream. It's worth noting the tag parameter, which passes in the name of the current LLMAgent, making it convenient to distinguish the current workflow node.

Command Line Configuration

In addition to yaml configuration, MS-Agent also supports several additional command line parameters.

  • query: Initial query, this query has higher priority than prompt.query in yaml
  • config: Configuration file path, supports modelscope model-id
  • trust_remote_code: Whether to trust external code. If a configuration contains some external code, this parameter needs to be set to true for it to take effect
  • load_cache: Continue conversation from historical messages. Cache will be automatically stored in the output configuration. Default is False
  • mcp_server_file: Can read an external mcp tool configuration, format is:
    {
      "mcpServers": {
        "amap-maps": {
          "type": "sse",
          "url": "https://mcp.api-inference.modelscope.net/..."
        }
      }
    }

Any configuration in agent.yaml can be passed in with new values via command line, and also supports reading from environment variables with the same name (case insensitive), for example --llm.modelscope_api_key xxx-xxx.

  • knowledge_search_paths: Comma-separated local search paths. Merges into tools.localsearch.paths and registers the localsearch tool (sirchmunk) for on-demand use by the model—not automatic per-turn injection. LLM settings are inherited from the llm module unless you set tools.localsearch.llm_* fields.

Quick Start for Knowledge Search

Use --knowledge_search_paths or define tools.localsearch in yaml so the model can call localsearch when needed:

# Using default agent.yaml configuration, automatically reuses LLM settings
ms-agent run --query "How to implement user authentication?" --knowledge_search_paths "/path/to/docs"

# Specify configuration file
ms-agent run --config /path/to/agent.yaml --query "your question" --knowledge_search_paths "/path/to/docs"

LLM-related parameters (api_key, base_url, model) are automatically inherited from the llm module in the configuration file, no need to configure them repeatedly. For a dedicated sirchmunk LLM, set tools.localsearch.llm_api_key, llm_base_url, and llm_model_name in yaml. Legacy top-level knowledge_search with the same keys is still read for backward compatibility.