Skip to content

Conversation

@Terrdi
Copy link

@Terrdi Terrdi commented Feb 19, 2025

Features

  • support response format
  • support LLMStudio models

Feature Docs

  • I tests with LLMStudio、OpenAI、Qianfan、ollma.
  • Current implement json response format.

Influence
When invoking the llm, you can specify the response_format field to standardize the output format.

Result

Other

…ble LLMStudio model deployment

- Implemented response_format to leverage platform capabilities for controlling the output of large models.
- Added support for deploying models using LLMStudio.
…ble LLMStudio model deployment

- Implemented response_format to leverage platform capabilities for controlling the output of large models.
- Added support for deploying models using LLMStudio.
…/MetaGPT into feature/response_format/llmstudio
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant