Skip to content

fix: Change the max_token of the Qianfan large model to max_output_token #895

fix: Change the max_token of the Qianfan large model to max_output_token

fix: Change the max_token of the Qianfan large model to max_output_token #895

Triggered via pull request April 22, 2025 10:07
Status Success
Total duration 19s
Artifacts

llm-code-review.yml

on: pull_request
llm-code-review
15s
llm-code-review
Fit to window
Zoom out
Zoom in