Skip to content

[runtime: TRT-LLM] support prompt audio cache & offline inference mode #447

[runtime: TRT-LLM] support prompt audio cache & offline inference mode

[runtime: TRT-LLM] support prompt audio cache & offline inference mode #447

Triggered via pull request September 8, 2025 10:00
Status Success
Total duration 17s
Artifacts

lint.yml

on: pull_request
quick-checks
5s
quick-checks
flake8-py3
13s
flake8-py3
Fit to window
Zoom out
Zoom in

Annotations

2 warnings
quick-checks
The `set-output` command is deprecated and will be disabled soon. Please upgrade to using Environment Files. For more information see: https://github.blog/changelog/2022-10-11-github-actions-deprecating-save-state-and-set-output-commands/
flake8-py3
The `set-output` command is deprecated and will be disabled soon. Please upgrade to using Environment Files. For more information see: https://github.blog/changelog/2022-10-11-github-actions-deprecating-save-state-and-set-output-commands/