Skip to content

fix(llma): extract model from response for OpenAI stored prompts #1261

fix(llma): extract model from response for OpenAI stored prompts

fix(llma): extract model from response for OpenAI stored prompts #1261

Triggered via pull request December 18, 2025 18:45
Status Success
Total duration 2m 45s
Artifacts

ci.yml

on: pull_request
Code quality checks
21s
Code quality checks
Django 5 integration tests
29s
Django 5 integration tests
Matrix: tests
Fit to window
Zoom out
Zoom in

Annotations

6 warnings
Code quality checks
Could not find required-version under [tool.uv] in pyproject.toml. Falling back to latest
Python 3.10 tests
Could not find required-version under [tool.uv] in pyproject.toml. Falling back to latest
Python 3.11 tests
Could not find required-version under [tool.uv] in pyproject.toml. Falling back to latest
Python 3.12 tests
Could not find required-version under [tool.uv] in pyproject.toml. Falling back to latest
Python 3.13 tests
Could not find required-version under [tool.uv] in pyproject.toml. Falling back to latest
Python 3.14 tests
Could not find required-version under [tool.uv] in pyproject.toml. Falling back to latest