Skip to content

fix(llma): extract model from response for OpenAI stored prompts #1268

fix(llma): extract model from response for OpenAI stored prompts

fix(llma): extract model from response for OpenAI stored prompts #1268

Triggered via pull request December 19, 2025 17:39
Status Success
Total duration 2m 50s
Artifacts

ci.yml

on: pull_request
Code quality checks
22s
Code quality checks
Django 5 integration tests
26s
Django 5 integration tests
Matrix: tests
Fit to window
Zoom out
Zoom in

Annotations

6 warnings
Code quality checks
Could not find required-version under [tool.uv] in pyproject.toml. Falling back to latest
Python 3.10 tests
Could not find required-version under [tool.uv] in pyproject.toml. Falling back to latest
Python 3.11 tests
Could not find required-version under [tool.uv] in pyproject.toml. Falling back to latest
Python 3.12 tests
Could not find required-version under [tool.uv] in pyproject.toml. Falling back to latest
Python 3.13 tests
Could not find required-version under [tool.uv] in pyproject.toml. Falling back to latest
Python 3.14 tests
Could not find required-version under [tool.uv] in pyproject.toml. Falling back to latest