Skip to content

fix(llma): extract model from response for OpenAI stored prompts #1263

fix(llma): extract model from response for OpenAI stored prompts

fix(llma): extract model from response for OpenAI stored prompts #1263

Triggered via pull request December 19, 2025 16:05
Status Success
Total duration 2m 39s
Artifacts

ci.yml

on: pull_request
Code quality checks
26s
Code quality checks
Django 5 integration tests
19s
Django 5 integration tests
Matrix: tests
Fit to window
Zoom out
Zoom in

Annotations

6 warnings
Code quality checks
Could not find required-version under [tool.uv] in pyproject.toml. Falling back to latest
Python 3.10 tests
Could not find required-version under [tool.uv] in pyproject.toml. Falling back to latest
Python 3.11 tests
Could not find required-version under [tool.uv] in pyproject.toml. Falling back to latest
Python 3.13 tests
Could not find required-version under [tool.uv] in pyproject.toml. Falling back to latest
Python 3.12 tests
Could not find required-version under [tool.uv] in pyproject.toml. Falling back to latest
Python 3.14 tests
Could not find required-version under [tool.uv] in pyproject.toml. Falling back to latest