Skip to content

fix(llma): extract model from response for OpenAI stored prompts #1266

fix(llma): extract model from response for OpenAI stored prompts

fix(llma): extract model from response for OpenAI stored prompts #1266

Triggered via pull request December 19, 2025 16:21
Status Success
Total duration 2m 41s
Artifacts

ci.yml

on: pull_request
Code quality checks
22s
Code quality checks
Django 5 integration tests
23s
Django 5 integration tests
Matrix: tests
Fit to window
Zoom out
Zoom in

Annotations

6 warnings
Code quality checks
Could not find required-version under [tool.uv] in pyproject.toml. Falling back to latest
Python 3.10 tests
Could not find required-version under [tool.uv] in pyproject.toml. Falling back to latest
Python 3.12 tests
Could not find required-version under [tool.uv] in pyproject.toml. Falling back to latest
Python 3.11 tests
Could not find required-version under [tool.uv] in pyproject.toml. Falling back to latest
Python 3.14 tests
Could not find required-version under [tool.uv] in pyproject.toml. Falling back to latest
Python 3.13 tests
Could not find required-version under [tool.uv] in pyproject.toml. Falling back to latest