-
Notifications
You must be signed in to change notification settings - Fork 167
Description
Description
Hi. I am facing an issue when I use the evals visualizer.
I use message histories in the input/output of evals. This results in long jsons. Since I want to inspect them, I use the row height as full
.
I like to close all message histories to see a compact view and then I start opening them to inspect model behavior.
However, if the first two json rows are fully closed and the rest are open, when I scroll to the bottom and then to the top of the list, the closed rows are automatically opened; they "forget" I closed them. This makes me loose were I am when looking at the evals.
Here is the URL of my experiment:
https://logfire-us.pydantic.dev/onbot/onbot-backend/evals/compare?experiment=0199b1458ffbe9613d5dae8898c1e1a3-093c1b371a171e2a-2025-10-04T22%3A09%3A10.139Z
Here is what I visualize:

And what happen when I scroll back to the top:

Python, Logfire & OS Versions, related packages (not required)
logfire="4.7.0"
platform="Linux-6.12.44-1-lts-x86_64-with-glibc2.42"
python="3.13.7 (main, Aug 15 2025, 12:34:02) [GCC 15.2.1 20250813]"
[related_packages]
requests="2.32.5"
pydantic="2.11.7"
fastapi="0.115.8"
openai="1.109.1"
protobuf="5.29.5"
rich="13.9.4"
executing="2.2.1"
opentelemetry-api="1.37.0"
opentelemetry-exporter-otlp-proto-common="1.37.0"
opentelemetry-exporter-otlp-proto-http="1.37.0"
opentelemetry-instrumentation="0.58b0"
opentelemetry-instrumentation-asgi="0.58b0"
opentelemetry-instrumentation-celery="0.58b0"
opentelemetry-instrumentation-django="0.58b0"
opentelemetry-instrumentation-httpx="0.58b0"
opentelemetry-instrumentation-requests="0.58b0"
opentelemetry-instrumentation-wsgi="0.58b0"
opentelemetry-proto="1.37.0"
opentelemetry-sdk="1.37.0"
opentelemetry-semantic-conventions="0.58b0"
opentelemetry-util-http="0.58b0"