Skip to content

feat(dashboard): two-session AI compare from sessions.html (#421)#433

Open
jonathaneoliver wants to merge 1 commit into
feat/llm-brush-range-420from
feat/llm-compare-421
Open

feat(dashboard): two-session AI compare from sessions.html (#421)#433
jonathaneoliver wants to merge 1 commit into
feat/llm-brush-range-420from
feat/llm-compare-421

Conversation

@jonathaneoliver
Copy link
Copy Markdown
Owner

Summary

What's new

content/dashboard/llm-compare-modal.js — own .llmchat-compare-cb checkbox class (separate from the live testing UI's .session-checkbox), cap at 2 (3rd click no-ops with FAB cap message), MutationObserver re-injects checkboxes through picker re-renders + keeps duplicates in sync via shared selected Set. Floating FAB bottom-right above the 💬 toggle's offset, count badge, modal reuses the .llm-modal-* CSS already defined in llm-analyze-modal.js. Esc + backdrop both close; focus returns to trigger.

content/dashboard/sessions.html — +1 script tag.

No backend changes — sessions array support landed in #416, the prompt's compare-mode shape (Similarities / Differences / Hypotheses) landed in #418.

What needs your hands-on browser verification

  • Multi-select two sessions of the same content (one with stalls, one without). Click Compare. Confirm the LLM output has the three structured sections.
  • Multi-select different content. Output notes the content difference upfront.
  • Try selecting a 3rd session — checkbox should bounce back unchecked + FAB shows the cap message.
  • Sort or filter the picker after selecting; checkboxes stay checked because the selection set is independent of DOM.
  • FAB position vs the 💬 Discuss toggle — both are bottom-right; my FAB sits at bottom:84px to clear the 56px toggle. Visual check.

🤖 Generated with Claude Code

Multi-select checkboxes on the picker rows + a floating "Compare 2
sessions" button that opens a modal posting {sessions:[a,b]} to
/api/session_chat. The forwarder injects "Compare sessions: [...]"
into the system preamble (#416, #418), and the prompt's compare-
mode hints drive the LLM to output Similarities / Differences /
Hypotheses sections.

  content/dashboard/llm-compare-modal.js
    - Own checkbox class .llmchat-compare-cb (separate from the
      live testing UI's .session-checkbox to avoid interfering
      with failure-mode group operations).
    - Cap at 2: third checkbox click no-ops with the cap message
      surfaced on the FAB.
    - MutationObserver re-injects checkboxes after picker re-
      renders (sort, filter, etc.); duplicate checkbox elements
      stay in sync via the shared selected Set.
    - z-index:3 so clicks reach the checkbox, not the row's
      stretched-link overlay.
    - Floating FAB bottom-right, just above the 💬 toggle's
      offset, with a count badge.
    - Modal reuses the .llm-modal-* CSS namespace defined in
      llm-analyze-modal.js.
    - Esc + backdrop close, focus returns to trigger.

  content/dashboard/sessions.html
    +1 script tag.

No backend changes — sessions array support landed in #416, the
prompt's compare-mode shape landed in #418.

What needs your hands-on browser verification:
  - Multi-select two sessions of the same content (one with
    stalls, one without). Click Compare. Confirm the LLM output
    has Similarities / Differences / Hypotheses sections.
  - Multi-select different content. Compare still works; output
    notes the content difference upfront.
  - Try selecting a 3rd session — checkbox should bounce back
    unchecked + FAB shows the cap message.
  - Sort or filter the picker after selecting; checkboxes stay
    checked because the selection set is independent of DOM.

Part of epic #412.
Closes #421.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
@github-actions github-actions Bot added the enhancement New feature or request label May 6, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant