Skip to content

Readable LLM responses while streaming/generatingΒ #5031

@fernandaspets

Description

@fernandaspets

What specific problem does this solve?

The problem is when the LLM starts streaming it's response I can't follow along and read it. When I scrolled up it keeps going right back to the bottom and the box in which the streaming text is in is not big enough to have time to ready almost anything. It would be very nice if we could make it so ites easy to scrolll back and forth and the place you want to read stays there and doesn't hop back to the bottom of the stream right away before you get a chance to read.

Additional context (optional)

No response

Request checklist

  • I've searched existing Issues and Discussions for duplicates
  • This describes a specific problem with clear impact and context

Interested in implementing this?

  • Yes, I'd like to help implement this feature

Implementation requirements

  • I understand this needs approval before implementation begins

How should this be solved? (REQUIRED if contributing, optional otherwise)

No response

How will we know it works? (Acceptance Criteria - REQUIRED if contributing, optional otherwise)

No response

Technical considerations (REQUIRED if contributing, optional otherwise)

No response

Trade-offs and risks (REQUIRED if contributing, optional otherwise)

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    Status

    Done

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions