Skip to content

Cannot get the whole response with llama.cpp #50

@thuongshoo

Description

@thuongshoo

Describe the bug

Privy chat keeps showing 3 dots ... while Privy log shows "End prompt"
There is an message log with the content: "Error: Type validation failed: Structure:". Please see the screenshot for details.

How to reproduce

  1. Setup llama.cpp server and choose llama.cpp for Privy Provider
  2. open Privy chat
  3. query

Expected behavior

Display the whole answer from llama.cpp

Screenshots

privy_bug

Additional information

  • Privy Version 0.2.8
  • WSL Ubuntu 20.04
  • llama.cpp:
    • local build
    • git: commit 924518e2e5726e81f3aeb2518fb85963a500e93a (HEAD -> master, tag: b4466, origin/master, origin/HEAD)
      Author: Eric Curtin ecurtin@redhat.com
      Date: Sun Jan 12 18:23:10 2025 +0000
    • version: build: 4466 (924518e2) with cc (Ubuntu 9.4.0-1ubuntu1~20.04.2) 9.4.0 for x86_64-linux-gnu

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions