Skip to content

llama-server with cpu device is not working in docker image #2634

@b-reich

Description

@b-reich
services:
  tabby:
    restart: always
    image: tabbyml/tabby
    entrypoint: /opt/tabby/bin/tabby-cpu
    command: serve --model StarCoder-1B --chat-model Qwen2-1.5B-Instruct
    volumes:
      - ".data/tabby:/data"
    ports:
      - 8080:8080

which is document here https://tabby.tabbyml.com/docs/quick-start/installation/docker-compose/ wont work

tabby-1  | 2024-07-13T12:53:36.624504Z  WARN llama_cpp_server::supervisor: crates/llama-cpp-server/src/supervisor.rs:99: llama-server <embedding> exited with status code 127
tabby-1  | 2024-07-13T12:53:36.624528Z  WARN llama_cpp_server::supervisor: crates/llama-cpp-server/src/supervisor.rs:111: <embedding>: /opt/tabby/bin/llama-server: error while loading shared libraries: libcuda.so.1: cannot open shared object file: No such file or directory

Originally posted by @b-reich in #2082 (comment)

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions