Skip to content

llama.cpp server-cuda-b5608 Public Latest

Install from the command line
Learn more about packages
$ docker pull ghcr.io/vishalc-ibm/llama.cpp:server-cuda-b5608

Recent tagged image versions

  • Published 3 months ago · Digest
    sha256:9de463ec25128575aa29caa6a8cf9918d5c5eaca8b0c01b43beaec3acb669c54
    2 Version downloads
  • Published 3 months ago · Digest
    sha256:779c7d29de6216b3079cf70829754f972e06e2cf91dad3c54ab29ad9a25eb4d3
    2 Version downloads
  • Published 3 months ago · Digest
    sha256:a1c5e14b5d231f8e4c8df34be70ffdf43a075f8b10933d035e2b5cd34fe40871
    2 Version downloads
  • Published 3 months ago · Digest
    sha256:254db20bda2d1646295515000a402b50ae5d309ff1dfb43eeff1770df5a807ac
    2 Version downloads
  • Published 3 months ago · Digest
    sha256:2e614ce14bdcc83a82b2ec8306b27a0d4a30b04cd1e4bc03d97cc77a76feaf8d
    2 Version downloads

Loading

Details


Last published

3 months ago

Total downloads

25