Skip to content

llama.cpp server-cuda-b5307 Public Latest

Install from the command line
Learn more about packages
$ docker pull ghcr.io/isotr0py/llama.cpp:server-cuda-b5307

Recent tagged image versions

  • Published 4 months ago · Digest
    sha256:3be4d0bc58762527d074c606af467fe33599d1bc3f0d97a2caa87af2158bbc06
    2 Version downloads
  • Published 4 months ago · Digest
    sha256:5a8d7fb02354c5c16c4b53359296becaa212777a9f67a4284f6c4d11c0fa8098
    2 Version downloads
  • Published 4 months ago · Digest
    sha256:360bcc0c437f763bc4035abd5cf5996765e7bf3b582d7a824e6c960654cbdfea
    2 Version downloads
  • Published 4 months ago · Digest
    sha256:103b53cfcbd9dbf29cdb0c8d49d986340aca66d586cb13a7a2ed43cf88cb9a4d
    2 Version downloads
  • Published 4 months ago · Digest
    sha256:e0ffedfeb41632df83dbaf4372fff4eda2f22cb87a1be2f415dc7e401712ee43
    2 Version downloads

Loading

Details


Last published

4 months ago

Total downloads

24