Skip to content

llama.cpp server-cuda-b6123 Public Latest

Install from the command line
Learn more about packages
$ docker pull ghcr.io/ryan-mangeno/llama.cpp:server-cuda-b6123

Recent tagged image versions

  • Published about 11 hours ago · Digest
    sha256:b7e70c129a67a7c2d78bd65e9230159a17c8dded6c7f1999ddb613f5e2d6419f
    0 Version downloads
  • Published about 11 hours ago · Digest
    sha256:52c496762e9b739c8c24120ccc4184f34b70ac42da6d2ac34f5b854928ce4a1a
    0 Version downloads
  • Published about 12 hours ago · Digest
    sha256:6e8eb6bf8f2001baa677913d9b0bdb3b2666aea13a27dd301aaf9b71b2490eb6
    0 Version downloads
  • Published about 12 hours ago · Digest
    sha256:7a87f31d3ba0743a6d6d9f38eb3328173c5d2eb9d60d98d13fd4afcbd1f64d1a
    0 Version downloads
  • Published about 12 hours ago · Digest
    sha256:aa2e3f3756e8970ba569ebde24d07f613c1b5fd27c3ab785f883190b73a7b141
    0 Version downloads

Loading

Details


Last published

11 hours ago

Total downloads

0