Skip to content

llama.cpp server-cuda-b6090 Public Latest

Install from the command line
Learn more about packages
$ docker pull ghcr.io/shalinib-ibm/llama.cpp:server-cuda-b6090

Recent tagged image versions

  • Published about 11 hours ago · Digest
    sha256:970533ad58ad3fa874e9b71254fb8c609e60ee103946fe4ae4c61ee2353e190b
    0 Version downloads
  • Published about 11 hours ago · Digest
    sha256:479439ab042ffcde7288629470dd5c20034f4f264b21cab3fdd73241651b4167
    0 Version downloads
  • Published about 12 hours ago · Digest
    sha256:dd6c400142187dd85f270c05b801dbee03db9769a6ada9bfbc095e11a513e60a
    0 Version downloads
  • Published about 12 hours ago · Digest
    sha256:7531306645c5a3d861f96ebdaefefea59ac85ec51d9a63c480bb516f5caa68a4
    0 Version downloads
  • Published about 12 hours ago · Digest
    sha256:8b317b1f1f75d99a7fc7c1bbf6eb3d1cdc60918a389f2ea1db98f3ac51b64095
    0 Version downloads

Loading

Details


Last published

11 hours ago

Total downloads

0