Skip to content

llama.cpp.mcp server-cuda-b6164 Public Latest

Install from the command line
Learn more about packages
$ docker pull ghcr.io/klogdotwebsitenotdotcom/llama.cpp.mcp:server-cuda-b6164

Recent tagged image versions

  • Published about 20 hours ago · Digest
    sha256:ae437986ef9d589fc0ee2fd7fc414c21d975f0501f66243b642ab2a3f14a8b1e
    0 Version downloads
  • Published about 20 hours ago · Digest
    sha256:5672039fc17eea7738fe91b38a8f2e87a7fdd4e786f90818ddcf4cd462abd983
    0 Version downloads
  • Published about 20 hours ago · Digest
    sha256:07cb5224171aebedbd4db54b12bb94fbb41fb34be8cb9ffe15a9687ca0056d18
    0 Version downloads
  • Published about 20 hours ago · Digest
    sha256:a3307bd273512834bc385554cffbe9f83cdabac1008fb45c990cd28932d7f7d7
    0 Version downloads
  • Published about 20 hours ago · Digest
    sha256:ecd77ae633c6e1bc7dc8e19106e0f62cd2a7ceb85b8d33cc74d605f0aa087c56
    0 Version downloads

Loading


Last published

20 hours ago

Total downloads

0