Skip to content

llama.cpp server-cuda-b4956 Public Latest

Install from the command line
Learn more about packages
$ docker pull ghcr.io/neozhangjianyu/llama.cpp:server-cuda-b4956

Recent tagged image versions

  • Published 5 months ago · Digest
    sha256:34b92e771c11c4005359b837d721930b96eef97487d0e3d3236b4616634462be
    0 Version downloads
  • Published 5 months ago · Digest
    sha256:bc290851d72b7ade1bc52b02bc3f4b63fa7d95e7982c130213f4c525fd05b688
    0 Version downloads
  • Published 5 months ago · Digest
    sha256:021a445e57afb9c1239c065809ea231946724a88275d08f0d18d9687624707f9
    0 Version downloads
  • Published 5 months ago · Digest
    sha256:6c143480661ede123522b0337d4d1357dd0f8a38abcd526b8d2ad70ab4b4f020
    0 Version downloads
  • Published 5 months ago · Digest
    sha256:42d9d4b1d2268e258af50a072e0f6bd93a90071a45323a5680d9f4a208ef4979
    0 Version downloads

Loading

Details


Last published

5 months ago

Total downloads

3.06K