Skip to content

llama.cpp-gfx906 server-cuda-b6643 Public Latest

Install from the command line
Learn more about packages
$ docker pull ghcr.io/iacoppbk/llama.cpp-gfx906:server-cuda-b6643

Recent tagged image versions

  • Published about 1 hour ago · Digest
    sha256:fb898dd5b1637747e5ced7e344a2c49a99cc5525cf88a1c5d50581cdd01ae710
    0 Version downloads
  • Published about 1 hour ago · Digest
    sha256:4837a543d17f70223dffb9a25362572c94908519dd5a82d097be3c943b24ceb1
    0 Version downloads
  • Published about 1 hour ago · Digest
    sha256:6a3bb59e20d0177e741be0aaea67fff060d8ceba83c5369be90055929f81188f
    0 Version downloads
  • Published about 1 hour ago · Digest
    sha256:2b1c011c6f56187faaa26db56798faa8eb0f3284e750df0989547c969bca9162
    0 Version downloads
  • Published about 1 hour ago · Digest
    sha256:9d27997915d49f34baefe0d735a284ed8dff0cbab8089717371ce1cad18e1d07
    0 Version downloads

Loading


Last published

1 hour ago

Discussions

1

Issues

2

Total downloads

24