Skip to content

llama.cpp server-cuda-b4908 Public Latest

Install from the command line
Learn more about packages
$ docker pull ghcr.io/orca-zhang/llama.cpp:server-cuda-b4908

Recent tagged image versions

  • Published 5 months ago · Digest
    sha256:cd01d8855eb7114a47d2e14df4dece8ccfd3de0ac1a31250f63a25a9b9999449
    2 Version downloads
  • Published 5 months ago · Digest
    sha256:92627109247caf00f1604e857c67036a561a5c67b8b31b9f3ec82a585c1b912f
    2 Version downloads
  • Published 5 months ago · Digest
    sha256:4f4aa58ec6ec5428f0c5718fd0cf05ad6c00caef925145ee0f75758b2456deac
    2 Version downloads
  • Published 5 months ago · Digest
    sha256:eecb9204783024d319f0fa26240859bd17c7d5be7bf2313be028b7ec9be8f3de
    2 Version downloads
  • Published 5 months ago · Digest
    sha256:be3495a4a71084c1325d2aeff534e2df39cb051420104d3057f5b320e43399f5
    2 Version downloads

Loading

Details


Last published

5 months ago

Total downloads

20