Actions: ggml-org/llama.cpp
Actions
7,068 workflow run results
7,068 workflow run results
backend_synchronize
CI
#23244:
Pull request #13939
opened
by
lhez
server: update deepseek reasoning format (pass reasoning_content as diffs)
CI
#23241:
Pull request #13933
synchronize
by
ochafik
server: update deepseek reasoning format (pass reasoning_content as diffs)
CI
#23240:
Pull request #13933
synchronize
by
ochafik
chat: improve llama 3.x handling of <|python_tag|> (+ allow --special combo)
CI
#23239:
Pull request #13932
synchronize
by
ochafik
chat: improve llama 3.x handling of <|python_tag|> (+ allow --special combo)
CI
#23235:
Pull request #13932
synchronize
by
ochafik
server: update deepseek reasoning format (pass reasoning_content as diffs)
CI
#23234:
Pull request #13933
synchronize
by
ochafik
server: update deepseek reasoning format (pass reasoning_content as diffs)
CI
#23232:
Pull request #13933
opened
by
ochafik
chat: improve llama 3.x handling of <|python_tag|> (+ allow --special combo)
CI
#23230:
Pull request #13932
opened
by
ochafik
chat: allow unclosed thinking tags
CI
#23229:
Pull request #13931
opened
by
ochafik