Skip to content

Actions: ngxson/llama.cpp

Actions

Python check requirements.txt

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
443 workflow runs
443 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

simple example works
Python check requirements.txt #133: Commit 6089b0a pushed by ngxson
2m 48s xsn/vision
convert : refactor rope_freqs generation (#9396)
Python check requirements.txt #132: Commit 1927378 pushed by ngxson
2m 30s master
py : update transfomers version (#9694)
Python check requirements.txt #131: Commit 08a43d0 pushed by ngxson
2m 33s master
img pre processing
Python check requirements.txt #130: Commit 6854ad4 pushed by ngxson
2m 37s xsn/vision
model is loadable
Python check requirements.txt #129: Commit a75c5c4 pushed by ngxson
2m 29s xsn/vision
add llava to conversion
Python check requirements.txt #128: Commit cd806a7 pushed by ngxson
2m 47s xsn/vision
llama : add IBM Granite MoE architecture (#9438)
Python check requirements.txt #125: Commit 3d6bf69 pushed by ngxson
2m 32s master
llama : support IBM Granite architecture (#9412)
Python check requirements.txt #123: Commit 0d2ec43 pushed by ngxson
2m 50s master
convert : identify missing model files (#9397)
Python check requirements.txt #122: Commit d54c21d pushed by ngxson
10m 16s master
py : add "LLaMAForCausalLM" conversion support (#9485)
Python check requirements.txt #121: Commit 3c7989f pushed by ngxson
2m 30s master
llava : fix the script error in MobileVLM README (#9054)
Python check requirements.txt #120: Commit e665744 pushed by ngxson
2m 33s master
py : support converting local models (#7547)
Python check requirements.txt #119: Commit 8db003a pushed by ngxson
2m 37s master
metal : fix compile warning with GGML_METAL_NDEBUG (#0)
Python check requirements.txt #118: Commit 00ba2ff pushed by ngxson
2m 31s master
ProTip! You can narrow down the results and go further in time using created:<2024-09-06 or the other filters available.