·
1164 commits
to master
since this release
What's Changed
- FIR 722 --- ggml-tsi-kernel latest changes updated by @akapoor3518 in #6
- Llama.cpp: Webserver & HTML pages support by @akapoor3518 in #8
- :@FIR-733 - Lllama.cpp: Webserver, add JOB status support for Model by @akapoor3518 in #9
- @FIR-731 - serial_script.py changes to identify end of output by @LewisLui777 in #10
- @FIR-737: Added another endpoint llama-cli t invoke directly in URL by @atrivedi-tsavoritesi in #11
- @FIR-738: Updated the run_llama_cli to be run instead of by @atrivedi-tsavoritesi in #12
- @FIR-736 - lama.cpp: Disable all logs except token generation log by @akapoor3518 in #13
- Fir 746 (Changed run_platform_test.sh to run_llama_cli.sh in flaskIfc.py) by @LewisLui777 in #14
- @FIR-742: Add system-info, txe-restart functionality and cd to right … by @atrivedi-tsavoritesi in #16
- @FIR-720--GGML: Add TMU(MAT_MUL) kernel by @akapoor3518 in #17
- @FIR-754: Added all parameter parsing for the llama-cli by @atrivedi-tsavoritesi in #18
- @FIR-756: Removed the echo of command in flask output by @atrivedi-tsavoritesi in #19
- @FIR-757: Update SDK to 0.1.4 and update release to 0.0.3 for tsi-ggml by @atrivedi-tsavoritesi in #20
New Contributors
- @LewisLui777 made their first contribution in #10
Full Changelog: tsi-ggml-0.0.2...tsi-ggml-0.0.3