Releases: tsisw/llama.cpp
Releases · tsisw/llama.cpp
tsi-ggml-0.0.3
What's Changed
- FIR 722 --- ggml-tsi-kernel latest changes updated by @akapoor3518 in #6
- Llama.cpp: Webserver & HTML pages support by @akapoor3518 in #8
- :@FIR-733 - Lllama.cpp: Webserver, add JOB status support for Model by @akapoor3518 in #9
- @FIR-731 - serial_script.py changes to identify end of output by @LewisLui777 in #10
- @FIR-737: Added another endpoint llama-cli t invoke directly in URL by @atrivedi-tsavoritesi in #11
- @FIR-738: Updated the run_llama_cli to be run instead of by @atrivedi-tsavoritesi in #12
- @FIR-736 - lama.cpp: Disable all logs except token generation log by @akapoor3518 in #13
- Fir 746 (Changed run_platform_test.sh to run_llama_cli.sh in flaskIfc.py) by @LewisLui777 in #14
- @FIR-742: Add system-info, txe-restart functionality and cd to right … by @atrivedi-tsavoritesi in #16
- @FIR-720--GGML: Add TMU(MAT_MUL) kernel by @akapoor3518 in #17
- @FIR-754: Added all parameter parsing for the llama-cli by @atrivedi-tsavoritesi in #18
- @FIR-756: Removed the echo of command in flask output by @atrivedi-tsavoritesi in #19
- @FIR-757: Update SDK to 0.1.4 and update release to 0.0.3 for tsi-ggml by @atrivedi-tsavoritesi in #20
New Contributors
- @LewisLui777 made their first contribution in #10
Full Changelog: tsi-ggml-0.0.2...tsi-ggml-0.0.3
tsi-ggml-0.0.2
What's Changed
- @FR-702 @FIR-702 - llama.cpp: Sync with latest opensource by @akapoor3518 in #1
- @FIR-707: Fix requirement for libgomp and move to new sdk 0.1.2 by @atrivedi-tsavoritesi in #2
- Fir 709 - gGGML: Adding SILU Kernel by @akapoor3518 in #4
- FIR-714: Updated the SDK Release r0.1.3 by @atrivedi-tsavoritesi in #5
- Support for SILU
- SDK r0.1.3 Release for MLIR/Runtime
New Contributors
- @akapoor3518 made their first contribution in #1
Full Changelog: https://github.com/tsisw/llama.cpp/commits/tsi-ggml-0.0.2