Releases: withcatai/node-llama-cpp
Releases · withcatai/node-llama-cpp
v2.8.2
09 Dec 22:38
Compare
Sorry, something went wrong.
No results found
2.8.2 (2023-12-09)
Bug Fixes
adapt to breaking changes of llama.cpp (#117 ) (595a6bc )
v2.8.1
06 Dec 12:50
Compare
Sorry, something went wrong.
No results found
v3.0.0-beta.1
26 Nov 19:39
Compare
Sorry, something went wrong.
No results found
Features
BREAKING CHANGES
completely new API (docs will be updated before a stable version is released)
v2.8.0
06 Nov 18:46
Compare
Sorry, something went wrong.
No results found
2.8.0 (2023-11-06)
Features
v2.7.5
05 Nov 23:49
Compare
Sorry, something went wrong.
No results found
v2.7.4
25 Oct 22:55
Compare
Sorry, something went wrong.
No results found
2.7.4 (2023-10-25)
Bug Fixes
do not download redundant node headers (#80 ) (ff1644d )
improve cmake custom options handling (#80 ) (ff1644d )
do not set CMAKE_GENERATOR_TOOLSET for CUDA (#80 ) (ff1644d )
do not fetch information from GitHub when using a local git bundle (#80 ) (ff1644d )
GBNF JSON schema string const formatting (#80 ) (ff1644d )
Features
adapt to the latest llama.cpp interface (#80 ) (ff1644d )
print helpful information to help resolve issues when they happen (#80 ) (ff1644d )
make portable cmake on Windows more stable (#80 ) (ff1644d )
update CMakeLists.txt to match llama.cpp better (#80 ) (ff1644d )
v2.7.3
13 Oct 13:59
Compare
Sorry, something went wrong.
No results found
v2.7.2
12 Oct 22:45
Compare
Sorry, something went wrong.
No results found
2.7.2 (2023-10-12)
Features
minor: save and load history to chat command (#71 ) (dc88531 )
v2.7.1
11 Oct 23:07
Compare
Sorry, something went wrong.
No results found
2.7.1 (2023-10-11)
Bug Fixes
GeneralChatPromptWrapper output (#70 ) (4ff8189 )
improve JSON schema validation error messages (#69 ) (c41da09 )
v2.7.0
11 Oct 16:32
Compare
Sorry, something went wrong.
No results found
2.7.0 (2023-10-11)
Features
add JSON schema grammar support (#68 ) (8ceac05 )
add promptWithMeta function to LlamaChatSession (#68 ) (8ceac05 )