v3.0.0-beta.23
Pre-release
Pre-release
3.0.0-beta.23 (2024-06-09)
Bug Fixes
Features
- parallel function calling (#225) (95f4645)
- preload prompt (#225) (95f4645)
- prompt completion engine (#225) (95f4645)
- chat wrapper based system message support (#225) (95f4645)
- add prompt completion to the Electron example (#225) (95f4645)
- model compatibility warnings (#225) (95f4645)
- Functionary
v2.llama3
support (#225) (95f4645) - parallel function calling with plain Llama 3 Instruct (#225) (95f4645)
- improve function calling support for default chat wrapper (#225) (95f4645)
- parallel model downloads (#225) (95f4645)
- improve the electron example (#225) (95f4645)
customStopTriggers
forLlamaCompletion
(#225) (95f4645)- improve loading status in the Electron example (#226) (4ea0c3c)
Shipped with llama.cpp
release b3091
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)