Skip to content
Discussion options

You must be logged in to vote

The support for this in llama.cpp is not yet standardized enough to add robust support for it in node-llama-cpp,
but it is planned and will be added in the future when it'll be practical to evaluate embeddings together with tokens in llama.cpp.

You can watch #88 to get updated when it's available in node-llama-cpp.

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by giladgd
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants