Replies: 2 comments
-
Yes, pleas! It's much needed. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Definitely, much much needed |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Checked
Feature request
github : https://github.com/microsoft/BitNet
bitnet.cpp is the official inference framework for 1-bit LLMs (e.g., BitNet b1.58). It offers a suite of optimized kernels, that support fast and lossless inference of 1.58-bit models on CPU (with NPU and GPU support coming next).
Motivation
we can using like other llm providers such as ollam,openai etc
Proposal (If applicable)
No response
Beta Was this translation helpful? Give feedback.
All reactions