QORA-4B is a 4-billion parameter language model with built-in vision. Used Burn #4608
blockmandev
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Pure Rust multimodal inference engine based on Qwen3.5-4B. No Python, no CUDA, no external ML frameworks. Single executable + model weights = portable AI that runs on any machine.
GPU accelerated — auto-detects Vulkan (Windows/Linux) or Metal (macOS) GPU and runs inference on it. Falls back to CPU if no GPU available. Smart system awareness — detects RAM and CPU at startup and adjusts generation limits automatically.
Try: https://huggingface.co/qoranet/QORA-LLM-4B
Beta Was this translation helpful? Give feedback.
All reactions