-
-
Notifications
You must be signed in to change notification settings - Fork 8
frequently asked questions gguf
This FAQ answers common questions to help you troubleshoot and get the best experience with GGUF Loader.
GGUF Loader is an open-source local AI model loader and assistant that supports GGUF-format models, enabling offline AI chats with features like a floating button and extensible addons.
Click the Load Model button in GGUF Loader.
A window will open allowing you to select the folder containing your GGUF model files.
Select the folder and click OK — the model will then load.
The floating button automatically activates as soon as the model loads.
Whenever you select any text anywhere on your desktop, the floating button appears near the selection.
Click it to chat instantly with your local AI assistant.
- Confirm that a model is loaded — the floating button only works when a model is active.
- Check your operating system permissions for overlay or accessibility features.
- Restart GGUF Loader if needed.
Yes! GGUF Loader supports an addon system to extend features.
See the Addon Guide for instructions on creating and installing addons.
GPU acceleration requires configuring compatible backends such as CUDA or OpenCL.
Refer to the advanced setup wiki page for detailed instructions.
Join the community Discord or open an issue on the GGUF Loader GitHub repository.
If your question isn't listed here, feel free to add it or contact the maintainers directly.