A Gedit plugin that integrates with openai API compatible local LLM servers (like llama.cpp) to ask questions about selected text.
- Context-Aware Prompts: Automatically includes selected text in your prompt when asking LLaMA questions.
- Streaming Support: Displays responses as they arrive, providing real-time output from the model.
- Customizable Configuration: Easily configure API URL, API key, model name and keyboard shortcut.
- Multi-line Prompt Input: Use a multi-line text area to compose complex prompts.
- Copy To Clipboard: button to copy LLM response to clipboard.
- Gedit 44+
- Python 3.x
requestslibrary (install withpip install requestsor via your distro's package managerpython3-requests)
-
Install the schema:
cp org.gnome.gedit.plugins.gedit_llama.gschema.xml ~/.local/share/glib-2.0/schemas/ glib-compile-schemas ~/.local/share/glib-2.0/schemas/
-
Copy plugin files:
mkdir -p ~/.local/share/gedit/plugins unzip gedit_LLaMA.zip -d ~/.local/share/gedit/plugins/
-
Enable the plugin:
- Open Gedit
- Go to
Edit→Preferences→Plugins - Enable "Gedit LLaMA"
- Select text in your document (optional)
- Right-click in the editor and choose
Gedit LLaMA→Ask LLaMA - Enter a prompt in the dialog
- View results in a popup dialog that shows real-time streaming output
You can customize:
- API URL (default:
http://127.0.0.1:5000/v1/chat/completions) - API Key (if required by your server)
- Model name (default:
llama.cpp) - Keyboard shortcut (default:
<Ctrl><Alt>l)
Access the configuration via:
- Right-click menu →
Configure LLaMA
- Select text in your document
- Right-click and select "Ask LLaMA"
- Enter your question or instruction
- Plugin sends selected text (if any) + prompt to your local LLM server
- Response is displayed in a streaming popup dialog
- Explain selected code snippets
- Generate documentation for code
- Find bugs or suggest improvements
- Summarize selected text
- Translate code comments
- Debugging assistance
- Code generation based on context
- Requires a local LLM server like llama.cpp running at the configured URL
- Supports both streaming and non-streaming responses
- Plugin automatically detects when new tabs are opened and connects to their views
- Uses GSettings for persistent configuration storage
If you encounter issues:
- Ensure your local LLM server is running and accessible at the specified URL
- Verify that
requestsis installed (pip install requests) - Check that the schema file was properly compiled using
glib-compile-schemas - Confirm the plugin is enabled in Gedit's preferences
MIT License - see LICENSE file for details.