A VS Code extension that helps developers practice programming by generating questions and providing AI-powered solutions using local LLMs through Ollama. Get it from VS Code Marketplace
- Quickly create multiple C++ template files for practice
- Files are created in your current workspace
- Each file comes with a basic C++ structure
- Generate programming questions based on difficulty levels:
- Beginner
- Intermediate
- Expert
- Choose the number of questions you want to generate
- Questions are saved as C++ files with problem statements in comments
- Ask any programming-related question
- Get AI-generated solutions using your local Ollama model
- Solutions are organized in folders named after the question
- Each solution includes:
- Original problem statement
- Complete code implementation
- VS Code 1.95.0 or higher
- Node.js
- Ollama installed and running locally
- Any Ollama-compatible LLM model (e.g., Llama2, CodeLlama, Mistral)
- Install the extension from VS Code Marketplace
- Install Ollama on your system
- Pull your preferred LLM model using Ollama
- Make sure Ollama is running locally on port 11434 (default port)
- Open Command Palette (
Ctrl+Shift+PorCmd+Shift+P) - Type "Create Files"
- Enter the number of files you want to create
- Open Command Palette
- Type "Practice Programming Questions"
- Select difficulty level
- Enter the number of questions
- Questions will be generated as C++ files in your workspace
- Open Command Palette
- Type "Ask Me Anything"
- Enter your programming question
- A new folder will be created with the solution
This extension contributes the following commands:
themzway.createFiles: Create multiple practice filesthemzway.practice: Generate practice questionsthemzway.ama: Ask programming questions and get solutions
- Added local LLM support through Ollama
- Improved error handling
- Enhanced question generation
- Better folder organization for solutions
Feel free to submit issues and enhancement requests on our GitHub repository.
This extension is licensed under the MIT License.
- Powered by Ollama and your choice of local LLM models
- Built with VS Code Extension API
Note: Make sure Ollama is running before using the extension. You can check by visiting http://localhost:11434 in your browser.