Code smarter, chat faster.
Built with the tools and technologies:
- Overview
- Features
- Project Structure
- Getting Started
- Project Roadmap
- Extension Settings
- Known Issues
- Release Notes
- License
Deepseek-chat-ext boosts developer productivity by integrating a powerful AI chat interface directly into VS Code. Ask coding questions, get instant answers, and streamline your workflow. This VS Code extension leverages a DeepSeek model via Ollama, providing on-demand code assistance and information retrieval within your editor.
| Feature | Summary | |
|---|---|---|
| ⚙️ | Architecture |
|
| 🔩 | Code Quality |
|
| 🔌 | Integrations |
|
| 🧩 | Modularity |
|
└── deepseek-chat-ext/
├── CHANGELOG.md
├── eslint.config.mjs
├── LICENSE
├── package.json
├── README.md
├── src
│ ├── extension.ts
│ └── test
├── tsconfig.json
└── vsc-extension-quickstart.mdDEEPSEEK-CHAT-EXT/
__root__
eslint.config.mjs - The eslint.config.mjs file configures ESLint for TypeScript code within the project
- It defines linting rules and parser settings to enforce consistent code style and detect potential errors in TypeScript files
- Specifically, it mandates naming conventions for imports and employs warnings for missing curly braces, loose equality comparisons, literal throws, and missing semicolons.package.json - The package.json file configures a VS Code extension named deepseek-chat-ext
- It integrates a DeepSeek model, leveraging Ollama, to provide a chat interface within the IDE
- The extension's primary function is to enable users to interact with the DeepSeek model directly from VS Code, enhancing the development workflow
- Necessary dependencies and build scripts are also defined.tsconfig.json - tsconfig.json configures the TypeScript compiler for the project
- It specifies the JavaScript module system, target ECMAScript version, output directory, and includes strict type checking
- The configuration ensures consistent code compilation and enhances maintainability by leveraging TypeScript's advanced type system features
- It directs the compiler to output to the out directory from the src directory, using ES2022 and DOM libraries.
src
extension.ts - The extension integrates a chat interface within VS Code
- It leverages the ollama library to communicate with a language model, enabling users to ask questions and receive responses directly within the editor
- A webview displays the conversation, and error handling includes restarting the ollama server if necessary
- The extension enhances coding workflows by providing on-demand code assistance and information retrieval.test
extension.test.ts - Tests comprise the extension.test.ts file, verifying core functionality within the VS Code extension
- It uses the VS Code testing framework and assertion library to execute test cases
- Currently, the provided tests are rudimentary examples, demonstrating basic assertion capabilities.
Before getting started with deepseek-chat-ext, ensure your runtime environment meets the following requirements:
- IDE: VS Code - version ^1.96.0
- Programming Language: JavaScript runtime - Node.js
- Package Manager: Npm
- AI model provider server: Ollama
Install deepseek-chat-ext using the following method:
- Clone the deepseek-chat-ext repository:
git clone https://github.com/Eyelor/deepseek-chat-ext- Navigate to the project directory:
cd deepseek-chat-ext- Install the project dependencies:
Using npm
npm install- Install the DeepSeek Coder 2 Lite (you need to have ollama server installed before):
Using ollama
ollama pull deepseek-coder-v2:16bTo start debugging and testing, open the project in VS Code and press F5 to launch the extension in a new separate window in watch mode. You can also find helpful information in the vsc-extension-quickstart.md file. To find the chat window, type Ctrl+Shift+P, then select DeepSeek Chat. When you hit Enter, the chat window will open.
If you want to run deepseek-chat-ext in your VS Code as normal extension you need to generate the .vsix file using this command (make sure that the out/ directory is generated after using F5 - watch mode):
Using npm
npm run packageAfter the .vsix file is generated, go to the Extensions tab is VS Code, then click ... -> Install from VSIX..., and choose the generated .vsix file. From this point on, the extension will be available as described in the Debugging and Manual Testing section.
-
Task 1:Basic behaviour and styling implemented. -
Task 2: Implement context for conversation. -
Task 3: Implement more features and optimize workflow. -
Task 4: Implement tests. -
Task 5: Implement detailed documentation. -
Task 6: Publish extension.
The extension does not add any settings to VS Code by default.
- Syntax highlighting may not be perfect for all languages.
- If the Ollama server has not started before, an error will be returned in the chat, and you will need to wait a few seconds and try again.
- Initial release of the extension.
- Basic chat functionality with a simple user interface.
This project is protected under the Apache 2.0 License. For more details, refer to the LICENSE file.


