Skip to content

Eyelor/deepseek-chat-ext

Repository files navigation

DEEPSEEK-CHAT-EXT

Code smarter, chat faster.

Built with the tools and technologies:

npm JavaScript TypeScript Ollama


Table of Contents


Overview

Deepseek-chat-ext boosts developer productivity by integrating a powerful AI chat interface directly into VS Code. Ask coding questions, get instant answers, and streamline your workflow. This VS Code extension leverages a DeepSeek model via Ollama, providing on-demand code assistance and information retrieval within your editor.

Finding Extension Test


Features

Feature Summary
⚙️ Architecture
  • The extension uses a client-server architecture, with the VS Code extension (src/extension.ts) as the client and the Ollama server hosting the DeepSeek model.
  • A webview is employed to display the chat interface within VS Code.
  • The architecture relies on <ollama> library for communication with the language model.
  • Error handling includes restarting the Ollama server if necessary, indicating a robust design for handling potential connection issues.
🔩 Code Quality
  • Uses <TypeScript> for development, enabling static type checking and improved code maintainability. See tsconfig.json.
  • Employs <ESLint> with a configuration file (eslint.config.mjs) to enforce code style and detect potential errors. This configuration includes rules for naming conventions, missing curly braces, loose equality comparisons, literal throws, and missing semicolons.
🔌 Integrations
  • Integrates with <Ollama> to interact with the DeepSeek language model.
  • Uses the VS Code extension API to create a webview for the chat interface.
  • Leverages <@vscode/test-cli> and <@vscode/test-electron> for testing within the VS Code environment.
  • Dependencies include various TypeScript and testing related packages as seen in package.json.
🧩 Modularity
  • The codebase is structured with a src directory containing the main extension logic (src/extension.ts) and a src/test directory for tests.
  • The use of TypeScript promotes modularity through interfaces and classes.

Project Structure

└── deepseek-chat-ext/
    ├── CHANGELOG.md
    ├── eslint.config.mjs
    ├── LICENSE
    ├── package.json
    ├── README.md
    ├── src
    │   ├── extension.ts
    │   └── test
    ├── tsconfig.json
    └── vsc-extension-quickstart.md

Project Index

DEEPSEEK-CHAT-EXT/
__root__
eslint.config.mjs - The eslint.config.mjs file configures ESLint for TypeScript code within the project
- It defines linting rules and parser settings to enforce consistent code style and detect potential errors in TypeScript files
- Specifically, it mandates naming conventions for imports and employs warnings for missing curly braces, loose equality comparisons, literal throws, and missing semicolons.
package.json - The package.json file configures a VS Code extension named deepseek-chat-ext
- It integrates a DeepSeek model, leveraging Ollama, to provide a chat interface within the IDE
- The extension's primary function is to enable users to interact with the DeepSeek model directly from VS Code, enhancing the development workflow
- Necessary dependencies and build scripts are also defined.
tsconfig.json - tsconfig.json configures the TypeScript compiler for the project
- It specifies the JavaScript module system, target ECMAScript version, output directory, and includes strict type checking
- The configuration ensures consistent code compilation and enhances maintainability by leveraging TypeScript's advanced type system features
- It directs the compiler to output to the out directory from the src directory, using ES2022 and DOM libraries.
src
extension.ts - The extension integrates a chat interface within VS Code
- It leverages the ollama library to communicate with a language model, enabling users to ask questions and receive responses directly within the editor
- A webview displays the conversation, and error handling includes restarting the ollama server if necessary
- The extension enhances coding workflows by providing on-demand code assistance and information retrieval.
test
extension.test.ts - Tests comprise the extension.test.ts file, verifying core functionality within the VS Code extension
- It uses the VS Code testing framework and assertion library to execute test cases
- Currently, the provided tests are rudimentary examples, demonstrating basic assertion capabilities.

Getting Started

Prerequisites

Before getting started with deepseek-chat-ext, ensure your runtime environment meets the following requirements:

  • IDE: VS Code - version ^1.96.0
  • Programming Language: JavaScript runtime - Node.js
  • Package Manager: Npm
  • AI model provider server: Ollama

Installation

Install deepseek-chat-ext using the following method:

  1. Clone the deepseek-chat-ext repository:
git clone https://github.com/Eyelor/deepseek-chat-ext
  1. Navigate to the project directory:
cd deepseek-chat-ext
  1. Install the project dependencies:

Using npm

npm install
  1. Install the DeepSeek Coder 2 Lite (you need to have ollama server installed before):

Using ollama

ollama pull deepseek-coder-v2:16b

Debugging and Manual Testing

To start debugging and testing, open the project in VS Code and press F5 to launch the extension in a new separate window in watch mode. You can also find helpful information in the vsc-extension-quickstart.md file. To find the chat window, type Ctrl+Shift+P, then select DeepSeek Chat. When you hit Enter, the chat window will open.

Usage

If you want to run deepseek-chat-ext in your VS Code as normal extension you need to generate the .vsix file using this command (make sure that the out/ directory is generated after using F5 - watch mode):

Using npm

npm run package

After the .vsix file is generated, go to the Extensions tab is VS Code, then click ... -> Install from VSIX..., and choose the generated .vsix file. From this point on, the extension will be available as described in the Debugging and Manual Testing section.


Project Roadmap

  • Task 1: Basic behaviour and styling implemented.
  • Task 2: Implement context for conversation.
  • Task 3: Implement more features and optimize workflow.
  • Task 4: Implement tests.
  • Task 5: Implement detailed documentation.
  • Task 6: Publish extension.

Extension Settings

The extension does not add any settings to VS Code by default.


Known Issues

  • Syntax highlighting may not be perfect for all languages.
  • If the Ollama server has not started before, an error will be returned in the chat, and you will need to wait a few seconds and try again.

Release Notes

0.0.1

  • Initial release of the extension.
  • Basic chat functionality with a simple user interface.

License

This project is protected under the Apache 2.0 License. For more details, refer to the LICENSE file.


About

VS Code extension that integrates a locally running DeepSeek model using Ollama for chat inside the IDE.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors