Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
{
"env": {
"browser": false,
"commonjs": true,
"es6": true,
"node": true,
"mocha": true
},
"parserOptions": {
"ecmaVersion": 2018,
"ecmaFeatures": {
"jsx": true
},
"sourceType": "module"
},
"rules": {
"no-const-assign": "warn",
"no-this-before-super": "warn",
"no-undef": "warn",
"no-unreachable": "warn",
"no-unused-vars": "warn",
"constructor-super": "warn",
"valid-typeof": "warn"
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
{
// See https://go.microsoft.com/fwlink/?LinkId=733558
// for the documentation about the extensions.json format
"recommendations": [
"dbaeumer.vscode-eslint",
"ms-vscode.extension-test-runner"
]
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
// A launch configuration that launches the extension inside a new window
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
{
"version": "0.2.0",
"configurations": [
{
"name": "Run Extension",
"type": "extensionHost",
"request": "launch",
"args": [
"--extensionDevelopmentPath=${workspaceFolder}"
]
}
]
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
import { defineConfig } from '@vscode/test-cli';

export default defineConfig({
files: 'test/**/*.test.js',
});
10 changes: 10 additions & 0 deletions Generative AI & LLMs/LLM Copilot Ext for VS Code IDE/.vscodeignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
.vscode/**
.vscode-test/**
test/**
.gitignore
.yarnrc
vsc-extension-quickstart.md
**/jsconfig.json
**/*.map
**/.eslintrc.json
**/.vscode-test.*
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
# Change Log

All notable changes to the "llm-copilot" extension will be documented in this file.

Check [Keep a Changelog](http://keepachangelog.com/) for recommendations on how to structure this file.

## [Unreleased]

- Initial release
82 changes: 82 additions & 0 deletions Generative AI & LLMs/LLM Copilot Ext for VS Code IDE/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,82 @@
LLM Copilot Extension in VS Code
----------------------------------

1. Introduction:
----------------
Welcome to the documentation for the VS Code Copilot Extension, which I created. This
extension was developed to enhance coding efficiency by integrating advanced language
models like Llama 3.1, OpenAI’s GPT-3.5-turbo, and Gemini 1.5. These models offer powerful
code generation, suggestion capabilities, and the ability to solve complex coding problems. The
extension is designed with a user-friendly interface and can be easily integrated into your
existing workflow.

2. Prerequisites:
-----------------
Before using this extension, we need to ensure that we have the following installed on our
system:
- Visual Studio Code (version 1.92.0 or higher)
- Node.js (for running the extension)
- TypeScript (for compiling TypeScript files)
Additionally, we need to make sure that we have access to API keys from Together.ai, Groq,
OpenAI, or other supported providers to interact with the language models.

3. Installation:
------------------
- First we clone this repository
- We then run “npm install” to install all necessary dependencies.
- Thereafter , we use the command npm run compile to compile the TypeScript files.
- Now to test the extension,we need to press F5 in VS Code to open a new window with
the extension running or run the following command in terminal window :
code --extensionDevelopmentPath=D:\Extension\extension\ ( according to our
exact file location path)

4. Overview of package.json
----------------------------
The package.json file is crucial as it defines the extension’s metadata, dependencies,
and commands. Key elements include:
- Name and Version: The extension is named codesuggestion with version 0.0.1.
- Engines: Specifies compatibility with VS Code version 1.92.0 or higher.
- Contributes: Defines the command extension.openChat to open the chat interface.
- Scripts: Includes commands for compiling TypeScript, linting, and testing.
- Dependencies: Includes axios for API calls and development dependencies for
TypeScript, linting, and testing.

5. Setting Up the Command in extension.ts:
------------------------------------------
The extension.ts file contains the core logic:
- Imports: Includes necessary modules like vscode and axios.
- Activate Function: The entry point when the extension is activated, where the
command extension.openChat is registered.
- Webview Setup: The command triggers a Webview panel that displays the chat
interface, allowing users to interact with the AI.

6. Making API Calls to AI Models:
---------------------------------
The getCodeSnippet function is central to the extension’s operation:
- Functionality: Sends a POST request to the selected AI model's API, passing the user’s
query.
- Response Handling: The response is processed and the code snippet is returned and displayed in the Webview.

7. User Interface:
-------------------
The user interface is designed using HTML, CSS, and JavaScript:
- Chat Interface: Users can input queries and select the desired AI model from a
dropdown menu.
- Response Display: The AI’s response is displayed in a code block format for easy
copying and pasting.

8. Features:
------------

- Model Integration: Supports switching between Llama 3.1, GPT-3.5-turbo, and Gemini models.
- Code Generation: Generates code snippets, solves LeetCode & DSA problems, and supports autocompletion
- User-Friendly Interface: Simplified design for ease of use, allowing users to interact with the AI seamlessly

Sample chat and prompt response demonstration:
-----------------------------------------
![sample_chat_interface](https://github.com/user-attachments/assets/43cab310-e93e-4040-bc4c-4390c99684f6)

Personalized chat interface to match your needs :
-----------------------------------------
![Personalized_response](https://github.com/user-attachments/assets/30071652-86ed-4eae-9da3-a59df866635c)

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@

75 changes: 75 additions & 0 deletions Generative AI & LLMs/LLM Copilot Ext for VS Code IDE/extension.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,75 @@
const vscode = require('vscode');
const axios = require('axios');

const TOGETHER_AI_API_KEY = 'your_actual_api_key_here';
const GROQ_AI_API_KEY = 'your_actual_api_key_here';
const LLAMA_API_KEY = 'your_actual_api_key_here';

async function getTogetherAIResponse(prompt) {
const response = await axios.post('https://api.together.ai/v1/text/completion', {
prompt: prompt
}, {
headers: {
'Authorization': `Bearer ${TOGETHER_AI_API_KEY}`,
'Content-Type': 'application/json'
}
});
return response.data.text;
}

async function getGroqAIResponse(prompt) {
const response = await axios.post('https://api.groq.com/v1/complete', {
prompt: prompt
}, {
headers: {
'Authorization': `Bearer ${GROQ_AI_API_KEY}`,
'Content-Type': 'application/json'
}
});
return response.data.text;
}

async function getLlamaResponse(prompt) {
const response = await axios.post('https://api.llama.com/v1/complete', {
prompt: prompt
}, {
headers: {
'Authorization': `Bearer ${LLAMA_API_KEY}`,
'Content-Type': 'application/json'
}
});
return response.data.text;
}

function activate(context) {
let disposable = vscode.commands.registerCommand('my-ext.helloWorld', async () => {
const prompt = await vscode.window.showInputBox({ prompt: 'Enter your prompt' });

if (prompt) {
try {
const togetherAIResponse = await getTogetherAIResponse(prompt);
const groqAIResponse = await getGroqAIResponse(prompt);
const llamaResponse = await getLlamaResponse(prompt);

vscode.window.showInformationMessage(`TogetherAI: ${togetherAIResponse}`);
vscode.window.showInformationMessage(`GroqAI: ${groqAIResponse}`);
vscode.window.showInformationMessage(`LLaMA: ${llamaResponse}`);
} catch (error) {
vscode.window.showErrorMessage('Error fetching response from LLMs');
}
}
});

context.subscriptions.push(disposable);
}

// @ts-ignore
exports.activate = activate;

function deactivate() {}

module.exports = {
// @ts-ignore
activate,
deactivate
};
13 changes: 13 additions & 0 deletions Generative AI & LLMs/LLM Copilot Ext for VS Code IDE/jsconfig.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
{
"compilerOptions": {
"module": "Node16",
"target": "ES2022",
"checkJs": true, /* Typecheck .js files. */
"lib": [
"ES2022"
]
},
"exclude": [
"node_modules"
]
}
Loading
Loading