|
| 1 | +--- |
| 2 | +title: AI Endpoints - Create a code assistant with Continue |
| 3 | +excerpt: Build your own code assistant directly in VSCode or JetBrains IDEs using the Continue plugin |
| 4 | +updated: 2025-04-15 |
| 5 | +--- |
| 6 | + |
| 7 | +> [!primary] |
| 8 | +> |
| 9 | +> AI Endpoints is currently in **Beta**. Although we aim to offer a production-ready product even in this testing phase, service availability may not be guaranteed. Please be careful if you use endpoints for production, as the Beta phase is not yet complete. |
| 10 | +> |
| 11 | +> AI Endpoints is covered by the **[OVHcloud AI Endpoints Conditions](https://storage.gra.cloud.ovh.net/v1/AUTH_325716a587c64897acbef9a4a4726e38/contracts/48743bf-AI_Endpoints-ALL-1.1.pdf)** and the **[OVHcloud Public Cloud Special Conditions](https://storage.gra.cloud.ovh.net/v1/AUTH_325716a587c64897acbef9a4a4726e38/contracts/d2a208c-Conditions_particulieres_OVH_Stack-WE-9.0.pdf)**. |
| 12 | +> |
| 13 | +
|
| 14 | +## Introduction |
| 15 | + |
| 16 | +Want more control over your code assistant? Looking to integrate your own LLM configuration and use models hosted on **[AI Endpoints](https://endpoints.ai.cloud.ovh.net/)**? |
| 17 | + |
| 18 | +This guide shows you how to build your own developer assistant using **[Continue](https://www.continue.dev/)**, an open-source IDE plugin that works with both VSCode and JetBrains IDEs, in combination with OVHcloud. |
| 19 | + |
| 20 | +Continue lets you plug in your own LLMs, enabling full control over which models you use and how they interact with your code. |
| 21 | + |
| 22 | +## Requirements |
| 23 | + |
| 24 | +- A [Public Cloud project](/links/public-cloud/public-cloud) in your OVHcloud account |
| 25 | +- An access token for **OVHcloud AI Endpoints**. To create an API token, follow the instructions in the [AI Endpoints - Getting Started](/pages/public_cloud/ai_machine_learning/endpoints_guide_01_getting_started) guide. |
| 26 | + |
| 27 | +## Instructions |
| 28 | + |
| 29 | +### Install Continue |
| 30 | + |
| 31 | +Continue is distributed as an IDE plugin and supports: |
| 32 | + |
| 33 | +- Visual Studio Code |
| 34 | +- JetBrains IDEs (e.g. IntelliJ, PyCharm) |
| 35 | + |
| 36 | +Follow the [official Continue installation instructions](https://docs.continue.dev/docs/getting-started/install) for your IDE. |
| 37 | + |
| 38 | +Once installed, Continue will share the same configuration across your IDEs. |
| 39 | + |
| 40 | +### Configure Continue with AI Endpoints |
| 41 | + |
| 42 | +Continue uses a JSON-based configuration file to manage: |
| 43 | + |
| 44 | +- Chatbot tool models |
| 45 | +- Tab autocomplete models |
| 46 | + |
| 47 | +You can customize this configuration file to connect the plugin to AI Endpoints: |
| 48 | + |
| 49 | +```json |
| 50 | +{ |
| 51 | + "tabAutocompleteModel": { |
| 52 | + "title": "Qwen2.5-Coder-32B-Instruct", |
| 53 | + "model": "Qwen2.5-Coder-32B-Instruct", |
| 54 | + "apiBase": "https://qwen-2-5-coder-32b-instruct.endpoints.kepler.ai.cloud.ovh.net/api/openai_compat/v1", |
| 55 | + "provider": "openai", |
| 56 | + "useLegacyCompletionsEndpoint": true, |
| 57 | + "apiKey": "<your API key>" |
| 58 | + }, |
| 59 | + "models": [ |
| 60 | + { |
| 61 | + "title": "Meta-Llama-3_3-70B-Instruct", |
| 62 | + "model": "Meta-Llama-3_3-70B-Instruct", |
| 63 | + "apiBase": "https://llama-3-3-70b-instruct.endpoints.kepler.ai.cloud.ovh.net/api/openai_compat/v1", |
| 64 | + "provider": "openai", |
| 65 | + "useLegacyCompletionsEndpoint": false, |
| 66 | + "apiKey": "<your API key>" |
| 67 | + }, |
| 68 | + { |
| 69 | + "title": "Qwen2.5-Coder-32B-Instruct", |
| 70 | + "model": "Qwen2.5-Coder-32B-Instruct", |
| 71 | + "apiBase": "https://qwen-2-5-coder-32b-instruct.endpoints.kepler.ai.cloud.ovh.net/api/openai_compat/v1", |
| 72 | + "provider": "openai", |
| 73 | + "useLegacyCompletionsEndpoint": false, |
| 74 | + "apiKey": "<your API key>" |
| 75 | + } |
| 76 | + ] |
| 77 | + // ... |
| 78 | +} |
| 79 | +``` |
| 80 | + |
| 81 | +### Tab Completion Configuration |
| 82 | + |
| 83 | +You can define only one model for tab autocomplete. Choose any model from the Code LLM category in AI Endpoints. Here's a quick example: |
| 84 | + |
| 85 | +```json |
| 86 | +{ |
| 87 | + "tabAutocompleteModel": { |
| 88 | + "title": "Qwen2.5-Coder-32B-Instruct", |
| 89 | + "model": "Qwen2.5-Coder-32B-Instruct", |
| 90 | + "apiBase": "https://qwen-2-5-coder-32b-instruct.endpoints.kepler.ai.cloud.ovh.net/api/openai_compat/v1", |
| 91 | + "provider": "openai", |
| 92 | + "useLegacyCompletionsEndpoint": true, |
| 93 | + "apiKey": "<your API key>" |
| 94 | + } |
| 95 | +} |
| 96 | +``` |
| 97 | + |
| 98 | +### Chatbot Configuration |
| 99 | + |
| 100 | +For the chatbot tool, you can define multiple models. Try out different LLMs and choose the one that best fits your use case. You can switch between them easily in the IDE UI. |
| 101 | + |
| 102 | +### Try It Out |
| 103 | + |
| 104 | +Once Continue is configured with your AI Endpoints, you're ready to test both features: |
| 105 | + |
| 106 | +**Chatbot Tool** |
| 107 | + |
| 108 | +Use the chatbot sidebar to ask for help, generate code, or refactor logic with any of your configured models. |
| 109 | + |
| 110 | +{.thumbnail} |
| 111 | + |
| 112 | +**Tab Completion Tool** |
| 113 | + |
| 114 | +Just start typing in your editor. The autocomplete model will complete code as you go — powered by your custom-configured model from AI Endpoints. |
| 115 | + |
| 116 | +{.thumbnail} |
| 117 | + |
| 118 | +## Conclusion |
| 119 | + |
| 120 | +By using Continue and AI Endpoints, you now have access to a fully customizable code assistant, support for cutting-edge open-source large language models such as Qwen, Mixtral, and LLaMA 3, and the ability to manage your own configuration and resources on AI Endpoints. |
| 121 | + |
| 122 | +If you need training or technical assistance to implement our solutions, contact your sales representative or click on [this link](/links/professional-services) to get a quote and ask our Professional Services experts for a custom analysis of your project. |
| 123 | + |
| 124 | +## Feedback |
| 125 | + |
| 126 | +Please feel free to send us your questions, feedback, and suggestions regarding AI Endpoints and its features: |
| 127 | + |
| 128 | +- In the #ai-endpoints channel of the OVHcloud [Discord server](https://discord.gg/ovhcloud), where you can engage with the community and OVHcloud team members. |
0 commit comments