Skip to content

Proxy Feature

Albarqawi edited this page Jun 26, 2023 · 4 revisions

The proxy feature provides a flexible way to establish communication with existing models by leveraging third-party hosts or proxies. This enables flexible control over data flow between your applications and the models.

Openai proxies

Azure AI Service

To direct all OpenAI communication through your Azure tenant, use the following function at the beginning of your application. This ensures that all OpenAI calls go through Azure.

const { ProxyHelper } = require('intellinode');
ProxyHelper.getInstance().setAzureOpenai(resourceName);

If you want to direct only specific connections through Azure, you can send the ProxyHelper as an object to your calls.
For instance:

azureProxyHelper = new ProxyHelper();
azureProxyHelper.setAzureOpenai(resourceName);
const chatbot = new Chatbot(apiKey, SupportedChatModels.OPENAI, azureProxyHelper);

Custom proxies

You can use custom proxies to avoid regional restrictions or connect through free services.

  1. Update the URL in the following JSON:
const openaiProxyJson = {
   "url":"https://api.openai.com",
   "completions":"/v1/completions",
   "chatgpt":"/v1/chat/completions",
   "imagegenerate":"/v1/images/generations",
   "embeddings":"/v1/embeddings" }
  1. Create a ProxyHelper object with the custom JSON: proxyHelper = new ProxyHelper(); proxyHelper.setOpenaiProxyValues(openaiProxyJson)

  2. Send the custom proxy with every call.
    Example 1:

const chatbot = new Chatbot(apiKey, SupportedChatModels.OPENAI, proxyHelper);

Example 2:

const htmlCode = await Gen.generate_html_page(prompt, openaiKey, modelName, proxyHelper);
Clone this wiki locally