-
Notifications
You must be signed in to change notification settings - Fork 17
Proxy Feature
The proxy feature provides a flexible way to establish communication with existing models by leveraging third-party hosts or proxies. This enables flexible control over data flow between your applications and the models.
To direct all OpenAI communication through your Azure tenant, use the following function at the beginning of your application. This ensures that all OpenAI calls go through Azure.
const { ProxyHelper } = require('intellinode');
ProxyHelper.getInstance().setAzureOpenai(resourceName);If you want to direct only specific connections through Azure, you can send the ProxyHelper as an object to your calls.
For instance:
azureProxyHelper = new ProxyHelper();
azureProxyHelper.setAzureOpenai(resourceName);
const chatbot = new Chatbot(apiKey, SupportedChatModels.OPENAI, azureProxyHelper);You can use custom proxies to avoid regional restrictions or connect through free services.
- Update the URL in the following JSON:
const openaiProxyJson = {
"url":"https://api.openai.com",
"completions":"/v1/completions",
"chatgpt":"/v1/chat/completions",
"imagegenerate":"/v1/images/generations",
"embeddings":"/v1/embeddings" }-
Create a
ProxyHelperobject with the custom JSON: proxyHelper = new ProxyHelper(); proxyHelper.setOpenaiProxyValues(openaiProxyJson) -
Send the custom proxy with every call.
Example 1:
const chatbot = new Chatbot(apiKey, SupportedChatModels.OPENAI, proxyHelper);Example 2:
const htmlCode = await Gen.generate_html_page(prompt, openaiKey, modelName, proxyHelper);