-
-
Notifications
You must be signed in to change notification settings - Fork 3.7k
Open
Labels
type: feature-requestNew feature or requestNew feature or request
Description
🔖 Feature description
I’ve noticed that the project already includes the configuration option OPENAI_API_KEY. In addition to this, I believe it would be worth considering the addition of more OpenAI-related configuration options(OPENAI_BASE_URL, SMART_LLM, FAST_LLM), which could integrate with other endpoints (such as OpenRouter).
🎤 Why is this feature needed ?
For OPENAI_BASE_URL
There’s no need to tie AI functionality to a specific provider (in this case, OpenAI). In fact, many providers are compatible with OpenAI's SDK, and I hope it will be possible to switch to other providers such as Claude, Gemini, Qwen, and so on.
For SMART_LLM & FAST_LLM
By distinguishing between SMART_LLM and FAST_LLM, a simple way to differentiate the costs of using AI can be implemented.
✌️ How do you aim to achieve this?
An example of using OpenRouter:
# .env
OPENAI_BASE_URL="https://openrouter.ai/api/v1"
SMART_LLM="openai/gpt-4o"
FAST_LLM="openai/gpt-4o-mini"
# libraries/nestjs-libraries/src/openai/openai.service.ts
import OpenAI from 'openai';
const openai = new OpenAI({
baseURL: process.env.OPENAI_BASE_URL,
apiKey: process.env.OPENAI_API_KEY || 'sk-proj-',
})
const completion = await openai.chat.completions.create({
model: process.env.SMART_LLM || 'gpt-4o',
messages: [
{
"role": "user",
"content": [
{
"type": "text",
"text": "Who are you?"
}
]
}
]
})
console.log(completion.choices[0].message)
🔄️ Additional Information
No response
👀 Have you spent some time to check if this feature request has been raised before?
- I checked and didn't find similar issue
Are you willing to submit PR?
None
justsml, dantosXD, kishanios123, RadekHavelka, L0rdShrek and 3 moreAzadbangladeshi-com and David-NahorniakCopilot and Foxhunt
Metadata
Metadata
Assignees
Labels
type: feature-requestNew feature or requestNew feature or request