Azure OpenAI Error, ZodError Version #7208
Unanswered
mohsraj
asked this question in
Troubleshooting
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, I'm trying to connect to Azure OpenAI. I'm getting the following error
LibreChat | 2025-05-04 00:55:03 error: Invalid custom config file at /app/librechat.yaml:
LibreChat | {
LibreChat | "issues": [
LibreChat | {
LibreChat | "code": "invalid_type",
LibreChat | "expected": "string",
LibreChat | "received": "undefined",
LibreChat | "path": [
LibreChat | "version"
LibreChat | ],
LibreChat | "message": "Required"
LibreChat | }
LibreChat | ],
LibreChat | "name": "ZodError"
LibreChat | }
LibreChat | 2025-05-04 00:55:03 warn: Default value for CREDS_KEY is being used.
LibreChat | 2025-05-04 00:55:03 warn: Default value for CREDS_IV is being used.
My librechat.yaml
endpoints:
azureOpenAI:
titleModel: "gpt-4-vision-preview"
plugins: true
assistants: true
groups:
- group: "my-resource-westus"
apiKey: "7it9XNxlr"
instanceName: "cus3"
version: "2024-03-01-preview"
models:
gpt-4-vision-preview:
deploymentName: "gpt-4-vision-preview"
version: "2024-03-01-preview"
.env
#=====================================================================#
LibreChat Configuration
#=====================================================================#
Please refer to the reference documentation for assistance
with configuring your LibreChat environment.
https://www.librechat.ai/docs/configuration/dotenv
#=====================================================================#
#==================================================#
Server Configuration
#==================================================#
HOST=localhost
PORT=3080
MONGO_URI=mongodb://mongo:27017/LibreChat
DOMAIN_CLIENT=http://localhost:3080
DOMAIN_SERVER=http://localhost:3080
NO_INDEX=true
Use the address that is at most n number of hops away from the Express application.
req.socket.remoteAddress is the first hop, and the rest are looked for in the X-Forwarded-For header from right to left.
A value of 0 means that the first untrusted address would be req.socket.remoteAddress, i.e. there is no reverse proxy.
Defaulted to 1.
TRUST_PROXY=1
#===============#
JSON Logging
#===============#
Use when process console logs in cloud deployment like GCP/AWS
CONSOLE_JSON=false
#===============#
Debug Logging
#===============#
DEBUG_LOGGING=true
DEBUG_CONSOLE=false
#=============#
Permissions
#=============#
UID=1000
GID=1000
#===============#
Configuration
#===============#
Use an absolute path, a relative path, or a URL
CONFIG_PATH="/alternative/path/to/librechat.yaml"
#===================================================#
Endpoints
#===================================================#
ENDPOINTS=openAI,assistants,azureOpenAI,google,gptPlugins,anthropic
PROXY=
#============#
Azure
#============#
Note: these variables are DEPRECATED
Use the
librechat.yaml
configuration forazureOpenAI
insteadYou may also continue to use them if you opt out of using the
librechat.yaml
configurationAZURE_OPENAI_DEFAULT_MODEL=gpt-3.5-turbo # Deprecated
AZURE_OPENAI_MODELS=gpt-3.5-turbo,gpt-4 # Deprecated
AZURE_USE_MODEL_AS_DEPLOYMENT_NAME=TRUE # Deprecated
AZURE_API_KEY= # Deprecated
AZURE_OPENAI_API_INSTANCE_NAME= # Deprecated
AZURE_OPENAI_API_DEPLOYMENT_NAME= # Deprecated
AZURE_OPENAI_API_VERSION= # Deprecated
AZURE_OPENAI_API_COMPLETIONS_DEPLOYMENT_NAME= # Deprecated
AZURE_OPENAI_API_EMBEDDINGS_DEPLOYMENT_NAME= # Deprecated
PLUGINS_USE_AZURE="true" # Deprecated
#====================#
Assistants API
#====================#
ASSISTANTS_API_KEY=user_provided
ASSISTANTS_BASE_URL=
ASSISTANTS_MODELS=gpt-4o,gpt-4o-mini,gpt-3.5-turbo-0125,gpt-3.5-turbo-16k-0613,gpt-3.5-turbo-16k,gpt-3.5-turbo,gpt-4,gpt-4-0314,gpt-4-32k-0314,gpt-4-0613,gpt-3.5-turbo-0613,gpt-3.5-turbo-1106,gpt-4-0125-preview,gpt-4-turbo-preview,gpt-4-1106-preview
#==========================#
Azure Assistants API
#==========================#
Note: You should map your credentials with custom variables according to your Azure OpenAI Configuration
The models for Azure Assistants are also determined by your Azure OpenAI configuration.
AZURE_OPENAI_API_KEY="7it9Xr"
AZURE_OPENAI_ENDPOINT="https://xxxx.cognitiveservices.azure.com"
AZURE_OPENAI_DEPLOYMENT_NAME="gpt-4o"
AZURE_OPENAI_API_VERSION="2024-08-01-preview"
AZURE_OPENAI_CHAT_ASSISTANT_ID="zzzz"
More info, including how to enable use of Assistants with Azure here:
https://www.librechat.ai/docs/configuration/librechat_yaml/ai_endpoints/azure#using-assistants-with-azure
#==================================================#
Speech to Text & Text to Speech
#==================================================#
STT_API_KEY=
TTS_API_KEY=
#==================================================#
RAG
#==================================================#
More info: https://www.librechat.ai/docs/configuration/rag_api
RAG_OPENAI_BASEURL=
RAG_OPENAI_API_KEY=
RAG_USE_FULL_CONTEXT=
EMBEDDINGS_PROVIDER=openai
EMBEDDINGS_MODEL=text-embedding-3-small
Beta Was this translation helpful? Give feedback.
All reactions