[Question]: How can I add Mistral to the list of endpoints? #1559
-
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Hi @berceanu Thanks for checking out the project! I'm happy to say it will work for you. For Docker, there is an extra step to have it configured: https://docs.librechat.ai/install/configuration/custom_config.html#docker-setup I will copy/paste the instructions below.
Make sure you are on the latest commit.
I noticed this field in your config. The example has been updated to remove this as Mistral API has deprecated Docker SetupFor Docker, you need to make use of an override file, named
# For more details on the override file, see the Docker Override Guide:
# https://docs.librechat.ai/install/configuration/docker_override.html
version: '3.4'
services:
api:
volumes:
- ./librechat.yaml:/app/librechat.yaml
docker-compose up # no need to rebuild Updated example# Configuration version (required)
version: 1.0.1
# Cache settings: Set to true to enable caching
cache: true
# Definition of custom endpoints
endpoints:
custom:
# Mistral AI API
- name: "Mistral" # Unique name for the endpoint
# For `apiKey` and `baseURL`, you can use environment variables that you define.
# recommended environment variables:
apiKey: "${MISTRAL_API_KEY}"
baseURL: "https://api.mistral.ai/v1"
# Models configuration
models:
# List of default models to use. At least one value is required.
default: ["mistral-tiny", "mistral-small", "mistral-medium"]
# Fetch option: Set to true to fetch models from API.
fetch: true # Defaults to false.
# Optional configurations
# Title Conversation setting
titleConvo: true # Set to true to enable title conversation
# Title Method: Choose between "completion" or "functions".
titleMethod: "completion" # Defaults to "completion" if omitted.
# Title Model: Specify the model to use for titles.
titleModel: "mistral-tiny" # Defaults to "gpt-3.5-turbo" if omitted.
# Summarize setting: Set to true to enable summarization.
summarize: false
# Summary Model: Specify the model to use if summarization is enabled.
summaryModel: "mistral-tiny" # Defaults to "gpt-3.5-turbo" if omitted.
# Force Prompt setting: If true, sends a `prompt` parameter instead of `messages`.
forcePrompt: false
# The label displayed for the AI model in messages.
modelDisplayLabel: "Mistral" # Default is "AI" when not set.
# Add additional parameters to the request. Default params will be overwritten.
addParams:
safe_prompt: true # This field is specific to Mistral AI: https://docs.mistral.ai/api/
# Drop Default params parameters from the request. See default params in guide linked below.
# NOTE: For Mistral, it is necessary to drop the following parameters or you will encounter a 422 Error:
dropParams: ["stop", "user", "frequency_penalty", "presence_penalty"]
# OpenRouter.ai Example
- name: "OpenRouter"
# For `apiKey` and `baseURL`, you can use environment variables that you define.
# recommended environment variables:
# Known issue: you should not use `OPENROUTER_API_KEY` as it will then override the `openAI` endpoint to use OpenRouter as well.
apiKey: "${OPENROUTER_KEY}"
baseURL: "https://openrouter.ai/api/v1"
models:
default: ["gpt-3.5-turbo"]
fetch: true
titleConvo: true
titleModel: "gpt-3.5-turbo"
summarize: false
summaryModel: "gpt-3.5-turbo"
forcePrompt: false
modelDisplayLabel: "OpenRouter"
# See the Custom Configuration Guide for more information:
# https://docs.librechat.ai/install/configuration/custom_config.html
|
Beta Was this translation helpful? Give feedback.
Hi @berceanu Thanks for checking out the project! I'm happy to say it will work for you.
For Docker, there is an extra step to have it configured: https://docs.librechat.ai/install/configuration/custom_config.html#docker-setup
I will copy/paste the instructions below.
Make sure you are on the latest commit.
I noticed this field in your config. The example has been updated to remove this as Mistral API has deprecated
safe_mode
in favor ofsafe_prompt
and it will throw an error if you use the former. Once on the latest commit, you can use the most up to datelibrechat.examp…