VertexAI Integration: Configuration of region / endpoint #3275
Closed
schnaker85
started this conversation in
Feature Requests & Suggestions
Replies: 2 comments
-
To clarify the above answer; Changing just line 29 will not do the trick, as it will still keep taking default us-central1. This is because the langchain GoogleVertexAI module has his own default set to us-central-1. We did "quick-fix"/harcoded it in GoogleClient.js in setting the location and endpoint in the clientOptions as following: createLLM(clientOptions) {
const model = clientOptions.modelName ?? clientOptions.model;
clientOptions.location='europe-west6';
clientOptions.endpoint='europe-west6-aiplatform.googleapis.com';
if (this.project_id && this.isTextModel) {
return new GoogleVertexAI(clientOptions);
} else if (this.project_id && this.isChatModel) {
return new ChatGoogleVertexAI(clientOptions);
} else if (this.project_id) {
return new ChatVertexAI(clientOptions);
} else if (model.includes('1.5')) {
return new GenAI(this.apiKey).getGenerativeModel(
{
...clientOptions,
model,
},
{ apiVersion: 'v1beta' },
);
} This will work at least for all langchain integrations for VertexAI. |
Beta Was this translation helpful? Give feedback.
0 replies
-
no longer required as there is an env variable |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Is there a possibility to configure the region / endpoint while using Vertex AI Integration in librechat, either through an Environment Variable or through the model options?
It seems that it always takes the hardcoded region "us-central1", seen here https://github.com/danny-avila/LibreChat/blob/main/api/app/clients/GoogleClient.js#L29
Or is there anyway to configure this right now, somehow?
Beta Was this translation helpful? Give feedback.
All reactions