Error connecting to local llm #8856
Replies: 1 comment
-
|
The SSL certificate error you're experiencing ("unable to verify the first certificate") occurs when connecting to HTTPS endpoints with self-signed or custom certificates. Here are solutions for Continue: Solution 1: Configure Certificate Bundle Path (Recommended)Add name: Config
version: 1.0.0
schema: v1
assistants:
- name: default
model: CodeLlama
models:
- name: CodeLlama
provider: ollama
model: codellama/codellama-13b-instruct
apiBase: https://ashish-code-llama-1.project-user-ashish-kumar.serving.adt-alto01-ingress.us.rdlabs.hpecorp.net/v1
apiKey: '**************'
title: codellama:13b
requestOptions:
caBundlePath: /path/to/your/certificate.pem # Add this line
roles:
- chat
- edit
- autocompleteReplace Solution 2: Client Certificate AuthenticationIf your endpoint requires client certificate authentication: models:
- name: CodeLlama
provider: ollama
model: codellama/codellama-13b-instruct
apiBase: https://ashish-code-llama-1.project-user-ashish-kumar.serving.adt-alto01-ingress.us.rdlabs.hpecorp.net/v1
apiKey: '**************'
requestOptions:
clientCertificate:
cert: /path/to/client.pem
key: /path/to/client.key
passphrase: your_passphrase # Optional if key is encryptedSolution 3: Windows VS Code UsersIf you're on Windows, installing the win-ca extension in VS Code may automatically resolve certificate issues. Troubleshooting Steps:
You can find more details in our FAQ documentation about configuring certificates. Let me know if this resolves your issue or if you need further assistance! |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I have configured my local vscode continue extension to point to local nvidia nim codellama model and my configuration looks something like this.
name: Config
version: 1.0.0
schema: v1
assistants:
model: CodeLlama
models:
provider: ollama
model: codellama/codellama-13b-instruct
apiBase: https://ashish-code-llama-1.project-user-ashish-kumar.serving.adt-alto01-ingress.us.rdlabs.hpecorp.net/v1
apiKey: '**************'
title: codellama:13b
roles:
I am getting following error while connecting to llm...
request to https://ashish-code-llama-1.project-user-ashish-kumar.serving.adt-alto01-ingress.us.rdlabs.hpecorp.net/v1/api/chat failed, reason: unable to verify the first certificate
My question is that how can i configure my ssl certificate since my endpoint is https.
Beta Was this translation helpful? Give feedback.
All reactions