Skip to content

Commit ba5d310

Browse files
Add changelog for 2024-12-05 (#84)
Co-authored-by: quantstruct[bot] <190039098+quantstruct[bot]@users.noreply.github.com>
1 parent a894c8c commit ba5d310

File tree

1 file changed

+16
-0
lines changed

1 file changed

+16
-0
lines changed

changelog/2024-12-05.mdx

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
1. **OAuth2 Support for Custom LLM Credentials and Webhooks**: You can now secure access to your [custom LLMs](https://docs.vapi.ai/customization/custom-llm/using-your-server#step-2-configuring-vapi-with-custom-llm) and [server urls (aka webhooks)](https://docs.vapi.ai/server-url) using OAuth2 (RFC 6749). Create a webhook credential with `CreateWebhookCredentialDTO` and specify the following information.
2+
3+
```json
4+
{
5+
"provider": "webhook",
6+
"authenticationPlan": {
7+
"type": "oauth2",
8+
"url": "https://your-url.com/your/path/token",
9+
"clientId": "your-client-id",
10+
"clientSecret": "your-client-secret"
11+
},
12+
"name": "your-credential-name-between-1-and-40-characters"
13+
}
14+
```
15+
16+
3. **Removal of Canonical Knowledge Base**: The ability to create, update, and use canoncial knowledge bases in your assistant has been removed from the API(as custom knowledge bases and the Trieve integration supports as superset of this functionality). Please update your implementations as endpoints and models referencing canoncial knowledge base schemas are no longer available.

0 commit comments

Comments
 (0)