From 39fea20cff3ef60306e96a2595cada0baee6c871 Mon Sep 17 00:00:00 2001 From: Rajashekar Chintalapati Date: Wed, 16 Jul 2025 14:52:19 -0700 Subject: [PATCH] docs: add six new LLM plugins to directory Add support for Cerebras, GitHub Models, Requesty, DigitalOcean, LiteLLM, and Chutes AI model providers to the plugin directory documentation. --- docs/plugins/directory.md | 6 ++++++ 1 file changed, 6 insertions(+) diff --git a/docs/plugins/directory.md b/docs/plugins/directory.md index d925f3177..c19242a9e 100644 --- a/docs/plugins/directory.md +++ b/docs/plugins/directory.md @@ -39,6 +39,12 @@ These plugins can be used to interact with remotely hosted models via their API: - **[llm-deepseek](https://github.com/abrasumente233/llm-deepseek)** adds support for the [DeepSeek](https://deepseek.com)'s DeepSeek-Chat and DeepSeek-Coder models. - **[llm-lambda-labs](https://github.com/simonw/llm-lambda-labs)** provides access to models hosted by [Lambda Labs](https://docs.lambdalabs.com/public-cloud/lambda-chat-api/), including the Nous Hermes 3 series. - **[llm-venice](https://github.com/ar-jan/llm-venice)** provides access to uncensored models hosted by privacy-focused [Venice AI](https://docs.venice.ai/), including Llama 3.1 405B. +- **[llm-cerebras](https://github.com/irthomasthomas/llm-cerebras)** provides access to models hosted by [cerebras](https://cloud.cerebras.ai/) +- **[llm-github-models](https://github.com/tonybaloney/llm-github-models)** provides access to models hosted by [Github via the Azure AI](https://github.com/marketplace/models) +- **[llm-requesty](https://github.com/rajashekar/llm-requesty)** provides access to models hosted by [requesty](https://www.requesty.ai/) +- **[llm-digitalocean](https://github.com/rajashekar/llm-digitalocean)** provides access to models hosted by [digitalocean](https://docs.digitalocean.com/products/gradientai-platform/details/pricing/) +- **[llm-litellm](https://github.com/rajashekar/llm-litellm)** provides access to models proxied through [litellm](https://docs.litellm.ai/docs/proxy/docker_quick_start) +- **[llm-chutes](https://github.com/rajashekar/llm-chutes)** provides access to models hosted by [Chutes AI](https://chutes.ai/) If an API model host provides an OpenAI-compatible API you can also [configure LLM to talk to it](https://llm.datasette.io/en/stable/other-models.html#openai-compatible-models) without needing an extra plugin.