LLMLean supports various cloud LLM providers beyond OpenAI. This document covers how to configure them.
To use Anthropic's Claude models:
-
Get an Anthropic API key.
-
Set configuration variables in
~/.config/llmlean/config.toml:
api = "anthropic"
model = "claude-3-5-sonnet-20241022"
apiKey = "<your-anthropic-api-key>"To use Together.AI:
-
Get a Together.AI API key.
-
Set configuration variables in
~/.config/llmlean/config.toml:
api = "together"
model = "Qwen/Qwen2.5-72B-Instruct-Turbo"
apiKey = "<your-together-api-key>"Many providers offer OpenAI-compatible APIs. For these providers:
-
Get an API key from your provider.
-
Set configuration variables in
~/.config/llmlean/config.toml:
api = "openai"
endpoint = "<provider-endpoint-url>"
model = "<model-name>"
apiKey = "<your-api-key>"For any provider, you can also set configuration using environment variables:
export LLMLEAN_API=<api-type>
export LLMLEAN_ENDPOINT=<endpoint-url>
export LLMLEAN_MODEL=<model-name>
export LLMLEAN_API_KEY=<your-api-key>You can also set configuration directly in your Lean files:
set_option llmlean.api "openai"
set_option llmlean.model "gpt-4o"
set_option llmlean.apiKey "<your-api-key>"Note: Be careful not to commit API keys to version control when using this method.