-
Notifications
You must be signed in to change notification settings - Fork 1.4k
Description
Is your feature request related to a problem? Please describe.
Currently, the self-operating-computer only supports direct API connections to specific AI providers. This limits flexibility for users who want to use OpenRouter as a unified interface to access multiple AI models through a single API endpoint. I'm frustrated when I need to switch between different API keys and configurations to test various models.
Describe the solution you'd like
Add native support for OpenRouter API integration. This would allow users to:
- Configure OpenRouter API key in the settings
- Select from various AI models available through OpenRouter (GPT-4, Claude, Gemini, etc.)
- Use a single API endpoint to access multiple providers
- Take advantage of OpenRouter's automatic fallbacks and load balancing
Describe alternatives you've considered
- Manually modifying the code to add OpenRouter endpoints - but this requires maintenance with each update
- Using a proxy server to redirect API calls - adds unnecessary complexity
- Creating a wrapper script - not ideal for integration with the existing architecture
Additional context
OpenRouter provides a unified API that supports models from OpenAI, Anthropic, Google, Meta, and others. This would significantly enhance the flexibility of self-operating-computer by allowing users to easily switch between models without changing their entire configuration. Many similar projects already support OpenRouter, making it a common choice in the AI development community.