Skip to content

Lifailon/openrouter-bot

Repository files navigation

OpenRouter
Bot

English (🇺🇸) | Русский (🇷🇺)

This project allows you to launch your Telegram bot in a few minutes to communicate with free and paid AI models via OpenRouter, or local LLMs, for example, via LM Studio.

Note

This repository is a fork of the openrouter-gpt-telegram-bot project, which adds new features (such as switch current model and Markdown formatting in bot responses) and optimizes the container startup process.

Example

Preparation

Tip

When you launch the bot, you will be able to see the IDs of other users in the log, to whom you can also grant access to the bot in the future.

Installation

To run locally on Windows or Linux system, download the pre-built binary (without dependencies) from the releases page.

Running in Docker

  • Create a working directory:
mkdir openrouter-bot
cd openrouter-bot
  • Create .env file and fill in the basic parameters:
# OpenRouter api key
API_KEY=
# Free modeles: https://openrouter.ai/models?max_price=0
MODEL=deepseek/deepseek-r1:free
# Telegram api key
TELEGRAM_BOT_TOKEN=
# Your Telegram id
ADMIN_IDS=
# List of users to access the bot, separated by commas
ALLOWED_USER_IDS=
# Disable guest access (enabled by default)
GUEST_BUDGET=0
# Language used for bot responses (supported: EN/RU)
LANG=EN

The list of all available parameters is listed in the .env.example file

  • Run a container using the image from Docker Hub:
docker run -d --name OpenRouter-Bot \
    -v ./.env:/openrouter-bot/.env \
    --restart unless-stopped \
    lifailon/openrouter-bot:latest

The image is build for amd64 and arm64 (Raspberry Pi) platforms using docker buildx.

Build

git clone https://github.com/Lifailon/openrouter-bot
cd openrouter-bot
docker-compose up -d --build

About

This project allows to launch your Telegram bot in a few minutes to communicate with free or paid AI models via OpenRouter.

Topics

Resources

License

Stars

Watchers

Forks