Skip to content

Commit e8904bc

Browse files
authored
Update README.md
1 parent bba54b2 commit e8904bc

File tree

1 file changed

+156
-2
lines changed

1 file changed

+156
-2
lines changed

README.md

Lines changed: 156 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,156 @@
1-
# Any-LLM-Chat
2-
A flexible Streamlit chat interface to interact with any OpenAI-compatible LLM API, including local Ollama instances.
1+
# Any LLM Chat
2+
3+
A versatile Streamlit application designed for seamless interaction with various Language Model (LLM) providers. Whether you're using a local Ollama instance (local or remote) or a cloud-based API like OpenAI, this app provides a unified chat experience, complete with chat history, model selection, and response management.
4+
5+
-----
6+
7+
## ✨ Features
8+
9+
* **Universal Compatibility**: Connects to any LLM provider that supports the OpenAI-compatible API, including local or remote Ollama instances.
10+
* **Dynamic Model Discovery**: Automatically fetches and lists available models from your specified API endpoint.
11+
* **Persistent Chat History**: Saves and loads your conversations locally, ensuring you never lose your progress.
12+
* **Interactive Chat Interface**: A familiar chat UI for intuitive conversations.
13+
* **Response Management**:
14+
* **"Read More/Show Less"**: Truncates long responses for readability with an option to expand.
15+
* **Copy to Clipboard**: Easily copy assistant responses with a single click.
16+
* **Delete Messages**: Remove individual user/assistant message pairs from the history.
17+
* **Streaming Responses**: Enjoy real-time text generation as the LLM responds.
18+
* **Stop Generation**: Ability to halt ongoing AI responses at any time, helping you save on token usage and costs.
19+
20+
-----
21+
22+
## 🚀 Getting Started
23+
24+
Follow these steps to get your Any LLM Chat Interface up and running.
25+
26+
### Prerequisites
27+
28+
* Python 3.8+
29+
* `pip` package manager
30+
31+
### Installation
32+
33+
1. **Clone the repository (or save the script):**
34+
If you have the script as a file, save it as `app.py`. Otherwise, clone the repo:
35+
36+
```bash
37+
git clone https://github.com/ChayScripts/Any-LLM-Chat.git
38+
cd Any-LLM-Chat
39+
```
40+
41+
2. **Install dependencies:**
42+
Create a virtual environment in python or install the packages directly.
43+
44+
```bash
45+
pip install streamlit openai httpx requests pyperclip
46+
```
47+
48+
### Running the Application
49+
50+
To start the Streamlit application, run the following command in your terminal:
51+
52+
```bash
53+
streamlit run app.py
54+
```
55+
56+
This will open the application in your default web browser.
57+
58+
-----
59+
60+
## 💡 Usage
61+
62+
1. **Enter Base URL**: In the sidebar, provide the base URL of your LLM provider.
63+
64+
* For **Ollama (local)**: Typically `http://localhost:11434`
65+
* For **Ollama (Remote)**: Typically `http://Ollama_Server_FQDN:11434`
66+
* For **OpenAI (cloud)** or other compatible APIs: e.g., `https://api.openai.com`
67+
* **Important**: Do not include a trailing slash.
68+
69+
*For remote Ollama servers, use the server's FQDN or IP address in the API URL. Refer to the [Ollama guide for Windows](https://www.techwithchay.com/posts/ollama-guide-for-windows/#remote-deployment) to configure `OLLAMA_HOST` for remote connections.*
70+
71+
2. **API Key**: If your chosen provider requires an API key (e.g., OpenAI), enter it in the "API Key" field. For local Ollama instances, an API key is not needed.
72+
73+
3. **List / Refresh Models**: Click the "List / Refresh Models" button. The application will fetch and display available models from your specified endpoint.
74+
75+
4. **Select a Model**: Choose your desired model from the "Select a model:" dropdown.
76+
77+
5. **Start Chatting**: Type your message in the input box at the bottom and press Enter.
78+
79+
Your chat history will be automatically saved and loaded from `chat_history.json` file in the same directory as the script.
80+
81+
-----
82+
83+
## 🔒 Privacy & Security
84+
85+
This application is designed with your privacy in mind:
86+
87+
**API Key Handling:** Your API key is entered directly into your browser session and is never saved or stored persistently by the application or forwarded to any remote server. It resides in your browser's memory only for the duration of your active session.
88+
89+
**Local Data Storage:** All your chat history is saved locally in a `chat_history.json` file on your machine. No chat data is transmitted to or stored on any external servers, ensuring your conversations remain private.
90+
91+
-----
92+
93+
## ▶️ Run without terminal
94+
To run a Streamlit app in a Python virtual environment without opening a terminal, create and run a shortcut or script that activates the virtual environment and starts the app. Here's how to do it on different platforms:
95+
96+
---
97+
98+
**Windows (.bat file):**
99+
100+
1. Create a `run_streamlit.bat` file with the following content:
101+
102+
```bat
103+
@echo off
104+
call C:\path\to\venv\Scripts\activate.bat
105+
streamlit run C:\path\to\your_app.py
106+
```
107+
Next create a .vbs file (e.g., launch_app.vbs) in the same folder and double click it. It will open browser directly without opening terminal.
108+
109+
```vbscript
110+
Set WshShell = CreateObject("WScript.Shell")
111+
WshShell.Run chr(34) & "C:\path\to\run_streamlit.bat" & chr(34), 0
112+
Set WshShell = Nothing
113+
```
114+
115+
2. Double-click the `.vbs` file to launch the app. After you close the browser, if it does not close python and streamlit.exe processes, you have to manually kill those processes or they will pile up for every time you launch the app.
116+
117+
---
118+
**macOS/Linux (.sh file):**
119+
120+
1. Create a `run_streamlit.sh` script:
121+
122+
```bash
123+
#!/bin/bash
124+
source /path/to/venv/bin/activate
125+
streamlit run /path/to/your_app.py
126+
```
127+
128+
2. Make it executable:
129+
130+
```bash
131+
chmod +x run_streamlit.sh
132+
```
133+
134+
3. Run it via double-click or from a launcher depending on your desktop environment.
135+
136+
---
137+
138+
## ✍️ Authors
139+
140+
* **Chay** - [ChayScripts](https://github.com/ChayScripts)
141+
142+
-----
143+
144+
## 🤝 Contributing
145+
146+
Contributions are welcome\! If you have suggestions for improvements, bug fixes, or new features, please feel free to:
147+
148+
* Open an issue to discuss your ideas.
149+
* Fork the repository and submit a pull request.
150+
151+
-----
152+
153+
## 📄 License
154+
155+
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details
156+

0 commit comments

Comments
 (0)