Skip to content

Khogao/lmstudio-openwebui

Repository files navigation

🤖 LM Studio + OpenWebUI Integration

Mục đích: Scripts to connect LM Studio (local LLM) with Open WebUI for a ChatGPT-like interface
Platform: Windows, Mac, Linux
Components: LM Studio + Open WebUI


🎯 What This Is

LM Studio provides local LLM inference (run models like Llama, Mistral on your machine).
Open WebUI provides a beautiful ChatGPT-like web interface.

This repo contains connection scripts to bridge them together.


📁 Files

  • start_lmstudio_connection.bat - Windows batch script
  • start_lmstudio_connection.ps1 - Windows PowerShell script
  • start_lmstudio_connection.sh - Mac/Linux shell script
  • README_WINDOWS.md - Windows-specific setup guide

🚀 Quick Start

Prerequisites

  1. LM Studio installed and running

    • Download: https://lmstudio.ai/
    • Load a model (e.g., Llama 3, Mistral 7B)
    • Start local server (default: http://localhost:1234)
  2. Open WebUI installed


🔧 Usage

Windows

# Option 1: PowerShell (recommended)
.\start_lmstudio_connection.ps1

# Option 2: Batch file
.\start_lmstudio_connection.bat

Mac/Linux

chmod +x start_lmstudio_connection.sh
./start_lmstudio_connection.sh

🌐 Access

After starting:

  1. LM Studio: http://localhost:1234 (LLM server)
  2. Open WebUI: http://localhost:3000 (Chat interface)
  3. Configure: In Open WebUI settings, set API endpoint to http://localhost:1234/v1

⚙️ Configuration

LM Studio Settings

  • Port: 1234 (default)
  • API Type: OpenAI-compatible
  • CORS: Enable if needed

Open WebUI Settings

  1. Go to Settings → Connections
  2. Set "OpenAI API Base URL" to http://localhost:1234/v1
  3. API Key: Not needed for local (use dummy key like lm-studio)

💡 Why Use This?

Benefits:

  • Privacy: All data stays local (no cloud)
  • Free: No API costs
  • Fast: No internet latency
  • Offline: Works without internet
  • Customizable: Use any model you want

Use Cases:

  • Private coding assistant
  • Sensitive document analysis
  • Learning/experimenting with LLMs
  • Offline work environments

🔧 Troubleshooting

LM Studio not responding?

  • Check if model is loaded
  • Verify server is running (green light in LM Studio)
  • Test: curl http://localhost:1234/v1/models

Open WebUI can't connect?

  • Check API URL is exactly http://localhost:1234/v1
  • Disable CORS if needed in LM Studio settings
  • Check firewall/antivirus not blocking

Slow responses?

  • Use smaller models (7B instead of 70B)
  • Enable GPU acceleration in LM Studio
  • Reduce context length

📚 Resources


🔄 Recommended Models

For coding:

  • DeepSeek Coder 6.7B
  • CodeLlama 7B/13B
  • Phind CodeLlama 34B

For general chat:

  • Llama 3 8B/70B
  • Mistral 7B
  • Mixtral 8x7B

Vietnamese support:

  • Vistral 7B (Vietnamese-optimized)
  • SeaLLM 7B (Southeast Asian languages)

Last Updated: 2025-10-13
Maintained By: Khogao
License: Private

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors