Skip to content

M2002HR/Ollama_Chat_App

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ollama Chat App

A simple web-based interface for chatting with local LLMs using Ollama.

Setup

  1. Clone the repository

  2. Install dependencies:

    pip install -r requirements.txt
  3. Run Ollama locally:

    ollama run llama3
  4. Start the server:

    python app.py
  5. Open http://localhost:5000

Notes

  • The app assumes Ollama is available at http://localhost:11434

About

A simple web-based interface for chatting with local LLMs using Ollama

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors