Skip to content

llm-workflow-engine/lwe-plugin-provider-chat-ollama

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 

Repository files navigation

LLM Workflow Engine (LWE) Chat Ollama Provider plugin

Chat Ollama Provider plugin for LLM Workflow Engine

Access to Ollama chat models.

Installation

Ollama local server

Follow the installation instructions for Ollama, and make sure the server is running on port 11434.

Plugin

From packages

Install the latest version of this software directly from github with pip:

pip install git+https://github.com/llm-workflow-engine/lwe-plugin-provider-chat-ollama

From source (recommended for development)

Install the latest version of this software directly from git:

git clone https://github.com/llm-workflow-engine/lwe-plugin-provider-chat-ollama.git

Install the development package:

cd lwe-plugin-provider-chat-ollama
pip install -e .

Configuration

Add the following to config.yaml in your profile:

plugins:
  enabled:
    - provider_chat_ollama
    # Any other plugins you want enabled...

Usage

From a running LWE shell:

/provider chat_ollama
/model model llama2

About

LLM Workflow Engine (LWE) Chat Ollama Provider plugin

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages