You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+5-4Lines changed: 5 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,8 @@
1
-
# Llamacpp
1
+
[](https://github.com/ZOSOpenTools/llama.cppport/actions/workflows/bump.yml)
2
2
3
-
A C++ library for writing high-performance network applications
3
+
# llama.cpp
4
+
5
+
Enable AI inferencing on z/os
4
6
5
7
# Installation and Usage
6
8
@@ -27,7 +29,6 @@ See the [zopen porting guide](https://zopen.community/#/Guides/Porting) for more
27
29
28
30
29
31
# Troubleshooting
30
-
TBD
31
32
32
33
# Contributing
33
-
Contributions are welcome! Please follow the [zopen contribution guidelines](https://github.com/zopencommunity/meta/blob/main/CONTRIBUTING.md).
34
+
Contributions are welcome! Please follow the [zopen contribution guidelines](https://github.com/zopencommunity/meta/blob/main/CONTRIBUTING.md).
**zOpen Chat** is a web-based interface that enables natural language interaction with AI models (like LLaMA 3.2 and Granite 3), designed to explore and work with open-source tools in the z/OS ecosystem.
5
+
6
+
This project features:
7
+
- A **web-based interface** for natural interactions.
8
+
- Integration with **llamacpp** backend for inferencing.
9
+
- Built-in tools to **search GitHub repositories** relevant to z/OS and files present in the **local file system**.
10
+
11
+
## Use cases covered
12
+
1.**Chat**: A conversational Q&A system where users can ask questions and receive clear, concise answers.
13
+
2.**Explain Code**: Request short, contextual explanations for specific code files or components.
14
+
3.**Generate Tests**: Ask the system to generate unit tests or test cases for specific files using natural language prompts.
15
+
16
+
The files can be extracted from the repositories in `zopencommunity` or the `local file system`.
17
+
> For usage examples and UI walkthroughs, see `docs/WEBUI.md`
18
+
<!-- > Video Demonstration in `docs/videos/Final Use Cases Demo.mp4` -->
19
+
20
+
## Prerequisites
21
+
22
+
Before running zOpen Chat, ensure that the following are set up:
23
+
-**llamacpp**: from the llamacpp port of z/OS. [Repository Link](https://github.com/zopencommunity/llamacppport)
24
+
-**Node.js (LTS)**: [Download Node.js supported by z/OS](https://www.ibm.com/products/sdk-nodejs-compiler-zos)
25
+
-**npm**: Comes with Node.js
26
+
27
+
## Workflow
28
+
29
+
The **Model Context Protocol (MCP)** is implemented here to orchestrate the routing in this system. It handles user inputs, tool management, and communication between the client interface and the underlying LLM infrastructure.
30
+
31
+
- The client UI allows users to interact with the system via options like:
32
+
- Chat
33
+
- Explain Code
34
+
- Generate Test Cases
35
+
36
+
- These requests are sent to the MCP Server, which serves as the orchestrator.
37
+
38
+
- The MCP Server forwards the request to a Llama Server hosting the Large Language Model (LLM).
39
+
40
+
- The LLM processes the input and returns a response to the MCP Server.
41
+
42
+
- The response is routed back to the client UI, completing the workflow loop.
43
+
44
+
> Folder structure details are explained in `docs/STRUCTURE.md`
This command starts the LLaMA inference engine. Let it run in the background.
68
+
69
+
### 3. Python Backend
70
+
Create a virtual environment and activate it
71
+
```bash
72
+
python -m venv venv
73
+
source venv/bin/activate
74
+
```
75
+
76
+
Install the dependencies
77
+
```bash
78
+
cd backend
79
+
pip install -r requirements.txt
80
+
```
81
+
82
+
Once the dependencies are installed run the Flask app
83
+
```bash
84
+
python app.py
85
+
```
86
+
This will start the Flask server on `http://127.0.0.1:21098`. To change the port, update the relevant port configuration in the `frontend/src/config.js` code.
87
+
88
+
### 4. Frontend (Website)
89
+
Install the node modules
90
+
```bash
91
+
cd frontend
92
+
npm install
93
+
```
94
+
95
+
Start the website
96
+
```bash
97
+
npm start
98
+
```
99
+
The website can be accessed by going to `http://127.0.0.1:21097` on your web browser!
0 commit comments