Skip to content

whartonbrettdev/multiagent-task

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 

Repository files navigation

README.md Content

Below is the content tailored to your project. Copy this into a file named README.md in your C:\Users\HP\OneDrive\Desktop\multiagent directory using a text editor (e.g., Notepad, VS Code).

# Multi-Agent Business System

A web-based application that uses multiple AI agents to process business-related queries, powered by FastAPI and Ollama's `mistral` model.

## Overview
This project implements an interactive multi-agent system to handle tasks such as data collection, analysis, visualization, and reporting. Agents collaborate to process user queries and deliver aggregated results via a browser interface.

## Features
- **Planner Agent**: Breaks down queries into manageable subtasks.
- **Data Collector Agent**: Generates or mocks relevant data.
- **Analyzer Agent**: Extracts insights from data.
- **Visualizer Agent**: Creates simple charts (if data supports it).
- **Report Generator Agent**: Compiles final summaries.

## Prerequisites
- **Python 3.8+**
- **Ollama** (for AI model hosting)
- **Windows, macOS, or Linux**

## Installation

### 1. Clone the Repository
```bash
git clone <your-repo-url>  # Replace with your repository URL if applicable
cd multiagent

2. Set Up a Virtual Environment

python -m venv multiagent_venv
multiagent_venv\Scripts\activate  # On Windows
# Or: source multiagent_venv/bin/activate  # On macOS/Linux

3. Install Dependencies

Install the required Python packages:

pip install -r requirements.txt

4. Install and Configure Ollama

  • Download and install Ollama from ollama.com.
  • Pull the mistral model:
    ollama pull mistral
  • Start the Ollama server:
    ollama serve

Usage

  1. Start the application:
    python main.py
  2. Open your browser and go to http://127.0.0.1:8000/.
  3. Enter a query (e.g., "Summarize sales trends for Q1") in the textarea and click "Process Request".
  4. View the agent progress and final result on the page.

Example Queries

  • Say hello - Tests basic connectivity.
  • Summarize the sales data for January - Generates a simple sales summary.
  • Analyze customer feedback: 'Great product, poor support' - Performs sentiment analysis.
  • Show a pie chart of sales by region: North $5000, South $3000, East $2000 - Creates a pie chart.
  • Generate a detailed report on quarterly revenue trends and customer satisfaction for Q1 - Produces a comprehensive report.

Troubleshooting

  • Connection Errors: Ensure Ollama is running and listening on 127.0.0.1:11434. Check with curl -X POST http://localhost:11434/api/generate -d '{"model": "mistral", "prompt": "Hello"}'.
  • 500 Errors: Increase memory availability (close other apps) or adjust the timeout in main.py (line 40) to 120 seconds.
  • Port Conflicts: Use netstat -aon | findstr :11434 to identify conflicts and kill the process with taskkill /PID <PID> /F.

Contributing

Feel free to fork this repository, make improvements, and submit pull requests. Report issues on the GitHub page if applicable.

License

[Specify your license, e.g., MIT] - Add a license file if needed.

Acknowledgments

  • Built with FastAPI, Uvicorn, and Ollama.
  • Inspired by xAI's multi-agent concepts.

### Step 1: Create the File
1. Open a text editor.
2. Copy the content above.
3. Save it as `README.md` in `C:\Users\HP\OneDrive\Desktop\multiagent`.

### Step 2: Customize (Optional)
- **Repository URL**: If this is a GitHub repo, replace `<your-repo-url>` with the actual URL.
- **License**: Add a license (e.g., MIT) and create a `LICENSE` file if desired.
- **Acknowledgments**: Credit additional tools or inspiration if applicable.

### Step 3: Verify
- Open `README.md` in a text editor or browser (if rendered with Markdown support) to ensure formatting is correct.
- The file uses Markdown syntax (e.g., `#` for headings, `-` for lists), which GitHub and other platforms will render nicely.

### Step 4: Integrate with Project
- This `README.md` complements your `requirements.txt` (from earlier) and project files. It provides setup instructions that align with our troubleshooting steps.
- You can now resume testing the app with the queries I provided earlier once Ollama finishes loading.

### Step 5: Report Back
- After creating the file, let me know if you need adjustments or help with testing the queries (e.g., "Say hello") at 05:00 PM PKT.
- Share any issues with file creation or further logs if testing fails.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published