|
| 1 | +# LLM-Guided Volatility3 Triage Tool |
| 2 | + |
| 3 | +A memory forensics triage assistant that combines Volatility3 with Large Language Model analysis to help analysts quickly identify and prioritize suspicious artifacts in Windows memory dumps. |
| 4 | + |
| 5 | +## Features |
| 6 | + |
| 7 | +- **Automated Volatility3 Execution**: Runs 10 forensic plugins to extract processes, network connections, DLLs, handles, and code injection indicators |
| 8 | +- **Two-Stage LLM Analysis**: Uses Google Gemini or Anthropic Claude to analyze raw data and provide scored findings with explanations |
| 9 | +- **Interactive Chat**: Ask follow-up questions about findings with full forensic context |
| 10 | +- **Hunting Checklist**: Automatically generates prioritized investigation steps |
| 11 | +- **Web Interface**: Simple browser-based UI for submitting dumps and reviewing results |
| 12 | + |
| 13 | +## Quick Start |
| 14 | + |
| 15 | +### Using Docker Run |
| 16 | + |
| 17 | +```bash |
| 18 | +# Create directories for dumps and outputs |
| 19 | +mkdir -p dumps runs |
| 20 | + |
| 21 | +# Run with Google Gemini (recommended) |
| 22 | +docker run -d \ |
| 23 | + -p 8000:8000 \ |
| 24 | + -v $(pwd)/dumps:/dumps:ro \ |
| 25 | + -v $(pwd)/runs:/app/runs \ |
| 26 | + -e LLM_PROVIDER=google \ |
| 27 | + -e GOOGLE_API_KEY=your_google_api_key \ |
| 28 | + --name vol3-triage \ |
| 29 | + therealpotus/vol3-triage:latest |
| 30 | + |
| 31 | +# Or run with Anthropic Claude |
| 32 | +docker run -d \ |
| 33 | + -p 8000:8000 \ |
| 34 | + -v $(pwd)/dumps:/dumps:ro \ |
| 35 | + -v $(pwd)/runs:/app/runs \ |
| 36 | + -e LLM_PROVIDER=anthropic \ |
| 37 | + -e ANTHROPIC_API_KEY=your_anthropic_api_key \ |
| 38 | + --name vol3-triage \ |
| 39 | + therealpotus/vol3-triage:latest |
| 40 | +``` |
| 41 | + |
| 42 | +### Using Docker Compose |
| 43 | + |
| 44 | +1. Download the compose file: |
| 45 | + ```bash |
| 46 | + wget https://raw.githubusercontent.com/vermi/vol3-triage/main/docker-compose.yml |
| 47 | + ``` |
| 48 | + |
| 49 | +2. Create a `.env` file: |
| 50 | + ```bash |
| 51 | + echo "LLM_PROVIDER=google" > .env |
| 52 | + echo "GOOGLE_API_KEY=your_key_here" >> .env |
| 53 | + ``` |
| 54 | + |
| 55 | +3. Start the container: |
| 56 | + ```bash |
| 57 | + docker-compose up -d |
| 58 | + ``` |
| 59 | + |
| 60 | +## Usage |
| 61 | + |
| 62 | +1. Place memory dump files in your `./dumps/` directory |
| 63 | +2. Open `http://localhost:8000/app` in your browser |
| 64 | +3. Enter the path `/dumps/your-file.dmp`, a case name, and scenario description |
| 65 | +4. Click **Run Triage** and wait for analysis (5-15 minutes depending on dump size) |
| 66 | +5. Review results in the Summary, Checklist, LLM Analysis, and Chat tabs |
| 67 | + |
| 68 | +## Environment Variables |
| 69 | + |
| 70 | +| Variable | Description | Default | |
| 71 | +|----------|-------------|---------| |
| 72 | +| `LLM_PROVIDER` | LLM provider: `google` or `anthropic` | `google` | |
| 73 | +| `GOOGLE_API_KEY` | Google Gemini API key | - | |
| 74 | +| `ANTHROPIC_API_KEY` | Anthropic Claude API key | - | |
| 75 | + |
| 76 | +## API Keys |
| 77 | + |
| 78 | +- **Google Gemini** (recommended, 1M token context): https://aistudio.google.com/apikey |
| 79 | +- **Anthropic Claude** (200K token context): https://console.anthropic.com/ |
| 80 | + |
| 81 | +## Volume Mounts |
| 82 | + |
| 83 | +| Path | Description | |
| 84 | +|------|-------------| |
| 85 | +| `/dumps` | Memory dump files (read-only) | |
| 86 | +| `/app/runs` | Analysis output directory (persisted) | |
| 87 | + |
| 88 | +## Links |
| 89 | + |
| 90 | +- **GitHub**: https://github.com/vermi/vol3-triage |
| 91 | +- **Documentation**: See full README on GitHub for detailed documentation |
0 commit comments