Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,8 +13,8 @@ jobs:

strategy:
matrix:
node-version: [14.x, 16.x]
python-version: [3.8, 3.9, 3.10]
node-version: [18.x, 20.x, 22.x]
python-version: [3.8, 3.10, 3.12]

steps:
- uses: actions/checkout@v3
Expand Down
1 change: 1 addition & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ dependencies = [
"python-Levenshtein==0.23.0",
"pathspec>=0.11.0",
"aider-chat>=0.69.1",
"python-dotenv>=1.0.0",
"ripgrepy>=0.1.0"
]

Expand Down
5 changes: 3 additions & 2 deletions requirements-dev.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
aider-chat==0.69.*
playwright==1.49.*
aider-chat>=0.69.*
playwright>=1.49.*
pytest-timeout>=2.2.0
pytest>=7.0.0
python-dotenv>=0.19.*
5 changes: 5 additions & 0 deletions requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
aider-chat>=0.72.0
playwright>=1.49.0
pytest-timeout>=2.2.0
pytest>=7.0.0
python-dotenv>=0.19.0
36 changes: 17 additions & 19 deletions sparc_cli/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -84,28 +84,26 @@ pip install -e .

### Environment Setup

Create a `.env` file in your project root with the following variables:
Environment variables are automatically managed in `$HOME/.sparc_exports` and sourced in your shell's RC file.

Required API Keys:
- ANTHROPIC_API_KEY - Get it from: https://console.anthropic.com/account/keys

Optional API Keys:
- OPENAI_API_KEY - Get it from: https://platform.openai.com/api-keys
- OPENROUTER_KEY - Get it from: https://openrouter.ai/keys
- GEMINI_API_KEY - Get it from: https://makersuite.google.com/app/apikey
- VERTEXAI_PROJECT and VERTEXAI_LOCATION - Configure in Google Cloud Console

The installation script supports both local Python and Docker-based installations. Choose the installation method that best suits your needs:

```bash
# Required: At least one of these LLM provider API keys
ANTHROPIC_API_KEY=your_anthropic_key # Required for Claude models
OPENAI_API_KEY=your_openai_key # Required for GPT models
OPENROUTER_API_KEY=your_openrouter_key # Required for OpenRouter

# Optional: Expert knowledge configuration
EXPERT_PROVIDER=openai # Default provider for expert queries (anthropic|openai|openrouter)
EXPERT_MODEL=o1-preview # Model to use for expert knowledge queries

# Optional: Default provider settings
DEFAULT_PROVIDER=anthropic # Default LLM provider (anthropic|openai|openrouter)
DEFAULT_MODEL=claude-3-opus-20240229 # Default model name

# Optional: Development settings
DEBUG=false # Enable debug logging
COWBOY_MODE=false # Skip command approval prompts
```
# Local Python Installation
./install.sh

Note: At least one provider API key must be configured for SPARC to function.
# Docker Installation
./install.sh --docker
```

## Usage

Expand Down
21 changes: 21 additions & 0 deletions sparc_cli/docs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,3 +46,24 @@ sparc scrape --url "https://example.com" --selector "article.content" --format j
- [Architecture](architecture.md) - System design and components

For installation and getting started, please refer to the [Usage Guide](usage.md).

### Environment Setup

Environment variables are automatically managed in `$HOME/.sparc_exports` and sourced in your shell's RC file.

Required API Keys:
- ANTHROPIC_API_KEY - Get it from: https://console.anthropic.com/account/keys

Optional API Keys:
- OPENAI_API_KEY - Get it from: https://platform.openai.com/api-keys
- OPENROUTER_KEY - Get it from: https://openrouter.ai/keys
- GEMINI_API_KEY - Get it from: https://makersuite.google.com/app/apikey
- VERTEXAI_PROJECT and VERTEXAI_LOCATION - Configure in Google Cloud Console

The installation script supports both local Python and Docker-based installations. Choose the installation method that best suits your needs:

```bash
./install.sh

./install.sh --docker
```
21 changes: 21 additions & 0 deletions sparc_cli/env.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,26 @@
import sys
from dataclasses import dataclass
from typing import Tuple, List
from pathlib import Path

from sparc_cli.console.formatting import print_error
from dotenv import load_dotenv
import shutil

def load_environment() -> bool:
"""Load environment variables from .env file.

Returns:
bool: True if .env loaded successfully, False otherwise
"""
env_path = Path('.env')
if not env_path.exists():
print_error(".env file not found in current directory")
print_error("Please copy sample.env to .env and configure your API keys")
return False

load_dotenv()
return True

@dataclass
class ProviderConfig:
Expand Down Expand Up @@ -36,6 +54,9 @@ def validate_environment(args) -> Tuple[bool, List[str]]:
Raises:
SystemExit: If required base environment variables are missing
"""
# Load environment variables from .env file
load_environment()

missing = []
provider = args.provider
expert_provider = args.expert_provider
Expand Down
37 changes: 36 additions & 1 deletion sparc_cli/install.sh
Original file line number Diff line number Diff line change
Expand Up @@ -43,21 +43,56 @@ setup_environment() {
# Create exports file
local exports_file="$HOME/.sparc_exports"

echo "The following API keys are used to access different AI models:"
echo

# Always require ANTHROPIC_API_KEY
echo "ANTHROPIC_API_KEY (Required)"
echo "- Used for Claude models from Anthropic"
echo "- Get it from: https://console.anthropic.com/account/keys"
while true; do
ANTHROPIC_API_KEY=$(prompt_secret "ANTHROPIC_API_KEY" "true")
if [ $? -eq 0 ] && [ -n "$ANTHROPIC_API_KEY" ]; then
break
fi
echo "ANTHROPIC_API_KEY is required. Please try again."
done
echo

# Optional keys
# Optional keys with descriptions
echo "OPENAI_API_KEY (Optional)"
echo "- Used for GPT models from OpenAI"
echo "- Get it from: https://platform.openai.com/api-keys"
OPENAI_API_KEY=$(prompt_secret "OPENAI_API_KEY" "false")
echo

echo "OPENROUTER_KEY (Optional)"
echo "- Used to access multiple AI models through a single API"
echo "- Get it from: https://openrouter.ai/keys"
OPENROUTER_KEY=$(prompt_secret "OPENROUTER_KEY" "false")
echo

echo "ENCRYPTION_KEY (Optional)"
echo "- Used for secure storage of sensitive data"
echo "- Can be any string you choose"
ENCRYPTION_KEY=$(prompt_secret "ENCRYPTION_KEY" "false")
echo

echo "GEMINI_API_KEY (Optional)"
echo "- Used for Google's Gemini models"
echo "- Get it from: https://makersuite.google.com/app/apikey"
GEMINI_API_KEY=$(prompt_secret "GEMINI_API_KEY" "false")
echo

echo "VERTEXAI_PROJECT (Optional)"
echo "- Your Google Cloud project ID for Vertex AI"
echo "- Find it in your Google Cloud Console"
VERTEXAI_PROJECT=$(prompt_secret "VERTEXAI_PROJECT" "false")
echo

echo "VERTEXAI_LOCATION (Optional)"
echo "- Geographic location for Vertex AI services (e.g., us-central1)"
echo "- See: https://cloud.google.com/vertex-ai/docs/general/locations"
VERTEXAI_LOCATION=$(prompt_secret "VERTEXAI_LOCATION" "false")

# Create exports file
Expand Down
10 changes: 10 additions & 0 deletions sparc_cli/providers.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
from ollama import Ollama
class OllamaProvider:
def __init__(self):
self.ollama = ollama.Ollama()

def get_response(self, prompt):
return self.ollama.get_response(prompt)

def default_host(self):
return "localhost:11434"
6 changes: 5 additions & 1 deletion sparc_cli/setup.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
from setuptools import setup, find_packages
import dotenv
from pathlib import Path
from setuptools import setup, find_packages

# load .env
dotenv.load_dotenv()

# Read the contents of README.md
this_directory = Path(__file__).parent
Expand Down
20 changes: 20 additions & 0 deletions sparc_mcp/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# Use an official Python runtime as a parent image
FROM python:3.12-slim

# Set the working directory in the container
WORKDIR /app

# Copy the current directory contents into the container at /app
COPY . /app

# Install any needed packages specified in requirements.txt
RUN pip install --no-cache-dir flask

# Make port 5000 available to the world outside this container
EXPOSE 5000

# Define environment variable
ENV NAME=SparcMCPServer

# Run app.py when the container launches
CMD ["python", "app.py"]
35 changes: 35 additions & 0 deletions sparc_mcp/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
# Sparc MCP Server

## Overview
This is a simple Flask-based MCP (Model Context Protocol) server that responds to POST requests at the `/process` endpoint.

## Prerequisites
- Docker
- Docker Compose (optional)

## Building the Docker Image
To build the Docker image, run the following command in the project directory:

```bash
docker build -t sparc_mcp .
```

## Running the Docker Container
To run the Docker container:

```bash
docker run -p 5000:5000 sparc_mcp
```

## Testing the Endpoint
You can test the `/process` endpoint using curl:

```bash
curl -X POST -H "Content-Type: application/json" -d '{"key": "value"}' http://localhost:5000/process
```

## Development
To modify the server, edit `app.py` and rebuild the Docker image.

## License
[Add your license information here]
12 changes: 12 additions & 0 deletions sparc_mcp/app.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
from flask import Flask, request, jsonify

app = Flask(__name__)

@app.route('/process', methods=['POST'])
def process():
data = request.get_json()
# Process data here
return jsonify({'message': 'Request received', 'data': data}), 200

if __name__ == '__main__':
app.run(host='0.0.0.0', port=5000)
5 changes: 5 additions & 0 deletions sparc_mcp/mcp.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
{
"command": "python",
"args": ["app.py"],
"env": {}
}
1 change: 1 addition & 0 deletions sparc_mcp/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
flask==2.1.0
8 changes: 8 additions & 0 deletions sparc_mcp/test.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
curl https://api.openai.com/v1/chat/completions -H "Content-Type: application/json" -H "Authorization: Bearer <token>" -d '{
"model": "gpt-4o-mini",
"store": true,
"messages": [
{"role": "user", "content": "write a haiku about ai"}
]
}'