SAFLA is now configured to work with the Model Context Protocol (MCP), allowing AI assistants to interact with the remote SAFLA instance deployed on Fly.io.
The MCP configuration is stored in .roo/mcp.json:
{
"mcpServers": {
"safla": {
"command": "python3",
"args": [
"/workspaces/SAFLA/safla_mcp_simple.py"
],
"env": {
"SAFLA_REMOTE_URL": "https://safla.fly.dev"
}
}
}
}The SAFLA MCP server provides the following tools:
Generate embeddings using SAFLA's extreme-optimized engine (1.75M+ ops/sec)
- Input:
texts(array of strings) - Texts to embed - Performance: Utilizes the 178,146% optimized engine
Store information in SAFLA's hybrid memory system
- Input:
content(string) - Content to storememory_type(string) - Type: "episodic", "semantic", or "procedural"
Search and retrieve from SAFLA's memory system
- Input:
query(string) - Search querylimit(integer, default: 5) - Maximum results
Get SAFLA performance metrics
- Input: None required
- Output: Current performance statistics
The MCP server (safla_mcp_simple.py) implements the Model Context Protocol by:
- Reading JSON-RPC messages from stdin
- Proxying requests to the remote SAFLA API
- Returning formatted responses to stdout
- Baseline: 985.38 ops/sec
- Optimized: 1,755,595.48 ops/sec
- Improvement: 178,146.95% (1,781x faster)
- Cache: Enabled for maximum performance
- Batch Size: 256 (optimal for performance)
To test the MCP server manually:
# Start the server
python3 /workspaces/SAFLA/safla_mcp_simple.py
# Send test requests (JSON-RPC format)
{"jsonrpc": "2.0", "id": 1, "method": "initialize", "params": {}}
{"jsonrpc": "2.0", "id": 2, "method": "tools/list", "params": {}}The MCP server connects to the SAFLA instance deployed at:
- URL: https://safla.fly.dev
- API Endpoint: https://safla.fly.dev/api/safla
- Health Check: https://safla.fly.dev/health
If you encounter connection issues:
- Verify the SAFLA instance is running:
curl https://safla.fly.dev/health - Check the MCP server logs for errors
- Ensure the
SAFLA_REMOTE_URLenvironment variable is set correctly - Verify Python 3 is available in your environment
AI Assistant <-> MCP Protocol <-> safla_mcp_simple.py <-> HTTPS <-> SAFLA on Fly.io
The MCP server acts as a bridge between the Model Context Protocol and the remote SAFLA API, enabling seamless integration with AI assistants while leveraging the extreme performance optimizations achieved (1.75M+ ops/sec).