Skip to content

Commit 2e64ba8

Browse files
Merge pull request #19 from bomanaps/impl/mcp_server_foundation
feature mcp server
2 parents c6ea323 + 7bf2f64 commit 2e64ba8

29 files changed

+5002
-2725
lines changed

apps/mcp-server/README.md

Lines changed: 320 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,320 @@
1+
# Lighthouse MCP Server
2+
3+
A Model Context Protocol (MCP) server implementation that bridges AI agents with Lighthouse decentralized storage. This server provides AI-accessible tools for file storage, dataset management, and IPFS operations through a standardized protocol.
4+
5+
## 🎯 Features
6+
7+
- **MCP Protocol Compliance**: Full implementation of MCP specification
8+
- **Mock Lighthouse Operations**: Realistic file upload, download, and pinning simulations
9+
- **Dataset Management**: Create and manage file collections with metadata
10+
- **Tool Registry**: Dynamic tool registration and discovery system
11+
- **Request Validation**: Comprehensive input validation and sanitization
12+
- **Structured Logging**: Detailed operation logging for debugging and monitoring
13+
- **Performance Optimized**: Meets strict performance requirements (<2s startup, <500ms operations)
14+
15+
## 📦 Installation
16+
17+
```bash
18+
# Install dependencies
19+
pnpm install
20+
21+
# Build the server
22+
pnpm run build
23+
24+
# Run tests
25+
pnpm test
26+
27+
# Run tests with coverage
28+
pnpm run test:coverage
29+
```
30+
31+
## 🚀 Quick Start
32+
33+
### Running the Server
34+
35+
```bash
36+
# Start with default configuration
37+
node dist/index.js
38+
39+
# Start with custom log level
40+
node dist/index.js --log-level debug
41+
42+
# Start with custom storage limit
43+
node dist/index.js --max-storage 2147483648
44+
45+
# View help
46+
node dist/index.js --help
47+
```
48+
49+
### Programmatic Usage
50+
51+
```typescript
52+
import { LighthouseMCPServer } from '@lighthouse-tooling/mcp-server';
53+
54+
// Create server instance
55+
const server = new LighthouseMCPServer({
56+
name: 'lighthouse-storage',
57+
version: '0.1.0',
58+
logLevel: 'info',
59+
maxStorageSize: 1024 * 1024 * 1024, // 1GB
60+
enableMetrics: true,
61+
});
62+
63+
// Start the server
64+
await server.start();
65+
66+
// Get server statistics
67+
const stats = server.getStats();
68+
console.log('Server stats:', stats);
69+
70+
// Graceful shutdown
71+
await server.stop();
72+
```
73+
74+
## 🛠️ Available MCP Tools
75+
76+
### 1. lighthouse_upload_file
77+
78+
Upload a file to IPFS via Lighthouse with optional encryption.
79+
80+
**Parameters:**
81+
- `filePath` (required): Path to the file to upload
82+
- `encrypt` (optional): Whether to encrypt the file
83+
- `accessConditions` (optional): Access control conditions
84+
- `tags` (optional): Tags for organization
85+
86+
**Example:**
87+
```json
88+
{
89+
"filePath": "/path/to/file.txt",
90+
"encrypt": true,
91+
"tags": ["dataset", "ml-model"]
92+
}
93+
```
94+
95+
### 2. lighthouse_create_dataset
96+
97+
Create a managed dataset collection with metadata.
98+
99+
**Parameters:**
100+
- `name` (required): Dataset name
101+
- `description` (optional): Dataset description
102+
- `files` (required): Array of file paths to include
103+
- `metadata` (optional): Additional metadata
104+
- `encrypt` (optional): Whether to encrypt the dataset
105+
106+
**Example:**
107+
```json
108+
{
109+
"name": "Training Dataset",
110+
"description": "ML training data",
111+
"files": ["/data/train.csv", "/data/validate.csv"],
112+
"metadata": {
113+
"author": "Data Team",
114+
"version": "1.0.0"
115+
}
116+
}
117+
```
118+
119+
### 3. lighthouse_fetch_file
120+
121+
Download and optionally decrypt a file from Lighthouse.
122+
123+
**Parameters:**
124+
- `cid` (required): IPFS CID of the file
125+
- `outputPath` (optional): Local path to save the file
126+
- `decrypt` (optional): Whether to decrypt the file
127+
128+
**Example:**
129+
```json
130+
{
131+
"cid": "QmYwAPJzv5CZsnA...",
132+
"outputPath": "/local/path/file.txt",
133+
"decrypt": true
134+
}
135+
```
136+
137+
## 🏗️ Architecture
138+
139+
```
140+
LighthouseMCPServer
141+
├── ToolRegistry # Tool management and execution
142+
├── MockLighthouseService # File operations (upload, fetch, pin)
143+
├── MockDatasetService # Dataset management
144+
├── Handlers
145+
│ ├── ListToolsHandler # Handle tools/list
146+
│ ├── CallToolHandler # Handle tools/call
147+
│ ├── ListResourcesHandler # Handle resources/list
148+
│ └── InitializeHandler # Handle initialize
149+
└── Utilities
150+
├── RequestValidator # Input validation
151+
├── ResponseBuilder # Response formatting
152+
└── CIDGenerator # Mock CID generation
153+
```
154+
155+
## 📊 Performance Metrics
156+
157+
The server meets the following performance requirements:
158+
159+
- **Server Startup**: < 2 seconds
160+
- **Tool Registration**: < 100ms per tool
161+
- **Mock Operations**: < 500ms per operation
162+
- **Memory Usage**: < 50MB
163+
164+
## 🧪 Testing
165+
166+
```bash
167+
# Run all tests
168+
pnpm test
169+
170+
# Run with coverage
171+
pnpm run test:coverage
172+
173+
# Run in watch mode
174+
pnpm run test:watch
175+
```
176+
177+
### Test Coverage
178+
179+
The test suite includes:
180+
- **Unit Tests**: Individual component testing
181+
- **Integration Tests**: End-to-end workflow testing
182+
- **Performance Tests**: Metric validation
183+
184+
Target: **>90% code coverage**
185+
186+
## 📝 Configuration
187+
188+
### Server Config Options
189+
190+
```typescript
191+
interface ServerConfig {
192+
name: string; // Server name
193+
version: string; // Server version
194+
logLevel: 'debug' | 'info' | 'warn' | 'error';
195+
maxStorageSize: number; // Max storage in bytes
196+
enableMetrics: boolean; // Enable metrics collection
197+
metricsInterval: number; // Metrics collection interval (ms)
198+
}
199+
```
200+
201+
### Environment Variables
202+
203+
```bash
204+
# Log level
205+
LOG_LEVEL=info
206+
207+
# Maximum storage size (bytes)
208+
MAX_STORAGE_SIZE=1073741824
209+
210+
# Enable metrics
211+
ENABLE_METRICS=true
212+
```
213+
214+
## 🔍 Logging
215+
216+
The server uses structured logging with different log levels:
217+
218+
```typescript
219+
// Log levels: debug, info, warn, error
220+
logger.info('Operation started', { operationId: '123' });
221+
logger.error('Operation failed', error, { context: {...} });
222+
```
223+
224+
## 🤝 Integration with AI Agents
225+
226+
The server follows the MCP specification, making it compatible with:
227+
228+
- **Cursor AI**: Direct integration via MCP
229+
- **Claude Desktop**: MCP server connection
230+
- **Custom AI Agents**: Any MCP-compliant client
231+
232+
### Example Client Configuration
233+
234+
```json
235+
{
236+
"mcpServers": {
237+
"lighthouse-storage": {
238+
"command": "node",
239+
"args": ["/path/to/lighthouse-mcp-server/dist/index.js"],
240+
"env": {
241+
"LOG_LEVEL": "info"
242+
}
243+
}
244+
}
245+
}
246+
```
247+
248+
## 🛡️ Error Handling
249+
250+
The server implements comprehensive error handling:
251+
252+
- **Validation Errors**: Input parameter validation
253+
- **Not Found Errors**: Missing files or datasets
254+
- **Operation Errors**: Failed uploads or downloads
255+
- **System Errors**: Resource exhaustion
256+
257+
All errors follow MCP error code standards.
258+
259+
## 📚 API Documentation
260+
261+
### Tool Registry API
262+
263+
```typescript
264+
// Register a tool
265+
registry.register(toolDefinition, executor);
266+
267+
// Execute a tool
268+
const result = await registry.executeTool(toolName, args);
269+
270+
// List all tools
271+
const tools = registry.listTools();
272+
273+
// Get metrics
274+
const metrics = registry.getMetrics();
275+
```
276+
277+
### Mock Service API
278+
279+
```typescript
280+
// Upload file
281+
const result = await lighthouseService.uploadFile({
282+
filePath: '/path/to/file',
283+
encrypt: true
284+
});
285+
286+
// Fetch file
287+
const file = await lighthouseService.fetchFile({
288+
cid: 'QmYwAPJzv5CZsnA...'
289+
});
290+
291+
// Get storage stats
292+
const stats = lighthouseService.getStorageStats();
293+
```
294+
295+
## 🔮 Future Enhancements
296+
297+
- [ ] Integration with real Lighthouse SDK (Issue #11)
298+
- [ ] WebSocket support for real-time updates
299+
- [ ] Caching layer for improved performance
300+
- [ ] Batch operation support
301+
- [ ] Advanced access control management
302+
303+
## 📄 License
304+
305+
MIT License - see LICENSE file for details
306+
307+
## 🤝 Contributing
308+
309+
Contributions are welcome! Please read CONTRIBUTING.md for guidelines.
310+
311+
## 📞 Support
312+
313+
For issues and questions:
314+
- GitHub Issues: [lighthouse-agent-tooling/issues](https://github.com/Patrick-Ehimen/lighthouse-agent-tooling/issues)
315+
- Documentation: See `/apps/docs/TECHNICAL_PROPOSAL.md`
316+
317+
---
318+
319+
**Built with ❤️ for the Lighthouse ecosystem**
320+

apps/mcp-server/demo-test.js

Lines changed: 64 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,64 @@
1+
const { LighthouseMCPServer } = require('./dist/server.js');
2+
const fs = require('fs');
3+
4+
async function demo() {
5+
const testFile = './demo-test.txt';
6+
fs.writeFileSync(testFile, 'Hello from MCP Server!');
7+
8+
console.log('\n=== MCP Server Demo ===\n');
9+
10+
const server = new LighthouseMCPServer({
11+
logLevel: 'error',
12+
enableMetrics: false
13+
});
14+
15+
const registry = server.getRegistry();
16+
console.log('Server initialized with', registry.listTools().length, 'tools\n');
17+
18+
// Test 1: Upload
19+
console.log('1️⃣ Testing lighthouse_upload_file...');
20+
const uploadResult = await registry.executeTool('lighthouse_upload_file', {
21+
filePath: testFile,
22+
encrypt: true,
23+
tags: ['demo', 'test']
24+
});
25+
26+
console.log(' Success:', uploadResult.success);
27+
console.log(' Execution time:', uploadResult.executionTime + 'ms');
28+
if (uploadResult.data) {
29+
console.log(' CID:', uploadResult.data.cid?.substring(0, 20) + '...');
30+
console.log(' Size:', uploadResult.data.size, 'bytes');
31+
console.log(' Encrypted:', uploadResult.data.encrypted);
32+
}
33+
34+
// Test 2: Dataset
35+
console.log('\n2️⃣ Testing lighthouse_create_dataset...');
36+
const datasetResult = await registry.executeTool('lighthouse_create_dataset', {
37+
name: 'Demo Dataset',
38+
description: 'Test dataset',
39+
files: [testFile]
40+
});
41+
42+
console.log(' Success:', datasetResult.success);
43+
console.log(' Execution time:', datasetResult.executionTime + 'ms');
44+
if (datasetResult.data) {
45+
console.log(' Dataset ID:', datasetResult.data.id?.substring(0, 20) + '...');
46+
console.log(' Files count:', datasetResult.data.files?.length);
47+
}
48+
49+
// Stats
50+
console.log('\n📊 Server Statistics:');
51+
const stats = server.getStats();
52+
console.log(' Tools available:', stats.registry.totalTools);
53+
console.log(' Total operations:', stats.registry.totalCalls);
54+
console.log(' Files stored:', stats.storage.fileCount);
55+
console.log(' Datasets created:', stats.datasets.totalDatasets);
56+
57+
fs.unlinkSync(testFile);
58+
console.log('\n✅ All operations completed successfully!\n');
59+
}
60+
61+
demo().catch(err => {
62+
console.error('\n❌ Error:', err.message);
63+
process.exit(1);
64+
});

0 commit comments

Comments
 (0)