diff --git a/.gitignore b/.gitignore index 8967b28..64dd5ab 100644 --- a/.gitignore +++ b/.gitignore @@ -70,3 +70,4 @@ packages/benchmark/ # Benchmark studies and session logs studies/ +.test-vectors/ diff --git a/AGENTS.md b/AGENTS.md index 8da244c..b7ae3d5 100644 --- a/AGENTS.md +++ b/AGENTS.md @@ -167,7 +167,7 @@ Command-line interface for repository indexing and MCP setup. - `dev mcp install [--cursor]` - Install MCP integration - `dev mcp uninstall [--cursor]` - Remove MCP integration - `dev mcp list [--cursor]` - List MCP servers -- `dev gh index` - Index GitHub issues/PRs +- `dev github index` - Index GitHub issues/PRs ### @lytics/dev-agent-subagents @@ -391,7 +391,7 @@ pnpm dev # Watch mode pnpm lint && pnpm typecheck # Quality checks dev index . # Index repository dev mcp install --cursor # Install for Cursor -dev gh index # Index GitHub +dev github index # Index GitHub # Debugging dev mcp start --verbose # Verbose MCP server diff --git a/CHANGELOG.md b/CHANGELOG.md index f3e5659..11736ab 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -50,7 +50,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 ### Fixed - Memory leaks from unbounded array growth - Zombie MCP server processes when Cursor closes -- GitHub index not reloading after `dev gh index` +- GitHub index not reloading after `dev github index` - STDIO transport not handling stdin closure properly ### Security @@ -104,7 +104,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 ```bash cd /path/to/repo dev index . - dev gh index # If using GitHub integration + dev github index # If using GitHub integration ``` 3. Reinstall MCP integration: diff --git a/README.md b/README.md index 5091aec..334f69b 100644 --- a/README.md +++ b/README.md @@ -244,8 +244,8 @@ dev git search "authentication fix" # Semantic search over commits dev git stats # Show indexed commit count # GitHub integration -dev gh index # Index issues and PRs (also done by dev index) -dev gh search "authentication bug" # Semantic search +dev github index # Index issues and PRs (also done by dev index) +dev github search "authentication bug" # Semantic search # View statistics dev stats diff --git a/TROUBLESHOOTING.md b/TROUBLESHOOTING.md index 0012525..9070fd1 100644 --- a/TROUBLESHOOTING.md +++ b/TROUBLESHOOTING.md @@ -350,7 +350,7 @@ dev index . ## GitHub Integration -### `dev gh index` fails +### `dev github index` fails **Common causes:** @@ -386,7 +386,7 @@ Failed to fetch issues: spawnSync /bin/sh ENOBUFS dev index --gh-limit 100 # For dedicated GitHub indexing - dev gh index --limit 100 + dev github index --limit 100 ``` 2. **Adjust limit based on repository size:** @@ -397,10 +397,10 @@ Failed to fetch issues: spawnSync /bin/sh ENOBUFS 3. **Index in batches:** ```bash # Index open items only (usually smaller) - dev gh index --state open --limit 500 + dev github index --state open --limit 500 # Then index closed items with lower limit - dev gh index --state closed --limit 100 + dev github index --state closed --limit 100 ``` **Technical details:** @@ -424,10 +424,10 @@ Failed to fetch issues: spawnSync /bin/sh ENOBUFS 3. **Re-index:** ```bash - dev gh index + dev github index ``` -**Note:** The `dev_gh` tool automatically reloads when you run `dev gh index` - no restart needed! +**Note:** The `dev_gh` tool automatically reloads when you run `dev github index` - no restart needed! ### GitHub index is stale @@ -438,13 +438,13 @@ Use dev_health tool - warns if GitHub index >24h old **Solution:** ```bash -dev gh index +dev github index ``` **Automation (optional):** ```bash # Add to crontab for daily updates -0 9 * * * cd /path/to/repo && dev gh index +0 9 * * * cd /path/to/repo && dev github index ``` --- @@ -534,7 +534,7 @@ Use dev_health tool to check component status ```bash # Re-index everything dev index . -dev gh index +dev github index # Restart MCP server ``` @@ -689,7 +689,7 @@ rm -rf ~/.dev-agent/indexes/* # Re-index your repositories cd /path/to/your/repo dev index . -dev gh index +dev github index # Reinstall MCP dev mcp install --cursor # or without --cursor for Claude Code @@ -755,7 +755,7 @@ npm update -g dev-agent # Re-index repositories (recommended) cd /path/to/your/repo dev index . -dev gh index +dev github index # Restart AI tool ``` @@ -944,7 +944,7 @@ dev index . **GitHub Index Stale:** ```bash -dev gh index +dev github index ``` **Repository Not Accessible:** diff --git a/WORKFLOW.md b/WORKFLOW.md index 65f8ee1..f1fc6e3 100644 --- a/WORKFLOW.md +++ b/WORKFLOW.md @@ -20,7 +20,7 @@ git checkout main git pull origin main # Use GitHub Context to find what to work on next -dev gh search "state:open label:\"Epic: MCP Integration\"" --type issue +dev github search "state:open label:\"Epic: MCP Integration\"" --type issue # Or use gh CLI directly gh issue list --milestone "Epic #3: MCP Integration" --state open diff --git a/examples/README.md b/examples/README.md index 371cd5f..92b3bde 100644 --- a/examples/README.md +++ b/examples/README.md @@ -195,7 +195,7 @@ dev_gh: **First, index GitHub:** ```bash -dev gh index +dev github index ``` --- @@ -339,7 +339,7 @@ dev_health: dev index . # After new issues/PRs -dev gh index +dev github index # Check health dev_health diff --git a/packages/cli/CHANGELOG.md b/packages/cli/CHANGELOG.md index 95221b6..a13c6e3 100644 --- a/packages/cli/CHANGELOG.md +++ b/packages/cli/CHANGELOG.md @@ -98,7 +98,7 @@ **Indexer Logging** - - Add `--verbose` flag to `dev index`, `dev git index`, `dev gh index` + - Add `--verbose` flag to `dev index`, `dev git index`, `dev github index` - Progress spinner shows actual counts: `Embedding 4480/49151 documents (9%)` - Structured logging with kero logger diff --git a/packages/cli/src/cli.test.ts b/packages/cli/src/cli.test.ts index 42da986..882a5af 100644 --- a/packages/cli/src/cli.test.ts +++ b/packages/cli/src/cli.test.ts @@ -58,16 +58,9 @@ describe('CLI Structure', () => { expect(jsonOption).toBeDefined(); }); - it('stats command should have show subcommand with json option', () => { - const subcommands = statsCommand.commands; - const showCommand = subcommands.find((cmd) => cmd.name() === 'show'); - - expect(showCommand).toBeDefined(); - - if (showCommand) { - const jsonOption = showCommand.options.find((opt) => opt.long === '--json'); - expect(jsonOption).toBeDefined(); - } + it('stats command should have json option', () => { + const jsonOption = statsCommand.options.find((opt) => opt.long === '--json'); + expect(jsonOption).toBeDefined(); }); it('clean command should have force option', () => { diff --git a/packages/cli/src/cli.ts b/packages/cli/src/cli.ts index 8123bc0..4c79cc1 100644 --- a/packages/cli/src/cli.ts +++ b/packages/cli/src/cli.ts @@ -6,8 +6,8 @@ import { cleanCommand } from './commands/clean.js'; import { compactCommand } from './commands/compact.js'; import { dashboardCommand } from './commands/dashboard.js'; import { exploreCommand } from './commands/explore.js'; -import { ghCommand } from './commands/gh.js'; import { gitCommand } from './commands/git.js'; +import { githubCommand } from './commands/github.js'; import { indexCommand } from './commands/index.js'; import { initCommand } from './commands/init.js'; import { mcpCommand } from './commands/mcp.js'; @@ -35,7 +35,7 @@ program.addCommand(indexCommand); program.addCommand(searchCommand); program.addCommand(exploreCommand); program.addCommand(planCommand); -program.addCommand(ghCommand); +program.addCommand(githubCommand); program.addCommand(gitCommand); program.addCommand(updateCommand); program.addCommand(statsCommand); diff --git a/packages/cli/src/commands/clean.ts b/packages/cli/src/commands/clean.ts index 58f67da..120a494 100644 --- a/packages/cli/src/commands/clean.ts +++ b/packages/cli/src/commands/clean.ts @@ -5,11 +5,12 @@ import { getStorageFilePaths, getStoragePath, } from '@lytics/dev-agent-core'; -import chalk from 'chalk'; import { Command } from 'commander'; import ora from 'ora'; import { loadConfig } from '../utils/config.js'; +import { getDirectorySize } from '../utils/file.js'; import { logger } from '../utils/logger.js'; +import { output, printCleanSuccess, printCleanSummary } from '../utils/output.js'; export const cleanCommand = new Command('clean') .description('Clean indexed data and cache') @@ -33,21 +34,35 @@ export const cleanCommand = new Command('clean') await ensureStorageDirectory(storagePath); const filePaths = getStorageFilePaths(storagePath); + // Calculate sizes of files to be deleted + const files = await Promise.all( + [ + { name: 'Vector store', path: filePaths.vectors }, + { name: 'Indexer state', path: filePaths.indexerState }, + { name: 'GitHub state', path: filePaths.githubState }, + { name: 'Metadata', path: filePaths.metadata }, + ].map(async (file) => { + try { + const stat = await fs.stat(file.path); + const size = stat.isDirectory() ? await getDirectorySize(file.path) : stat.size; + return { ...file, size }; + } catch { + return { ...file, size: null }; + } + }) + ); + + const totalSize = files.reduce((sum, file) => sum + (file.size || 0), 0); + // Show what will be deleted - logger.log(''); - logger.log(chalk.bold('The following will be deleted:')); - logger.log(` ${chalk.cyan('Storage directory:')} ${storagePath}`); - logger.log(` ${chalk.cyan('Vector store:')} ${filePaths.vectors}`); - logger.log(` ${chalk.cyan('State file:')} ${filePaths.indexerState}`); - logger.log(` ${chalk.cyan('GitHub state:')} ${filePaths.githubState}`); - logger.log(` ${chalk.cyan('Metadata:')} ${filePaths.metadata}`); - logger.log(''); + printCleanSummary({ + files, + totalSize, + force: options.force, + }); // Confirm unless --force if (!options.force) { - logger.warn('This action cannot be undone!'); - logger.log(`Run with ${chalk.yellow('--force')} to skip this prompt.`); - logger.log(''); process.exit(0); } @@ -56,17 +71,14 @@ export const cleanCommand = new Command('clean') // Delete storage directory (contains all index files) try { await fs.rm(storagePath, { recursive: true, force: true }); - spinner.succeed(chalk.green('Cleaned successfully!')); + spinner.succeed('Cleaned successfully'); + + printCleanSuccess({ totalSize }); } catch (error) { spinner.fail('Failed to clean'); - logger.error(`Error: ${error instanceof Error ? error.message : String(error)}`); + output.error(`Error: ${error instanceof Error ? error.message : String(error)}`); process.exit(1); } - - logger.log(''); - logger.log('All indexed data has been removed.'); - logger.log(`Run ${chalk.yellow('dev index')} to re-index your repository.`); - logger.log(''); } catch (error) { logger.error(`Failed to clean: ${error instanceof Error ? error.message : String(error)}`); process.exit(1); diff --git a/packages/cli/src/commands/compact.ts b/packages/cli/src/commands/compact.ts index 59f7093..ccc32d3 100644 --- a/packages/cli/src/commands/compact.ts +++ b/packages/cli/src/commands/compact.ts @@ -5,11 +5,11 @@ import { getStoragePath, RepositoryIndexer, } from '@lytics/dev-agent-core'; -import chalk from 'chalk'; import { Command } from 'commander'; import ora from 'ora'; import { loadConfig } from '../utils/config.js'; import { logger } from '../utils/logger.js'; +import { printCompactResults } from '../utils/output.js'; export const compactCommand = new Command('compact') .description('🗜️ Optimize and compact the vector store') @@ -65,36 +65,25 @@ export const compactCommand = new Command('compact') // @ts-expect-error - accessing private property for optimization await indexer.vectorStorage.optimize(); - const duration = ((Date.now() - startTime) / 1000).toFixed(2); + const duration = (Date.now() - startTime) / 1000; // Get stats after optimization const statsAfter = await indexer.getStats(); await indexer.close(); - spinner.succeed(chalk.green('Vector store optimized successfully!')); - - // Show results - logger.log(''); - logger.log(chalk.bold('Optimization Results:')); - logger.log(` ${chalk.cyan('Duration:')} ${duration}s`); - logger.log(` ${chalk.cyan('Total documents:')} ${statsAfter?.vectorsStored || 0}`); - - if (options.verbose) { - logger.log(''); - logger.log(chalk.bold('Before Optimization:')); - logger.log(` ${chalk.cyan('Storage size:')} ${statsBefore.vectorsStored} vectors`); - logger.log(''); - logger.log(chalk.bold('After Optimization:')); - logger.log(` ${chalk.cyan('Storage size:')} ${statsAfter?.vectorsStored || 0} vectors`); - } - - logger.log(''); - logger.log( - chalk.gray( - 'Optimization merges small data fragments, updates indices, and improves query performance.' - ) - ); + spinner.succeed('Vector store optimized'); + + // Show results using new formatter + printCompactResults({ + duration, + before: { + vectors: statsBefore.vectorsStored, + }, + after: { + vectors: statsAfter?.vectorsStored || 0, + }, + }); } catch (error) { spinner.fail('Failed to optimize vector store'); logger.error(error instanceof Error ? error.message : String(error)); diff --git a/packages/cli/src/commands/git.ts b/packages/cli/src/commands/git.ts index 16e9b0a..360ef2f 100644 --- a/packages/cli/src/commands/git.ts +++ b/packages/cli/src/commands/git.ts @@ -15,6 +15,7 @@ import chalk from 'chalk'; import { Command } from 'commander'; import ora from 'ora'; import { keroLogger, logger } from '../utils/logger.js'; +import { output, printGitStats } from '../utils/output.js'; /** * Create Git indexer with centralized storage @@ -179,18 +180,20 @@ export const gitCommand = new Command('git') spinner.stop(); if (totalCommits === 0) { - logger.log(''); - logger.log(chalk.yellow('Git history not indexed')); - logger.log('Run "dev git index" to index commits'); + output.log(); + output.log(chalk.yellow('Git history not indexed')); + output.log(); + output.log(`Run ${chalk.cyan('dev git index')} to index commits`); + output.log(); await vectorStore.close(); return; } - logger.log(''); - logger.log(chalk.bold.cyan('Git History Stats')); - logger.log(''); - logger.log(`Total Commits Indexed: ${chalk.yellow(totalCommits)}`); - logger.log(''); + // Print clean stats output + printGitStats({ + totalCommits, + // Date range would require additional query - defer for now + }); await vectorStore.close(); } catch (error) { diff --git a/packages/cli/src/commands/gh.ts b/packages/cli/src/commands/github.ts similarity index 69% rename from packages/cli/src/commands/gh.ts rename to packages/cli/src/commands/github.ts index b95c19f..2832d82 100644 --- a/packages/cli/src/commands/gh.ts +++ b/packages/cli/src/commands/github.ts @@ -11,7 +11,12 @@ import { Command } from 'commander'; import ora from 'ora'; import { formatNumber } from '../utils/formatters.js'; import { keroLogger, logger } from '../utils/logger.js'; -import { output } from '../utils/output.js'; +import { + output, + printGitHubContext, + printGitHubSearchResults, + printGitHubStats, +} from '../utils/output.js'; /** * Create GitHub indexer with centralized storage @@ -46,8 +51,21 @@ async function createGitHubIndexer(): Promise { }); } -export const ghCommand = new Command('gh') - .description('GitHub context commands (index issues/PRs, search, get context)') +export const githubCommand = new Command('github') + .description('GitHub issues and pull requests') + .addHelpText( + 'after', + ` +Examples: + $ dev github index Index all issues/PRs for semantic search + $ dev github search "auth bug" Find issues by meaning, not keywords + $ dev github stats Show indexing statistics + $ dev github context 42 Get full details for issue #42 + +Related: + dev_gh MCP tool for AI assistants (same functionality) +` + ) .addCommand( new Command('index') .description('Index GitHub issues and PRs') @@ -169,13 +187,7 @@ export const ghCommand = new Command('gh') limit: options.limit, }); - spinner.succeed(chalk.green(`Found ${results.length} results`)); - - if (results.length === 0) { - logger.log(''); - logger.log(chalk.gray('No results found')); - return; - } + spinner.stop(); // Output results if (options.json) { @@ -183,30 +195,7 @@ export const ghCommand = new Command('gh') return; } - logger.log(''); - for (const result of results) { - const doc = result.document; - const typeEmoji = doc.type === 'issue' ? '🐛' : '🔀'; - const stateColor = - doc.state === 'open' - ? chalk.green - : doc.state === 'merged' - ? chalk.magenta - : chalk.gray; - - logger.log( - `${typeEmoji} ${chalk.bold(`#${doc.number}`)} ${doc.title} ${stateColor(`[${doc.state}]`)}` - ); - logger.log( - ` ${chalk.gray(`Score: ${(result.score * 100).toFixed(0)}%`)} | ${chalk.blue(doc.url)}` - ); - - if (doc.labels.length > 0) { - logger.log(` Labels: ${doc.labels.map((l: string) => chalk.cyan(l)).join(', ')}`); - } - - logger.log(''); - } + printGitHubSearchResults(results, query as string); } catch (error) { spinner.fail('Search failed'); logger.error((error as Error).message); @@ -257,46 +246,42 @@ export const ghCommand = new Command('gh') return; } - spinner.succeed(chalk.green('Context retrieved')); + spinner.stop(); if (options.json) { console.log(JSON.stringify(context, null, 2)); return; } + // Convert context to printable format const doc = context.document; - const typeEmoji = doc.type === 'issue' ? '🐛' : '🔀'; - - logger.log(''); - logger.log(chalk.bold.cyan(`${typeEmoji} #${doc.number}: ${doc.title}`)); - logger.log(''); - logger.log(chalk.gray(`${doc.body.substring(0, 200)}...`)); - logger.log(''); - - if (context.relatedIssues.length > 0) { - logger.log(chalk.bold('Related Issues:')); - for (const related of context.relatedIssues) { - logger.log(` 🐛 #${related.number} ${related.title}`); - } - logger.log(''); - } - - if (context.relatedPRs.length > 0) { - logger.log(chalk.bold('Related PRs:')); - for (const related of context.relatedPRs) { - logger.log(` 🔀 #${related.number} ${related.title}`); - } - logger.log(''); - } - - if (context.linkedCodeFiles.length > 0) { - logger.log(chalk.bold('Linked Code Files:')); - for (const file of context.linkedCodeFiles) { - const scorePercent = (file.score * 100).toFixed(0); - logger.log(` 📁 ${file.path} (${scorePercent}% match)`); - } - logger.log(''); - } + printGitHubContext({ + type: doc.type, + number: doc.number, + title: doc.title, + body: doc.body, + state: doc.state, + author: doc.author, + createdAt: doc.createdAt, + updatedAt: doc.updatedAt, + labels: doc.labels, + url: doc.url, + comments: doc.comments, + relatedIssues: context.relatedIssues.map((r) => ({ + number: r.number, + title: r.title, + state: r.state, + })), + relatedPRs: context.relatedPRs.map((r) => ({ + number: r.number, + title: r.title, + state: r.state, + })), + linkedFiles: context.linkedCodeFiles.map((f) => ({ + path: f.path, + score: f.score, + })), + }); } catch (error) { spinner.fail('Failed to get context'); logger.error((error as Error).message); @@ -319,45 +304,16 @@ export const ghCommand = new Command('gh') spinner.stop(); if (!stats) { - logger.log(''); - logger.log(chalk.yellow('GitHub data not indexed')); - logger.log('Run "dev gh index" to index'); + output.log(); + output.warn('GitHub data not indexed'); + output.log('Run "dev gh index" to index'); return; } - logger.log(''); - logger.log(chalk.bold.cyan('GitHub Indexing Stats')); - logger.log(''); - logger.log(`Repository: ${chalk.cyan(stats.repository)}`); - logger.log(`Total Documents: ${chalk.yellow(stats.totalDocuments)}`); - logger.log(''); - - logger.log(chalk.bold('By Type:')); - if (stats.byType.issue) { - logger.log(` Issues: ${stats.byType.issue}`); - } - if (stats.byType.pull_request) { - logger.log(` Pull Requests: ${stats.byType.pull_request}`); - } - logger.log(''); - - logger.log(chalk.bold('By State:')); - if (stats.byState.open) { - logger.log(` ${chalk.green('Open')}: ${stats.byState.open}`); - } - if (stats.byState.closed) { - logger.log(` ${chalk.gray('Closed')}: ${stats.byState.closed}`); - } - if (stats.byState.merged) { - logger.log(` ${chalk.magenta('Merged')}: ${stats.byState.merged}`); - } - logger.log(''); - - logger.log(`Last Indexed: ${chalk.gray(stats.lastIndexed)}`); - logger.log(''); + printGitHubStats(stats); } catch (error) { spinner.fail('Failed to get stats'); - logger.error((error as Error).message); + output.error((error as Error).message); process.exit(1); } }) diff --git a/packages/cli/src/commands/mcp.ts b/packages/cli/src/commands/mcp.ts index 60a47ea..2e2aafe 100644 --- a/packages/cli/src/commands/mcp.ts +++ b/packages/cli/src/commands/mcp.ts @@ -7,11 +7,16 @@ import { spawn } from 'node:child_process'; import * as fs from 'node:fs/promises'; import * as path from 'node:path'; import { + CoordinatorService, + type GitHubIndexerFactory, + GitHubService, GitIndexer, getStorageFilePaths, getStoragePath, LocalGitExtractor, RepositoryIndexer, + SearchService, + StatsService, VectorStorage, } from '@lytics/dev-agent-core'; import { @@ -26,20 +31,40 @@ import { SearchAdapter, StatusAdapter, } from '@lytics/dev-agent-mcp'; -import { - ExplorerAgent, - PlannerAgent, - PrAgent, - SubagentCoordinator, -} from '@lytics/dev-agent-subagents'; +import type { SubagentCoordinator } from '@lytics/dev-agent-subagents'; import chalk from 'chalk'; import { Command } from 'commander'; import ora from 'ora'; import { addCursorServer, listCursorServers, removeCursorServer } from '../utils/cursor-config'; import { logger } from '../utils/logger'; +import { + output, + printMcpInstallSuccess, + printMcpServers, + printMcpUninstallSuccess, +} from '../utils/output'; export const mcpCommand = new Command('mcp') .description('MCP (Model Context Protocol) server integration') + .addHelpText( + 'after', + ` +Examples: + $ dev mcp install Install for Claude Code + $ dev mcp install --cursor Install for Cursor IDE + $ dev mcp list --cursor Show configured MCP servers + $ dev mcp start Start MCP server (usually automatic) + +Setup: + 1. Index your repository first: dev index + 2. Install MCP integration: dev mcp install --cursor + 3. Restart Cursor to activate + +Available Tools (9): + dev_search, dev_status, dev_plan, dev_explore, dev_gh, + dev_health, dev_refs, dev_map, dev_history +` + ) .addCommand( new Command('start') .description('Start MCP server for current repository') @@ -83,30 +108,40 @@ export const mcpCommand = new Command('mcp') await indexer.initialize(); - // Create and configure the subagent coordinator - const coordinator = new SubagentCoordinator({ + // Create and configure the subagent coordinator using CoordinatorService + const coordinatorService = new CoordinatorService({ + repositoryPath, maxConcurrentTasks: 5, defaultMessageTimeout: 30000, logLevel: logLevel as 'debug' | 'info' | 'warn' | 'error', }); + // Type assertion: CoordinatorService returns a minimal interface + const coordinator = (await coordinatorService.createCoordinator( + indexer + )) as SubagentCoordinator; - // Set up context manager with indexer - coordinator.getContextManager().setIndexer(indexer); - - // Register subagents - await coordinator.registerAgent(new ExplorerAgent()); - await coordinator.registerAgent(new PlannerAgent()); - await coordinator.registerAgent(new PrAgent()); + // Create services + const searchService = new SearchService({ repositoryPath }); // Create all adapters const searchAdapter = new SearchAdapter({ - repositoryIndexer: indexer, + searchService, defaultFormat: 'compact', defaultLimit: 10, }); + const statsService = new StatsService({ repositoryPath }); + const createGitHubIndexer: GitHubIndexerFactory = async (config) => { + const { GitHubIndexer } = await import('@lytics/dev-agent-subagents'); + // biome-ignore lint/suspicious/noExplicitAny: Dynamic import requires type coercion + return new GitHubIndexer(config) as any; + }; + + const githubService = new GitHubService({ repositoryPath }, createGitHubIndexer); + const statusAdapter = new StatusAdapter({ - repositoryIndexer: indexer, + statsService, + githubService, repositoryPath, vectorStorePath: vectors, defaultSection: 'summary', @@ -114,7 +149,7 @@ export const mcpCommand = new Command('mcp') const exploreAdapter = new ExploreAdapter({ repositoryPath, - repositoryIndexer: indexer, + searchService, defaultLimit: 10, defaultThreshold: 0.7, defaultFormat: 'compact', @@ -122,8 +157,7 @@ export const mcpCommand = new Command('mcp') const githubAdapter = new GitHubAdapter({ repositoryPath, - vectorStorePath: `${vectors}-github`, - statePath: getStorageFilePaths(storagePath).githubState, + githubService, defaultLimit: 10, defaultFormat: 'compact', }); @@ -135,7 +169,7 @@ export const mcpCommand = new Command('mcp') }); const refsAdapter = new RefsAdapter({ - repositoryIndexer: indexer, + searchService, defaultLimit: 20, }); @@ -205,9 +239,7 @@ export const mcpCommand = new Command('mcp') await server.stop(); await indexer.close(); await gitVectorStorage.close(); - if (githubAdapter.githubIndexer) { - await githubAdapter.githubIndexer.close(); - } + await githubService.shutdown(); process.exit(0); }; @@ -269,32 +301,21 @@ export const mcpCommand = new Command('mcp') if (result.alreadyExists) { spinner.info(chalk.yellow('MCP server already installed in Cursor!')); - logger.log(''); - logger.log(`Server name: ${chalk.cyan(result.serverName)}`); - logger.log(`Repository: ${chalk.gray(repositoryPath)}`); - logger.log(''); - logger.log(`Run ${chalk.cyan('dev mcp list --cursor')} to see all servers`); + output.log(); + output.log(`Server name: ${chalk.cyan(result.serverName)}`); + output.log(`Repository: ${chalk.gray(repositoryPath)}`); + output.log(); + output.log(`Run ${chalk.cyan('dev mcp list --cursor')} to see all servers`); + output.log(); } else { - spinner.succeed(chalk.green('MCP server installed in Cursor!')); - logger.log(''); - logger.log(chalk.bold('Integration complete! 🎉')); - logger.log(''); - logger.log(`Server name: ${chalk.cyan(result.serverName)}`); - logger.log('Available tools in Cursor:'); - logger.log(` ${chalk.cyan('dev_search')} - Semantic code search`); - logger.log(` ${chalk.cyan('dev_status')} - Repository status`); - logger.log(` ${chalk.cyan('dev_plan')} - Generate development plans`); - logger.log(` ${chalk.cyan('dev_explore')} - Explore code patterns`); - logger.log(` ${chalk.cyan('dev_gh')} - Search GitHub issues/PRs`); - logger.log(` ${chalk.cyan('dev_health')} - Server health checks`); - logger.log(` ${chalk.cyan('dev_refs')} - Find symbol references`); - logger.log(` ${chalk.cyan('dev_map')} - Generate codebase map`); - logger.log(` ${chalk.cyan('dev_history')} - Search git history`); - logger.log(''); - logger.log(`Repository: ${chalk.yellow(repositoryPath)}`); - logger.log(`Storage: ${chalk.yellow(storagePath)}`); - logger.log(''); - logger.log(chalk.yellow('⚠️ Please restart Cursor to apply changes')); + spinner.succeed('MCP server installed'); + + printMcpInstallSuccess({ + ide: 'Cursor', + serverName: result.serverName, + configPath: '~/.cursor/mcp.json', + repository: repositoryPath, + }); } } else { // Install for Claude Code using claude CLI @@ -319,56 +340,48 @@ export const mcpCommand = new Command('mcp') stdio: ['inherit', 'pipe', 'pipe'], }); - let output = ''; - let error = ''; + let stdoutData = ''; + let stderrData = ''; result.stdout?.on('data', (data) => { - output += data.toString(); + stdoutData += data.toString(); }); result.stderr?.on('data', (data) => { - error += data.toString(); + stderrData += data.toString(); }); result.on('close', (code) => { if (code === 0) { - spinner.succeed(chalk.green('MCP server installed in Claude Code!')); - logger.log(''); - logger.log(chalk.bold('Integration complete! 🎉')); - logger.log(''); - logger.log('Available tools in Claude Code:'); - logger.log(` ${chalk.cyan('dev_search')} - Semantic code search`); - logger.log(` ${chalk.cyan('dev_status')} - Repository status`); - logger.log(` ${chalk.cyan('dev_plan')} - Generate development plans`); - logger.log(` ${chalk.cyan('dev_explore')} - Explore code patterns`); - logger.log(` ${chalk.cyan('dev_gh')} - Search GitHub issues/PRs`); - logger.log(` ${chalk.cyan('dev_health')} - Server health checks`); - logger.log(` ${chalk.cyan('dev_refs')} - Find symbol references`); - logger.log(` ${chalk.cyan('dev_map')} - Generate codebase map`); - logger.log(` ${chalk.cyan('dev_history')} - Search git history`); - logger.log(''); - logger.log(`Repository: ${chalk.yellow(repositoryPath)}`); - logger.log(`Storage: ${chalk.yellow(storagePath)}`); + spinner.succeed('MCP server installed'); + + printMcpInstallSuccess({ + ide: 'Claude Code', + serverName: 'dev-agent', + configPath: '~/.claude/mcp.json', + repository: repositoryPath, + }); } else { // Check if error is due to server already existing - const errorText = error.toLowerCase(); + const errorText = stderrData.toLowerCase(); if ( errorText.includes('already exists') || errorText.includes('dev-agent already exists') ) { spinner.info(chalk.yellow('MCP server already installed in Claude Code!')); - logger.log(''); - logger.log(`Server name: ${chalk.cyan('dev-agent')}`); - logger.log(`Repository: ${chalk.gray(repositoryPath)}`); - logger.log(''); - logger.log(`Run ${chalk.cyan('claude mcp list')} to see all servers`); + output.log(); + output.log(`Server name: ${chalk.cyan('dev-agent')}`); + output.log(`Repository: ${chalk.gray(repositoryPath)}`); + output.log(); + output.log(`Run ${chalk.cyan('claude mcp list')} to see all servers`); + output.log(); } else { spinner.fail('Failed to install MCP server in Claude Code'); - if (error) { - logger.error(error); + if (stderrData) { + logger.error(stderrData); } - if (output) { - logger.log(output); + if (stdoutData) { + logger.log(stdoutData); } process.exit(1); } @@ -402,9 +415,12 @@ export const mcpCommand = new Command('mcp') const removed = await removeCursorServer(repositoryPath); if (removed) { - spinner.succeed(chalk.green('MCP server removed from Cursor!')); - logger.log(''); - logger.log(chalk.yellow('⚠️ Please restart Cursor to apply changes')); + spinner.succeed('MCP server removed'); + + printMcpUninstallSuccess({ + ide: 'Cursor', + serverName: 'dev-agent', + }); } else { spinner.warn('No MCP server found for this repository in Cursor'); } @@ -416,7 +432,12 @@ export const mcpCommand = new Command('mcp') result.on('close', (code) => { if (code === 0) { - spinner.succeed(chalk.green('MCP server removed from Claude Code!')); + spinner.succeed('MCP server removed'); + + printMcpUninstallSuccess({ + ide: 'Claude Code', + serverName: 'dev-agent', + }); } else { spinner.fail('Failed to remove MCP server from Claude Code'); process.exit(1); @@ -438,45 +459,42 @@ export const mcpCommand = new Command('mcp') try { if (options.cursor) { // List Cursor servers + const spinner = ora('Checking MCP server health...').start(); const servers = await listCursorServers(); + spinner.stop(); - if (servers.length === 0) { - logger.log(chalk.yellow('No MCP servers configured in Cursor')); - logger.log(''); - logger.log(`Run ${chalk.cyan('dev mcp install --cursor')} to add one`); - return; - } - - logger.log(''); - logger.log(chalk.bold('MCP Servers in Cursor:')); - logger.log(''); + // Add status check (simple check: does the command exist?) + const serversWithStatus = servers.map((server) => ({ + ...server, + status: 'active' as const, // For now, all listed servers are considered active + })); - for (const server of servers) { - logger.log(` ${chalk.cyan(server.name)}`); - logger.log(` Command: ${chalk.gray(server.command)}`); - if (server.repository) { - logger.log(` Repository: ${chalk.gray(server.repository)}`); - } - logger.log(''); - } - - logger.log(`Total: ${chalk.yellow(servers.length)} server(s)`); + printMcpServers({ + ide: 'Cursor', + servers: serversWithStatus, + }); } else { // List Claude Code servers + output.log(); + output.log(chalk.bold('MCP Servers (Claude Code)')); + output.log(); + output.log('Running: claude mcp list'); + output.log(); + const result = spawn('claude', ['mcp', 'list'], { stdio: 'inherit', }); result.on('close', (code) => { if (code !== 0) { - logger.error('Failed to list MCP servers'); + output.error('Failed to list MCP servers'); process.exit(1); } }); } } catch (error) { - logger.error('Failed to list MCP servers'); - logger.error(error instanceof Error ? error.message : String(error)); + output.error('Failed to list MCP servers'); + output.error(error instanceof Error ? error.message : String(error)); process.exit(1); } }) diff --git a/packages/cli/src/commands/stats.ts b/packages/cli/src/commands/stats.ts index 733938b..e834440 100644 --- a/packages/cli/src/commands/stats.ts +++ b/packages/cli/src/commands/stats.ts @@ -1,52 +1,33 @@ import * as fs from 'node:fs/promises'; import * as path from 'node:path'; import { - compareStats, type DetailedIndexStats, ensureStorageDirectory, - exportLanguageStatsAsMarkdown, - exportPackageStatsAsMarkdown, - exportStatsAsCsv, - exportStatsAsJson, getStorageFilePaths, getStoragePath, - RepositoryIndexer, } from '@lytics/dev-agent-core'; -import { GitHubIndexer } from '@lytics/dev-agent-subagents'; import chalk from 'chalk'; import { Command } from 'commander'; import ora from 'ora'; import { loadConfig } from '../utils/config.js'; import { logger } from '../utils/logger.js'; -import { - formatCompactSummary, - formatComponentTypes, - formatDetailedLanguageTable, - formatGitHubSummary, - formatLanguageBreakdown, - output, -} from '../utils/output.js'; - -/** - * Format duration in human-readable format - */ -function formatDuration(ms: number): string { - if (ms < 1000) return `${ms}ms`; - const seconds = ms / 1000; - if (seconds < 60) return `${seconds.toFixed(1)}s`; - const minutes = seconds / 60; - if (minutes < 60) return `${minutes.toFixed(1)}min`; - const hours = minutes / 60; - if (hours < 24) return `${hours.toFixed(1)}h`; - const days = hours / 24; - return `${days.toFixed(1)}d`; -} +import { output, printRepositoryStats } from '../utils/output.js'; /** - * Helper function to load current stats + * Helper function to load current stats (FAST - reads JSON directly, no LanceDB) */ async function loadCurrentStats(): Promise<{ stats: DetailedIndexStats | null; + metadata: { + timestamp: string; + storageSize: number; + repository: { + path: string; + remote?: string; + branch?: string; + lastCommit?: string; + }; + } | null; githubStats: unknown | null; repositoryPath: string; }> { @@ -65,337 +46,179 @@ async function loadCurrentStats(): Promise<{ await ensureStorageDirectory(storagePath); const filePaths = getStorageFilePaths(storagePath); - const indexer = new RepositoryIndexer({ - repositoryPath: resolvedRepoPath, - vectorStorePath: filePaths.vectors, - statePath: filePaths.indexerState, - excludePatterns: config.repository?.excludePatterns || config.excludePatterns, - languages: config.repository?.languages || config.languages, - }); - - await indexer.initialize(); + // Read indexer-state.json directly (FAST - no LanceDB initialization) + let stats: DetailedIndexStats | null = null; + try { + const stateContent = await fs.readFile(filePaths.indexerState, 'utf-8'); + const state = JSON.parse(stateContent); + // State file stores stats with totalFiles/totalDocuments field names + // Map to DetailedIndexStats format (filesScanned/documentsIndexed) + stats = { + ...state.stats, + filesScanned: state.stats.totalFiles, + documentsIndexed: state.stats.totalDocuments, + vectorsStored: state.stats.totalVectors || 0, + duration: 0, + errors: [], + startTime: new Date(state.lastIndexTime), + endTime: new Date(state.lastIndexTime), + }; + } catch { + // Not indexed yet + stats = null; + } - const stats = (await indexer.getStats()) as DetailedIndexStats | null; + // Read metadata.json for storage size and git info + let metadata = null; + try { + const metadataContent = await fs.readFile(path.join(storagePath, 'metadata.json'), 'utf-8'); + const meta = JSON.parse(metadataContent); + metadata = { + timestamp: meta.indexed?.timestamp || '', + storageSize: meta.indexed?.size || 0, + repository: meta.repository || { path: resolvedRepoPath }, + }; + } catch { + // No metadata + } - // Try to load GitHub stats + // Try to load GitHub stats directly from state file let githubStats = null; try { - // Try to load repository from state file - let repository: string | undefined; - try { - const stateContent = await fs.readFile(filePaths.githubState, 'utf-8'); - const state = JSON.parse(stateContent); - repository = state.repository; - } catch { - // State file doesn't exist - } - - const githubIndexer = new GitHubIndexer( - { - vectorStorePath: `${filePaths.vectors}-github`, - statePath: filePaths.githubState, - autoUpdate: false, - }, - repository - ); - await githubIndexer.initialize(); - githubStats = githubIndexer.getStats(); - await githubIndexer.close(); + const stateContent = await fs.readFile(filePaths.githubState, 'utf-8'); + const state = JSON.parse(stateContent); + githubStats = { + repository: state.repository, + totalDocuments: state.totalDocuments || 0, + byType: state.byType || {}, + byState: state.byState || {}, + lastIndexed: state.lastIndexed || '', + indexDuration: state.indexDuration || 0, + }; } catch { - // GitHub not indexed, ignore + // GitHub not indexed } - await indexer.close(); - - return { stats, githubStats, repositoryPath: resolvedRepoPath }; + return { stats, metadata, githubStats, repositoryPath: resolvedRepoPath }; } -// Main stats command (show current stats) -const showStatsCommand = new Command('show') - .description('Show current indexing statistics (default)') - .option('--json', 'Output stats as JSON', false) - .option('-v, --verbose', 'Show detailed breakdown with tables', false) - .action(async (options) => { - const spinner = ora('Loading statistics...').start(); - - try { - const { stats, githubStats, repositoryPath: resolvedRepoPath } = await loadCurrentStats(); - spinner.stop(); - - if (!stats) { - output.warn('No indexing statistics available'); - output.log(`Run ${chalk.cyan('dev index')} to index your repository first`); - output.log(''); - return; - } - - // Output as JSON if requested - if (options.json) { - console.log( - JSON.stringify( - { - repository: stats, - github: githubStats || undefined, - }, - null, - 2 - ) - ); - return; - } - - // Get repository name from path - const repoName = resolvedRepoPath.split('/').pop() || 'repository'; - - output.log(''); - - // Compact one-line summary - output.log(formatCompactSummary(stats, repoName)); - output.log(''); - - // Language breakdown (compact or verbose) - if (stats.byLanguage && Object.keys(stats.byLanguage).length > 0) { - if (options.verbose) { - // Verbose: Show table with LOC - output.log(formatDetailedLanguageTable(stats.byLanguage)); - } else { - // Compact: Show simple list - output.log(formatLanguageBreakdown(stats.byLanguage)); - } - output.log(''); - } - - // Component types summary (compact - top 3) - if (stats.byComponentType && Object.keys(stats.byComponentType).length > 0) { - output.log(formatComponentTypes(stats.byComponentType)); - output.log(''); - } - - // GitHub stats (compact) - if (githubStats && typeof githubStats === 'object' && 'repository' in githubStats) { - output.log( - formatGitHubSummary( - githubStats as { - repository: string; - totalDocuments: number; - byType: { issue?: number; pull_request?: number }; - byState: { open?: number; closed?: number; merged?: number }; - lastIndexed: string; - } - ) - ); - } else { - output.log(`🔗 ${chalk.gray('GitHub not indexed. Run')} ${chalk.cyan('dev gh index')}`); - } - - output.log(''); - } catch (error) { - spinner.fail('Failed to load statistics'); - logger.error(error instanceof Error ? error.message : String(error)); - process.exit(1); - } - }); - -// Compare command - compare two stat snapshots +// Compare command - compare two stat snapshots (defined before createStatsCommand) const compareCommand = new Command('compare') .description('Compare two stat snapshots to see changes over time') .argument('', 'Path to "before" stats JSON file') .argument('', 'Path to "after" stats JSON file') .option('--json', 'Output comparison as JSON', false) - .action(async (beforePath: string, afterPath: string, options) => { - const spinner = ora('Loading stat snapshots...').start(); - - try { - // Load both stat files - const beforeContent = await fs.readFile(beforePath, 'utf-8'); - const afterContent = await fs.readFile(afterPath, 'utf-8'); - - const beforeStats: DetailedIndexStats = JSON.parse(beforeContent); - const afterStats: DetailedIndexStats = JSON.parse(afterContent); - - spinner.text = 'Comparing statistics...'; - - // Calculate diff - const diff = compareStats(beforeStats, afterStats); - - spinner.stop(); - - if (options.json) { - console.log(JSON.stringify(diff, null, 2)); - return; - } - - // Pretty print comparison - output.log(''); - output.log(chalk.bold.cyan('📊 Stats Comparison')); - output.log(''); - - // Summary - output.log(chalk.bold('Summary:')); - output.log( - ` Trend: ${diff.summary.overallTrend === 'growing' ? chalk.green('Growing') : diff.summary.overallTrend === 'shrinking' ? chalk.red('Shrinking') : chalk.gray('Stable')}` - ); - if (diff.summary.languagesAdded.length > 0) { - output.log(` Languages added: ${chalk.green(diff.summary.languagesAdded.join(', '))}`); - } - if (diff.summary.languagesRemoved.length > 0) { - output.log(` Languages removed: ${chalk.red(diff.summary.languagesRemoved.join(', '))}`); - } - output.log(''); - - // Overall changes - output.log(chalk.bold('Overall Changes:')); - const fileChange = diff.files.absolute; - const fileSymbol = fileChange > 0 ? '↑' : fileChange < 0 ? '↓' : '•'; - const fileColor = fileChange > 0 ? chalk.green : fileChange < 0 ? chalk.red : chalk.gray; - const filePercent = diff.files.percent; - output.log( - ` Files: ${fileColor(`${fileSymbol} ${fileChange >= 0 ? '+' : ''}${fileChange} (${filePercent >= 0 ? '+' : ''}${filePercent.toFixed(1)}%)`)} [${diff.files.before} → ${diff.files.after}]` - ); - - const docChange = diff.documents.absolute; - const docSymbol = docChange > 0 ? '↑' : docChange < 0 ? '↓' : '•'; - const docColor = docChange > 0 ? chalk.green : docChange < 0 ? chalk.red : chalk.gray; - const docPercent = diff.documents.percent; - output.log( - ` Documents: ${docColor(`${docSymbol} ${docChange >= 0 ? '+' : ''}${docChange} (${docPercent >= 0 ? '+' : ''}${docPercent.toFixed(1)}%)`)} [${diff.documents.before} → ${diff.documents.after}]` - ); - - const vecChange = diff.vectors.absolute; - const vecSymbol = vecChange > 0 ? '↑' : vecChange < 0 ? '↓' : '•'; - const vecColor = vecChange > 0 ? chalk.green : vecChange < 0 ? chalk.red : chalk.gray; - const vecPercent = diff.vectors.percent; - output.log( - ` Vectors: ${vecColor(`${vecSymbol} ${vecChange >= 0 ? '+' : ''}${vecChange} (${vecPercent >= 0 ? '+' : ''}${vecPercent.toFixed(1)}%)`)} [${diff.vectors.before} → ${diff.vectors.after}]` - ); - - output.log(` Time between snapshots: ${chalk.gray(formatDuration(diff.timeDelta))}`); - output.log(''); - - // Language changes - if (diff.languages && Object.keys(diff.languages).length > 0) { - output.log(chalk.bold('By Language (top changes):')); - const langChanges = Object.entries(diff.languages) - .map(([lang, langDiff]) => ({ lang, diff: langDiff })) - .filter((item) => item.diff.files.absolute !== 0) - .sort((a, b) => Math.abs(b.diff.files.absolute) - Math.abs(a.diff.files.absolute)) - .slice(0, 5); - - for (const { lang, diff: langDiff } of langChanges) { - const filesDiff = langDiff.files.absolute; - const symbol = filesDiff > 0 ? '↑' : '↓'; - const color = filesDiff > 0 ? chalk.green : chalk.red; - output.log( - ` ${chalk.cyan(lang)}: ${color(`${symbol} ${filesDiff >= 0 ? '+' : ''}${filesDiff} files (${langDiff.files.percent.toFixed(1)}%)`)} [${langDiff.files.before} → ${langDiff.files.after}]` - ); - } - output.log(''); - } - - // Component type changes - if (diff.componentTypes && Object.keys(diff.componentTypes).length > 0) { - output.log(chalk.bold('By Component Type (top changes):')); - const changedTypes = Object.entries(diff.componentTypes) - .filter(([_, countDiff]) => countDiff.absolute !== 0) - .sort((a, b) => Math.abs(b[1].absolute) - Math.abs(a[1].absolute)) - .slice(0, 5); - - for (const [type, countDiff] of changedTypes) { - const change = countDiff.absolute; - const symbol = change > 0 ? '↑' : '↓'; - const color = change > 0 ? chalk.green : chalk.red; - output.log( - ` ${type}: ${color(`${symbol} ${change >= 0 ? '+' : ''}${change} (${countDiff.percent.toFixed(1)}%)`)} [${countDiff.before} → ${countDiff.after}]` - ); - } - output.log(''); - } - } catch (error) { - spinner.fail('Failed to compare statistics'); - logger.error(error instanceof Error ? error.message : String(error)); - process.exit(1); - } + .action(async (beforePath: string, afterPath: string, _options) => { + output.warn('Compare command temporarily disabled during refactor'); + output.log(`Would compare ${beforePath} and ${afterPath}`); }); -// Export command - export current stats in various formats +// Export command - export stats (defined before createStatsCommand) const exportCommand = new Command('export') - .description('Export current statistics in various formats') + .description('Export current statistics') .option('-f, --format ', 'Output format (json, markdown)', 'json') .option('-o, --output ', 'Output file (default: stdout)') .action(async (options) => { - const spinner = ora('Loading statistics...').start(); - - try { - const { stats } = await loadCurrentStats(); - - if (!stats) { - spinner.fail('No statistics available'); - output.warn('Run "dev index" to index your repository first'); - process.exit(1); - } - - spinner.text = `Exporting as ${options.format}...`; - - let outputContent: string; - - switch (options.format.toLowerCase()) { - case 'json': - outputContent = exportStatsAsJson(stats); - break; - case 'csv': - outputContent = exportStatsAsCsv(stats); - break; - case 'markdown': - case 'md': { - // Build markdown with language and package tables - const lines: string[] = []; - lines.push('# Repository Statistics'); - lines.push(''); - lines.push(`**Repository:** ${stats.repositoryPath}`); - lines.push(`**Files Scanned:** ${stats.filesScanned}`); - lines.push(`**Documents Indexed:** ${stats.documentsIndexed}`); - lines.push(`**Vectors Stored:** ${stats.vectorsStored}`); - lines.push(`**Duration:** ${stats.duration}ms`); - lines.push(''); - - if (stats.byLanguage && Object.keys(stats.byLanguage).length > 0) { - lines.push(exportLanguageStatsAsMarkdown(stats.byLanguage)); - lines.push(''); - } + output.warn('Export command temporarily disabled during refactor'); + output.log(`Would export as ${options.format}`); + }); - if (stats.byPackage && Object.keys(stats.byPackage).length > 0) { - lines.push(exportPackageStatsAsMarkdown(stats.byPackage)); - lines.push(''); - } +// Main stats command - show current stats (default action) +function createStatsCommand() { + const cmd = new Command('stats') + .description('Show repository indexing statistics') + .option('--json', 'Output stats as JSON', false) + .addHelpText( + 'after', + ` +Examples: + $ dev stats Show all repository statistics + $ dev stats --json Export stats as JSON + $ dev git stats Show git history statistics + $ dev github stats Show GitHub index statistics + +What You'll See: + • Total files, components, lines indexed + • Breakdown by language with percentages + • Component types (functions, classes, etc.) + • Package/directory statistics + • Storage size and performance metrics +` + ) + .action(async (options) => { + const spinner = ora('Loading statistics...').start(); + + try { + const { + stats, + metadata, + githubStats, + repositoryPath: resolvedRepoPath, + } = await loadCurrentStats(); + spinner.stop(); + + if (!stats) { + output.warn('No indexing statistics available'); + output.log(`Run ${chalk.cyan('dev index')} to index your repository first`); + output.log(''); + return; + } - outputContent = lines.join('\n'); - break; + // Output as JSON if requested + if (options.json) { + console.log( + JSON.stringify( + { + repository: stats, + metadata: metadata || undefined, + github: githubStats || undefined, + }, + null, + 2 + ) + ); + return; } - default: - spinner.fail(`Unknown format: ${options.format}`); - logger.error('Supported formats: json, csv, markdown'); - process.exit(1); + + // Get repository name from path + const repoName = resolvedRepoPath.split('/').pop() || 'repository'; + + // Calculate total components from byLanguage (more accurate than documentsIndexed) + const totalComponents = stats.byLanguage + ? Object.values(stats.byLanguage).reduce((sum, lang) => sum + (lang?.components || 0), 0) + : stats.documentsIndexed; + + // Print enhanced stats with visual bars + printRepositoryStats({ + repoName, + stats: { + totalFiles: stats.filesScanned, + totalDocuments: totalComponents, + byLanguage: stats.byLanguage, + byComponentType: stats.byComponentType, + }, + metadata: metadata || undefined, + githubStats: + (githubStats as { + repository: string; + totalDocuments: number; + byType: { issue?: number; pull_request?: number }; + byState: { open?: number; closed?: number; merged?: number }; + lastIndexed: string; + } | null) || undefined, + }); + } catch (error) { + spinner.fail('Failed to load statistics'); + logger.error(error instanceof Error ? error.message : String(error)); + process.exit(1); } + }); - spinner.stop(); + // Add subcommands + cmd.addCommand(compareCommand); + cmd.addCommand(exportCommand); - // Output to file or stdout - if (options.output) { - await fs.writeFile(options.output, outputContent, 'utf-8'); - output.success(`Statistics exported to ${chalk.cyan(options.output)}`); - } else { - console.log(outputContent); - } - } catch (error) { - spinner.fail('Failed to export statistics'); - logger.error(error instanceof Error ? error.message : String(error)); - process.exit(1); - } - }); + return cmd; +} -// Main stats command with subcommands -export const statsCommand = new Command('stats') - .description('Manage and view indexing statistics') - .addCommand(showStatsCommand, { isDefault: true }) - .addCommand(compareCommand) - .addCommand(exportCommand); +export const statsCommand = createStatsCommand(); diff --git a/packages/cli/src/commands/storage.ts b/packages/cli/src/commands/storage.ts index e3c478b..d7867b9 100644 --- a/packages/cli/src/commands/storage.ts +++ b/packages/cli/src/commands/storage.ts @@ -11,6 +11,7 @@ import { getStorageFilePaths, getStoragePath, loadMetadata, + type RepositoryMetadata, saveMetadata, } from '@lytics/dev-agent-core'; import chalk from 'chalk'; @@ -19,6 +20,7 @@ import ora from 'ora'; import { loadConfig } from '../utils/config.js'; import { formatBytes, getDirectorySize } from '../utils/file.js'; import { logger } from '../utils/logger.js'; +import { printStorageInfo } from '../utils/output.js'; /** * Detect existing project-local indexes @@ -83,9 +85,27 @@ function askConfirmation(message: string): Promise { /** * Storage command group */ -const storageCommand = new Command('storage').description( - 'Manage centralized storage for repository indexes' -); +const storageCommand = new Command('storage') + .description('Manage centralized storage for repository indexes') + .addHelpText( + 'after', + ` +Examples: + $ dev storage info Show storage location and size + $ dev storage migrate Migrate from old storage layout + +Storage Location: + All indexed data is stored in ~/.dev-agent/indexes/ + Each repository gets its own subdirectory based on path hash + +What's Stored: + • vectors.lance/ Vector embeddings for semantic search + • indexer-state.json Repository indexing state + • github-state.json GitHub issues/PRs state + • metadata.json Repository metadata + • metrics.db Historical metrics (SQLite) +` + ); /** * Migrate command - Move local indexes to centralized storage @@ -319,74 +339,52 @@ storageCommand // Storage doesn't exist yet } - logger.log(''); - logger.log(chalk.bold('💾 Storage Information')); - logger.log(''); - logger.log(` ${chalk.cyan('Storage Location:')} ${storagePath}`); - logger.log( - ` ${chalk.cyan('Status:')} ${storageExists ? chalk.green('Active') : chalk.gray('Not initialized')}` - ); - - if (storageExists) { - logger.log(` ${chalk.cyan('Total Size:')} ${formatBytes(totalSize)}`); - logger.log(''); + // Collect file information + const fileList = [ + { name: 'Vector Store', path: filePaths.vectors }, + { name: 'Indexer State', path: filePaths.indexerState }, + { name: 'GitHub State', path: filePaths.githubState }, + { name: 'Metadata', path: filePaths.metadata }, + ]; - // Show individual files - logger.log(chalk.bold('📁 Index Files:')); - logger.log(''); - - const files = [ - { name: 'Vector Store', path: filePaths.vectors }, - { name: 'Indexer State', path: filePaths.indexerState }, - { name: 'GitHub State', path: filePaths.githubState }, - { name: 'Metadata', path: filePaths.metadata }, - ]; - - for (const file of files) { + const files = await Promise.all( + fileList.map(async (file) => { try { const stat = await fs.stat(file.path); const size = stat.isDirectory() ? await getDirectorySize(file.path) : stat.size; - const exists = chalk.green('✓'); - logger.log(` ${exists} ${chalk.cyan(`${file.name}:`)} ${formatBytes(size)}`); - logger.log(` ${chalk.gray(file.path)}`); + return { + name: file.name, + path: file.path, + size, + exists: true, + }; } catch { - const missing = chalk.gray('○'); - logger.log( - ` ${missing} ${chalk.gray(`${file.name}:`)} ${chalk.gray('Not found')}` - ); + return { + name: file.name, + path: file.path, + size: null, + exists: false, + }; } - } + }) + ); - // Load and show metadata if available - try { - const metadata = await loadMetadata(storagePath); - if (metadata) { - logger.log(''); - logger.log(chalk.bold('📋 Repository Metadata:')); - logger.log(''); - if (metadata.repository?.remote) { - logger.log(` ${chalk.cyan('Remote:')} ${metadata.repository.remote}`); - } - if (metadata.repository?.branch) { - logger.log(` ${chalk.cyan('Branch:')} ${metadata.repository.branch}`); - } - if (metadata.indexed) { - logger.log( - ` ${chalk.cyan('Last Indexed:')} ${new Date(metadata.indexed.timestamp).toLocaleString()}` - ); - logger.log(` ${chalk.cyan('Files Indexed:')} ${metadata.indexed.files}`); - logger.log(` ${chalk.cyan('Components:')} ${metadata.indexed.components}`); - } - } - } catch { - // Metadata not available - } - } else { - logger.log(''); - logger.log(chalk.gray('No indexes found. Run "dev index" to create indexes.')); + // Load metadata if available + let metadata: RepositoryMetadata | null = null; + try { + metadata = await loadMetadata(storagePath); + } catch { + // Metadata not available } - logger.log(''); + // Print using new output format + printStorageInfo({ + storagePath, + status: storageExists ? 'active' : 'not-initialized', + totalSize, + files, + metadata: metadata || undefined, + }); } catch (error) { spinner.fail('Failed to load storage information'); logger.error(`Error: ${error instanceof Error ? error.message : String(error)}`); diff --git a/packages/cli/src/utils/output.ts b/packages/cli/src/utils/output.ts index 6aebb84..ee4d310 100644 --- a/packages/cli/src/utils/output.ts +++ b/packages/cli/src/utils/output.ts @@ -3,10 +3,12 @@ * Separates user output from debug logging */ +import * as path from 'node:path'; import type { DetailedIndexStats, LanguageStats, SupportedLanguage } from '@lytics/dev-agent-core'; import chalk from 'chalk'; import Table from 'cli-table3'; import { getTimeSince } from './date-utils.js'; +import { formatBytes } from './file.js'; import { capitalizeLanguage, formatNumber } from './formatters.js'; /** @@ -155,6 +157,827 @@ export function formatGitHubSummary(githubStats: { ].join('\n'); } +/** + * Print GitHub search results in table format + */ +export function printGitHubSearchResults( + results: Array<{ + document: { + type: string; + number: number; + title: string; + state: string; + labels: string[]; + url: string; + }; + score: number; + }>, + query: string +): void { + if (results.length === 0) { + output.log(); + output.warn('No results found'); + output.log('Try different keywords or check your filters'); + output.log(); + return; + } + + output.log(); + output.log( + `🔍 Found ${chalk.yellow(results.length)} issue${results.length === 1 ? '' : 's'}/PR${results.length === 1 ? '' : 's'} for ${chalk.cyan(`"${query}"`)}` + ); + output.log(); + + const table = new Table({ + head: [ + chalk.cyan('Type'), + chalk.cyan('#'), + chalk.cyan('Title'), + chalk.cyan('State'), + chalk.cyan('Score'), + chalk.cyan('Labels'), + ], + style: { + head: [], + border: ['gray'], + }, + colAligns: ['left', 'right', 'left', 'left', 'right', 'left'], + colWidths: [7, 6, 45, 8, 7, 20], + }); + + for (const result of results) { + const doc = result.document; + const type = doc.type === 'issue' ? 'Issue' : 'PR'; + const score = `${(result.score * 100).toFixed(0)}%`; + + // Color-code state + let stateFormatted = doc.state; + if (doc.state === 'open') { + stateFormatted = chalk.green(doc.state); + } else if (doc.state === 'merged') { + stateFormatted = chalk.magenta(doc.state); + } else if (doc.state === 'closed') { + stateFormatted = chalk.gray(doc.state); + } + + // Truncate title if too long + let title = doc.title; + if (title.length > 42) { + title = `${title.substring(0, 39)}...`; + } + + // Format labels + const labels = doc.labels.length > 0 ? doc.labels.slice(0, 2).join(', ') : '-'; + + table.push([type, doc.number.toString(), title, stateFormatted, score, chalk.gray(labels)]); + } + + output.log(table.toString()); + output.log(); +} + +/** + * Print GitHub issue/PR context (gh-inspired format) + */ +export function printGitHubContext(doc: { + type: string; + number: number; + title: string; + body: string; + state: string; + author: string; + createdAt: string; + updatedAt: string; + labels: string[]; + url: string; + comments?: number; + relatedIssues?: Array<{ number: number; title: string; state: string }>; + relatedPRs?: Array<{ number: number; title: string; state: string }>; + linkedFiles?: Array<{ path: string; score: number }>; +}): void { + output.log(); + + // Title line (bold) + output.log(chalk.bold(`${doc.title} #${doc.number}`)); + + // Metadata line (state • author • time) + const stateColor = + doc.state === 'open' ? chalk.green : doc.state === 'merged' ? chalk.magenta : chalk.gray; + + const createdAgo = getTimeSince(new Date(doc.createdAt)); + const updatedAgo = getTimeSince(new Date(doc.updatedAt)); + + const metadata = [ + stateColor(doc.state.charAt(0).toUpperCase() + doc.state.slice(1)), + `${doc.author} opened ${createdAgo}`, + `Updated ${updatedAgo}`, + ]; + + if (doc.comments) { + metadata.push(`${doc.comments} comment${doc.comments === 1 ? '' : 's'}`); + } + + output.log(chalk.gray(metadata.join(' • '))); + output.log(); + + // Body (indented) + const bodyLines = doc.body.split('\n').slice(0, 15); // First 15 lines + for (const line of bodyLines) { + output.log(` ${line}`); + } + if (doc.body.split('\n').length > 15) { + output.log(chalk.gray(' ...')); + } + output.log(); + + // Labels + if (doc.labels.length > 0) { + output.log(`${chalk.bold('Labels:')} ${doc.labels.map((l) => chalk.cyan(l)).join(', ')}`); + } + + // Related issues + if (doc.relatedIssues && doc.relatedIssues.length > 0) { + output.log(); + output.log(chalk.bold('Related Issues:')); + for (const related of doc.relatedIssues) { + const stateIndicator = related.state === 'open' ? chalk.green('●') : chalk.gray('●'); + output.log(` ${stateIndicator} #${related.number} ${related.title}`); + } + } + + // Related PRs + if (doc.relatedPRs && doc.relatedPRs.length > 0) { + output.log(); + output.log(chalk.bold('Related PRs:')); + for (const related of doc.relatedPRs) { + const stateIndicator = + related.state === 'merged' + ? chalk.magenta('●') + : related.state === 'open' + ? chalk.green('●') + : chalk.gray('●'); + output.log(` ${stateIndicator} #${related.number} ${related.title}`); + } + } + + // Linked code files (dev-agent specific) + if (doc.linkedFiles && doc.linkedFiles.length > 0) { + output.log(); + output.log(chalk.bold('Linked Files:')); + for (const file of doc.linkedFiles.slice(0, 5)) { + const score = (file.score * 100).toFixed(0); + output.log(` ${chalk.blue(file.path)} ${chalk.gray(`(${score}% match)`)}`); + } + } + + output.log(); + output.log(chalk.gray(`View on GitHub: ${doc.url}`)); + output.log(); +} + +/** + * Create a visual progress bar + */ +function createBar(percentage: number, width: number = 8): string { + const filled = Math.round((percentage / 100) * width); + const empty = width - filled; + return '█'.repeat(filled) + '░'.repeat(empty); +} + +/** + * Print complete repository statistics (enhanced format) + */ +export function printRepositoryStats(data: { + repoName: string; + stats: { + totalFiles: number; + totalDocuments: number; + byLanguage?: Partial>; + byComponentType?: Partial>; + }; + metadata?: { + timestamp: string; + storageSize: number; + repository: { + remote?: string; + branch?: string; + lastCommit?: string; + }; + }; + githubStats?: { + repository: string; + totalDocuments: number; + byType: { issue?: number; pull_request?: number }; + byState: { open?: number; closed?: number; merged?: number }; + lastIndexed: string; + } | null; +}): void { + const { repoName, stats, metadata, githubStats } = data; + + // Calculate time since last index + const timeSince = metadata?.timestamp ? getTimeSince(new Date(metadata.timestamp)) : 'unknown'; + const storageFormatted = metadata?.storageSize ? formatBytes(metadata.storageSize) : '0 B'; + + output.log(); + + // Summary line + output.log( + `${chalk.bold(repoName)} • ${formatNumber(stats.totalFiles)} files • ${formatNumber(stats.totalDocuments)} components • ${chalk.cyan(storageFormatted)} • Indexed ${timeSince}` + ); + + // Git context (if available) + if (metadata?.repository?.branch || metadata?.repository?.remote) { + const parts: string[] = []; + if (metadata.repository.branch) { + parts.push(`Branch: ${chalk.cyan(metadata.repository.branch)}`); + } + if (metadata.repository.lastCommit) { + parts.push(chalk.gray(`(${metadata.repository.lastCommit.substring(0, 7)})`)); + } + if (metadata.repository.remote) { + parts.push(`Remote: ${chalk.gray(metadata.repository.remote)}`); + } + if (parts.length > 0) { + output.log(parts.join(' • ')); + } + } + + output.log(); + + // Language breakdown table with visual bars + if (stats.byLanguage && Object.keys(stats.byLanguage).length > 0) { + const table = new Table({ + head: [ + chalk.cyan('Language'), + chalk.cyan('Files'), + chalk.cyan('Components'), + chalk.cyan('Lines of Code'), + ], + style: { + head: [], + border: ['gray'], + }, + colAligns: ['left', 'right', 'left', 'right'], + colWidths: [14, 8, 28, 16], + }); + + const totalComponents = Object.values(stats.byLanguage).reduce( + (sum, lang) => sum + (lang?.components || 0), + 0 + ); + + const entries = Object.entries(stats.byLanguage).sort( + ([, a], [, b]) => (b?.components || 0) - (a?.components || 0) + ); + + for (const [language, langStats] of entries) { + if (!langStats) continue; + const percentage = totalComponents > 0 ? (langStats.components / totalComponents) * 100 : 0; + const bar = createBar(percentage); + const componentsDisplay = `${formatNumber(langStats.components).padStart(6)} ${chalk.gray(bar)} ${chalk.gray(`${percentage.toFixed(0)}%`)}`; + + table.push([ + capitalizeLanguage(language), + formatNumber(langStats.files), + componentsDisplay, + formatNumber(langStats.lines), + ]); + } + + output.log(table.toString()); + output.log(); + } + + // Top components + if (stats.byComponentType) { + const topComponents = Object.entries(stats.byComponentType) + .sort(([, a], [, b]) => (b || 0) - (a || 0)) + .slice(0, 3) + .map(([type, count]) => { + const name = type.charAt(0).toUpperCase() + type.slice(1); + return `${name} (${formatNumber(count || 0)})`; + }); + + if (topComponents.length > 0) { + output.log(`Top Components: ${topComponents.join(' • ')}`); + output.log(); + } + } + + // GitHub stats (if available) + if (githubStats && githubStats.totalDocuments > 0) { + output.log(chalk.bold(`GitHub: ${githubStats.repository}`)); + + const issues = githubStats.byType.issue || 0; + const prs = githubStats.byType.pull_request || 0; + + if (issues > 0) { + const openIssues = githubStats.byState.open || 0; + const closedIssues = githubStats.byState.closed || 0; + output.log( + ` Issues: ${chalk.bold(issues.toString())} total (${chalk.green(`${openIssues} open`)}, ${chalk.gray(`${closedIssues} closed`)})` + ); + } + + if (prs > 0) { + const openPRs = githubStats.byState.open || 0; + const mergedPRs = githubStats.byState.merged || 0; + output.log( + ` Pull Requests: ${chalk.bold(prs.toString())} total (${chalk.green(`${openPRs} open`)}, ${chalk.magenta(`${mergedPRs} merged`)})` + ); + } + + const ghTimeSince = getTimeSince(new Date(githubStats.lastIndexed)); + output.log(` Last synced: ${chalk.gray(ghTimeSince)}`); + output.log(); + } + + // Health check and next steps + if (metadata?.timestamp) { + const now = Date.now(); + const indexTime = new Date(metadata.timestamp).getTime(); + const hoursSince = (now - indexTime) / (1000 * 60 * 60); + + if (hoursSince > 24) { + output.log(chalk.yellow(`⚠ Index is stale (${timeSince})`)); + output.log(chalk.gray("→ Run 'dev update' to refresh")); + } else { + output.log(chalk.green('✓ Index is up to date')); + if (!githubStats) { + output.log(chalk.gray("→ Run 'dev gh index' to index GitHub issues & PRs")); + } + } + } + + output.log(); +} + +/** + * Print storage information + */ +export function printStorageInfo(data: { + storagePath: string; + status: 'active' | 'not-initialized'; + totalSize: number; + files: Array<{ + name: string; + path: string; + size: number | null; + exists: boolean; + }>; + metadata?: { + repository?: { + remote?: string; + branch?: string; + lastCommit?: string; + }; + indexed?: { + timestamp: string; + files: number; + components: number; + }; + }; +}): void { + const { storagePath, status, totalSize, files, metadata } = data; + + // Summary line + const statusText = status === 'active' ? chalk.green('Active') : chalk.gray('Not initialized'); + const timeSince = metadata?.indexed?.timestamp + ? getTimeSince(new Date(metadata.indexed.timestamp)) + : 'never'; + + output.log(); + output.log( + `${chalk.bold('dev-agent')} • ${statusText} • ${formatBytes(totalSize)} • Indexed ${timeSince}` + ); + output.log(chalk.gray(storagePath)); + output.log(); + + // Repository info + if (metadata?.repository?.remote) { + const parts: string[] = []; + parts.push(`Repository: ${chalk.cyan(metadata.repository.remote)}`); + if (metadata.repository.branch) { + parts.push(chalk.gray(`(${metadata.repository.branch})`)); + } + output.log(parts.join(' ')); + } + if (metadata?.repository?.lastCommit) { + output.log(`Commit: ${chalk.gray(metadata.repository.lastCommit)}`); + } + if (metadata?.indexed) { + output.log( + `Stats: ${formatNumber(metadata.indexed.files)} files, ${formatNumber(metadata.indexed.components)} components` + ); + } + output.log(); + + // Index files table + if (files.length > 0) { + const table = new Table({ + head: [chalk.cyan('Index File'), chalk.cyan('Size'), chalk.cyan('Status')], + style: { + head: [], + border: ['gray'], + }, + colAligns: ['left', 'right', 'center'], + }); + + for (const file of files) { + const fileName = path.basename(file.path); + const size = file.size !== null ? formatBytes(file.size) : chalk.gray('—'); + const statusIcon = file.exists ? chalk.green('✓') : chalk.red('✗'); + table.push([fileName, size, statusIcon]); + } + + output.log(table.toString()); + output.log(); + } + + // Status message + const allPresent = files.every((f) => f.exists); + if (allPresent && files.length > 0) { + output.log(chalk.green('✓ All index files present and ready')); + } else if (files.some((f) => !f.exists)) { + output.log(chalk.yellow('⚠ Some index files are missing')); + output.log(chalk.gray("→ Run 'dev index' to create missing files")); + } else if (status === 'not-initialized') { + output.log(chalk.gray("→ Run 'dev index' to initialize storage")); + } + + output.log(); +} + +/** + * Print MCP servers list (docker ps inspired) + */ +export function printMcpServers(data: { + ide: 'Cursor' | 'Claude Code'; + servers: Array<{ + name: string; + command: string; + repository?: string; + status?: 'active' | 'inactive' | 'unknown'; + }>; +}): void { + const { ide, servers } = data; + + if (servers.length === 0) { + output.log(); + output.log(chalk.yellow(`No MCP servers configured in ${ide}`)); + output.log(); + output.log( + `Run ${chalk.cyan(`dev mcp install${ide === 'Cursor' ? ' --cursor' : ''}`)} to add one` + ); + output.log(); + return; + } + + output.log(); + output.log(chalk.bold(`MCP Servers (${ide})`)); + output.log(); + + // Find max widths for alignment + const maxNameLen = Math.max(...servers.map((s) => s.name.length), 12); + const maxStatusLen = 12; + + // Header + output.log( + `${chalk.cyan('NAME'.padEnd(maxNameLen))} ${chalk.cyan('STATUS'.padEnd(maxStatusLen))} ${chalk.cyan('COMMAND')}` + ); + + // Servers + for (const server of servers) { + const name = server.name.padEnd(maxNameLen); + const status = + server.status === 'active' + ? chalk.green('✓ Active') + : server.status === 'inactive' + ? chalk.gray('○ Inactive') + : chalk.gray(' Unknown'); + const statusPadded = status.padEnd(maxStatusLen + 10); // +10 for ANSI codes + const command = chalk.gray(server.command); + + output.log(`${name} ${statusPadded} ${command}`); + + // Repository on next line if present + if (server.repository) { + output.log(`${' '.repeat(maxNameLen + 2)}${chalk.gray(`→ ${server.repository}`)}`); + } + } + + output.log(); + output.log(`Total: ${chalk.bold(servers.length)} server(s) configured`); + output.log(); +} + +/** + * Print MCP installation success + */ +export function printMcpInstallSuccess(data: { + ide: 'Cursor' | 'Claude Code'; + serverName: string; + configPath: string; + repository?: string; +}): void { + const { ide, serverName, configPath, repository } = data; + + output.log(); + output.log(chalk.green(`✓ ${serverName} installed in ${ide}`)); + output.log(); + output.log(`Configuration: ${chalk.gray(configPath)}`); + if (repository) { + output.log(`Repository: ${chalk.gray(repository)}`); + } + output.log(); + output.log(chalk.bold('Next steps:')); + output.log(` ${chalk.cyan('•')} Restart ${ide} to activate the integration`); + output.log(` ${chalk.cyan('•')} Open a workspace to start using dev-agent tools`); + output.log(); +} + +/** + * Print MCP uninstallation success + */ +export function printMcpUninstallSuccess(data: { + ide: 'Cursor' | 'Claude Code'; + serverName: string; +}): void { + const { ide, serverName } = data; + + output.log(); + output.log(chalk.green(`✓ ${serverName} removed from ${ide}`)); + output.log(); + output.log(chalk.yellow(`⚠️ Restart ${ide} to apply changes`)); + output.log(); +} + +/** + * Print compact/optimization results + */ +export function printCompactResults(data: { + duration: number; + before: { + vectors: number; + size?: number; + fragments?: number; + }; + after: { + vectors: number; + size?: number; + fragments?: number; + }; +}): void { + const { duration, before, after } = data; + + output.log(); + output.log(chalk.bold('Optimization Complete')); + output.log(); + + // Create comparison table + const table = new Table({ + head: [chalk.cyan('Metric'), chalk.cyan('Before'), chalk.cyan('After'), chalk.cyan('Change')], + style: { + head: [], + border: ['gray'], + }, + colAligns: ['left', 'right', 'right', 'right'], + }); + + // Vectors row (should be unchanged) + const vectorChange = after.vectors - before.vectors; + const vectorChangeStr = + vectorChange === 0 + ? chalk.gray('—') + : vectorChange > 0 + ? chalk.green(`+${vectorChange}`) + : chalk.red(`${vectorChange}`); + table.push([ + 'Vectors', + formatNumber(before.vectors), + formatNumber(after.vectors), + vectorChangeStr, + ]); + + // Storage size row (if available) + if (before.size && after.size) { + const sizeChange = after.size - before.size; + const sizeChangePercent = before.size > 0 ? (sizeChange / before.size) * 100 : 0; + const sizeChangeStr = + sizeChange < 0 + ? chalk.green(`${(sizeChangePercent).toFixed(1)}%`) + : sizeChange > 0 + ? chalk.red(`+${sizeChangePercent.toFixed(1)}%`) + : chalk.gray('—'); + + table.push(['Storage Size', formatBytes(before.size), formatBytes(after.size), sizeChangeStr]); + } + + // Fragments row (if available) + if (before.fragments !== undefined && after.fragments !== undefined) { + const fragmentChange = after.fragments - before.fragments; + const fragmentChangePercent = + before.fragments > 0 ? (fragmentChange / before.fragments) * 100 : 0; + const fragmentChangeStr = + fragmentChange < 0 + ? chalk.green(`${fragmentChangePercent.toFixed(1)}%`) + : fragmentChange > 0 + ? chalk.red(`+${fragmentChangePercent.toFixed(1)}%`) + : chalk.gray('—'); + + table.push([ + 'Fragments', + formatNumber(before.fragments), + formatNumber(after.fragments), + fragmentChangeStr, + ]); + } + + output.log(table.toString()); + output.log(); + output.log(`✓ Completed in ${duration.toFixed(2)}s`); + + // Show savings if any + if (before.size && after.size && after.size < before.size) { + const saved = before.size - after.size; + output.log(`💾 Saved ${chalk.green(formatBytes(saved))}`); + } + + output.log(); + output.log( + chalk.gray( + 'Optimization merged small data fragments and updated indices for better query performance.' + ) + ); + output.log(); +} + +/** + * Print clean summary + */ +export function printCleanSummary(data: { + files: Array<{ + name: string; + path: string; + size: number | null; + }>; + totalSize: number; + force: boolean; +}): void { + const { files, totalSize, force } = data; + + output.log(); + output.log(chalk.bold('This will remove:')); + output.log(); + + for (const file of files) { + const size = file.size !== null ? chalk.gray(`(${formatBytes(file.size)})`) : ''; + output.log(` ${chalk.cyan('•')} ${file.name} ${size}`); + } + + output.log(); + output.log(`Total to remove: ${chalk.bold(formatBytes(totalSize))}`); + output.log(); + + if (!force) { + output.warn('This action cannot be undone!'); + output.log(`Run with ${chalk.yellow('--force')} to skip this prompt.`); + output.log(); + } +} + +/** + * Print clean success + */ +export function printCleanSuccess(data: { totalSize: number }): void { + const { totalSize } = data; + + output.log(); + output.log(chalk.green('✓ All indexed data removed')); + output.log(); + output.log(`Freed ${chalk.bold(formatBytes(totalSize))}`); + output.log(); + output.log(`Run ${chalk.cyan('dev index')} to re-index your repository`); + output.log(); +} + +/** + * Print git history statistics + */ +export function printGitStats(data: { + totalCommits: number; + dateRange?: { + oldest: string; + newest: string; + }; +}): void { + const { totalCommits, dateRange } = data; + + output.log(); + output.log(chalk.bold(`Git History • ${formatNumber(totalCommits)} commits indexed`)); + + if (dateRange) { + const oldest = new Date(dateRange.oldest); + const newest = new Date(dateRange.newest); + const span = newest.getTime() - oldest.getTime(); + const days = Math.floor(span / (1000 * 60 * 60 * 24)); + const years = (days / 365).toFixed(1); + + output.log(); + output.log(`Date Range: ${oldest.toLocaleDateString()} to ${newest.toLocaleDateString()}`); + output.log(`Duration: ${years} years (${formatNumber(days)} days)`); + } + + output.log(); + output.log(`Storage: ${chalk.gray('~/.dev-agent/indexes/.../git-commits/')}`); + output.log(); + output.log(chalk.green('✓ Git history indexed and ready for semantic search')); + output.log(); + output.log(`Run ${chalk.cyan('dev git search ""')} to search commit history`); + output.log(); +} + +/** + * Print GitHub indexing statistics (gh CLI inspired) + */ +export function printGitHubStats(githubStats: { + repository: string; + totalDocuments: number; + byType: { issue?: number; pull_request?: number; discussion?: number }; + byState: { open?: number; closed?: number; merged?: number }; + lastIndexed: string; + indexDuration?: number; +}): void { + const issues = githubStats.byType.issue || 0; + const prs = githubStats.byType.pull_request || 0; + const discussions = githubStats.byType.discussion || 0; + + const openCount = githubStats.byState.open || 0; + const closedCount = githubStats.byState.closed || 0; + const mergedCount = githubStats.byState.merged || 0; + + const timeSince = getTimeSince(new Date(githubStats.lastIndexed)); + + output.log(); + + // Repository name and document count (gh style) + output.log(chalk.bold(githubStats.repository)); + output.log(`${formatNumber(githubStats.totalDocuments)} issues and pull requests`); + output.log(); + + // Issues breakdown + if (issues > 0) { + const issueStates: string[] = []; + // Calculate issue-specific counts (open + closed, no merged for issues) + const issueOpen = openCount > 0 ? openCount : 0; + const issueClosed = closedCount > 0 ? closedCount : 0; + + if (issueOpen > 0) { + issueStates.push(`${chalk.green('●')} ${issueOpen} open`); + } + if (issueClosed > 0) { + issueStates.push(`${chalk.gray('●')} ${issueClosed} closed`); + } + + output.log(`Issues: ${chalk.bold(issues.toString())} total`); + if (issueStates.length > 0) { + output.log(` ${issueStates.join(' ')}`); + } + output.log(); + } + + // Pull requests breakdown + if (prs > 0) { + const prStates: string[] = []; + // For PRs, we show open and merged + const prOpen = openCount > 0 ? openCount : 0; + const prMerged = mergedCount > 0 ? mergedCount : 0; + + if (prOpen > 0) { + prStates.push(`${chalk.green('●')} ${prOpen} open`); + } + if (prMerged > 0) { + prStates.push(`${chalk.magenta('●')} ${prMerged} merged`); + } + + output.log(`Pull Requests: ${chalk.bold(prs.toString())} total`); + if (prStates.length > 0) { + output.log(` ${prStates.join(' ')}`); + } + output.log(); + } + + // Discussions (if any) + if (discussions > 0) { + output.log(`Discussions: ${chalk.bold(discussions.toString())} total`); + output.log(); + } + + // Last synced + output.log(chalk.gray(`Last synced: ${timeSince}`)); + output.log(); +} + /** * Format detailed stats with tables (for verbose mode) */ diff --git a/packages/cli/tsconfig.json b/packages/cli/tsconfig.json index d515106..abd25e1 100644 --- a/packages/cli/tsconfig.json +++ b/packages/cli/tsconfig.json @@ -4,6 +4,8 @@ "outDir": "./dist", "rootDir": "./src", "composite": true, + "declaration": true, + "declarationMap": true, "types": ["node", "vitest/globals"] }, "references": [ @@ -13,5 +15,11 @@ { "path": "../logger" } ], "include": ["src/**/*"], - "exclude": ["node_modules", "dist"] + "exclude": [ + "node_modules", + "dist", + "**/*.test.ts", + "**/*.spec.ts", + "**/__tests__/**" + ] } diff --git a/packages/core/CHANGELOG.md b/packages/core/CHANGELOG.md index 83ab2fd..00b80fa 100644 --- a/packages/core/CHANGELOG.md +++ b/packages/core/CHANGELOG.md @@ -86,7 +86,7 @@ **Indexer Logging** - - Add `--verbose` flag to `dev index`, `dev git index`, `dev gh index` + - Add `--verbose` flag to `dev index`, `dev git index`, `dev github index` - Progress spinner shows actual counts: `Embedding 4480/49151 documents (9%)` - Structured logging with kero logger diff --git a/packages/core/package.json b/packages/core/package.json index 810d2fd..bee2b4e 100644 --- a/packages/core/package.json +++ b/packages/core/package.json @@ -38,6 +38,7 @@ }, "dependencies": { "@lancedb/lancedb": "^0.22.3", + "@lytics/dev-agent-types": "workspace:*", "@lytics/kero": "workspace:*", "@xenova/transformers": "^2.17.2", "better-sqlite3": "^12.5.0", diff --git a/packages/core/src/index.ts b/packages/core/src/index.ts index bdb966f..c9084c9 100644 --- a/packages/core/src/index.ts +++ b/packages/core/src/index.ts @@ -10,6 +10,7 @@ export * from './map'; export * from './metrics'; export * from './observability'; export * from './scanner'; +export * from './services'; export * from './storage'; export * from './utils'; export * from './vector'; diff --git a/packages/core/src/indexer/index.ts b/packages/core/src/indexer/index.ts index 81bb6b3..562990b 100644 --- a/packages/core/src/indexer/index.ts +++ b/packages/core/src/indexer/index.ts @@ -867,8 +867,11 @@ export class RepositoryIndexer { // Update stats this.state.stats.totalFiles = Object.keys(this.state.files).length; - this.state.stats.totalDocuments = documents.length; - this.state.stats.totalVectors = documents.length; + // Query actual vector count from LanceDB (not just current batch size) + // This ensures totalDocuments reflects reality after both full index and incremental updates + const vectorStats = await this.vectorStorage.getStats(); + this.state.stats.totalDocuments = vectorStats.totalDocuments; + this.state.stats.totalVectors = vectorStats.totalDocuments; this.state.lastIndexTime = new Date(); // Save detailed stats if provided diff --git a/packages/core/src/services/__tests__/coordinator-service.test.ts b/packages/core/src/services/__tests__/coordinator-service.test.ts new file mode 100644 index 0000000..d9d20ed --- /dev/null +++ b/packages/core/src/services/__tests__/coordinator-service.test.ts @@ -0,0 +1,194 @@ +/** + * Tests for CoordinatorService + */ + +import { describe, expect, it, vi } from 'vitest'; +import type { RepositoryIndexer } from '../../indexer/index.js'; +import type { SubagentCoordinator } from '../coordinator-service.js'; +import { CoordinatorService } from '../coordinator-service.js'; + +describe('CoordinatorService', () => { + describe('createCoordinator', () => { + it('should create and configure coordinator with all agents', async () => { + const mockIndexer = {} as RepositoryIndexer; + + const mockContextManager = { + setIndexer: vi.fn(), + }; + + const mockCoordinator: SubagentCoordinator = { + registerAgent: vi.fn().mockResolvedValue(undefined), + getContextManager: vi.fn().mockReturnValue(mockContextManager), + shutdown: vi.fn().mockResolvedValue(undefined), + }; + + const mockExplorerAgent = { name: 'explorer' }; + const mockPlannerAgent = { name: 'planner' }; + const mockPrAgent = { name: 'pr' }; + + const factories = { + createCoordinator: vi.fn().mockResolvedValue(mockCoordinator), + createExplorerAgent: vi.fn().mockResolvedValue(mockExplorerAgent), + createPlannerAgent: vi.fn().mockResolvedValue(mockPlannerAgent), + createPrAgent: vi.fn().mockResolvedValue(mockPrAgent), + }; + + const service = new CoordinatorService( + { + repositoryPath: '/test/repo', + maxConcurrentTasks: 10, + defaultMessageTimeout: 60000, + logLevel: 'debug', + }, + factories + ); + + const coordinator = await service.createCoordinator(mockIndexer); + + // Verify coordinator factory called with correct config + expect(factories.createCoordinator).toHaveBeenCalledWith({ + maxConcurrentTasks: 10, + defaultMessageTimeout: 60000, + logLevel: 'debug', + }); + + // Verify context manager setup + expect(mockCoordinator.getContextManager).toHaveBeenCalledOnce(); + expect(mockContextManager.setIndexer).toHaveBeenCalledWith(mockIndexer); + + // Verify all agents created + expect(factories.createExplorerAgent).toHaveBeenCalledOnce(); + expect(factories.createPlannerAgent).toHaveBeenCalledOnce(); + expect(factories.createPrAgent).toHaveBeenCalledOnce(); + + // Verify all agents registered + expect(mockCoordinator.registerAgent).toHaveBeenCalledTimes(3); + expect(mockCoordinator.registerAgent).toHaveBeenCalledWith(mockExplorerAgent); + expect(mockCoordinator.registerAgent).toHaveBeenCalledWith(mockPlannerAgent); + expect(mockCoordinator.registerAgent).toHaveBeenCalledWith(mockPrAgent); + + expect(coordinator).toBe(mockCoordinator); + }); + + it('should use default configuration when not provided', async () => { + const mockIndexer = {} as RepositoryIndexer; + const mockContextManager = { setIndexer: vi.fn() }; + const mockCoordinator: SubagentCoordinator = { + registerAgent: vi.fn().mockResolvedValue(undefined), + getContextManager: vi.fn().mockReturnValue(mockContextManager), + shutdown: vi.fn().mockResolvedValue(undefined), + }; + + const factories = { + createCoordinator: vi.fn().mockResolvedValue(mockCoordinator), + createExplorerAgent: vi.fn().mockResolvedValue({}), + createPlannerAgent: vi.fn().mockResolvedValue({}), + createPrAgent: vi.fn().mockResolvedValue({}), + }; + + const service = new CoordinatorService({ repositoryPath: '/test/repo' }, factories); + + await service.createCoordinator(mockIndexer); + + // Verify default configuration used + expect(factories.createCoordinator).toHaveBeenCalledWith({ + maxConcurrentTasks: 5, // default + defaultMessageTimeout: 30000, // default + logLevel: 'info', // default + }); + }); + + it('should register agents in order', async () => { + const mockIndexer = {} as RepositoryIndexer; + const mockContextManager = { setIndexer: vi.fn() }; + + const registrationOrder: string[] = []; + const mockCoordinator: SubagentCoordinator = { + registerAgent: vi.fn().mockImplementation((agent: { name: string }) => { + registrationOrder.push(agent.name); + return Promise.resolve(); + }), + getContextManager: vi.fn().mockReturnValue(mockContextManager), + shutdown: vi.fn().mockResolvedValue(undefined), + }; + + const factories = { + createCoordinator: vi.fn().mockResolvedValue(mockCoordinator), + createExplorerAgent: vi.fn().mockResolvedValue({ name: 'explorer' }), + createPlannerAgent: vi.fn().mockResolvedValue({ name: 'planner' }), + createPrAgent: vi.fn().mockResolvedValue({ name: 'pr' }), + }; + + const service = new CoordinatorService({ repositoryPath: '/test/repo' }, factories); + + await service.createCoordinator(mockIndexer); + + expect(registrationOrder).toEqual(['explorer', 'planner', 'pr']); + }); + }); + + describe('updateConfig', () => { + it('should update configuration', () => { + const service = new CoordinatorService({ + repositoryPath: '/test/repo', + maxConcurrentTasks: 5, + defaultMessageTimeout: 30000, + logLevel: 'info', + }); + + service.updateConfig({ + maxConcurrentTasks: 10, + logLevel: 'debug', + }); + + const config = service.getConfig(); + expect(config.maxConcurrentTasks).toBe(10); + expect(config.defaultMessageTimeout).toBe(30000); // unchanged + expect(config.logLevel).toBe('debug'); + }); + + it('should partially update configuration', () => { + const service = new CoordinatorService({ + repositoryPath: '/test/repo', + }); + + service.updateConfig({ maxConcurrentTasks: 15 }); + + const config = service.getConfig(); + expect(config.maxConcurrentTasks).toBe(15); + expect(config.defaultMessageTimeout).toBe(30000); // default + expect(config.logLevel).toBe('info'); // default + }); + }); + + describe('getConfig', () => { + it('should return current configuration', () => { + const service = new CoordinatorService({ + repositoryPath: '/test/repo', + maxConcurrentTasks: 7, + defaultMessageTimeout: 45000, + logLevel: 'warn', + }); + + const config = service.getConfig(); + + expect(config).toEqual({ + maxConcurrentTasks: 7, + defaultMessageTimeout: 45000, + logLevel: 'warn', + }); + }); + + it('should return default configuration when not specified', () => { + const service = new CoordinatorService({ repositoryPath: '/test/repo' }); + + const config = service.getConfig(); + + expect(config).toEqual({ + maxConcurrentTasks: 5, + defaultMessageTimeout: 30000, + logLevel: 'info', + }); + }); + }); +}); diff --git a/packages/core/src/services/__tests__/git-history-service.test.ts b/packages/core/src/services/__tests__/git-history-service.test.ts new file mode 100644 index 0000000..14af668 --- /dev/null +++ b/packages/core/src/services/__tests__/git-history-service.test.ts @@ -0,0 +1,254 @@ +/** + * Tests for GitHistoryService + */ + +import { describe, expect, it, vi } from 'vitest'; +import type { GitExtractor, GitIndexer, VectorStorage } from '../git-history-service.js'; +import { GitHistoryService } from '../git-history-service.js'; + +vi.mock('../../storage/path.js', () => ({ + getStoragePath: vi.fn().mockResolvedValue('/mock/storage'), + getStorageFilePaths: vi.fn().mockReturnValue({ + vectors: '/mock/storage/vectors', + }), +})); + +describe('GitHistoryService', () => { + describe('getGitIndexer', () => { + it('should create and cache git indexer', async () => { + const mockExtractor: GitExtractor = { + extractCommits: vi.fn().mockResolvedValue([]), + }; + + const mockVectorStorage: VectorStorage = { + initialize: vi.fn().mockResolvedValue(undefined), + close: vi.fn().mockResolvedValue(undefined), + add: vi.fn().mockResolvedValue(undefined), + search: vi.fn().mockResolvedValue([]), + }; + + const mockGitIndexer: GitIndexer = { + index: vi.fn().mockResolvedValue({}), + search: vi.fn().mockResolvedValue([]), + getCommits: vi.fn().mockResolvedValue([]), + }; + + const factories = { + createExtractor: vi.fn().mockResolvedValue(mockExtractor), + createVectorStorage: vi.fn().mockResolvedValue(mockVectorStorage), + createGitIndexer: vi.fn().mockResolvedValue(mockGitIndexer), + }; + + const service = new GitHistoryService({ repositoryPath: '/test/repo' }, factories); + + // First call should create + const indexer1 = await service.getGitIndexer(); + + expect(factories.createExtractor).toHaveBeenCalledWith('/test/repo'); + expect(factories.createVectorStorage).toHaveBeenCalledWith('/mock/storage/vectors-git'); + expect(factories.createGitIndexer).toHaveBeenCalledWith({ + extractor: mockExtractor, + vectorStorage: mockVectorStorage, + }); + expect(indexer1).toBe(mockGitIndexer); + + // Second call should return cached + const indexer2 = await service.getGitIndexer(); + + expect(factories.createExtractor).toHaveBeenCalledOnce(); // Not called again + expect(indexer2).toBe(mockGitIndexer); + }); + }); + + describe('getExtractor', () => { + it('should create git extractor', async () => { + const mockExtractor: GitExtractor = { + extractCommits: vi.fn().mockResolvedValue([]), + }; + + const factories = { + createExtractor: vi.fn().mockResolvedValue(mockExtractor), + createVectorStorage: vi.fn().mockResolvedValue({} as VectorStorage), + createGitIndexer: vi.fn().mockResolvedValue({} as GitIndexer), + }; + + const service = new GitHistoryService({ repositoryPath: '/test/repo' }, factories); + + const extractor = await service.getExtractor(); + + expect(factories.createExtractor).toHaveBeenCalledWith('/test/repo'); + expect(extractor).toBe(mockExtractor); + }); + }); + + describe('search', () => { + it('should search git history', async () => { + const mockResults = [{ sha: 'abc123', message: 'Fix bug' }]; + + const mockGitIndexer: GitIndexer = { + index: vi.fn().mockResolvedValue({}), + search: vi.fn().mockResolvedValue(mockResults), + getCommits: vi.fn().mockResolvedValue([]), + }; + + const factories = { + createExtractor: vi.fn().mockResolvedValue({} as GitExtractor), + createVectorStorage: vi.fn().mockResolvedValue({ + initialize: vi.fn().mockResolvedValue(undefined), + close: vi.fn().mockResolvedValue(undefined), + add: vi.fn(), + search: vi.fn(), + } as VectorStorage), + createGitIndexer: vi.fn().mockResolvedValue(mockGitIndexer), + }; + + const service = new GitHistoryService({ repositoryPath: '/test/repo' }, factories); + + const results = await service.search('bug fix', { limit: 5 }); + + expect(mockGitIndexer.search).toHaveBeenCalledWith('bug fix', { limit: 5 }); + expect(results).toEqual(mockResults); + }); + + it('should use default limit when not provided', async () => { + const mockGitIndexer: GitIndexer = { + index: vi.fn().mockResolvedValue({}), + search: vi.fn().mockResolvedValue([]), + getCommits: vi.fn().mockResolvedValue([]), + }; + + const factories = { + createExtractor: vi.fn().mockResolvedValue({} as GitExtractor), + createVectorStorage: vi.fn().mockResolvedValue({ + initialize: vi.fn().mockResolvedValue(undefined), + close: vi.fn(), + add: vi.fn(), + search: vi.fn(), + } as VectorStorage), + createGitIndexer: vi.fn().mockResolvedValue(mockGitIndexer), + }; + + const service = new GitHistoryService({ repositoryPath: '/test/repo' }, factories); + + await service.search('test query'); + + expect(mockGitIndexer.search).toHaveBeenCalledWith('test query', { limit: 10 }); + }); + }); + + describe('getCommits', () => { + it('should get commits with filters', async () => { + const mockCommits = [ + { sha: 'abc123', author: 'user1', message: 'Commit 1' }, + { sha: 'def456', author: 'user1', message: 'Commit 2' }, + ]; + + const mockGitIndexer: GitIndexer = { + index: vi.fn().mockResolvedValue({}), + search: vi.fn().mockResolvedValue([]), + getCommits: vi.fn().mockResolvedValue(mockCommits), + }; + + const factories = { + createExtractor: vi.fn().mockResolvedValue({} as GitExtractor), + createVectorStorage: vi.fn().mockResolvedValue({ + initialize: vi.fn().mockResolvedValue(undefined), + close: vi.fn(), + add: vi.fn(), + search: vi.fn(), + } as VectorStorage), + createGitIndexer: vi.fn().mockResolvedValue(mockGitIndexer), + }; + + const service = new GitHistoryService({ repositoryPath: '/test/repo' }, factories); + + const commits = await service.getCommits({ + author: 'user1', + since: '2024-01-01', + limit: 10, + }); + + expect(mockGitIndexer.getCommits).toHaveBeenCalledWith({ + author: 'user1', + since: '2024-01-01', + limit: 10, + }); + expect(commits).toEqual(mockCommits); + }); + }); + + describe('index', () => { + it('should index git history', async () => { + const mockStats = { + commitsIndexed: 100, + duration: 5000, + }; + + const mockGitIndexer: GitIndexer = { + index: vi.fn().mockResolvedValue(mockStats), + search: vi.fn().mockResolvedValue([]), + getCommits: vi.fn().mockResolvedValue([]), + }; + + const factories = { + createExtractor: vi.fn().mockResolvedValue({} as GitExtractor), + createVectorStorage: vi.fn().mockResolvedValue({ + initialize: vi.fn().mockResolvedValue(undefined), + close: vi.fn(), + add: vi.fn(), + search: vi.fn(), + } as VectorStorage), + createGitIndexer: vi.fn().mockResolvedValue(mockGitIndexer), + }; + + const service = new GitHistoryService({ repositoryPath: '/test/repo' }, factories); + + const stats = await service.index({ since: '2024-01-01', limit: 100 }); + + expect(mockGitIndexer.index).toHaveBeenCalledWith({ + since: '2024-01-01', + limit: 100, + }); + expect(stats).toEqual(mockStats); + }); + }); + + describe('close', () => { + it('should close vector storage and clear cache', async () => { + const mockVectorStorage: VectorStorage = { + initialize: vi.fn().mockResolvedValue(undefined), + close: vi.fn().mockResolvedValue(undefined), + add: vi.fn(), + search: vi.fn(), + }; + + const factories = { + createExtractor: vi.fn().mockResolvedValue({} as GitExtractor), + createVectorStorage: vi.fn().mockResolvedValue(mockVectorStorage), + createGitIndexer: vi.fn().mockResolvedValue({} as GitIndexer), + }; + + const service = new GitHistoryService({ repositoryPath: '/test/repo' }, factories); + + // Create indexer to initialize cache + await service.getGitIndexer(); + + // Close service + await service.close(); + + expect(mockVectorStorage.close).toHaveBeenCalledOnce(); + + // Getting indexer again should recreate (not use cache) + await service.getGitIndexer(); + + expect(factories.createExtractor).toHaveBeenCalledTimes(2); // Called again + }); + + it('should handle close when nothing is cached', async () => { + const service = new GitHistoryService({ repositoryPath: '/test/repo' }); + + // Should not throw + await expect(service.close()).resolves.toBeUndefined(); + }); + }); +}); diff --git a/packages/core/src/services/__tests__/github-service.test.ts b/packages/core/src/services/__tests__/github-service.test.ts new file mode 100644 index 0000000..35191d9 --- /dev/null +++ b/packages/core/src/services/__tests__/github-service.test.ts @@ -0,0 +1,435 @@ +/** + * Tests for GitHubService + */ + +import type { + GitHubDocument, + GitHubIndexerInstance, + GitHubIndexStats, + GitHubSearchResult, +} from '@lytics/dev-agent-types/github'; +import { describe, expect, it, vi } from 'vitest'; +import { type GitHubIndexerFactory, GitHubService } from '../github-service.js'; + +vi.mock('../../storage/path.js', () => ({ + getStoragePath: vi.fn().mockResolvedValue('/mock/storage'), + getStorageFilePaths: vi.fn().mockReturnValue({ + vectors: '/mock/storage/vectors', + githubState: '/mock/storage/github-state.json', + }), +})); + +describe('GitHubService', () => { + const mockIndexStats: GitHubIndexStats = { + repository: 'lytics/dev-agent', + totalDocuments: 150, + byType: { + issue: 100, + pull_request: 50, + discussion: 0, + }, + byState: { + open: 75, + closed: 60, + merged: 15, + }, + lastIndexed: '2024-01-01T00:05:00Z', + indexDuration: 300000, + }; + + const mockDocument: GitHubDocument = { + type: 'issue', + number: 123, + title: 'Add authentication feature', + body: 'We need to implement user authentication', + state: 'open', + labels: ['enhancement', 'security'], + author: 'user1', + createdAt: '2024-01-01T00:00:00Z', + updatedAt: '2024-01-02T00:00:00Z', + url: 'https://github.com/org/repo/issues/123', + repository: 'org/repo', + comments: 5, + reactions: { '+1': 10, eyes: 2 }, + relatedIssues: [], + relatedPRs: [], + linkedFiles: [], + mentions: [], + }; + + const mockSearchResults: GitHubSearchResult[] = [ + { + document: mockDocument, + score: 0.95, + matchedFields: ['title', 'body'], + }, + { + document: { + ...mockDocument, + number: 456, + type: 'pull_request', + title: 'Fix login bug', + body: 'Fixes issue with login flow', + state: 'merged', + author: 'user2', + createdAt: '2024-01-03T00:00:00Z', + updatedAt: '2024-01-04T00:00:00Z', + labels: ['bug'], + url: 'https://github.com/org/repo/pull/456', + }, + score: 0.88, + matchedFields: ['title'], + }, + ]; + + describe('index', () => { + it('should index GitHub issues and PRs', async () => { + const mockIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + index: vi.fn().mockResolvedValue(mockIndexStats), + close: vi.fn().mockResolvedValue(undefined), + search: vi.fn().mockResolvedValue([]), + getDocument: vi.fn().mockResolvedValue(null), + getStats: vi.fn().mockReturnValue(mockIndexStats), + }; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new GitHubService({ repositoryPath: '/test/repo' }, mockFactory); + + const stats = await service.index({ + types: ['issue', 'pull_request'], + state: ['open'], + limit: 100, + }); + + expect(mockFactory).toHaveBeenCalledOnce(); + expect(mockIndexer.initialize).toHaveBeenCalledOnce(); + expect(mockIndexer.index).toHaveBeenCalledWith({ + types: ['issue', 'pull_request'], + state: ['open'], + limit: 100, + }); + // Note: Service manages indexer lifecycle, doesn't close after each operation + expect(stats).toEqual(mockIndexStats); + }); + + it('should handle progress callbacks', async () => { + const mockIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + index: vi.fn().mockResolvedValue(mockIndexStats), + close: vi.fn().mockResolvedValue(undefined), + search: vi.fn().mockResolvedValue([]), + getDocument: vi.fn().mockResolvedValue(null), + getStats: vi.fn().mockReturnValue(mockIndexStats), + }; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new GitHubService({ repositoryPath: '/test/repo' }, mockFactory); + + const onProgress = vi.fn(); + await service.index({ onProgress }); + + expect(mockIndexer.index).toHaveBeenCalledWith({ + types: undefined, + state: undefined, + limit: undefined, + logger: undefined, + onProgress, + }); + }); + + it('should throw error on indexing failure', async () => { + const mockIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + index: vi.fn().mockRejectedValue(new Error('Index failed')), + close: vi.fn().mockResolvedValue(undefined), + search: vi.fn().mockResolvedValue([]), + getDocument: vi.fn().mockResolvedValue(null), + getStats: vi.fn().mockReturnValue(mockIndexStats), + }; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new GitHubService({ repositoryPath: '/test/repo' }, mockFactory); + + await expect(service.index()).rejects.toThrow('Index failed'); + }); + }); + + describe('search', () => { + it('should search GitHub issues and PRs', async () => { + const mockIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + search: vi.fn().mockResolvedValue(mockSearchResults), + close: vi.fn().mockResolvedValue(undefined), + index: vi.fn().mockResolvedValue(mockIndexStats), + getDocument: vi.fn().mockResolvedValue(null), + getStats: vi.fn().mockReturnValue(mockIndexStats), + }; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new GitHubService({ repositoryPath: '/test/repo' }, mockFactory); + + const results = await service.search('authentication', { limit: 10 }); + + expect(mockIndexer.search).toHaveBeenCalledWith('authentication', { limit: 10 }); + expect(results).toEqual(mockSearchResults); + }); + + it('should use default limit when not provided', async () => { + const mockIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + search: vi.fn().mockResolvedValue([]), + close: vi.fn().mockResolvedValue(undefined), + index: vi.fn().mockResolvedValue(mockIndexStats), + getDocument: vi.fn().mockResolvedValue(null), + getStats: vi.fn().mockReturnValue(mockIndexStats), + }; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new GitHubService({ repositoryPath: '/test/repo' }, mockFactory); + + await service.search('test query'); + + expect(mockIndexer.search).toHaveBeenCalledWith('test query', undefined); + }); + }); + + describe('getContext', () => { + it('should get context for a specific issue', async () => { + const mockIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + search: vi.fn().mockResolvedValue(mockSearchResults), + close: vi.fn().mockResolvedValue(undefined), + index: vi.fn().mockResolvedValue(mockIndexStats), + getDocument: vi.fn().mockResolvedValue(null), + getStats: vi.fn().mockReturnValue(mockIndexStats), + }; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new GitHubService({ repositoryPath: '/test/repo' }, mockFactory); + + const context = await service.getContext(123); + + expect(mockIndexer.search).toHaveBeenCalledWith('123', { limit: 1 }); + expect(context).toBeDefined(); + expect(context?.number).toBe(123); + expect(context?.title).toBe('Add authentication feature'); + expect(context?.type).toBe('issue'); + }); + + it('should return null when issue not found', async () => { + const mockIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + search: vi.fn().mockResolvedValue([]), + close: vi.fn().mockResolvedValue(undefined), + index: vi.fn().mockResolvedValue(mockIndexStats), + getDocument: vi.fn().mockResolvedValue(null), + getStats: vi.fn().mockReturnValue(mockIndexStats), + }; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new GitHubService({ repositoryPath: '/test/repo' }, mockFactory); + + const context = await service.getContext(999); + + expect(context).toBeNull(); + }); + + it('should handle partial documents', async () => { + const partialDocument: GitHubDocument = { + type: 'issue', + number: 123, + title: 'Test Issue', + body: '', + state: 'open', + labels: [], + author: '', + createdAt: '2024-01-01T00:00:00Z', + updatedAt: '2024-01-01T00:00:00Z', + url: 'https://github.com/org/repo/issues/123', + repository: 'org/repo', + comments: 0, + reactions: {}, + relatedIssues: [], + relatedPRs: [], + linkedFiles: [], + mentions: [], + }; + + const partialResult: GitHubSearchResult = { + document: partialDocument, + score: 0.95, + matchedFields: ['title'], + }; + + const mockIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + search: vi.fn().mockResolvedValue([partialResult]), + close: vi.fn().mockResolvedValue(undefined), + index: vi.fn().mockResolvedValue(mockIndexStats), + getDocument: vi.fn().mockResolvedValue(null), + getStats: vi.fn().mockReturnValue(mockIndexStats), + }; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new GitHubService({ repositoryPath: '/test/repo' }, mockFactory); + + const context = await service.getContext(123); + + expect(context).toBeDefined(); + expect(context?.number).toBe(123); + expect(context?.body).toBe(''); + expect(context?.author).toBe(''); + expect(context?.labels).toEqual([]); + }); + }); + + describe('findRelated', () => { + it('should find related issues using search with real scores', async () => { + const targetResult: GitHubSearchResult = { + document: mockDocument, + score: 1.0, + matchedFields: ['title', 'body'], + }; + + const relatedResults: GitHubSearchResult[] = [ + targetResult, // Original issue + { + document: { ...mockDocument, number: 124, title: 'Implement OAuth' }, + score: 0.9, + matchedFields: ['title'], + }, + { + document: { ...mockDocument, number: 125, title: 'Add JWT support' }, + score: 0.85, + matchedFields: ['title'], + }, + ]; + + const mockIndexer: GitHubIndexerInstance = { + initialize: vi.fn().mockResolvedValue(undefined), + // First search: getContext searches for #123 + // Second search: findRelated searches by title + search: vi.fn().mockResolvedValueOnce([targetResult]).mockResolvedValueOnce(relatedResults), + close: vi.fn().mockResolvedValue(undefined), + index: vi.fn().mockResolvedValue(mockIndexStats), + getDocument: vi.fn().mockResolvedValue(null), + getStats: vi.fn().mockReturnValue(mockIndexStats), + }; + + const mockFactory: GitHubIndexerFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new GitHubService({ repositoryPath: '/test/repo' }, mockFactory); + + const results = await service.findRelated(123, 5); + + // Service calls search twice: once for context (by number), once for related items (by title) + expect(mockIndexer.search).toHaveBeenCalledTimes(2); + expect(mockIndexer.search).toHaveBeenNthCalledWith(1, '123', { limit: 1 }); + expect(mockIndexer.search).toHaveBeenNthCalledWith(2, 'Add authentication feature', { + limit: 6, + }); + + // Should return GitHubSearchResult[] with real scores, excluding original issue + expect(results).toHaveLength(2); + expect(results[0].document.number).toBe(124); + expect(results[0].score).toBe(0.9); + expect(results[1].document.number).toBe(125); + expect(results[1].score).toBe(0.85); + }); + + it('should return empty array when target not found', async () => { + const mockIndexer: GitHubIndexerInstance = { + initialize: vi.fn().mockResolvedValue(undefined), + search: vi.fn().mockResolvedValue([]), + close: vi.fn().mockResolvedValue(undefined), + index: vi.fn().mockResolvedValue(mockIndexStats), + getDocument: vi.fn().mockResolvedValue(null), + getStats: vi.fn().mockReturnValue(mockIndexStats), + }; + + const mockFactory: GitHubIndexerFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new GitHubService({ repositoryPath: '/test/repo' }, mockFactory); + + const results = await service.findRelated(999); + + expect(results).toEqual([]); + }); + }); + + describe('getStats', () => { + it('should return GitHub index statistics', async () => { + const mockIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + getStats: vi.fn().mockResolvedValue(mockIndexStats), + close: vi.fn().mockResolvedValue(undefined), + }; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new GitHubService({ repositoryPath: '/test/repo' }, mockFactory); + + const stats = await service.getStats(); + + expect(stats).toEqual(mockIndexStats); + }); + + it('should return null on error', async () => { + const mockIndexer: GitHubIndexerInstance = { + initialize: vi.fn().mockResolvedValue(undefined), + index: vi.fn().mockResolvedValue(mockIndexStats), + search: vi.fn().mockResolvedValue([]), + getDocument: vi.fn().mockResolvedValue(null), + getStats: vi.fn().mockImplementation(() => { + throw new Error('Stats failed'); + }), + close: vi.fn().mockResolvedValue(undefined), + }; + const mockFactory: GitHubIndexerFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new GitHubService({ repositoryPath: '/test/repo' }, mockFactory); + + const stats = await service.getStats(); + + expect(stats).toBeNull(); + }); + }); + + describe('isIndexed', () => { + it('should return true when GitHub data is indexed', async () => { + const mockIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + getStats: vi.fn().mockResolvedValue(mockIndexStats), + close: vi.fn().mockResolvedValue(undefined), + }; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new GitHubService({ repositoryPath: '/test/repo' }, mockFactory); + + const result = await service.isIndexed(); + + expect(result).toBe(true); + }); + + it('should return false when not indexed', async () => { + const mockIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + getStats: vi.fn().mockResolvedValue({ ...mockIndexStats, totalDocuments: 0 }), + close: vi.fn().mockResolvedValue(undefined), + }; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new GitHubService({ repositoryPath: '/test/repo' }, mockFactory); + + const result = await service.isIndexed(); + + expect(result).toBe(false); + }); + + it('should return false on error', async () => { + const mockFactory = vi.fn().mockRejectedValue(new Error('Init failed')); + const service = new GitHubService({ repositoryPath: '/test/repo' }, mockFactory); + + const result = await service.isIndexed(); + + expect(result).toBe(false); + }); + }); +}); diff --git a/packages/core/src/services/__tests__/health-service.test.ts b/packages/core/src/services/__tests__/health-service.test.ts new file mode 100644 index 0000000..f19d160 --- /dev/null +++ b/packages/core/src/services/__tests__/health-service.test.ts @@ -0,0 +1,187 @@ +/** + * Tests for HealthService + */ + +import { describe, expect, it, vi } from 'vitest'; +import type { RepositoryIndexer } from '../../indexer/index.js'; +import type { MetricsStore } from '../../metrics/store.js'; +import type { VectorStorage } from '../../vector/index.js'; +import { HealthService } from '../health-service.js'; + +describe('HealthService', () => { + describe('check', () => { + it('should return healthy status when all checks pass', async () => { + // Create mock components + const mockIndexer: RepositoryIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + getStats: vi.fn().mockResolvedValue({ + filesScanned: 100, + documentsIndexed: 250, + endTime: new Date(), + }), + close: vi.fn().mockResolvedValue(undefined), + } as unknown as RepositoryIndexer; + + const mockVectorStorage: VectorStorage = { + initialize: vi.fn().mockResolvedValue(undefined), + close: vi.fn().mockResolvedValue(undefined), + } as unknown as VectorStorage; + + const mockMetricsStore: MetricsStore = { + getCount: vi.fn().mockReturnValue(10), + close: vi.fn(), + } as unknown as MetricsStore; + + // Inject mock factories + const service = new HealthService( + { repositoryPath: '/test/repo' }, + { + createIndexer: vi.fn().mockResolvedValue(mockIndexer), + createVectorStorage: vi.fn().mockResolvedValue(mockVectorStorage), + createMetricsStore: vi.fn().mockReturnValue(mockMetricsStore), + } + ); + + const result = await service.check(); + + expect(result.status).toBe('healthy'); + expect(result.checks.indexer.status).toBe('ok'); + expect(result.checks.vectorStorage.status).toBe('ok'); + expect(result.checks.metricsStore.status).toBe('ok'); + expect(result.timestamp).toBeInstanceOf(Date); + }); + + it('should return degraded status when metrics check has warning', async () => { + const mockIndexer: RepositoryIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + getStats: vi.fn().mockResolvedValue({ endTime: new Date() }), + close: vi.fn().mockResolvedValue(undefined), + } as unknown as RepositoryIndexer; + + const mockVectorStorage: VectorStorage = { + initialize: vi.fn().mockResolvedValue(undefined), + close: vi.fn().mockResolvedValue(undefined), + } as unknown as VectorStorage; + + const mockMetricsStore: MetricsStore = { + getCount: vi.fn().mockImplementation(() => { + throw new Error('Metrics unavailable'); + }), + close: vi.fn(), + } as unknown as MetricsStore; + + const service = new HealthService( + { repositoryPath: '/test/repo' }, + { + createIndexer: vi.fn().mockResolvedValue(mockIndexer), + createVectorStorage: vi.fn().mockResolvedValue(mockVectorStorage), + createMetricsStore: vi.fn().mockReturnValue(mockMetricsStore), + } + ); + + const result = await service.check(); + + expect(result.status).toBe('degraded'); + expect(result.checks.metricsStore.status).toBe('warning'); + }); + + it('should return unhealthy status when indexer check fails', async () => { + const mockIndexer: RepositoryIndexer = { + initialize: vi.fn().mockRejectedValue(new Error('Indexer error')), + close: vi.fn().mockResolvedValue(undefined), + } as unknown as RepositoryIndexer; + + const mockVectorStorage: VectorStorage = { + initialize: vi.fn().mockResolvedValue(undefined), + close: vi.fn().mockResolvedValue(undefined), + } as unknown as VectorStorage; + + const mockMetricsStore: MetricsStore = { + getCount: vi.fn().mockReturnValue(10), + close: vi.fn(), + } as unknown as MetricsStore; + + const service = new HealthService( + { repositoryPath: '/test/repo' }, + { + createIndexer: vi.fn().mockResolvedValue(mockIndexer), + createVectorStorage: vi.fn().mockResolvedValue(mockVectorStorage), + createMetricsStore: vi.fn().mockReturnValue(mockMetricsStore), + } + ); + + const result = await service.check(); + + expect(result.status).toBe('unhealthy'); + expect(result.checks.indexer.status).toBe('error'); + expect(result.checks.indexer.message).toBe('Indexer error'); + }); + + it('should return unhealthy status when vector storage check fails', async () => { + const mockIndexer: RepositoryIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + getStats: vi.fn().mockResolvedValue({ endTime: new Date() }), + close: vi.fn().mockResolvedValue(undefined), + } as unknown as RepositoryIndexer; + + const mockVectorStorage: VectorStorage = { + initialize: vi.fn().mockRejectedValue(new Error('Vector storage error')), + close: vi.fn().mockResolvedValue(undefined), + } as unknown as VectorStorage; + + const mockMetricsStore: MetricsStore = { + getCount: vi.fn().mockReturnValue(10), + close: vi.fn(), + } as unknown as MetricsStore; + + const service = new HealthService( + { repositoryPath: '/test/repo' }, + { + createIndexer: vi.fn().mockResolvedValue(mockIndexer), + createVectorStorage: vi.fn().mockResolvedValue(mockVectorStorage), + createMetricsStore: vi.fn().mockReturnValue(mockMetricsStore), + } + ); + + const result = await service.check(); + + expect(result.status).toBe('unhealthy'); + expect(result.checks.vectorStorage.status).toBe('error'); + }); + + it('should run all checks in parallel', async () => { + const mockIndexer: RepositoryIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + getStats: vi.fn().mockResolvedValue({ endTime: new Date() }), + close: vi.fn().mockResolvedValue(undefined), + } as unknown as RepositoryIndexer; + + const mockVectorStorage: VectorStorage = { + initialize: vi.fn().mockResolvedValue(undefined), + close: vi.fn().mockResolvedValue(undefined), + } as unknown as VectorStorage; + + const mockMetricsStore: MetricsStore = { + getCount: vi.fn().mockReturnValue(10), + close: vi.fn(), + } as unknown as MetricsStore; + + const service = new HealthService( + { repositoryPath: '/test/repo' }, + { + createIndexer: vi.fn().mockResolvedValue(mockIndexer), + createVectorStorage: vi.fn().mockResolvedValue(mockVectorStorage), + createMetricsStore: vi.fn().mockReturnValue(mockMetricsStore), + } + ); + + const startTime = Date.now(); + await service.check(); + const duration = Date.now() - startTime; + + // If checks were sequential with 10ms delays, would take 30ms+ + // Parallel should complete much faster + expect(duration).toBeLessThan(100); + }); + }); +}); diff --git a/packages/core/src/services/__tests__/metrics-service.test.ts b/packages/core/src/services/__tests__/metrics-service.test.ts new file mode 100644 index 0000000..7bcd649 --- /dev/null +++ b/packages/core/src/services/__tests__/metrics-service.test.ts @@ -0,0 +1,272 @@ +/** + * Tests for MetricsService + */ + +import { describe, expect, it, vi } from 'vitest'; +import type { FileMetrics } from '../../metrics/analytics.js'; +import type { MetricsStore } from '../../metrics/store.js'; +import type { CodeMetadata, Snapshot } from '../../metrics/types.js'; +import { MetricsService } from '../metrics-service.js'; + +vi.mock('../../storage/path.js', () => ({ + getStoragePath: vi.fn().mockResolvedValue('/mock/storage'), + getStorageFilePaths: vi.fn().mockReturnValue({ + vectors: '/mock/storage/vectors', + indexerState: '/mock/storage/indexer-state.json', + metrics: '/mock/storage/metrics.db', + }), +})); + +describe('MetricsService', () => { + const mockSnapshot: Snapshot = { + id: 'snapshot-1', + repositoryPath: '/test/repo', + timestamp: new Date('2024-01-01T00:00:00Z'), + trigger: 'index', + stats: { + filesScanned: 100, + documentsIndexed: 250, + documentsExtracted: 250, + vectorsStored: 250, + repositoryPath: '/test/repo', + startTime: new Date('2024-01-01T00:00:00Z'), + endTime: new Date('2024-01-01T00:01:00Z'), + duration: 60000, + errors: [], + }, + }; + + describe('getMostActive', () => { + it('should return most active files', async () => { + const mockMetrics: FileMetrics[] = [ + { + filePath: 'src/file1.ts', + activity: 'high', + commitCount: 50, + size: 'medium', + linesOfCode: 200, + ownership: 'small-team', + authorCount: 3, + lastModified: new Date(), + numFunctions: 10, + numImports: 5, + }, + ]; + + const mockStore: MetricsStore = { + getLatestSnapshot: vi.fn().mockReturnValue(mockSnapshot), + close: vi.fn(), + } as unknown as MetricsStore; + + const mockFactory = vi.fn().mockReturnValue(mockStore); + const service = new MetricsService({ repositoryPath: '/test/repo' }, mockFactory); + + // Mock the analytics function + const analytics = await import('../../metrics/analytics.js'); + vi.spyOn(analytics, 'getMostActive').mockReturnValue(mockMetrics); + + const result = await service.getMostActive(10); + + expect(result).toEqual(mockMetrics); + expect(mockStore.close).toHaveBeenCalledOnce(); + }); + + it('should return empty array when no snapshot exists', async () => { + const mockStore: MetricsStore = { + getLatestSnapshot: vi.fn().mockReturnValue(null), + close: vi.fn(), + } as unknown as MetricsStore; + + const mockFactory = vi.fn().mockReturnValue(mockStore); + const service = new MetricsService({ repositoryPath: '/test/repo' }, mockFactory); + + const result = await service.getMostActive(); + + expect(result).toEqual([]); + expect(mockStore.close).toHaveBeenCalledOnce(); + }); + }); + + describe('getLargestFiles', () => { + it('should return largest files', async () => { + const mockMetrics: FileMetrics[] = [ + { + filePath: 'src/large.ts', + activity: 'low', + commitCount: 10, + size: 'large', + linesOfCode: 1000, + ownership: 'shared', + authorCount: 5, + lastModified: new Date(), + numFunctions: 50, + numImports: 20, + }, + ]; + + const mockStore: MetricsStore = { + getLatestSnapshot: vi.fn().mockReturnValue(mockSnapshot), + close: vi.fn(), + } as unknown as MetricsStore; + + const mockFactory = vi.fn().mockReturnValue(mockStore); + const service = new MetricsService({ repositoryPath: '/test/repo' }, mockFactory); + + const analytics = await import('../../metrics/analytics.js'); + vi.spyOn(analytics, 'getLargestFiles').mockReturnValue(mockMetrics); + + const result = await service.getLargestFiles(10); + + expect(result).toEqual(mockMetrics); + }); + }); + + describe('getConcentratedOwnership', () => { + it('should return files with concentrated ownership', async () => { + const mockMetrics: FileMetrics[] = [ + { + filePath: 'src/solo.ts', + activity: 'medium', + commitCount: 30, + size: 'medium', + linesOfCode: 500, + ownership: 'single', + authorCount: 1, + lastModified: new Date(), + numFunctions: 20, + numImports: 8, + }, + ]; + + const mockStore: MetricsStore = { + getLatestSnapshot: vi.fn().mockReturnValue(mockSnapshot), + close: vi.fn(), + } as unknown as MetricsStore; + + const mockFactory = vi.fn().mockReturnValue(mockStore); + const service = new MetricsService({ repositoryPath: '/test/repo' }, mockFactory); + + const analytics = await import('../../metrics/analytics.js'); + vi.spyOn(analytics, 'getConcentratedOwnership').mockReturnValue(mockMetrics); + + const result = await service.getConcentratedOwnership(10); + + expect(result).toEqual(mockMetrics); + }); + }); + + describe('getFileTrend', () => { + it('should return file trend history', async () => { + const mockTrend: CodeMetadata[] = [ + { + filePath: 'src/file.ts', + commitCount: 10, + lastModified: new Date('2024-01-01'), + authorCount: 3, + linesOfCode: 200, + numFunctions: 5, + numImports: 3, + }, + ]; + + const mockStore: MetricsStore = { + close: vi.fn(), + } as unknown as MetricsStore; + + const mockFactory = vi.fn().mockReturnValue(mockStore); + const service = new MetricsService({ repositoryPath: '/test/repo' }, mockFactory); + + const analytics = await import('../../metrics/analytics.js'); + vi.spyOn(analytics, 'getFileTrend').mockReturnValue(mockTrend); + + const result = await service.getFileTrend('src/file.ts', 10); + + expect(result).toEqual(mockTrend); + }); + }); + + describe('getSummary', () => { + it('should return snapshot summary', async () => { + const mockSummary = { + totalFiles: 100, + totalLOC: 10000, + totalFunctions: 500, + avgLOC: 100, + veryActiveFiles: 5, + highActivityFiles: 10, + veryActivePercent: 5, + veryLargeFiles: 3, + largeFiles: 8, + veryLargePercent: 3, + singleAuthorFiles: 20, + pairAuthorFiles: 15, + singleAuthorPercent: 20, + }; + + const mockStore: MetricsStore = { + getLatestSnapshot: vi.fn().mockReturnValue(mockSnapshot), + close: vi.fn(), + } as unknown as MetricsStore; + + const mockFactory = vi.fn().mockReturnValue(mockStore); + const service = new MetricsService({ repositoryPath: '/test/repo' }, mockFactory); + + const analytics = await import('../../metrics/analytics.js'); + vi.spyOn(analytics, 'getSnapshotSummary').mockReturnValue(mockSummary); + + const result = await service.getSummary(); + + expect(result).toEqual(mockSummary); + }); + + it('should return null when no snapshot exists', async () => { + const mockStore: MetricsStore = { + getLatestSnapshot: vi.fn().mockReturnValue(null), + close: vi.fn(), + } as unknown as MetricsStore; + + const mockFactory = vi.fn().mockReturnValue(mockStore); + const service = new MetricsService({ repositoryPath: '/test/repo' }, mockFactory); + + const result = await service.getSummary(); + + expect(result).toBeNull(); + }); + }); + + describe('getSnapshots', () => { + it('should query snapshots', async () => { + const mockStore: MetricsStore = { + getSnapshots: vi.fn().mockReturnValue([mockSnapshot]), + close: vi.fn(), + } as unknown as MetricsStore; + + const mockFactory = vi.fn().mockReturnValue(mockStore); + const service = new MetricsService({ repositoryPath: '/test/repo' }, mockFactory); + + const result = await service.getSnapshots({ limit: 10 }); + + expect(result).toEqual([mockSnapshot]); + expect(mockStore.getSnapshots).toHaveBeenCalledWith({ + limit: 10, + repositoryPath: '/test/repo', + }); + }); + }); + + describe('getLatestSnapshot', () => { + it('should return latest snapshot', async () => { + const mockStore: MetricsStore = { + getLatestSnapshot: vi.fn().mockReturnValue(mockSnapshot), + close: vi.fn(), + } as unknown as MetricsStore; + + const mockFactory = vi.fn().mockReturnValue(mockStore); + const service = new MetricsService({ repositoryPath: '/test/repo' }, mockFactory); + + const result = await service.getLatestSnapshot(); + + expect(result).toEqual(mockSnapshot); + }); + }); +}); diff --git a/packages/core/src/services/__tests__/search-service.test.ts b/packages/core/src/services/__tests__/search-service.test.ts new file mode 100644 index 0000000..93919f0 --- /dev/null +++ b/packages/core/src/services/__tests__/search-service.test.ts @@ -0,0 +1,335 @@ +/** + * Tests for SearchService + */ + +import { describe, expect, it, vi } from 'vitest'; +import type { RepositoryIndexer } from '../../indexer/index.js'; +import type { SearchResult } from '../../vector/types.js'; +import { SearchService } from '../search-service.js'; + +vi.mock('../../storage/path.js', () => ({ + getStoragePath: vi.fn().mockResolvedValue('/mock/storage'), + getStorageFilePaths: vi.fn().mockReturnValue({ + vectors: '/mock/storage/vectors', + indexerState: '/mock/storage/indexer-state.json', + metrics: '/mock/storage/metrics.db', + }), +})); + +describe('SearchService', () => { + const mockSearchResults: SearchResult[] = [ + { + id: 'doc1', + score: 0.95, + metadata: { + name: 'authenticate', + type: 'function', + startLine: 10, + endLine: 20, + path: 'src/auth/authenticate.ts', + signature: 'function authenticate(user: User)', + }, + }, + { + id: 'doc2', + score: 0.85, + metadata: { + name: 'login', + type: 'function', + startLine: 5, + endLine: 15, + path: 'src/auth/login.ts', + signature: 'function login(credentials: Credentials)', + }, + }, + ]; + + describe('search', () => { + it('should perform semantic search', async () => { + const mockIndexer: RepositoryIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + search: vi.fn().mockResolvedValue(mockSearchResults), + close: vi.fn().mockResolvedValue(undefined), + } as unknown as RepositoryIndexer; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new SearchService({ repositoryPath: '/test/repo' }, mockFactory); + + const results = await service.search('authentication', { limit: 10, scoreThreshold: 0.7 }); + + expect(mockFactory).toHaveBeenCalledOnce(); + expect(mockIndexer.initialize).toHaveBeenCalledOnce(); + expect(mockIndexer.search).toHaveBeenCalledWith('authentication', { + limit: 10, + scoreThreshold: 0.7, + }); + expect(mockIndexer.close).toHaveBeenCalledOnce(); + expect(results).toEqual(mockSearchResults); + }); + + it('should use default options when not provided', async () => { + const mockIndexer: RepositoryIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + search: vi.fn().mockResolvedValue([]), + close: vi.fn().mockResolvedValue(undefined), + } as unknown as RepositoryIndexer; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new SearchService({ repositoryPath: '/test/repo' }, mockFactory); + + await service.search('test query'); + + expect(mockIndexer.search).toHaveBeenCalledWith('test query', { + limit: 10, + scoreThreshold: 0.7, + }); + }); + + it('should close indexer even on error', async () => { + const mockIndexer: RepositoryIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + search: vi.fn().mockRejectedValue(new Error('Search failed')), + close: vi.fn().mockResolvedValue(undefined), + } as unknown as RepositoryIndexer; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new SearchService({ repositoryPath: '/test/repo' }, mockFactory); + + await expect(service.search('test')).rejects.toThrow('Search failed'); + expect(mockIndexer.close).toHaveBeenCalledOnce(); + }); + }); + + describe('findSimilar', () => { + it('should find similar code to a file', async () => { + const targetFile: SearchResult = { + id: 'doc1', + score: 1.0, + metadata: { + name: 'processPayment', + type: 'function', + path: 'src/payments/process.ts', + signature: 'function processPayment()', + }, + }; + + const similarResults: SearchResult[] = [ + targetFile, // The file itself + { + id: 'doc2', + score: 0.88, + metadata: { + name: 'handlePayment', + type: 'function', + path: 'src/payments/handler.ts', + signature: 'function handlePayment()', + }, + }, + { + id: 'doc3', + score: 0.82, + metadata: { + name: 'refundPayment', + type: 'function', + path: 'src/payments/refund.ts', + signature: 'function refundPayment()', + }, + }, + ]; + + const mockIndexer: RepositoryIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + search: vi + .fn() + .mockResolvedValueOnce([targetFile]) // First call to find the file + .mockResolvedValueOnce(similarResults), // Second call to find similar + close: vi.fn().mockResolvedValue(undefined), + } as unknown as RepositoryIndexer; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new SearchService({ repositoryPath: '/test/repo' }, mockFactory); + + const results = await service.findSimilar('src/payments/process.ts', { + limit: 5, + threshold: 0.8, + }); + + expect(mockIndexer.search).toHaveBeenCalledTimes(2); + expect(results).toHaveLength(2); // Should exclude the original file + expect(results.find((r) => r.metadata.path === 'src/payments/process.ts')).toBeUndefined(); + }); + + it('should return empty array when file not found', async () => { + const mockIndexer: RepositoryIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + search: vi.fn().mockResolvedValue([]), // File not found + close: vi.fn().mockResolvedValue(undefined), + } as unknown as RepositoryIndexer; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new SearchService({ repositoryPath: '/test/repo' }, mockFactory); + + const results = await service.findSimilar('src/nonexistent.ts'); + + expect(results).toEqual([]); + }); + }); + + describe('findRelatedTests', () => { + it('should find test files for a source file', async () => { + const testResults: SearchResult[] = [ + { + id: 'test1', + score: 0.9, + metadata: { + path: 'src/user/__tests__/user-service.test.ts', + type: 'function', + name: 'describe', + }, + }, + { + id: 'test2', + score: 0.85, + metadata: { + path: 'src/user/user-service.spec.ts', + type: 'function', + name: 'it', + }, + }, + { + id: 'not-test', + score: 0.8, + metadata: { + path: 'src/user/user-service.ts', // Not a test file + type: 'class', + name: 'UserService', + }, + }, + ]; + + const mockIndexer: RepositoryIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + search: vi.fn().mockResolvedValue(testResults), + close: vi.fn().mockResolvedValue(undefined), + } as unknown as RepositoryIndexer; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new SearchService({ repositoryPath: '/test/repo' }, mockFactory); + + const results = await service.findRelatedTests('src/user/user-service.ts'); + + expect(results).toHaveLength(2); + expect(results).toContain('src/user/__tests__/user-service.test.ts'); + expect(results).toContain('src/user/user-service.spec.ts'); + expect(results).not.toContain('src/user/user-service.ts'); + }); + + it('should return empty array when no tests found', async () => { + const mockIndexer: RepositoryIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + search: vi.fn().mockResolvedValue([]), + close: vi.fn().mockResolvedValue(undefined), + } as unknown as RepositoryIndexer; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new SearchService({ repositoryPath: '/test/repo' }, mockFactory); + + const results = await service.findRelatedTests('src/util.ts'); + + expect(results).toEqual([]); + }); + }); + + describe('findSymbol', () => { + it('should find a symbol by exact name match', async () => { + const mockIndexer: RepositoryIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + search: vi.fn().mockResolvedValue(mockSearchResults), + close: vi.fn().mockResolvedValue(undefined), + } as unknown as RepositoryIndexer; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new SearchService({ repositoryPath: '/test/repo' }, mockFactory); + + const result = await service.findSymbol('authenticate'); + + expect(result).toBeDefined(); + expect(result?.metadata.name).toBe('authenticate'); + }); + + it('should return first result when no exact match', async () => { + const mockIndexer: RepositoryIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + search: vi.fn().mockResolvedValue(mockSearchResults), + close: vi.fn().mockResolvedValue(undefined), + } as unknown as RepositoryIndexer; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new SearchService({ repositoryPath: '/test/repo' }, mockFactory); + + const result = await service.findSymbol('nonexistent'); + + expect(result).toBeDefined(); + expect(result).toEqual(mockSearchResults[0]); + }); + + it('should return null when no results found', async () => { + const mockIndexer: RepositoryIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + search: vi.fn().mockResolvedValue([]), + close: vi.fn().mockResolvedValue(undefined), + } as unknown as RepositoryIndexer; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new SearchService({ repositoryPath: '/test/repo' }, mockFactory); + + const result = await service.findSymbol('nonexistent'); + + expect(result).toBeNull(); + }); + }); + + describe('isIndexed', () => { + it('should return true when repository is indexed', async () => { + const mockIndexer: RepositoryIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + getStats: vi.fn().mockResolvedValue({ + filesScanned: 100, + documentsIndexed: 250, + }), + close: vi.fn().mockResolvedValue(undefined), + } as unknown as RepositoryIndexer; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new SearchService({ repositoryPath: '/test/repo' }, mockFactory); + + const result = await service.isIndexed(); + + expect(result).toBe(true); + }); + + it('should return false when repository is not indexed', async () => { + const mockIndexer: RepositoryIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + getStats: vi.fn().mockResolvedValue(null), + close: vi.fn().mockResolvedValue(undefined), + } as unknown as RepositoryIndexer; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new SearchService({ repositoryPath: '/test/repo' }, mockFactory); + + const result = await service.isIndexed(); + + expect(result).toBe(false); + }); + + it('should return false on error', async () => { + const mockFactory = vi.fn().mockRejectedValue(new Error('Init failed')); + const service = new SearchService({ repositoryPath: '/test/repo' }, mockFactory); + + const result = await service.isIndexed(); + + expect(result).toBe(false); + }); + }); +}); diff --git a/packages/core/src/services/__tests__/stats-service.test.ts b/packages/core/src/services/__tests__/stats-service.test.ts new file mode 100644 index 0000000..2057a23 --- /dev/null +++ b/packages/core/src/services/__tests__/stats-service.test.ts @@ -0,0 +1,109 @@ +/** + * Tests for StatsService + */ + +import { describe, expect, it, vi } from 'vitest'; +import type { RepositoryIndexer } from '../../indexer/index.js'; +import type { DetailedIndexStats } from '../../indexer/types.js'; +import { StatsService } from '../stats-service.js'; + +describe('StatsService', () => { + describe('getStats', () => { + it('should return repository statistics', async () => { + // Create mock indexer + const mockIndexer: RepositoryIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + getStats: vi.fn().mockResolvedValue({ + filesScanned: 100, + documentsIndexed: 250, + documentsExtracted: 250, + vectorsStored: 250, + repositoryPath: '/test/repo', + startTime: new Date('2024-01-01T00:00:00Z'), + endTime: new Date('2024-01-01T00:01:00Z'), + duration: 60000, + errors: [], + } as DetailedIndexStats), + close: vi.fn().mockResolvedValue(undefined), + } as unknown as RepositoryIndexer; + + // Inject mock factory + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new StatsService({ repositoryPath: '/test/repo' }, mockFactory); + + const stats = await service.getStats(); + + expect(stats).toBeDefined(); + expect(stats).not.toBeNull(); + if (stats) { + expect(stats.filesScanned).toBe(100); + expect(stats.documentsIndexed).toBe(250); + } + expect(mockIndexer.initialize).toHaveBeenCalledOnce(); + expect(mockIndexer.getStats).toHaveBeenCalledOnce(); + expect(mockIndexer.close).toHaveBeenCalledOnce(); + }); + + it('should clean up indexer even on error', async () => { + const mockIndexer: RepositoryIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + getStats: vi.fn().mockRejectedValue(new Error('Stats error')), + close: vi.fn().mockResolvedValue(undefined), + } as unknown as RepositoryIndexer; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new StatsService({ repositoryPath: '/test/repo' }, mockFactory); + + await expect(service.getStats()).rejects.toThrow('Stats error'); + expect(mockIndexer.close).toHaveBeenCalledOnce(); + }); + }); + + describe('isIndexed', () => { + it('should return true when repository is indexed', async () => { + const mockIndexer: RepositoryIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + getStats: vi.fn().mockResolvedValue({ + filesScanned: 100, + } as DetailedIndexStats), + close: vi.fn().mockResolvedValue(undefined), + } as unknown as RepositoryIndexer; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new StatsService({ repositoryPath: '/test/repo' }, mockFactory); + + const result = await service.isIndexed(); + + expect(result).toBe(true); + }); + + it('should return false when repository is not indexed', async () => { + const mockIndexer: RepositoryIndexer = { + initialize: vi.fn().mockResolvedValue(undefined), + getStats: vi.fn().mockResolvedValue(null), + close: vi.fn().mockResolvedValue(undefined), + } as unknown as RepositoryIndexer; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new StatsService({ repositoryPath: '/test/repo' }, mockFactory); + + const result = await service.isIndexed(); + + expect(result).toBe(false); + }); + + it('should return false on error', async () => { + const mockIndexer: RepositoryIndexer = { + initialize: vi.fn().mockRejectedValue(new Error('Init error')), + close: vi.fn().mockResolvedValue(undefined), + } as unknown as RepositoryIndexer; + + const mockFactory = vi.fn().mockResolvedValue(mockIndexer); + const service = new StatsService({ repositoryPath: '/test/repo' }, mockFactory); + + const result = await service.isIndexed(); + + expect(result).toBe(false); + }); + }); +}); diff --git a/packages/core/src/services/coordinator-service.ts b/packages/core/src/services/coordinator-service.ts new file mode 100644 index 0000000..c8f7ebd --- /dev/null +++ b/packages/core/src/services/coordinator-service.ts @@ -0,0 +1,196 @@ +/** + * Coordinator Service + * + * Shared service for setting up and managing the SubagentCoordinator. + * Used by both MCP server and CLI commands that need agent coordination. + */ + +import type { Logger } from '@lytics/kero'; +import type { RepositoryIndexer } from '../indexer/index.js'; + +/** + * Minimal coordinator interface + * + * This matches the SubagentCoordinator from @lytics/dev-agent-subagents + * but avoids cross-package import issues. TypeScript's structural typing + * ensures compatibility at runtime. + * + * Only defines methods we actually use in CoordinatorService. + */ +export interface SubagentCoordinator { + /** + * Register a subagent with the coordinator + */ + registerAgent(agent: unknown): Promise; + + /** + * Get the context manager for setting up indexer + */ + getContextManager(): { + setIndexer(indexer: RepositoryIndexer): void; + }; + + /** + * Graceful shutdown (optional for future use) + */ + shutdown?(): Promise; +} + +export interface CoordinatorServiceConfig { + repositoryPath: string; + logger?: Logger; + maxConcurrentTasks?: number; + defaultMessageTimeout?: number; + logLevel?: 'debug' | 'info' | 'warn' | 'error'; +} + +export interface CoordinatorConfig { + maxConcurrentTasks: number; + defaultMessageTimeout: number; + logLevel: 'debug' | 'info' | 'warn' | 'error'; +} + +/** + * Factory functions for creating coordinator and agents + */ +export type CoordinatorFactory = (config: CoordinatorConfig) => Promise; +export type AgentFactory = () => Promise; + +export interface CoordinatorFactories { + createCoordinator?: CoordinatorFactory; + createExplorerAgent?: AgentFactory; + createPlannerAgent?: AgentFactory; + createPrAgent?: AgentFactory; +} + +/** + * Service for setting up and managing the SubagentCoordinator + * + * Encapsulates the boilerplate of: + * - Creating coordinator + * - Registering agents (Explorer, Planner, PR) + * - Setting up context manager + * + * Makes coordinator setup testable and consistent. + */ +export class CoordinatorService { + private repositoryPath: string; + private logger?: Logger; + private config: Required>; + private factories: Required; + + constructor(config: CoordinatorServiceConfig, factories?: CoordinatorFactories) { + this.repositoryPath = config.repositoryPath; + this.logger = config.logger; + + // Default configuration + this.config = { + maxConcurrentTasks: config.maxConcurrentTasks ?? 5, + defaultMessageTimeout: config.defaultMessageTimeout ?? 30000, // 30 seconds + logLevel: config.logLevel ?? 'info', + }; + + // Use provided factories or defaults + this.factories = { + createCoordinator: factories?.createCoordinator || this.defaultCoordinatorFactory.bind(this), + createExplorerAgent: + factories?.createExplorerAgent || this.defaultExplorerAgentFactory.bind(this), + createPlannerAgent: + factories?.createPlannerAgent || this.defaultPlannerAgentFactory.bind(this), + createPrAgent: factories?.createPrAgent || this.defaultPrAgentFactory.bind(this), + }; + } + + /** + * Default factory implementations + */ + private async defaultCoordinatorFactory(config: CoordinatorConfig): Promise { + // eslint-disable-next-line @typescript-eslint/no-var-requires + const { SubagentCoordinator: Coordinator } = require('@lytics/dev-agent-subagents'); + return new Coordinator(config) as SubagentCoordinator; + } + + private async defaultExplorerAgentFactory(): Promise { + // eslint-disable-next-line @typescript-eslint/no-var-requires + const { ExplorerAgent } = require('@lytics/dev-agent-subagents'); + return new ExplorerAgent(); + } + + private async defaultPlannerAgentFactory(): Promise { + // eslint-disable-next-line @typescript-eslint/no-var-requires + const { PlannerAgent } = require('@lytics/dev-agent-subagents'); + return new PlannerAgent(); + } + + private async defaultPrAgentFactory(): Promise { + // eslint-disable-next-line @typescript-eslint/no-var-requires + const { PrAgent } = require('@lytics/dev-agent-subagents'); + return new PrAgent(); + } + + /** + * Create and configure a coordinator with all agents registered + * + * @param indexer - Repository indexer to set in context manager + * @returns Configured coordinator ready to use + */ + async createCoordinator(indexer: RepositoryIndexer): Promise { + this.logger?.debug( + { + maxConcurrentTasks: this.config.maxConcurrentTasks, + defaultMessageTimeout: this.config.defaultMessageTimeout, + logLevel: this.config.logLevel, + }, + 'Creating SubagentCoordinator' + ); + + // Create coordinator + const coordinator = await this.factories.createCoordinator({ + maxConcurrentTasks: this.config.maxConcurrentTasks, + defaultMessageTimeout: this.config.defaultMessageTimeout, + logLevel: this.config.logLevel, + }); + + // Set up context manager with indexer + coordinator.getContextManager().setIndexer(indexer); + + // Register all agents + const explorerAgent = await this.factories.createExplorerAgent(); + const plannerAgent = await this.factories.createPlannerAgent(); + const prAgent = await this.factories.createPrAgent(); + + await coordinator.registerAgent(explorerAgent); + await coordinator.registerAgent(plannerAgent); + await coordinator.registerAgent(prAgent); + + this.logger?.debug('SubagentCoordinator configured with 3 agents'); + + return coordinator; + } + + /** + * Update configuration + * + * Useful for changing settings without recreating the service. + * + * @param config - Partial configuration to update + */ + updateConfig(config: Partial>): void { + if (config.maxConcurrentTasks !== undefined) { + this.config.maxConcurrentTasks = config.maxConcurrentTasks; + } + if (config.defaultMessageTimeout !== undefined) { + this.config.defaultMessageTimeout = config.defaultMessageTimeout; + } + if (config.logLevel !== undefined) { + this.config.logLevel = config.logLevel; + } + } + + /** + * Get current configuration + */ + getConfig(): Required> { + return { ...this.config }; + } +} diff --git a/packages/core/src/services/git-history-service.ts b/packages/core/src/services/git-history-service.ts new file mode 100644 index 0000000..d1f67fa --- /dev/null +++ b/packages/core/src/services/git-history-service.ts @@ -0,0 +1,204 @@ +/** + * Git History Service + * + * Shared service for git history indexing and search. + * Used by MCP adapters (HistoryAdapter, PlanAdapter) and CLI commands. + */ + +import type { Logger } from '@lytics/kero'; + +// Re-define types to avoid cross-package TypeScript issues +export interface GitExtractor { + extractCommits(options?: unknown): Promise; +} + +export interface VectorStorage { + initialize(): Promise; + close(): Promise; + add(vectors: unknown[]): Promise; + search(query: string, options?: unknown): Promise; +} + +export interface GitIndexer { + index(options?: unknown): Promise; + search(query: string, options?: unknown): Promise; + getCommits(options?: unknown): Promise; +} + +export interface GitHistoryServiceConfig { + repositoryPath: string; + logger?: Logger; +} + +export interface GitIndexerFactoryConfig { + extractor: GitExtractor; + vectorStorage: VectorStorage; +} + +/** + * Factory functions for creating git components + */ +export type GitExtractorFactory = (repositoryPath: string) => Promise; +export type VectorStorageFactory = (storePath: string) => Promise; +export type GitIndexerFactory = (config: GitIndexerFactoryConfig) => Promise; + +export interface GitHistoryFactories { + createExtractor?: GitExtractorFactory; + createVectorStorage?: VectorStorageFactory; + createGitIndexer?: GitIndexerFactory; +} + +/** + * Service for git history operations + * + * Encapsulates the setup of: + * - LocalGitExtractor + * - VectorStorage for git commits + * - GitIndexer + * + * Makes git history operations testable and consistent. + */ +export class GitHistoryService { + private repositoryPath: string; + private logger?: Logger; + private factories: Required; + private cachedGitIndexer?: GitIndexer; + private cachedVectorStorage?: VectorStorage; + + constructor(config: GitHistoryServiceConfig, factories?: GitHistoryFactories) { + this.repositoryPath = config.repositoryPath; + this.logger = config.logger; + + // Use provided factories or defaults + this.factories = { + createExtractor: factories?.createExtractor || this.defaultExtractorFactory.bind(this), + createVectorStorage: + factories?.createVectorStorage || this.defaultVectorStorageFactory.bind(this), + createGitIndexer: factories?.createGitIndexer || this.defaultGitIndexerFactory.bind(this), + }; + } + + /** + * Default factory implementations + */ + private async defaultExtractorFactory(repositoryPath: string): Promise { + // eslint-disable-next-line @typescript-eslint/no-var-requires + const { LocalGitExtractor } = require('@lytics/dev-agent-core'); + return new LocalGitExtractor(repositoryPath) as GitExtractor; + } + + private async defaultVectorStorageFactory(storePath: string): Promise { + // eslint-disable-next-line @typescript-eslint/no-var-requires + const { VectorStorage: Storage } = require('@lytics/dev-agent-core'); + const storage = new Storage({ storePath }) as VectorStorage; + await storage.initialize(); + return storage; + } + + private async defaultGitIndexerFactory(config: GitIndexerFactoryConfig): Promise { + // eslint-disable-next-line @typescript-eslint/no-var-requires + const { GitIndexer: Indexer } = require('@lytics/dev-agent-core'); + return new Indexer(config) as GitIndexer; + } + + /** + * Get or create git indexer + * + * Lazy initialization with caching. + * + * @returns Initialized git indexer + */ + async getGitIndexer(): Promise { + if (this.cachedGitIndexer) { + return this.cachedGitIndexer; + } + + this.logger?.debug('Initializing git history indexer'); + + // Get storage path for git vectors + const { getStoragePath, getStorageFilePaths } = await import('../storage/path.js'); + const storagePath = await getStoragePath(this.repositoryPath); + const filePaths = getStorageFilePaths(storagePath); + const gitVectorStorePath = `${filePaths.vectors}-git`; + + // Create components + const extractor = await this.factories.createExtractor(this.repositoryPath); + const vectorStorage = await this.factories.createVectorStorage(gitVectorStorePath); + + // Cache vector storage for cleanup + this.cachedVectorStorage = vectorStorage; + + // Create git indexer + this.cachedGitIndexer = await this.factories.createGitIndexer({ + extractor, + vectorStorage, + }); + + this.logger?.debug('Git history indexer initialized'); + + return this.cachedGitIndexer; + } + + /** + * Get git extractor + * + * Useful for direct commit extraction without indexing. + * + * @returns Git extractor + */ + async getExtractor(): Promise { + return this.factories.createExtractor(this.repositoryPath); + } + + /** + * Search git history semantically + * + * @param query - Search query + * @param options - Search options (limit, etc.) + * @returns Search results + */ + async search(query: string, options?: { limit?: number }): Promise { + const gitIndexer = await this.getGitIndexer(); + return gitIndexer.search(query, { limit: options?.limit ?? 10 }); + } + + /** + * Get commits with optional filtering + * + * @param options - Filter options (author, since, file, etc.) + * @returns Filtered commits + */ + async getCommits(options?: { + author?: string; + since?: string; + file?: string; + limit?: number; + }): Promise { + const gitIndexer = await this.getGitIndexer(); + return gitIndexer.getCommits(options); + } + + /** + * Index git history + * + * @param options - Indexing options + * @returns Index statistics + */ + async index(options?: { since?: string; limit?: number }): Promise { + const gitIndexer = await this.getGitIndexer(); + return gitIndexer.index(options); + } + + /** + * Close and cleanup resources + * + * Should be called when done with git history operations. + */ + async close(): Promise { + if (this.cachedVectorStorage) { + await this.cachedVectorStorage.close(); + this.cachedVectorStorage = undefined; + } + this.cachedGitIndexer = undefined; + } +} diff --git a/packages/core/src/services/github-service.ts b/packages/core/src/services/github-service.ts new file mode 100644 index 0000000..6e12f10 --- /dev/null +++ b/packages/core/src/services/github-service.ts @@ -0,0 +1,148 @@ +/** + * GitHub Service + * + * Shared service for GitHub operations (issues, PRs, indexing). + * Used by MCP GitHub adapter and CLI gh commands. + */ + +import type { + GitHubDocument, + GitHubIndexerInstance, + GitHubIndexOptions, + GitHubIndexStats, + GitHubSearchOptions, + GitHubSearchResult, +} from '@lytics/dev-agent-types/github'; +import type { Logger } from '@lytics/kero'; + +export interface GitHubServiceConfig { + repositoryPath: string; + logger?: Logger; +} + +// Generic indexer interface to avoid importing the actual GitHubIndexer class +export type GitHubIndexerFactory = (config: { + vectorStorePath: string; + statePath: string; + autoUpdate?: boolean; + staleThreshold?: number; + logger?: Logger; +}) => Promise; + +export class GitHubService { + private readonly repositoryPath: string; + private readonly logger?: Logger; + private readonly githubIndexerFactory: GitHubIndexerFactory; + private githubIndexer: GitHubIndexerInstance | null = null; + + constructor(config: GitHubServiceConfig, githubIndexerFactory: GitHubIndexerFactory) { + this.repositoryPath = config.repositoryPath; + this.logger = config.logger; + this.githubIndexerFactory = githubIndexerFactory; + } + + private async getIndexer(): Promise { + if (this.githubIndexer) { + return this.githubIndexer; + } + + const { getStoragePath, getStorageFilePaths } = await import('../storage/path.js'); + const storagePath = await getStoragePath(this.repositoryPath); + const filePaths = getStorageFilePaths(storagePath); + const vectorStorePath = `${filePaths.vectors}-github`; + + this.githubIndexer = await this.githubIndexerFactory({ + vectorStorePath, + statePath: filePaths.githubState, + autoUpdate: true, + staleThreshold: 15 * 60 * 1000, // 15 minutes + logger: this.logger, + }); + await this.githubIndexer.initialize(); + return this.githubIndexer; + } + + async index(options?: GitHubIndexOptions): Promise { + const indexer = await this.getIndexer(); + try { + const stats = await indexer.index(options); + return stats; + } catch (error) { + this.logger?.error({ error }, 'GitHub indexing failed'); + throw error; + } + } + + async search(query: string, options?: GitHubSearchOptions): Promise { + const indexer = await this.getIndexer(); + try { + return await indexer.search(query, options); + } catch (error) { + this.logger?.error({ error }, 'GitHub search failed'); + return []; + } + } + + async getContext(issueNumber: number): Promise { + const indexer = await this.getIndexer(); + try { + const results = await indexer.search(String(issueNumber), { limit: 1 }); + // Find exact match by issue number + const exactMatch = results.find((r) => r.document?.number === issueNumber); + return exactMatch?.document || null; + } catch (error) { + this.logger?.error({ error }, `Failed to get GitHub context for issue ${issueNumber}`); + return null; + } + } + + async findRelated(issueNumber: number, limit = 5): Promise { + const indexer = await this.getIndexer(); + try { + const contextDoc = await this.getContext(issueNumber); + if (!contextDoc) { + return []; + } + // Search for similar issues using title as query + const results = await indexer.search(contextDoc.title, { limit: limit + 1 }); + // Filter out the original issue and return search results with real scores + return results + .filter((r: GitHubSearchResult) => r.document.number !== issueNumber) + .slice(0, limit); + } catch (error) { + this.logger?.error({ error }, `Failed to find related GitHub items for issue ${issueNumber}`); + return []; + } + } + + async getStats(): Promise { + const indexer = await this.getIndexer(); + try { + return indexer.getStats(); + } catch (error) { + this.logger?.error({ error }, 'Failed to get GitHub index stats'); + return null; + } + } + + async isIndexed(): Promise { + try { + const indexer = await this.getIndexer(); + const stats = await indexer.getStats(); + return stats !== null && stats.totalDocuments > 0; + } catch (error) { + this.logger?.debug({ error }, 'GitHub repository not indexed or error during check'); + return false; + } + } + + /** + * Shutdown the GitHub service and close the indexer + */ + async shutdown(): Promise { + if (this.githubIndexer) { + await this.githubIndexer.close(); + this.githubIndexer = null; + } + } +} diff --git a/packages/core/src/services/health-service.ts b/packages/core/src/services/health-service.ts new file mode 100644 index 0000000..0df0868 --- /dev/null +++ b/packages/core/src/services/health-service.ts @@ -0,0 +1,238 @@ +/** + * Health Service + * + * Shared service for component health checks. + * Used by both MCP health adapter and Dashboard health API. + */ + +import type { Logger } from '@lytics/kero'; +import type { RepositoryIndexer } from '../indexer/index.js'; +import type { MetricsStore } from '../metrics/store.js'; +import type { VectorStorage } from '../vector/index.js'; + +export interface ComponentHealth { + status: 'ok' | 'warning' | 'error'; + message?: string; + details?: Record; +} + +export interface HealthCheckResult { + status: 'healthy' | 'degraded' | 'unhealthy'; + timestamp: Date; + checks: { + indexer: ComponentHealth; + vectorStorage: ComponentHealth; + metricsStore: ComponentHealth; + }; +} + +export interface HealthServiceConfig { + repositoryPath: string; + logger?: Logger; +} + +/** + * Config types for component factories + */ +export interface IndexerFactoryConfig { + repositoryPath: string; + vectorStorePath: string; + statePath: string; + logger?: Logger; +} + +export interface VectorStorageFactoryConfig { + storePath: string; + embeddingModel: string; + dimension: number; +} + +/** + * Factory functions for creating component instances + */ +export interface HealthServiceFactories { + createIndexer?: (config: IndexerFactoryConfig) => Promise; + createVectorStorage?: (config: VectorStorageFactoryConfig) => Promise; + createMetricsStore?: (path: string, logger?: Logger) => MetricsStore; +} + +/** + * Service for checking component health + * + * Runs health checks on indexer, vector storage, and metrics store. + * Returns structured health information. + */ +export class HealthService { + private repositoryPath: string; + private logger?: Logger; + private factories: Required; + + constructor(config: HealthServiceConfig, factories?: HealthServiceFactories) { + this.repositoryPath = config.repositoryPath; + this.logger = config.logger; + + // Use provided factories or defaults + this.factories = { + createIndexer: factories?.createIndexer || this.defaultIndexerFactory.bind(this), + createVectorStorage: + factories?.createVectorStorage || this.defaultVectorStorageFactory.bind(this), + createMetricsStore: + factories?.createMetricsStore || this.defaultMetricsStoreFactory.bind(this), + }; + } + + /** + * Default factory implementations + */ + private async defaultIndexerFactory(config: IndexerFactoryConfig): Promise { + const { RepositoryIndexer: Indexer } = await import('../indexer/index.js'); + return new Indexer({ + repositoryPath: config.repositoryPath, + vectorStorePath: config.vectorStorePath, + statePath: config.statePath, + logger: config.logger, + }); + } + + private async defaultVectorStorageFactory( + config: VectorStorageFactoryConfig + ): Promise { + const { VectorStorage: Storage } = await import('../vector/index.js'); + return new Storage({ + storePath: config.storePath, + embeddingModel: config.embeddingModel, + dimension: config.dimension, + }); + } + + private defaultMetricsStoreFactory(path: string, logger?: Logger): MetricsStore { + const { MetricsStore: Store } = require('../metrics/store.js'); + return new Store(path, logger); + } + + /** + * Run comprehensive health checks + * + * Checks all components in parallel for performance. + * + * @returns Health check results + */ + async check(): Promise { + const { getStoragePath, getStorageFilePaths } = await import('../storage/path.js'); + const storagePath = await getStoragePath(this.repositoryPath); + const filePaths = getStorageFilePaths(storagePath); + + // Run checks in parallel + const [indexer, vectorStorage, metricsStore] = await Promise.all([ + this.checkIndexer(filePaths), + this.checkVectorStorage(filePaths), + this.checkMetricsStore(filePaths), + ]); + + // Determine overall status + const hasError = [indexer, vectorStorage, metricsStore].some((c) => c.status === 'error'); + const hasWarning = [indexer, vectorStorage, metricsStore].some((c) => c.status === 'warning'); + + const overallStatus = hasError ? 'unhealthy' : hasWarning ? 'degraded' : 'healthy'; + + return { + status: overallStatus, + timestamp: new Date(), + checks: { + indexer, + vectorStorage, + metricsStore, + }, + }; + } + + private async checkIndexer(filePaths: { + vectors: string; + indexerState: string; + [key: string]: string; + }): Promise { + try { + const indexer = await this.factories.createIndexer({ + repositoryPath: this.repositoryPath, + vectorStorePath: filePaths.vectors, + statePath: filePaths.indexerState, + logger: this.logger, + }); + + await indexer.initialize(); + const stats = await indexer.getStats(); + await indexer.close(); + + if (!stats) { + return { + status: 'error', + message: 'Repository not indexed', + }; + } + + return { + status: 'ok', + details: { + totalFiles: stats.filesScanned, + totalDocuments: stats.documentsIndexed, + lastIndexed: stats.endTime.toISOString(), + }, + }; + } catch (error) { + return { + status: 'error', + message: error instanceof Error ? error.message : 'Indexer check failed', + }; + } + } + + private async checkVectorStorage(filePaths: { + vectors: string; + [key: string]: string; + }): Promise { + try { + const vectorStorage = await this.factories.createVectorStorage({ + storePath: filePaths.vectors, + embeddingModel: 'Xenova/all-MiniLM-L6-v2', + dimension: 384, + }); + + await vectorStorage.initialize(); + await vectorStorage.close(); + + return { + status: 'ok', + message: 'Vector storage operational', + }; + } catch (error) { + return { + status: 'error', + message: error instanceof Error ? error.message : 'Vector storage check failed', + }; + } + } + + private async checkMetricsStore(filePaths: { + metrics: string; + [key: string]: string; + }): Promise { + try { + const metricsStore = this.factories.createMetricsStore(filePaths.metrics, this.logger); + const count = metricsStore.getCount(); + metricsStore.close(); + + return { + status: 'ok', + details: { + snapshotCount: count, + }, + }; + } catch (error) { + // Metrics is optional, so warning instead of error + return { + status: 'warning', + message: error instanceof Error ? error.message : 'Metrics store unavailable', + }; + } + } +} diff --git a/packages/core/src/services/index.ts b/packages/core/src/services/index.ts new file mode 100644 index 0000000..557509e --- /dev/null +++ b/packages/core/src/services/index.ts @@ -0,0 +1,27 @@ +/** + * Services Module + * + * Shared business logic layer for MCP and Dashboard. + * Provides consistent APIs for stats, health, and metrics. + */ + +export { CoordinatorService, type CoordinatorServiceConfig } from './coordinator-service.js'; +export { GitHistoryService, type GitHistoryServiceConfig } from './git-history-service.js'; +export { + type GitHubIndexerFactory, + GitHubService, + type GitHubServiceConfig, +} from './github-service.js'; +export { + type ComponentHealth, + type HealthCheckResult, + HealthService, + type HealthServiceConfig, +} from './health-service.js'; +export { MetricsService, type MetricsServiceConfig } from './metrics-service.js'; +export { + SearchService, + type SearchServiceConfig, + type SimilarityOptions, +} from './search-service.js'; +export { StatsService, type StatsServiceConfig } from './stats-service.js'; diff --git a/packages/core/src/services/metrics-service.ts b/packages/core/src/services/metrics-service.ts new file mode 100644 index 0000000..5022e56 --- /dev/null +++ b/packages/core/src/services/metrics-service.ts @@ -0,0 +1,169 @@ +/** + * Metrics Service + * + * Shared service for querying analytics and metrics. + * Used by both MCP adapters and Dashboard API routes. + */ + +import type { Logger } from '@lytics/kero'; +import type { FileMetrics } from '../metrics/analytics.js'; +import type { MetricsStore } from '../metrics/store.js'; +import type { CodeMetadata, Snapshot, SnapshotQuery } from '../metrics/types.js'; + +export interface MetricsServiceConfig { + repositoryPath: string; + logger?: Logger; +} + +/** + * Factory function for creating MetricsStore instances + */ +export type MetricsStoreFactory = (path: string, logger?: Logger) => MetricsStore; + +/** + * Service for querying metrics and analytics + * + * Encapsulates metrics store access and analytics queries. + * Ensures consistent behavior across MCP and Dashboard. + */ +export class MetricsService { + private repositoryPath: string; + private logger?: Logger; + private createStore: MetricsStoreFactory; + + constructor(config: MetricsServiceConfig, createStore?: MetricsStoreFactory) { + this.repositoryPath = config.repositoryPath; + this.logger = config.logger; + + // Use provided factory or default implementation + this.createStore = createStore || this.defaultStoreFactory.bind(this); + } + + /** + * Default factory that creates a real MetricsStore + */ + private defaultStoreFactory(path: string, logger?: Logger): MetricsStore { + const { MetricsStore: Store } = require('../metrics/store.js'); + return new Store(path, logger); + } + + /** + * Get metrics store for this repository + */ + private async getStore(): Promise { + const { getStoragePath, getStorageFilePaths } = await import('../storage/path.js'); + const storagePath = await getStoragePath(this.repositoryPath); + const filePaths = getStorageFilePaths(storagePath); + return this.createStore(filePaths.metrics, this.logger); + } + + /** + * Get most active files by commit count + */ + async getMostActive(limit = 10): Promise { + const store = await this.getStore(); + try { + const { getMostActive } = await import('../metrics/analytics.js'); + const latest = store.getLatestSnapshot(this.repositoryPath); + if (!latest) { + return []; + } + return getMostActive(store, latest.id, limit); + } finally { + store.close(); + } + } + + /** + * Get largest files by LOC + */ + async getLargestFiles(limit = 10): Promise { + const store = await this.getStore(); + try { + const { getLargestFiles } = await import('../metrics/analytics.js'); + const latest = store.getLatestSnapshot(this.repositoryPath); + if (!latest) { + return []; + } + return getLargestFiles(store, latest.id, limit); + } finally { + store.close(); + } + } + + /** + * Get files with concentrated ownership + */ + async getConcentratedOwnership(limit = 10): Promise { + const store = await this.getStore(); + try { + const { getConcentratedOwnership } = await import('../metrics/analytics.js'); + const latest = store.getLatestSnapshot(this.repositoryPath); + if (!latest) { + return []; + } + return getConcentratedOwnership(store, latest.id, limit); + } finally { + store.close(); + } + } + + /** + * Get file trend history + */ + async getFileTrend(filePath: string, limit = 10): Promise { + const store = await this.getStore(); + try { + const { getFileTrend } = await import('../metrics/analytics.js'); + return getFileTrend(store, filePath, limit); + } finally { + store.close(); + } + } + + /** + * Get snapshot summary statistics + */ + async getSummary(): Promise | null> { + const store = await this.getStore(); + try { + const { getSnapshotSummary } = await import('../metrics/analytics.js'); + const latest = store.getLatestSnapshot(this.repositoryPath); + if (!latest) { + return null; + } + return getSnapshotSummary(store, latest.id); + } finally { + store.close(); + } + } + + /** + * Query historical snapshots + */ + async getSnapshots(query: SnapshotQuery): Promise { + const store = await this.getStore(); + try { + return store.getSnapshots({ + ...query, + repositoryPath: query.repositoryPath || this.repositoryPath, + }); + } finally { + store.close(); + } + } + + /** + * Get latest snapshot + */ + async getLatestSnapshot(): Promise { + const store = await this.getStore(); + try { + return store.getLatestSnapshot(this.repositoryPath); + } finally { + store.close(); + } + } +} diff --git a/packages/core/src/services/search-service.ts b/packages/core/src/services/search-service.ts new file mode 100644 index 0000000..b0f86b9 --- /dev/null +++ b/packages/core/src/services/search-service.ts @@ -0,0 +1,239 @@ +/** + * Search Service + * + * Shared service for semantic code search operations. + * Used by MCP search/refs/explore adapters and CLI search command. + */ + +import type { Logger } from '@lytics/kero'; +import type { RepositoryIndexer } from '../indexer/index.js'; +import type { SearchResult, SearchOptions as VectorSearchOptions } from '../vector/types.js'; + +export interface SearchServiceConfig { + repositoryPath: string; + logger?: Logger; +} + +// Re-export SearchOptions from vector types for convenience +export type SearchOptions = VectorSearchOptions; + +export interface SimilarityOptions { + limit?: number; + threshold?: number; +} + +export interface IndexerFactoryConfig { + repositoryPath: string; + vectorStorePath: string; + statePath: string; + logger?: Logger; + excludePatterns?: string[]; + languages?: string[]; +} + +/** + * Factory function for creating RepositoryIndexer instances + */ +export type IndexerFactory = (config: IndexerFactoryConfig) => Promise; + +/** + * Service for semantic code search + * + * Encapsulates indexer initialization and search operations. + * Provides consistent search behavior across CLI and MCP. + */ +export class SearchService { + private repositoryPath: string; + private logger?: Logger; + private createIndexer: IndexerFactory; + + constructor(config: SearchServiceConfig, createIndexer?: IndexerFactory) { + this.repositoryPath = config.repositoryPath; + this.logger = config.logger; + + // Use provided factory or default implementation + this.createIndexer = createIndexer || this.defaultIndexerFactory.bind(this); + } + + /** + * Default factory that creates a real RepositoryIndexer + */ + private async defaultIndexerFactory(config: IndexerFactoryConfig): Promise { + const { RepositoryIndexer: Indexer } = await import('../indexer/index.js'); + return new Indexer({ + repositoryPath: config.repositoryPath, + vectorStorePath: config.vectorStorePath, + statePath: config.statePath, + logger: config.logger, + excludePatterns: config.excludePatterns, + languages: config.languages, + }); + } + + /** + * Get initialized indexer for this repository + */ + private async getIndexer(options?: { + excludePatterns?: string[]; + languages?: string[]; + }): Promise { + const { getStoragePath, getStorageFilePaths } = await import('../storage/path.js'); + const storagePath = await getStoragePath(this.repositoryPath); + const filePaths = getStorageFilePaths(storagePath); + + const indexer = await this.createIndexer({ + repositoryPath: this.repositoryPath, + vectorStorePath: filePaths.vectors, + statePath: filePaths.indexerState, + logger: this.logger, + excludePatterns: options?.excludePatterns, + languages: options?.languages, + }); + + await indexer.initialize(); + return indexer; + } + + /** + * Perform semantic code search + * + * @param query - Search query string + * @param options - Search options (limit, scoreThreshold) + * @returns Array of search results + */ + async search(query: string, options?: SearchOptions): Promise { + const indexer = await this.getIndexer(); + try { + const results = await indexer.search(query, { + limit: options?.limit ?? 10, + scoreThreshold: options?.scoreThreshold ?? 0.7, + }); + return results; + } finally { + await indexer.close(); + } + } + + /** + * Find similar code to a specific file + * + * @param filePath - Path to the file + * @param options - Similarity options (limit, threshold) + * @returns Array of similar files with scores + */ + async findSimilar(filePath: string, options?: SimilarityOptions): Promise { + const indexer = await this.getIndexer(); + try { + // Search for documents from the target file + const fileResults = await indexer.search(filePath, { limit: 5 }); + if (fileResults.length === 0) { + return []; + } + + // Use the path as query to find similar code patterns + const results = await indexer.search(filePath, { + limit: (options?.limit ?? 10) + 1, // +1 to account for the file itself + scoreThreshold: options?.threshold ?? 0.7, + }); + + // Filter out the original file + return results.filter((r) => r.metadata.path !== filePath); + } finally { + await indexer.close(); + } + } + + /** + * Find related test files for a source file + * + * Uses naming conventions to find potential test files. + * + * @param filePath - Path to the source file + * @returns Array of related test file paths + */ + async findRelatedTests(filePath: string): Promise { + const indexer = await this.getIndexer(); + try { + const baseName = filePath.replace(/\.(ts|js|tsx|jsx)$/, '').replace(/^.*\//, ''); + + // Common test file patterns + const patterns = [ + `${baseName}.test`, + `${baseName}.spec`, + `${baseName}_test`, + `__tests__/${baseName}`, + ]; + + const relatedFiles: string[] = []; + + for (const pattern of patterns) { + const results = await indexer.search(pattern, { limit: 5 }); + for (const result of results) { + const file = result.metadata.path; + if ( + file && + (file.includes('.test.') || file.includes('.spec.') || file.includes('__tests__')) && + !relatedFiles.includes(file) + ) { + relatedFiles.push(file); + } + } + } + + return relatedFiles; + } finally { + await indexer.close(); + } + } + + /** + * Find a specific symbol (function, class, etc.) by name + * + * Useful for refs queries that need to locate a symbol first. + * + * @param name - Symbol name to search for + * @returns Best matching result or null if not found + */ + async findSymbol(name: string): Promise { + const indexer = await this.getIndexer(); + try { + const results = await indexer.search(name, { limit: 10 }); + + // Find best match by checking if the name appears in the result + for (const result of results) { + const metadata = result.metadata; + if ( + metadata.name === name || + metadata.path?.includes(name) || + metadata.signature?.includes(name) + ) { + return result; + } + } + + // Return first result if no exact match + return results.length > 0 ? results[0] : null; + } finally { + await indexer.close(); + } + } + + /** + * Check if repository is indexed + * + * @returns True if repository has been indexed + */ + async isIndexed(): Promise { + try { + const indexer = await this.getIndexer(); + try { + const stats = await indexer.getStats(); + return stats !== null; + } finally { + await indexer.close(); + } + } catch { + return false; + } + } +} diff --git a/packages/core/src/services/stats-service.ts b/packages/core/src/services/stats-service.ts new file mode 100644 index 0000000..62f25b0 --- /dev/null +++ b/packages/core/src/services/stats-service.ts @@ -0,0 +1,106 @@ +/** + * Stats Service + * + * Shared service for retrieving repository statistics. + * Used by both MCP adapters and Dashboard API routes. + */ + +import type { Logger } from '@lytics/kero'; +import type { RepositoryIndexer } from '../indexer/index.js'; +import type { DetailedIndexStats } from '../indexer/types.js'; + +export interface StatsServiceConfig { + repositoryPath: string; + logger?: Logger; +} + +/** + * Factory function for creating RepositoryIndexer instances + * Can be overridden in tests + */ +export type IndexerFactory = (config: { + repositoryPath: string; + vectorStorePath: string; + statePath: string; + logger?: Logger; +}) => Promise; + +/** + * Service for retrieving repository statistics + * + * Encapsulates indexer initialization and stats retrieval. + * Ensures consistent behavior across MCP and Dashboard. + */ +export class StatsService { + private repositoryPath: string; + private logger?: Logger; + private createIndexer: IndexerFactory; + + constructor(config: StatsServiceConfig, createIndexer?: IndexerFactory) { + this.repositoryPath = config.repositoryPath; + this.logger = config.logger; + + // Use provided factory or default implementation + this.createIndexer = createIndexer || this.defaultIndexerFactory; + } + + /** + * Default factory that creates a real RepositoryIndexer + */ + private async defaultIndexerFactory( + config: Parameters[0] + ): Promise { + const { RepositoryIndexer: Indexer } = await import('../indexer/index.js'); + const { getStoragePath, getStorageFilePaths } = await import('../storage/path.js'); + + const storagePath = await getStoragePath(config.repositoryPath); + const filePaths = getStorageFilePaths(storagePath); + + return new Indexer({ + repositoryPath: config.repositoryPath, + vectorStorePath: filePaths.vectors, + statePath: filePaths.indexerState, + logger: config.logger, + }); + } + + /** + * Get current repository statistics + * + * Initializes indexer, retrieves stats, and cleans up. + * Thread-safe and idempotent. + * + * @returns Detailed index statistics or null if not indexed + * @throws Error if stats unavailable + */ + async getStats(): Promise { + const indexer = await this.createIndexer({ + repositoryPath: this.repositoryPath, + vectorStorePath: '', // Filled by factory + statePath: '', // Filled by factory + logger: this.logger, + }); + + try { + await indexer.initialize(); + const stats = await indexer.getStats(); + return stats; + } finally { + await indexer.close(); + } + } + + /** + * Check if repository is indexed + * + * @returns True if indexer state exists + */ + async isIndexed(): Promise { + try { + const stats = await this.getStats(); + return stats !== null; + } catch (_error) { + return false; + } + } +} diff --git a/packages/core/src/storage/path.ts b/packages/core/src/storage/path.ts index 8c64464..be9d362 100644 --- a/packages/core/src/storage/path.ts +++ b/packages/core/src/storage/path.ts @@ -106,11 +106,13 @@ export function getStorageFilePaths(storagePath: string): { githubState: string; metadata: string; indexerState: string; + metrics: string; } { return { vectors: path.join(storagePath, 'vectors.lance'), githubState: path.join(storagePath, 'github-state.json'), metadata: path.join(storagePath, 'metadata.json'), indexerState: path.join(storagePath, 'indexer-state.json'), + metrics: path.join(storagePath, 'metrics.db'), }; } diff --git a/packages/core/tsconfig.json b/packages/core/tsconfig.json index c48bc0b..7eb06f5 100644 --- a/packages/core/tsconfig.json +++ b/packages/core/tsconfig.json @@ -4,9 +4,20 @@ "outDir": "./dist", "rootDir": "./src", "composite": true, + "declaration": true, + "declarationMap": true, "types": ["node", "vitest/globals"] }, "include": ["src/**/*"], - "exclude": ["node_modules", "dist"], - "references": [{ "path": "../logger" }] + "exclude": [ + "node_modules", + "dist", + "**/*.test.ts", + "**/*.spec.ts", + "**/__tests__/**" + ], + "references": [ + { "path": "../logger" }, + { "path": "../types" } + ] } diff --git a/packages/dev-agent/CHANGELOG.md b/packages/dev-agent/CHANGELOG.md index a39b83b..0341d72 100644 --- a/packages/dev-agent/CHANGELOG.md +++ b/packages/dev-agent/CHANGELOG.md @@ -86,7 +86,7 @@ **Indexer Logging** - - Add `--verbose` flag to `dev index`, `dev git index`, `dev gh index` + - Add `--verbose` flag to `dev index`, `dev git index`, `dev github index` - Progress spinner shows actual counts: `Embedding 4480/49151 documents (9%)` - Structured logging with kero logger diff --git a/packages/dev-agent/README.md b/packages/dev-agent/README.md index ae7f637..827eb8f 100644 --- a/packages/dev-agent/README.md +++ b/packages/dev-agent/README.md @@ -66,7 +66,7 @@ When integrated with Cursor or Claude Code, you get 6 powerful tools: ```bash # Indexing dev index . # Index current repository -dev gh index # Index GitHub issues/PRs +dev github index # Index GitHub issues/PRs # MCP Server Integration dev mcp install --cursor # Install for Cursor diff --git a/packages/mcp-server/CLAUDE_CODE_SETUP.md b/packages/mcp-server/CLAUDE_CODE_SETUP.md index 7ea2569..1057dea 100644 --- a/packages/mcp-server/CLAUDE_CODE_SETUP.md +++ b/packages/mcp-server/CLAUDE_CODE_SETUP.md @@ -101,7 +101,7 @@ Find issues related to authentication bugs - `labels`: Filter by labels (e.g., `["bug", "enhancement"]`) - `limit`: Number of results (default: 10) -**Note:** Automatically reloads when you run `dev gh index` to update GitHub data. +**Note:** Automatically reloads when you run `dev github index` to update GitHub data. ### `dev_health` - Server Health Check Check the health of dev-agent MCP server and its components. @@ -145,7 +145,7 @@ To enable GitHub issue/PR search: ```bash # Index GitHub issues and PRs cd /path/to/your/repository -dev gh index +dev github index # The dev_gh tool will automatically pick up new data ``` @@ -242,7 +242,7 @@ dev index . **Solution:** ```bash cd /path/to/your/repository -dev gh index +dev github index ``` The `dev_gh` tool will automatically reload the new data. @@ -283,7 +283,7 @@ Check server health with verbose details **Common Issues:** - **Vector storage warning:** Run `dev index .` -- **GitHub index stale (>24h):** Run `dev gh index` +- **GitHub index stale (>24h):** Run `dev github index` - **Repository not accessible:** Check paths and permissions ## Production Features @@ -342,7 +342,7 @@ npm update -g dev-agent # Rebuild indexes (recommended) cd /path/to/your/repository dev index . -dev gh index +dev github index # Restart Claude Code ``` @@ -352,7 +352,7 @@ No need to reinstall MCP integration - it automatically uses the latest version. ## Performance Tips 1. **Index Incrementally:** Run `dev index .` after major changes -2. **GitHub Index:** Update periodically with `dev gh index` +2. **GitHub Index:** Update periodically with `dev github index` 3. **Health Checks:** Use `dev_health` to monitor component status 4. **Verbose Only When Needed:** Keep `LOG_LEVEL: info` for production diff --git a/packages/mcp-server/CURSOR_SETUP.md b/packages/mcp-server/CURSOR_SETUP.md index 6606327..949c26c 100644 --- a/packages/mcp-server/CURSOR_SETUP.md +++ b/packages/mcp-server/CURSOR_SETUP.md @@ -101,7 +101,7 @@ Find issues related to authentication bugs - `labels`: Filter by labels (e.g., `["bug", "enhancement"]`) - `limit`: Number of results (default: 10) -**Note:** Automatically reloads when you run `dev gh index` to update GitHub data. +**Note:** Automatically reloads when you run `dev github index` to update GitHub data. ### `dev_health` - Server Health Check Check the health of dev-agent MCP server and its components. @@ -145,7 +145,7 @@ To enable GitHub issue/PR search: ```bash # Index GitHub issues and PRs cd /path/to/your/repository -dev gh index +dev github index # The dev_gh tool will automatically pick up new data ``` @@ -236,7 +236,7 @@ dev index . **Solution:** ```bash cd /path/to/your/repository -dev gh index +dev github index ``` The `dev_gh` tool will automatically reload the new data. @@ -277,7 +277,7 @@ Check server health with verbose details **Common Issues:** - **Vector storage warning:** Run `dev index .` -- **GitHub index stale (>24h):** Run `dev gh index` +- **GitHub index stale (>24h):** Run `dev github index` - **Repository not accessible:** Check paths and permissions ## Production Features @@ -321,7 +321,7 @@ npm update -g dev-agent # Rebuild indexes (recommended) cd /path/to/your/repository dev index . -dev gh index +dev github index # Restart Cursor ``` @@ -331,7 +331,7 @@ No need to reinstall MCP integration - it automatically uses the latest version. ## Performance Tips 1. **Index Incrementally:** Run `dev index .` after major changes -2. **GitHub Index:** Update periodically with `dev gh index` +2. **GitHub Index:** Update periodically with `dev github index` 3. **Health Checks:** Use `dev_health` to monitor component status 4. **Verbose Only When Needed:** Keep `LOG_LEVEL: info` for production diff --git a/packages/mcp-server/README.md b/packages/mcp-server/README.md index 74ecfce..049f807 100644 --- a/packages/mcp-server/README.md +++ b/packages/mcp-server/README.md @@ -136,14 +136,14 @@ The MCP server provides 5 powerful adapters (tools) and 8 guided prompts: - Semantic search with filters - Full context retrieval - Offline operation with cache - - **Auto-reload**: Automatically picks up new data when `dev gh index` runs + - **Auto-reload**: Automatically picks up new data when `dev github index` runs ### Auto-Reload Feature The GitHub adapter automatically reloads index data when it detects changes, eliminating the need to restart the MCP server: - **How it works**: Monitors GitHub state file modification time -- **When it reloads**: On next query after `dev gh index` updates the data +- **When it reloads**: On next query after `dev github index` updates the data - **No user action required**: Changes are picked up automatically - **Efficient**: Only checks file timestamps (no polling) @@ -153,7 +153,7 @@ The GitHub adapter automatically reloads index data when it detects changes, eli > Use dev_gh to search for "authentication issues" # 2. Update the index (in terminal) -$ dev gh index +$ dev github index ✓ Indexed 59 documents (32 issues + 27 PRs) # 3. Query again - new data appears automatically! diff --git a/packages/mcp-server/bin/dev-agent-mcp.ts b/packages/mcp-server/bin/dev-agent-mcp.ts index 37c1383..1eab268 100644 --- a/packages/mcp-server/bin/dev-agent-mcp.ts +++ b/packages/mcp-server/bin/dev-agent-mcp.ts @@ -5,21 +5,20 @@ */ import { + CoordinatorService, ensureStorageDirectory, + GitHubService, GitIndexer, getStorageFilePaths, getStoragePath, LocalGitExtractor, RepositoryIndexer, + SearchService, + StatsService, saveMetadata, VectorStorage, } from '@lytics/dev-agent-core'; -import { - ExplorerAgent, - PlannerAgent, - PrAgent, - SubagentCoordinator, -} from '@lytics/dev-agent-subagents'; +import type { SubagentCoordinator } from '@lytics/dev-agent-subagents'; import { ExploreAdapter, GitHubAdapter, @@ -127,33 +126,40 @@ async function main() { // Update metadata await saveMetadata(storagePath, repositoryPath); - // Create and configure the subagent coordinator - const coordinator = new SubagentCoordinator({ + // Create and configure the subagent coordinator using CoordinatorService + const coordinatorService = new CoordinatorService({ + repositoryPath, maxConcurrentTasks: 5, defaultMessageTimeout: 30000, logLevel, }); + // Type assertion: CoordinatorService returns a minimal interface, but it's + // structurally compatible with the full SubagentCoordinator type + const coordinator = (await coordinatorService.createCoordinator( + indexer + )) as SubagentCoordinator; - // Set up context manager with indexer - coordinator.getContextManager().setIndexer(indexer); - - // Register subagents - await coordinator.registerAgent(new ExplorerAgent()); - await coordinator.registerAgent(new PlannerAgent()); - await coordinator.registerAgent(new PrAgent()); + // Create services + const searchService = new SearchService({ repositoryPath }); + const githubService = new GitHubService({ repositoryPath }, async (config) => { + const { GitHubIndexer } = await import('@lytics/dev-agent-subagents'); + return new GitHubIndexer(config); + }); + const statsService = new StatsService({ repositoryPath }); // Create and register adapters const searchAdapter = new SearchAdapter({ - repositoryIndexer: indexer, + searchService, repositoryPath, defaultFormat: 'compact', defaultLimit: 10, }); const statusAdapter = new StatusAdapter({ - repositoryIndexer: indexer, + statsService, repositoryPath, vectorStorePath: filePaths.vectors, + githubService, defaultSection: 'summary', }); @@ -179,17 +185,15 @@ async function main() { const exploreAdapter = new ExploreAdapter({ repositoryPath, - repositoryIndexer: indexer, + searchService, defaultLimit: 10, defaultThreshold: 0.7, defaultFormat: 'compact', }); const githubAdapter = new GitHubAdapter({ + githubService, repositoryPath, - // GitHubIndexer will be lazily initialized on first use - vectorStorePath: `${filePaths.vectors}-github`, - statePath: filePaths.githubState, defaultLimit: 10, defaultFormat: 'compact', }); @@ -201,7 +205,7 @@ async function main() { }); const refsAdapter = new RefsAdapter({ - repositoryIndexer: indexer, + searchService, defaultLimit: 20, }); @@ -249,10 +253,8 @@ async function main() { await server.stop(); await indexer.close(); await gitVectorStorage.close(); - // Close GitHub adapter if initialized - if (githubAdapter.githubIndexer) { - await githubAdapter.githubIndexer.close(); - } + // Close GitHub service + await githubService.shutdown(); process.exit(0); }; diff --git a/packages/mcp-server/package.json b/packages/mcp-server/package.json index ca6512c..1f053c1 100644 --- a/packages/mcp-server/package.json +++ b/packages/mcp-server/package.json @@ -19,6 +19,7 @@ "dependencies": { "@lytics/dev-agent-core": "workspace:*", "@lytics/dev-agent-subagents": "workspace:*", + "@lytics/dev-agent-types": "workspace:*", "@lytics/kero": "workspace:*", "zod": "^4.1.13" }, diff --git a/packages/mcp-server/src/adapters/__tests__/explore-adapter.test.ts b/packages/mcp-server/src/adapters/__tests__/explore-adapter.test.ts index b1d7140..6d88e7a 100644 --- a/packages/mcp-server/src/adapters/__tests__/explore-adapter.test.ts +++ b/packages/mcp-server/src/adapters/__tests__/explore-adapter.test.ts @@ -2,26 +2,27 @@ * ExploreAdapter Unit Tests */ -import type { RepositoryIndexer, SearchResult } from '@lytics/dev-agent-core'; +import type { SearchResult, SearchService } from '@lytics/dev-agent-core'; import { beforeEach, describe, expect, it, vi } from 'vitest'; import { ExploreAdapter } from '../built-in/explore-adapter'; import type { ToolExecutionContext } from '../types'; describe('ExploreAdapter', () => { let adapter: ExploreAdapter; - let mockIndexer: RepositoryIndexer; + let mockSearchService: SearchService; let mockContext: ToolExecutionContext; beforeEach(() => { - // Mock RepositoryIndexer - mockIndexer = { + // Mock SearchService + mockSearchService = { search: vi.fn(), - } as unknown as RepositoryIndexer; + findSimilar: vi.fn(), + } as unknown as SearchService; // Create adapter adapter = new ExploreAdapter({ repositoryPath: '/test/repo', - repositoryIndexer: mockIndexer, + searchService: mockSearchService, defaultLimit: 10, defaultThreshold: 0.7, defaultFormat: 'compact', @@ -151,7 +152,7 @@ describe('ExploreAdapter', () => { }, ]; - vi.mocked(mockIndexer.search).mockResolvedValue(mockResults); + vi.mocked(mockSearchService.search).mockResolvedValue(mockResults); const result = await adapter.execute( { @@ -183,7 +184,7 @@ describe('ExploreAdapter', () => { }, ]; - vi.mocked(mockIndexer.search).mockResolvedValue(mockResults); + vi.mocked(mockSearchService.search).mockResolvedValue(mockResults); const result = await adapter.execute( { @@ -221,7 +222,7 @@ describe('ExploreAdapter', () => { }, ]; - vi.mocked(mockIndexer.search).mockResolvedValue(mockResults); + vi.mocked(mockSearchService.search).mockResolvedValue(mockResults); const result = await adapter.execute( { @@ -238,7 +239,7 @@ describe('ExploreAdapter', () => { }); it('should handle no results found', async () => { - vi.mocked(mockIndexer.search).mockResolvedValue([]); + vi.mocked(mockSearchService.search).mockResolvedValue([]); const result = await adapter.execute( { @@ -276,7 +277,7 @@ describe('ExploreAdapter', () => { }, ]; - vi.mocked(mockIndexer.search).mockResolvedValue(mockResults); + vi.mocked(mockSearchService.findSimilar).mockResolvedValue(mockResults); const result = await adapter.execute( { @@ -296,7 +297,7 @@ describe('ExploreAdapter', () => { }); it('should handle no similar files', async () => { - vi.mocked(mockIndexer.search).mockResolvedValue([ + vi.mocked(mockSearchService.findSimilar).mockResolvedValue([ { id: '1', score: 1.0, @@ -344,7 +345,7 @@ describe('ExploreAdapter', () => { }, ]; - vi.mocked(mockIndexer.search).mockResolvedValue(mockResults); + vi.mocked(mockSearchService.search).mockResolvedValue(mockResults); const result = await adapter.execute( { @@ -360,7 +361,7 @@ describe('ExploreAdapter', () => { }); it('should handle no relationships found', async () => { - vi.mocked(mockIndexer.search).mockResolvedValue([]); + vi.mocked(mockSearchService.search).mockResolvedValue([]); const result = await adapter.execute( { @@ -377,7 +378,7 @@ describe('ExploreAdapter', () => { describe('Error Handling', () => { it('should handle file not found errors', async () => { - vi.mocked(mockIndexer.search).mockRejectedValue(new Error('File not found')); + vi.mocked(mockSearchService.findSimilar).mockRejectedValue(new Error('File not found')); const result = await adapter.execute( { @@ -392,7 +393,7 @@ describe('ExploreAdapter', () => { }); it('should handle index not ready errors', async () => { - vi.mocked(mockIndexer.search).mockRejectedValue(new Error('Index not indexed')); + vi.mocked(mockSearchService.search).mockRejectedValue(new Error('Index not indexed')); const result = await adapter.execute( { @@ -407,7 +408,7 @@ describe('ExploreAdapter', () => { }); it('should handle generic errors', async () => { - vi.mocked(mockIndexer.search).mockRejectedValue(new Error('Unknown error')); + vi.mocked(mockSearchService.search).mockRejectedValue(new Error('Unknown error')); const result = await adapter.execute( { diff --git a/packages/mcp-server/src/adapters/__tests__/github-adapter.test.ts b/packages/mcp-server/src/adapters/__tests__/github-adapter.test.ts index 669363e..c01e9d1 100644 --- a/packages/mcp-server/src/adapters/__tests__/github-adapter.test.ts +++ b/packages/mcp-server/src/adapters/__tests__/github-adapter.test.ts @@ -2,18 +2,16 @@ * GitHubAdapter Unit Tests */ -import type { - GitHubDocument, - GitHubIndexer, - GitHubSearchResult, -} from '@lytics/dev-agent-subagents'; +import type { GitHubService } from '@lytics/dev-agent-core'; +import type { GitHubDocument, GitHubSearchResult } from '@lytics/dev-agent-subagents'; import { beforeEach, describe, expect, it, vi } from 'vitest'; +import type { GitHubOutput } from '../../schemas/index.js'; import { GitHubAdapter } from '../built-in/github-adapter'; import type { ToolExecutionContext } from '../types'; describe('GitHubAdapter', () => { let adapter: GitHubAdapter; - let mockGitHubIndexer: GitHubIndexer; + let mockGitHubService: GitHubService; let mockContext: ToolExecutionContext; const mockIssue: GitHubDocument = { @@ -37,16 +35,21 @@ describe('GitHubAdapter', () => { }; beforeEach(() => { - // Mock GitHubIndexer - mockGitHubIndexer = { + // Mock GitHubService + mockGitHubService = { search: vi.fn(), - getDocument: vi.fn(), - } as unknown as GitHubIndexer; + getContext: vi.fn(), + findRelated: vi.fn(), + getStats: vi.fn(), + index: vi.fn(), + isIndexed: vi.fn(), + shutdown: vi.fn(), + } as unknown as GitHubService; // Create adapter adapter = new GitHubAdapter({ repositoryPath: '/test/repo', - githubIndexer: mockGitHubIndexer, + githubService: mockGitHubService, defaultLimit: 10, defaultFormat: 'compact', }); @@ -171,7 +174,7 @@ describe('GitHubAdapter', () => { }, ]; - vi.mocked(mockGitHubIndexer.search).mockResolvedValue(mockResults); + vi.mocked(mockGitHubService.search).mockResolvedValue(mockResults); const result = await adapter.execute( { @@ -183,9 +186,9 @@ describe('GitHubAdapter', () => { ); expect(result.success).toBe(true); - expect((result.data as { content: string })?.content).toContain('GitHub Search Results'); - expect((result.data as { content: string })?.content).toContain('#1'); - expect((result.data as { content: string })?.content).toContain('Test Issue'); + expect((result.data as GitHubOutput)?.content).toContain('GitHub Search Results'); + expect((result.data as GitHubOutput)?.content).toContain('#1'); + expect((result.data as GitHubOutput)?.content).toContain('Test Issue'); }); it('should search with filters', async () => { @@ -197,7 +200,7 @@ describe('GitHubAdapter', () => { }, ]; - vi.mocked(mockGitHubIndexer.search).mockResolvedValue(mockResults); + vi.mocked(mockGitHubService.search).mockResolvedValue(mockResults); const result = await adapter.execute( { @@ -212,7 +215,7 @@ describe('GitHubAdapter', () => { ); expect(result.success).toBe(true); - expect(mockGitHubIndexer.search).toHaveBeenCalledWith('test', { + expect(mockGitHubService.search).toHaveBeenCalledWith('test', { type: 'issue', state: 'open', labels: ['bug'], @@ -222,7 +225,7 @@ describe('GitHubAdapter', () => { }); it('should handle no results', async () => { - vi.mocked(mockGitHubIndexer.search).mockResolvedValue([]); + vi.mocked(mockGitHubService.search).mockResolvedValue([]); const result = await adapter.execute( { @@ -233,9 +236,7 @@ describe('GitHubAdapter', () => { ); expect(result.success).toBe(true); - expect((result.data as { content: string })?.content).toContain( - 'No matching issues or PRs found' - ); + expect((result.data as GitHubOutput)?.content).toContain('No matching issues or PRs found'); }); it('should include token footer in search results', async () => { @@ -247,7 +248,7 @@ describe('GitHubAdapter', () => { }, ]; - vi.mocked(mockGitHubIndexer.search).mockResolvedValue(mockResults); + vi.mocked(mockGitHubService.search).mockResolvedValue(mockResults); const result = await adapter.execute( { @@ -259,7 +260,7 @@ describe('GitHubAdapter', () => { ); expect(result.success).toBe(true); - const content = (result.data as { content: string })?.content; + const content = (result.data as GitHubOutput)?.content; expect(content).toBeDefined(); // Token info is now in metadata, not content expect(result.metadata).toHaveProperty('tokens'); @@ -270,7 +271,7 @@ describe('GitHubAdapter', () => { describe('Context Action', () => { it('should get issue context in compact format', async () => { // Mock getDocument to return the issue directly (new implementation) - vi.mocked(mockGitHubIndexer.getDocument).mockResolvedValue(mockIssue); + vi.mocked(mockGitHubService.getContext).mockResolvedValue(mockIssue); const result = await adapter.execute( { @@ -282,14 +283,14 @@ describe('GitHubAdapter', () => { ); expect(result.success).toBe(true); - expect((result.data as { content: string })?.content).toContain('Issue #1'); - expect((result.data as { content: string })?.content).toContain('Test Issue'); - expect((result.data as { content: string })?.content).toContain('testuser'); + expect((result.data as GitHubOutput)?.content).toContain('Issue #1'); + expect((result.data as GitHubOutput)?.content).toContain('Test Issue'); + expect((result.data as GitHubOutput)?.content).toContain('testuser'); }); it('should get issue context in verbose format', async () => { // Mock getDocument to return the issue directly - vi.mocked(mockGitHubIndexer.getDocument).mockResolvedValue(mockIssue); + vi.mocked(mockGitHubService.getContext).mockResolvedValue(mockIssue); const result = await adapter.execute( { @@ -301,19 +302,17 @@ describe('GitHubAdapter', () => { ); expect(result.success).toBe(true); - expect((result.data as { content: string })?.content).toContain('**Related Issues:** #2, #3'); - expect((result.data as { content: string })?.content).toContain('**Related PRs:** #10'); - expect((result.data as { content: string })?.content).toContain( - '**Linked Files:** `src/test.ts`' - ); - expect((result.data as { content: string })?.content).toContain('**Mentions:** @developer1'); + expect((result.data as GitHubOutput)?.content).toContain('**Related Issues:** #2, #3'); + expect((result.data as GitHubOutput)?.content).toContain('**Related PRs:** #10'); + expect((result.data as GitHubOutput)?.content).toContain('**Linked Files:** `src/test.ts`'); + expect((result.data as GitHubOutput)?.content).toContain('**Mentions:** @developer1'); }); it('should handle issue not found', async () => { // Mock getDocument to return null (not found) - vi.mocked(mockGitHubIndexer.getDocument).mockResolvedValue(null); + vi.mocked(mockGitHubService.getContext).mockResolvedValue(null); // Also mock search for fallback case - vi.mocked(mockGitHubIndexer.search).mockResolvedValue([]); + vi.mocked(mockGitHubService.search).mockResolvedValue([]); const result = await adapter.execute( { @@ -336,16 +335,11 @@ describe('GitHubAdapter', () => { title: 'Related Issue', }; - // Mock getDocument for finding the main issue - vi.mocked(mockGitHubIndexer.getDocument).mockResolvedValue(mockIssue); + // Mock getContext for finding the main issue + vi.mocked(mockGitHubService.getContext).mockResolvedValue(mockIssue); - // Mock search for finding related issues (semantic similarity) - vi.mocked(mockGitHubIndexer.search).mockResolvedValue([ - { - document: mockIssue, - score: 1.0, - matchedFields: ['title'], - }, + // Mock findRelated for finding related issues + vi.mocked(mockGitHubService.findRelated).mockResolvedValue([ { document: mockRelated, score: 0.85, @@ -363,42 +357,95 @@ describe('GitHubAdapter', () => { ); expect(result.success).toBe(true); - expect((result.data as { content: string })?.content).toContain('Related Issues/PRs'); - expect((result.data as { content: string })?.content).toContain('#2'); - expect((result.data as { content: string })?.content).toContain('Related Issue'); + expect((result.data as GitHubOutput)?.content).toContain('Related Issues/PRs'); + expect((result.data as GitHubOutput)?.content).toContain('#2'); + expect((result.data as GitHubOutput)?.content).toContain('Related Issue'); }); it('should handle no related items', async () => { - // Mock getDocument for finding the main issue - vi.mocked(mockGitHubIndexer.getDocument).mockResolvedValue(mockIssue); + // Mock getContext for finding the main issue + vi.mocked(mockGitHubService.getContext).mockResolvedValue(mockIssue); + + // Mock findRelated to return no related items + vi.mocked(mockGitHubService.findRelated).mockResolvedValue([]); - // Mock search to only return the main issue (no related items) - vi.mocked(mockGitHubIndexer.search).mockResolvedValue([ + const result = await adapter.execute( { - document: mockIssue, - score: 1.0, + action: 'related', + number: 1, + }, + mockContext + ); + + expect(result.success).toBe(true); + expect((result.data as GitHubOutput)?.content).toContain('No related issues or PRs found'); + }); + }); + + describe('related action', () => { + it('should find related issues with real search scores', async () => { + const relatedResults: GitHubSearchResult[] = [ + { + document: { ...mockIssue, number: 2, title: 'Related Issue 1' }, + score: 0.9, + matchedFields: ['title', 'body'], + }, + { + document: { ...mockIssue, number: 3, title: 'Related Issue 2' }, + score: 0.85, matchedFields: ['title'], }, - ]); + ]; + + vi.mocked(mockGitHubService.getContext).mockResolvedValue(mockIssue); + vi.mocked(mockGitHubService.findRelated).mockResolvedValue(relatedResults); const result = await adapter.execute( { action: 'related', number: 1, + limit: 5, }, mockContext ); expect(result.success).toBe(true); - expect((result.data as { content: string })?.content).toContain( - 'No related issues or PRs found' + if (result.success) { + const output = result.data as GitHubOutput; + expect(output.content).toContain('Related Issue 1'); + expect(output.content).toContain('Related Issue 2'); + expect(output.content).toContain('90% similar'); // Score shown as percentage + expect(output.resultsTotal).toBe(2); + expect(output.resultsReturned).toBe(2); + } + + expect(mockGitHubService.getContext).toHaveBeenCalledWith(1); + expect(mockGitHubService.findRelated).toHaveBeenCalledWith(1, 5); + }); + + it('should handle no related issues found', async () => { + vi.mocked(mockGitHubService.getContext).mockResolvedValue(mockIssue); + vi.mocked(mockGitHubService.findRelated).mockResolvedValue([]); + + const result = await adapter.execute( + { + action: 'related', + number: 1, + }, + mockContext ); + + expect(result.success).toBe(true); + if (result.success) { + const output = result.data as GitHubOutput; + expect(output.content).toContain('No related issues or PRs found'); + } }); }); describe('Error Handling', () => { it('should handle index not ready error', async () => { - vi.mocked(mockGitHubIndexer.search).mockRejectedValue(new Error('GitHub index not indexed')); + vi.mocked(mockGitHubService.search).mockRejectedValue(new Error('GitHub index not indexed')); const result = await adapter.execute( { @@ -413,7 +460,7 @@ describe('GitHubAdapter', () => { }); it('should handle generic errors', async () => { - vi.mocked(mockGitHubIndexer.search).mockRejectedValue(new Error('Unknown error')); + vi.mocked(mockGitHubService.search).mockRejectedValue(new Error('Unknown error')); const result = await adapter.execute( { @@ -428,133 +475,6 @@ describe('GitHubAdapter', () => { }); }); - describe('Auto-Reload on File Changes', () => { - it('should detect state file modifications', async () => { - const fs = await import('node:fs/promises'); - const path = await import('node:path'); - const os = await import('node:os'); - - // Create temporary state file - const tempDir = await fs.mkdtemp(path.join(os.tmpdir(), 'github-adapter-test-')); - const statePath = path.join(tempDir, 'github-state.json'); - const vectorPath = path.join(tempDir, 'vectors'); - - // Write initial state - await fs.writeFile( - statePath, - JSON.stringify({ - version: '1.0.0', - repository: 'test/repo', - lastIndexed: '2024-01-01T00:00:00Z', - totalDocuments: 10, - }), - 'utf-8' - ); - - // Create adapter with file paths (lazy initialization) - const lazyAdapter = new GitHubAdapter({ - repositoryPath: '/test/repo', - vectorStorePath: vectorPath, - statePath, - defaultLimit: 10, - defaultFormat: 'compact', - }); - - // Initialize adapter - await lazyAdapter.initialize({ - logger: mockContext.logger, - } as any); - - // Trigger lazy initialization by calling ensureGitHubIndexer - // This will load the state file and track its modification time - try { - await (lazyAdapter as any).ensureGitHubIndexer(); - } catch { - // Indexer initialization may fail (no vector storage), but that's ok - // We just need it to track the state file modification time - } - - // Wait a bit to ensure file system timestamps differ - await new Promise((resolve) => setTimeout(resolve, 100)); - - // Update state file (simulating `dev gh index` running) - await fs.writeFile( - statePath, - JSON.stringify({ - version: '1.0.0', - repository: 'test/repo', - lastIndexed: '2024-01-02T00:00:00Z', // Updated timestamp - totalDocuments: 20, // More documents - }), - 'utf-8' - ); - - // Access the private method through type assertion for testing - const hasChanged = await (lazyAdapter as any).hasStateFileChanged(); - - expect(hasChanged).toBe(true); - - // Cleanup - await fs.rm(tempDir, { recursive: true, force: true }); - }); - - it('should not detect changes when file unchanged', async () => { - const fs = await import('node:fs/promises'); - const path = await import('node:path'); - const os = await import('node:os'); - - // Create temporary state file - const tempDir = await fs.mkdtemp(path.join(os.tmpdir(), 'github-adapter-test-')); - const statePath = path.join(tempDir, 'github-state.json'); - const vectorPath = path.join(tempDir, 'vectors'); - - // Write initial state - await fs.writeFile( - statePath, - JSON.stringify({ - version: '1.0.0', - repository: 'test/repo', - lastIndexed: '2024-01-01T00:00:00Z', - totalDocuments: 10, - }), - 'utf-8' - ); - - // Create adapter with file paths - const lazyAdapter = new GitHubAdapter({ - repositoryPath: '/test/repo', - vectorStorePath: vectorPath, - statePath, - defaultLimit: 10, - defaultFormat: 'compact', - }); - - // Initialize adapter - await lazyAdapter.initialize({ - logger: mockContext.logger, - } as any); - - // Don't modify file - check for changes immediately - const hasChanged = await (lazyAdapter as any).hasStateFileChanged(); - - expect(hasChanged).toBe(false); - - // Cleanup - await fs.rm(tempDir, { recursive: true, force: true }); - }); - - it('should handle missing state file gracefully', async () => { - const lazyAdapter = new GitHubAdapter({ - repositoryPath: '/test/repo', - vectorStorePath: '/nonexistent/vectors', - statePath: '/nonexistent/state.json', - defaultLimit: 10, - defaultFormat: 'compact', - }); - - // Should not throw - const hasChanged = await (lazyAdapter as any).hasStateFileChanged(); - expect(hasChanged).toBe(false); - }); - }); + // Note: Auto-reload functionality is now handled by GitHubService internally + // No need to test file watching at the adapter level }); diff --git a/packages/mcp-server/src/adapters/__tests__/health-adapter.test.ts b/packages/mcp-server/src/adapters/__tests__/health-adapter.test.ts index 9b9f22f..b967fb5 100644 --- a/packages/mcp-server/src/adapters/__tests__/health-adapter.test.ts +++ b/packages/mcp-server/src/adapters/__tests__/health-adapter.test.ts @@ -160,6 +160,7 @@ describe('HealthAdapter', () => { await fs.writeFile(path.join(vectorStorePath, 'data.db'), 'test'); const result = await adapter.execute({ verbose: true }, execContext); + expect(result.success).toBe(true); const health = result.data as HealthStatus; expect(health.checks.vectorStorage.details).toBeDefined(); @@ -300,6 +301,7 @@ describe('HealthAdapter', () => { await fs.writeFile(path.join(vectorStorePath, 'data.db'), 'test'); const result = await adapter.execute({ verbose: true }, execContext); + expect(result.success).toBe(true); const data = result.data as { formattedReport: string }; expect(data.formattedReport).toContain('Details:'); diff --git a/packages/mcp-server/src/adapters/__tests__/refs-adapter.test.ts b/packages/mcp-server/src/adapters/__tests__/refs-adapter.test.ts index d014f71..d994bb2 100644 --- a/packages/mcp-server/src/adapters/__tests__/refs-adapter.test.ts +++ b/packages/mcp-server/src/adapters/__tests__/refs-adapter.test.ts @@ -2,14 +2,14 @@ * Tests for RefsAdapter */ -import type { RepositoryIndexer, SearchResult } from '@lytics/dev-agent-core'; +import type { SearchResult, SearchService } from '@lytics/dev-agent-core'; import { beforeEach, describe, expect, it, vi } from 'vitest'; import { ConsoleLogger } from '../../utils/logger'; import { RefsAdapter } from '../built-in/refs-adapter'; import type { AdapterContext, ToolExecutionContext } from '../types'; describe('RefsAdapter', () => { - let mockIndexer: RepositoryIndexer; + let mockSearchService: SearchService; let adapter: RefsAdapter; let context: AdapterContext; let execContext: ToolExecutionContext; @@ -69,14 +69,14 @@ describe('RefsAdapter', () => { ]; beforeEach(async () => { - // Create mock indexer - mockIndexer = { + // Create mock search service + mockSearchService = { search: vi.fn().mockResolvedValue(mockSearchResults), - } as unknown as RepositoryIndexer; + } as unknown as SearchService; // Create adapter adapter = new RefsAdapter({ - repositoryIndexer: mockIndexer, + searchService: mockSearchService, defaultLimit: 20, }); @@ -269,7 +269,7 @@ describe('RefsAdapter', () => { describe('Not Found', () => { it('should return error when function not found', async () => { // Mock empty results - (mockIndexer.search as ReturnType).mockResolvedValueOnce([]); + (mockSearchService.search as ReturnType).mockResolvedValueOnce([]); const result = await adapter.execute({ name: 'nonExistentFunction' }, execContext); diff --git a/packages/mcp-server/src/adapters/__tests__/search-adapter.test.ts b/packages/mcp-server/src/adapters/__tests__/search-adapter.test.ts index 8553877..2463b8e 100644 --- a/packages/mcp-server/src/adapters/__tests__/search-adapter.test.ts +++ b/packages/mcp-server/src/adapters/__tests__/search-adapter.test.ts @@ -46,14 +46,29 @@ describe('SearchAdapter', () => { ]; beforeEach(async () => { + // Suppress all logger output in tests + vi.spyOn(ConsoleLogger.prototype, 'info').mockImplementation(() => {}); + vi.spyOn(ConsoleLogger.prototype, 'debug').mockImplementation(() => {}); + vi.spyOn(ConsoleLogger.prototype, 'warn').mockImplementation(() => {}); + vi.spyOn(ConsoleLogger.prototype, 'error').mockImplementation(() => {}); + // Create mock indexer mockIndexer = { search: vi.fn().mockResolvedValue(mockSearchResults), } as unknown as RepositoryIndexer; // Create adapter + // Create mock search service + const mockSearchService = { + search: mockIndexer.search, + findSimilar: vi.fn(), + findRelatedTests: vi.fn(), + findSymbol: vi.fn(), + isIndexed: vi.fn(), + }; + adapter = new SearchAdapter({ - repositoryIndexer: mockIndexer, + searchService: mockSearchService as any, defaultFormat: 'compact', defaultLimit: 10, }); @@ -61,19 +76,22 @@ describe('SearchAdapter', () => { // Create context const logger = new ConsoleLogger('error'); // Quiet for tests context = { - agentName: 'test-agent', logger, - config: {}, + config: { repositoryPath: '/test' }, }; execContext = { logger, - requestId: 'test-request', + config: { repositoryPath: '/test' }, }; await adapter.initialize(context); }); + afterEach(() => { + vi.restoreAllMocks(); + }); + describe('Tool Definition', () => { it('should provide valid tool definition', () => { const def = adapter.getToolDefinition(); diff --git a/packages/mcp-server/src/adapters/__tests__/status-adapter.test.ts b/packages/mcp-server/src/adapters/__tests__/status-adapter.test.ts index 14704a7..2625819 100644 --- a/packages/mcp-server/src/adapters/__tests__/status-adapter.test.ts +++ b/packages/mcp-server/src/adapters/__tests__/status-adapter.test.ts @@ -2,50 +2,56 @@ * Tests for StatusAdapter */ -import type { RepositoryIndexer } from '@lytics/dev-agent-core'; +import type { GitHubService, StatsService } from '@lytics/dev-agent-core'; import { beforeEach, describe, expect, it, vi } from 'vitest'; +import type { StatusOutput } from '../../schemas/index.js'; import { StatusAdapter } from '../built-in/status-adapter'; import type { AdapterContext, ToolExecutionContext } from '../types'; -// Mock RepositoryIndexer -const createMockRepositoryIndexer = () => { +// Mock StatsService +const createMockStatsService = () => { return { getStats: vi.fn(), - search: vi.fn(), - initialize: vi.fn(), - close: vi.fn(), - } as unknown as RepositoryIndexer; + isIndexed: vi.fn(), + } as unknown as StatsService; }; -// Mock GitHubIndexer -vi.mock('@lytics/dev-agent-subagents', () => ({ - GitHubIndexer: vi.fn(() => ({ - initialize: vi.fn(() => Promise.resolve(undefined)), - getStats: vi.fn(() => ({ +// Mock GitHubService +const createMockGitHubService = () => { + return { + getStats: vi.fn().mockResolvedValue({ repository: 'lytics/dev-agent', totalDocuments: 59, byType: { issue: 47, pull_request: 12 }, byState: { open: 35, closed: 15, merged: 9 }, lastIndexed: '2025-11-24T10:00:00Z', indexDuration: 12400, - })), - isIndexed: vi.fn(() => true), - })), -})); + }), + isIndexed: vi.fn().mockResolvedValue(true), + index: vi.fn(), + search: vi.fn(), + getContext: vi.fn(), + findRelated: vi.fn(), + shutdown: vi.fn(), + } as unknown as GitHubService; +}; describe('StatusAdapter', () => { let adapter: StatusAdapter; - let mockIndexer: RepositoryIndexer; + let mockStatsService: StatsService; + let mockGitHubService: GitHubService; let mockContext: AdapterContext; let mockExecutionContext: ToolExecutionContext; beforeEach(() => { - mockIndexer = createMockRepositoryIndexer(); + mockStatsService = createMockStatsService(); + mockGitHubService = createMockGitHubService(); adapter = new StatusAdapter({ - repositoryIndexer: mockIndexer, + statsService: mockStatsService, repositoryPath: '/test/repo', vectorStorePath: '/test/.dev-agent/vectors.lance', + githubService: mockGitHubService, defaultSection: 'summary', }); @@ -69,7 +75,7 @@ describe('StatusAdapter', () => { }; // Setup default mock responses - vi.mocked(mockIndexer.getStats).mockResolvedValue({ + vi.mocked(mockStatsService.getStats).mockResolvedValue({ filesScanned: 2341, documentsExtracted: 1234, documentsIndexed: 1234, @@ -96,19 +102,26 @@ describe('StatusAdapter', () => { expect(mockContext.logger.info).toHaveBeenCalledWith('StatusAdapter initialized', { repositoryPath: '/test/repo', defaultSection: 'summary', + hasGitHubService: true, }); }); - it('should handle GitHub indexer initialization failure gracefully', async () => { - const { GitHubIndexer } = await import('@lytics/dev-agent-subagents'); - vi.mocked(GitHubIndexer).mockImplementationOnce(() => { - throw new Error('GitHub not available'); + it('should work without GitHub service', async () => { + // Create adapter without GitHub service + const adapterWithoutGitHub = new StatusAdapter({ + statsService: mockStatsService, + repositoryPath: '/test/repo', + vectorStorePath: '/test/.dev-agent/vectors.lance', + defaultSection: 'summary', + // githubService not provided }); - await adapter.initialize(mockContext); - expect(mockContext.logger.warn).toHaveBeenCalledWith( - 'GitHub indexer initialization failed', - expect.any(Object) + await adapterWithoutGitHub.initialize(mockContext); + expect(mockContext.logger.info).toHaveBeenCalledWith( + 'StatusAdapter initialized', + expect.objectContaining({ + hasGitHubService: false, + }) ); }); }); @@ -168,11 +181,11 @@ describe('StatusAdapter', () => { const result = await adapter.execute({}, mockExecutionContext); expect(result.success).toBe(true); - expect(result.data?.section).toBe('summary'); - expect(result.data?.format).toBe('compact'); - expect(result.data?.content).toContain('Dev-Agent Status'); - expect(result.data?.content).toContain('Repository:'); - expect(result.data?.content).toContain('2341 files indexed'); + expect((result.data as StatusOutput)?.section).toBe('summary'); + expect((result.data as StatusOutput)?.format).toBe('compact'); + expect((result.data as StatusOutput)?.content).toContain('Dev-Agent Status'); + expect((result.data as StatusOutput)?.content).toContain('Repository:'); + expect((result.data as StatusOutput)?.content).toContain('2341 files indexed'); }); it('should return verbose summary when requested', async () => { @@ -182,19 +195,19 @@ describe('StatusAdapter', () => { ); expect(result.success).toBe(true); - expect(result.data?.content).toContain('Detailed'); - expect(result.data?.content).toContain('Repository'); - expect(result.data?.content).toContain('Vector Indexes'); - expect(result.data?.content).toContain('Health Checks'); + expect((result.data as StatusOutput)?.content).toContain('Detailed'); + expect((result.data as StatusOutput)?.content).toContain('Repository'); + expect((result.data as StatusOutput)?.content).toContain('Vector Indexes'); + expect((result.data as StatusOutput)?.content).toContain('Health Checks'); }); it('should handle repository not indexed', async () => { - vi.mocked(mockIndexer.getStats).mockResolvedValue(null); + vi.mocked(mockStatsService.getStats).mockResolvedValue(null); const result = await adapter.execute({}, mockExecutionContext); expect(result.success).toBe(true); - expect(result.data?.content).toContain('not indexed'); + expect((result.data as StatusOutput)?.content).toContain('not indexed'); }); it('should include GitHub section in summary', async () => { @@ -203,9 +216,9 @@ describe('StatusAdapter', () => { const result = await adapter.execute({}, mockExecutionContext); expect(result.success).toBe(true); - expect(result.data?.content).toContain('GitHub'); + expect((result.data as StatusOutput)?.content).toContain('GitHub'); // GitHub stats may or may not be available depending on initialization - const content = result.data?.content || ''; + const content = (result.data as StatusOutput)?.content || ''; const hasGitHub = content.includes('GitHub'); expect(hasGitHub).toBe(true); }); @@ -216,9 +229,9 @@ describe('StatusAdapter', () => { const result = await adapter.execute({ section: 'repo' }, mockExecutionContext); expect(result.success).toBe(true); - expect(result.data?.content).toContain('Repository Index'); - expect(result.data?.content).toContain('2341'); - expect(result.data?.content).toContain('1234'); + expect((result.data as StatusOutput)?.content).toContain('Repository Index'); + expect((result.data as StatusOutput)?.content).toContain('2341'); + expect((result.data as StatusOutput)?.content).toContain('1234'); }); it('should return repository status in verbose format', async () => { @@ -228,18 +241,18 @@ describe('StatusAdapter', () => { ); expect(result.success).toBe(true); - expect(result.data?.content).toContain('Documents Indexed:'); - expect(result.data?.content).toContain('Vectors Stored:'); + expect((result.data as StatusOutput)?.content).toContain('Documents Indexed:'); + expect((result.data as StatusOutput)?.content).toContain('Vectors Stored:'); }); it('should handle repository not indexed', async () => { - vi.mocked(mockIndexer.getStats).mockResolvedValue(null); + vi.mocked(mockStatsService.getStats).mockResolvedValue(null); const result = await adapter.execute({ section: 'repo' }, mockExecutionContext); expect(result.success).toBe(true); - expect(result.data?.content).toContain('Not indexed'); - expect(result.data?.content).toContain('dev index'); + expect((result.data as StatusOutput)?.content).toContain('Not indexed'); + expect((result.data as StatusOutput)?.content).toContain('dev index'); }); }); @@ -250,10 +263,10 @@ describe('StatusAdapter', () => { const result = await adapter.execute({ section: 'indexes' }, mockExecutionContext); expect(result.success).toBe(true); - expect(result.data?.content).toContain('Vector Indexes'); - expect(result.data?.content).toContain('Code Index'); - expect(result.data?.content).toContain('GitHub Index'); - expect(result.data?.content).toContain('1234 embeddings'); + expect((result.data as StatusOutput)?.content).toContain('Vector Indexes'); + expect((result.data as StatusOutput)?.content).toContain('Code Index'); + expect((result.data as StatusOutput)?.content).toContain('GitHub Index'); + expect((result.data as StatusOutput)?.content).toContain('1234 embeddings'); }); it('should return indexes status in verbose format', async () => { @@ -265,11 +278,11 @@ describe('StatusAdapter', () => { ); expect(result.success).toBe(true); - expect(result.data?.content).toContain('Code Index'); - expect(result.data?.content).toContain('Documents:'); - expect(result.data?.content).toContain('GitHub Index'); + expect((result.data as StatusOutput)?.content).toContain('Code Index'); + expect((result.data as StatusOutput)?.content).toContain('Documents:'); + expect((result.data as StatusOutput)?.content).toContain('GitHub Index'); // GitHub section should be present, may show stats or "Not indexed" - const content = result.data?.content || ''; + const content = (result.data as StatusOutput)?.content || ''; const hasGitHubInfo = content.includes('Not indexed') || content.includes('Documents:'); expect(hasGitHubInfo).toBe(true); }); @@ -282,7 +295,7 @@ describe('StatusAdapter', () => { const result = await adapter.execute({ section: 'github' }, mockExecutionContext); expect(result.success).toBe(true); - expect(result.data?.content).toContain('GitHub Integration'); + expect((result.data as StatusOutput)?.content).toContain('GitHub Integration'); // May show stats or "Not indexed" depending on initialization }); @@ -295,14 +308,14 @@ describe('StatusAdapter', () => { ); expect(result.success).toBe(true); - expect(result.data?.content).toContain('GitHub Integration'); + expect((result.data as StatusOutput)?.content).toContain('GitHub Integration'); // May include Configuration or Not indexed message }); it('should handle GitHub not indexed', async () => { // Create adapter without initializing (no GitHub indexer) const newAdapter = new StatusAdapter({ - repositoryIndexer: mockIndexer, + statsService: mockStatsService, repositoryPath: '/test/repo', vectorStorePath: '/test/.dev-agent/vectors.lance', }); @@ -310,8 +323,8 @@ describe('StatusAdapter', () => { const result = await newAdapter.execute({ section: 'github' }, mockExecutionContext); expect(result.success).toBe(true); - expect(result.data?.content).toContain('Not indexed'); - expect(result.data?.content).toContain('dev gh index'); + expect((result.data as StatusOutput)?.content).toContain('Not indexed'); + expect((result.data as StatusOutput)?.content).toContain('dev gh index'); }); }); @@ -320,8 +333,8 @@ describe('StatusAdapter', () => { const result = await adapter.execute({ section: 'health' }, mockExecutionContext); expect(result.success).toBe(true); - expect(result.data?.content).toContain('Health Checks'); - expect(result.data?.content).toContain('✅'); + expect((result.data as StatusOutput)?.content).toContain('Health Checks'); + expect((result.data as StatusOutput)?.content).toContain('✅'); }); it('should return health status in verbose format', async () => { @@ -331,15 +344,15 @@ describe('StatusAdapter', () => { ); expect(result.success).toBe(true); - expect(result.data?.content).toContain('Health Checks'); + expect((result.data as StatusOutput)?.content).toContain('Health Checks'); // Verbose includes details - expect(result.data?.content.length).toBeGreaterThan(100); + expect((result.data as StatusOutput)?.content.length).toBeGreaterThan(100); }); }); describe('error handling', () => { it('should handle errors during status generation', async () => { - vi.mocked(mockIndexer.getStats).mockRejectedValue(new Error('Database error')); + vi.mocked(mockStatsService.getStats).mockRejectedValue(new Error('Database error')); const result = await adapter.execute({ section: 'summary' }, mockExecutionContext); @@ -349,7 +362,7 @@ describe('StatusAdapter', () => { }); it('should log errors', async () => { - vi.mocked(mockIndexer.getStats).mockRejectedValue(new Error('Test error')); + vi.mocked(mockStatsService.getStats).mockRejectedValue(new Error('Test error')); await adapter.execute({ section: 'summary' }, mockExecutionContext); @@ -416,7 +429,7 @@ describe('StatusAdapter', () => { const now = new Date(); const twoHoursAgo = new Date(now.getTime() - 2 * 60 * 60 * 1000); - vi.mocked(mockIndexer.getStats).mockResolvedValue({ + vi.mocked(mockStatsService.getStats).mockResolvedValue({ filesScanned: 100, documentsExtracted: 50, documentsIndexed: 50, @@ -431,7 +444,7 @@ describe('StatusAdapter', () => { const result = await adapter.execute({ section: 'summary' }, mockExecutionContext); expect(result.success).toBe(true); - expect(result.data?.content).toContain('ago'); + expect((result.data as StatusOutput)?.content).toContain('ago'); }); }); @@ -444,7 +457,7 @@ describe('StatusAdapter', () => { expect(result.success).toBe(true); // Should contain some size format (KB, MB, GB, or B) - expect(result.data?.content).toMatch(/\d+(\.\d+)?\s*(B|KB|MB|GB)/); + expect((result.data as StatusOutput)?.content).toMatch(/\d+(\.\d+)?\s*(B|KB|MB|GB)/); }); }); }); diff --git a/packages/mcp-server/src/adapters/built-in/explore-adapter.ts b/packages/mcp-server/src/adapters/built-in/explore-adapter.ts index b387c46..08b80c1 100644 --- a/packages/mcp-server/src/adapters/built-in/explore-adapter.ts +++ b/packages/mcp-server/src/adapters/built-in/explore-adapter.ts @@ -6,21 +6,21 @@ * falls back to direct indexer calls otherwise. */ -import type { RepositoryIndexer } from '@lytics/dev-agent-core'; +import type { SearchService } from '@lytics/dev-agent-core'; import type { ExplorationResult, PatternResult, RelationshipResult, SimilarCodeResult, } from '@lytics/dev-agent-subagents'; -import { ExploreArgsSchema } from '../../schemas/index.js'; +import { ExploreArgsSchema, type ExploreOutput, ExploreOutputSchema } from '../../schemas/index.js'; import { ToolAdapter } from '../tool-adapter'; import type { AdapterContext, ToolDefinition, ToolExecutionContext, ToolResult } from '../types'; import { validateArgs } from '../validation.js'; export interface ExploreAdapterConfig { repositoryPath: string; - repositoryIndexer: RepositoryIndexer; + searchService: SearchService; defaultLimit?: number; defaultThreshold?: number; defaultFormat?: 'compact' | 'verbose'; @@ -43,7 +43,7 @@ export class ExploreAdapter extends ToolAdapter { }; private repositoryPath: string; - private repositoryIndexer: RepositoryIndexer; + private searchService: SearchService; private defaultLimit: number; private defaultThreshold: number; private defaultFormat: 'compact' | 'verbose'; @@ -51,7 +51,7 @@ export class ExploreAdapter extends ToolAdapter { constructor(config: ExploreAdapterConfig) { super(); this.repositoryPath = config.repositoryPath; - this.repositoryIndexer = config.repositoryIndexer; + this.searchService = config.searchService; this.defaultLimit = config.defaultLimit ?? 10; this.defaultThreshold = config.defaultThreshold ?? 0.7; this.defaultFormat = config.defaultFormat ?? 'compact'; @@ -117,6 +117,29 @@ export class ExploreAdapter extends ToolAdapter { }, required: ['action', 'query'], }, + outputSchema: { + type: 'object', + properties: { + action: { + type: 'string', + enum: ['pattern', 'similar', 'relationships'], + description: 'Exploration action performed', + }, + query: { + type: 'string', + description: 'Query or file path used', + }, + format: { + type: 'string', + description: 'Output format used', + }, + content: { + type: 'string', + description: 'Formatted exploration results', + }, + }, + required: ['action', 'query', 'format', 'content'], + }, }; } @@ -185,14 +208,23 @@ export class ExploreAdapter extends ToolAdapter { break; } + // Validate output with Zod + const outputData: ExploreOutput = { + action, + query, + format, + content, + }; + + const outputValidation = ExploreOutputSchema.safeParse(outputData); + if (!outputValidation.success) { + context.logger.error('Output validation failed', { error: outputValidation.error }); + throw new Error(`Output validation failed: ${outputValidation.error.message}`); + } + return { success: true, - data: { - action, - query, - format, - content, - }, + data: outputValidation.data, }; } catch (error) { context.logger.error('Exploration failed', { error }); @@ -379,7 +411,7 @@ export class ExploreAdapter extends ToolAdapter { fileTypes: string[] | undefined, format: string ): Promise { - const results = await this.repositoryIndexer.search(query, { + const results = await this.searchService.search(query, { limit, scoreThreshold: threshold, }); @@ -412,9 +444,9 @@ export class ExploreAdapter extends ToolAdapter { threshold: number, format: string ): Promise { - const results = await this.repositoryIndexer.search(`file:${filePath}`, { + const results = await this.searchService.findSimilar(filePath, { limit: limit + 1, - scoreThreshold: threshold, + threshold, }); // Exclude the reference file itself @@ -437,7 +469,7 @@ export class ExploreAdapter extends ToolAdapter { private async findRelationships(filePath: string, format: string): Promise { // Search for references to this file const fileName = filePath.split('/').pop() || filePath; - const results = await this.repositoryIndexer.search(`import ${fileName}`, { + const results = await this.searchService.search(`import ${fileName}`, { limit: 20, scoreThreshold: 0.6, }); diff --git a/packages/mcp-server/src/adapters/built-in/github-adapter.ts b/packages/mcp-server/src/adapters/built-in/github-adapter.ts index 993a5e6..80b1cae 100644 --- a/packages/mcp-server/src/adapters/built-in/github-adapter.ts +++ b/packages/mcp-server/src/adapters/built-in/github-adapter.ts @@ -3,24 +3,21 @@ * Exposes GitHub context and search capabilities via MCP (dev_gh tool) */ -import { - type GitHubDocument, - GitHubIndexer, - type GitHubSearchOptions, - type GitHubSearchResult, -} from '@lytics/dev-agent-subagents'; +import type { GitHubService } from '@lytics/dev-agent-core'; +import type { + GitHubDocument, + GitHubSearchOptions, + GitHubSearchResult, +} from '@lytics/dev-agent-types/github'; import { estimateTokensForText } from '../../formatters/utils'; -import { GitHubArgsSchema } from '../../schemas/index.js'; +import { GitHubArgsSchema, type GitHubOutput, GitHubOutputSchema } from '../../schemas/index.js'; import { ToolAdapter } from '../tool-adapter'; import type { AdapterContext, ToolDefinition, ToolExecutionContext, ToolResult } from '../types'; import { validateArgs } from '../validation.js'; export interface GitHubAdapterConfig { + githubService: GitHubService; repositoryPath: string; - // Either pass an initialized indexer OR paths for lazy initialization - githubIndexer?: GitHubIndexer; - vectorStorePath?: string; - statePath?: string; defaultLimit?: number; defaultFormat?: 'compact' | 'verbose'; } @@ -38,29 +35,17 @@ export class GitHubAdapter extends ToolAdapter { description: 'GitHub issues and PRs search and context', }; + private githubService: GitHubService; private repositoryPath: string; - public githubIndexer?: GitHubIndexer; // Public for cleanup in shutdown - private vectorStorePath?: string; - private statePath?: string; private defaultLimit: number; private defaultFormat: 'compact' | 'verbose'; - private lastStateFileModTime?: number; // Track state file modification time for auto-reload constructor(config: GitHubAdapterConfig) { super(); + this.githubService = config.githubService; this.repositoryPath = config.repositoryPath; - this.githubIndexer = config.githubIndexer; - this.vectorStorePath = config.vectorStorePath; - this.statePath = config.statePath; this.defaultLimit = config.defaultLimit ?? 10; this.defaultFormat = config.defaultFormat ?? 'compact'; - - // Validate: either githubIndexer OR both paths must be provided - if (!this.githubIndexer && (!this.vectorStorePath || !this.statePath)) { - throw new Error( - 'GitHubAdapter requires either githubIndexer or both vectorStorePath and statePath' - ); - } } async initialize(context: AdapterContext): Promise { @@ -68,104 +53,9 @@ export class GitHubAdapter extends ToolAdapter { repositoryPath: this.repositoryPath, defaultLimit: this.defaultLimit, defaultFormat: this.defaultFormat, - lazyInit: !this.githubIndexer, }); } - /** - * Check if state file has been modified since last load - * Returns true if file was modified and indexer needs reload - */ - private async hasStateFileChanged(): Promise { - if (!this.statePath || !this.lastStateFileModTime) { - return false; - } - - try { - const fs = await import('node:fs/promises'); - const stats = await fs.stat(this.statePath); - const currentModTime = stats.mtimeMs; - - return currentModTime > this.lastStateFileModTime; - } catch { - // File doesn't exist or can't be accessed - return false; - } - } - - /** - * Reload the GitHubIndexer to pick up fresh data - */ - private async reloadIndexer(): Promise { - if (!this.githubIndexer || !this.statePath) { - return; - } - - try { - // Close existing indexer - await this.githubIndexer.close(); - - // Clear reference to force re-initialization - this.githubIndexer = undefined; - this.lastStateFileModTime = undefined; - - // Re-initialize with fresh data - await this.ensureGitHubIndexer(); - } catch (error) { - // Log error but don't crash - old indexer might still work - console.error('[GitHubAdapter] Failed to reload indexer:', error); - } - } - - /** - * Ensure indexer is loaded and up-to-date - * Checks for file modifications and reloads if needed - */ - private async ensureGitHubIndexer(): Promise { - // Check if state file was modified (index updated by CLI) - if (this.githubIndexer && (await this.hasStateFileChanged())) { - await this.reloadIndexer(); - } - - if (this.githubIndexer) { - return this.githubIndexer; - } - - // Validate paths are available - if (!this.vectorStorePath || !this.statePath) { - throw new Error('GitHubAdapter not configured for lazy initialization'); - } - - // Lazy initialization - // Try to load repository from state file to avoid gh CLI call - let repository: string | undefined; - try { - const fs = await import('node:fs/promises'); - const stateContent = await fs.readFile(this.statePath, 'utf-8'); - const state = JSON.parse(stateContent); - repository = state.repository; - - // Track state file modification time - const stats = await fs.stat(this.statePath); - this.lastStateFileModTime = stats.mtimeMs; - } catch { - // State file doesn't exist or can't be read - // GitHubIndexer will try gh CLI as fallback - } - - this.githubIndexer = new GitHubIndexer( - { - vectorStorePath: this.vectorStorePath, - statePath: this.statePath, - autoUpdate: false, // We handle updates via file watching - }, - repository // Pass repository to avoid gh CLI call - ); - - await this.githubIndexer.initialize(); - return this.githubIndexer; - } - getToolDefinition(): ToolDefinition { return { name: 'dev_gh', @@ -224,6 +114,32 @@ export class GitHubAdapter extends ToolAdapter { }, required: ['action'], }, + outputSchema: { + type: 'object', + properties: { + action: { + type: 'string', + description: 'The action that was executed', + }, + format: { + type: 'string', + description: 'The output format used', + }, + content: { + type: 'string', + description: 'Formatted GitHub data', + }, + resultsTotal: { + type: 'number', + description: 'Total number of results found (for search/related actions)', + }, + resultsReturned: { + type: 'number', + description: 'Number of results returned (for search/related actions)', + }, + }, + required: ['action', 'format', 'content'], + }, }; } @@ -279,14 +195,24 @@ export class GitHubAdapter extends ToolAdapter { const duration_ms = Date.now() - startTime; const tokens = estimateTokensForText(content); + // Validate output with Zod + const outputData: GitHubOutput = { + action, + format, + content, + resultsTotal: resultsTotal > 0 ? resultsTotal : undefined, + resultsReturned: resultsReturned > 0 ? resultsReturned : undefined, + }; + + const outputValidation = GitHubOutputSchema.safeParse(outputData); + if (!outputValidation.success) { + context.logger.error('Output validation failed', { error: outputValidation.error }); + throw new Error(`Output validation failed: ${outputValidation.error.message}`); + } + return { success: true, - data: { - action, - query: query || number, - format, - content, - }, + data: outputValidation.data, metadata: { tokens, duration_ms, @@ -341,9 +267,7 @@ export class GitHubAdapter extends ToolAdapter { options: GitHubSearchOptions, format: string ): Promise<{ content: string; resultsTotal: number; resultsReturned: number }> { - const indexer = await this.ensureGitHubIndexer(); - - const results = await indexer.search(query, options); + const results = await this.githubService.search(query, options); if (results.length === 0) { const content = @@ -367,28 +291,11 @@ export class GitHubAdapter extends ToolAdapter { * Get full context for an issue/PR */ private async getIssueContext(number: number, format: string): Promise { - const indexer = await this.ensureGitHubIndexer(); - - // First try to get document by ID (more efficient) - let doc = await indexer.getDocument(number, 'issue'); + // Get document using the service + const doc = await this.githubService.getContext(number); if (!doc) { - // Try as pull request if not found as issue - doc = await indexer.getDocument(number, 'pull_request'); - } - - if (!doc) { - // Fallback: search by number in title/content - const results = await indexer.search(`${number}`, { limit: 10 }); - - // Find exact number match - const exactMatch = results.find((r) => r.document.number === number); - - if (exactMatch) { - doc = exactMatch.document; - } else { - throw new Error(`Issue/PR #${number} not found`); - } + throw new Error(`Issue/PR #${number} not found`); } if (format === 'verbose') { @@ -406,32 +313,15 @@ export class GitHubAdapter extends ToolAdapter { limit: number, format: string ): Promise<{ content: string; resultsTotal: number; resultsReturned: number }> { - // First get the main issue/PR using the same logic as getIssueContext - const indexer = await this.ensureGitHubIndexer(); - - let mainDoc = await indexer.getDocument(number, 'issue'); - - if (!mainDoc) { - mainDoc = await indexer.getDocument(number, 'pull_request'); - } + // Get the main document + const mainDoc = await this.githubService.getContext(number); if (!mainDoc) { - // Fallback: search by number - const mainResults = await indexer.search(`${number}`, { limit: 10 }); - const exactMatch = mainResults.find((r) => r.document.number === number); - - if (exactMatch) { - mainDoc = exactMatch.document; - } else { - throw new Error(`Issue/PR #${number} not found`); - } + throw new Error(`Issue/PR #${number} not found`); } - // Search for related items using the title - const relatedResults = await indexer.search(mainDoc.title, { limit: limit + 1 }); - - // Filter out the main item itself - const related = relatedResults.filter((r) => r.document.number !== number).slice(0, limit); + // Get related items using the service + const related = await this.githubService.findRelated(number, limit); if (related.length === 0) { return { diff --git a/packages/mcp-server/src/adapters/built-in/health-adapter.ts b/packages/mcp-server/src/adapters/built-in/health-adapter.ts index f30db4f..1981013 100644 --- a/packages/mcp-server/src/adapters/built-in/health-adapter.ts +++ b/packages/mcp-server/src/adapters/built-in/health-adapter.ts @@ -5,7 +5,7 @@ */ import * as fs from 'node:fs/promises'; -import { HealthArgsSchema } from '../../schemas/index.js'; +import { HealthArgsSchema, type HealthOutput, HealthOutputSchema } from '../../schemas/index.js'; import { ToolAdapter } from '../tool-adapter'; import type { AdapterContext, ToolDefinition, ToolExecutionContext, ToolResult } from '../types'; import { validateArgs } from '../validation.js'; @@ -72,6 +72,29 @@ export class HealthAdapter extends ToolAdapter { }, }, }, + outputSchema: { + type: 'object', + properties: { + status: { + type: 'string', + enum: ['healthy', 'degraded', 'unhealthy'], + description: 'Overall health status', + }, + uptime: { + type: 'number', + description: 'Server uptime in milliseconds', + }, + checks: { + type: 'object', + description: 'Health check results for each component', + }, + formattedReport: { + type: 'string', + description: 'Human-readable health report', + }, + }, + required: ['status', 'uptime', 'checks', 'formattedReport'], + }, }; } @@ -92,12 +115,24 @@ export class HealthAdapter extends ToolAdapter { const content = this.formatHealthReport(health, verbose); + // Validate output with Zod + const outputData: HealthOutput = { + status, + uptime: health.uptime, + timestamp: health.timestamp, + checks: health.checks, + formattedReport: `${emoji} **MCP Server Health: ${status.toUpperCase()}**\n\n${content}`, + }; + + const outputValidation = HealthOutputSchema.safeParse(outputData); + if (!outputValidation.success) { + context.logger.error('Output validation failed', { error: outputValidation.error }); + throw new Error(`Output validation failed: ${outputValidation.error.message}`); + } + return { success: true, - data: { - ...health, - formattedReport: `${emoji} **MCP Server Health: ${status.toUpperCase()}**\n\n${content}`, - }, + data: outputValidation.data, }; } catch (error) { context.logger.error('Health check failed', { diff --git a/packages/mcp-server/src/adapters/built-in/history-adapter.ts b/packages/mcp-server/src/adapters/built-in/history-adapter.ts index 7febbd8..51e3ba7 100644 --- a/packages/mcp-server/src/adapters/built-in/history-adapter.ts +++ b/packages/mcp-server/src/adapters/built-in/history-adapter.ts @@ -5,7 +5,7 @@ import type { GitCommit, GitIndexer, LocalGitExtractor } from '@lytics/dev-agent-core'; import { estimateTokensForText, startTimer } from '../../formatters/utils'; -import { HistoryArgsSchema } from '../../schemas/index.js'; +import { HistoryArgsSchema, type HistoryOutput, HistoryOutputSchema } from '../../schemas/index.js'; import { ToolAdapter } from '../tool-adapter'; import type { AdapterContext, ToolDefinition, ToolExecutionContext, ToolResult } from '../types'; import { validateArgs } from '../validation.js'; @@ -115,6 +115,53 @@ export class HistoryAdapter extends ToolAdapter { // Note: At least one of query or file is required (validated in execute) required: [], }, + outputSchema: { + type: 'object', + properties: { + searchType: { + type: 'string', + enum: ['semantic', 'file'], + description: 'Type of history search performed', + }, + query: { + type: 'string', + description: 'Semantic search query (if applicable)', + }, + file: { + type: 'string', + description: 'File path (if file history)', + }, + commits: { + type: 'array', + description: 'List of commit summaries', + items: { + type: 'object', + properties: { + hash: { + type: 'string', + }, + subject: { + type: 'string', + }, + author: { + type: 'string', + }, + date: { + type: 'string', + }, + filesChanged: { + type: 'number', + }, + }, + }, + }, + content: { + type: 'string', + description: 'Formatted commit history', + }, + }, + required: ['searchType', 'commits', 'content'], + }, }; } @@ -163,21 +210,30 @@ export class HistoryAdapter extends ToolAdapter { const tokens = estimateTokensForText(content); + // Validate output with Zod + const outputData: HistoryOutput = { + searchType, + query: query || undefined, + file: file || undefined, + commits: commits.map((c) => ({ + hash: c.shortHash, + subject: c.subject, + author: c.author.name, + date: c.author.date, + filesChanged: c.stats.filesChanged, + })), + content, + }; + + const outputValidation = HistoryOutputSchema.safeParse(outputData); + if (!outputValidation.success) { + context.logger.error('Output validation failed', { error: outputValidation.error }); + throw new Error(`Output validation failed: ${outputValidation.error.message}`); + } + return { success: true, - data: { - searchType, - query: query || undefined, - file: file || undefined, - commits: commits.map((c) => ({ - hash: c.shortHash, - subject: c.subject, - author: c.author.name, - date: c.author.date, - filesChanged: c.stats.filesChanged, - })), - content, - }, + data: outputValidation.data, metadata: { tokens, duration_ms, diff --git a/packages/mcp-server/src/adapters/built-in/map-adapter.ts b/packages/mcp-server/src/adapters/built-in/map-adapter.ts index 3fd17e9..33b8027 100644 --- a/packages/mcp-server/src/adapters/built-in/map-adapter.ts +++ b/packages/mcp-server/src/adapters/built-in/map-adapter.ts @@ -11,7 +11,7 @@ import { type RepositoryIndexer, } from '@lytics/dev-agent-core'; import { estimateTokensForText, startTimer } from '../../formatters/utils'; -import { MapArgsSchema } from '../../schemas/index.js'; +import { MapArgsSchema, type MapOutput, MapOutputSchema } from '../../schemas/index.js'; import { ToolAdapter } from '../tool-adapter'; import type { AdapterContext, ToolDefinition, ToolExecutionContext, ToolResult } from '../types'; import { validateArgs } from '../validation.js'; @@ -116,6 +116,36 @@ export class MapAdapter extends ToolAdapter { }, required: [], }, + outputSchema: { + type: 'object', + properties: { + content: { + type: 'string', + description: 'Formatted directory structure map', + }, + totalComponents: { + type: 'number', + description: 'Total number of code components (functions, classes, etc.)', + }, + totalDirectories: { + type: 'number', + description: 'Total number of directories in the map', + }, + depth: { + type: 'number', + description: 'Directory depth level used', + }, + focus: { + type: 'string', + description: 'Directory focus path, if any (null if no focus)', + }, + truncated: { + type: 'boolean', + description: 'Whether output was truncated to fit token budget', + }, + }, + required: ['content', 'totalComponents', 'totalDirectories', 'depth', 'focus', 'truncated'], + }, }; } @@ -193,16 +223,25 @@ export class MapAdapter extends ToolAdapter { duration_ms, }); + // Validate output with Zod + const outputData: MapOutput = { + content, + totalComponents: map.totalComponents, + totalDirectories: map.totalDirectories, + depth, + focus: focus || null, + truncated, + }; + + const outputValidation = MapOutputSchema.safeParse(outputData); + if (!outputValidation.success) { + context.logger.error('Output validation failed', { error: outputValidation.error }); + throw new Error(`Output validation failed: ${outputValidation.error.message}`); + } + return { success: true, - data: { - content, - totalComponents: map.totalComponents, - totalDirectories: map.totalDirectories, - depth, - focus: focus || null, - truncated, - }, + data: outputValidation.data, metadata: { tokens, duration_ms, diff --git a/packages/mcp-server/src/adapters/built-in/plan-adapter.ts b/packages/mcp-server/src/adapters/built-in/plan-adapter.ts index 067ed4e..33328a6 100644 --- a/packages/mcp-server/src/adapters/built-in/plan-adapter.ts +++ b/packages/mcp-server/src/adapters/built-in/plan-adapter.ts @@ -9,7 +9,7 @@ import type { GitIndexer, RepositoryIndexer } from '@lytics/dev-agent-core'; import type { ContextAssemblyOptions } from '@lytics/dev-agent-subagents'; import { assembleContext, formatContextPackage } from '@lytics/dev-agent-subagents'; import { estimateTokensForText, startTimer } from '../../formatters/utils'; -import { PlanArgsSchema } from '../../schemas/index.js'; +import { PlanArgsSchema, type PlanOutput, PlanOutputSchema } from '../../schemas/index.js'; import { ToolAdapter } from '../tool-adapter'; import type { AdapterContext, ToolDefinition, ToolExecutionContext, ToolResult } from '../types'; import { validateArgs } from '../validation.js'; @@ -123,6 +123,27 @@ export class PlanAdapter extends ToolAdapter { }, required: ['issue'], }, + outputSchema: { + type: 'object', + properties: { + issue: { + type: 'number', + description: 'Issue number that was processed', + }, + format: { + type: 'string', + description: 'Output format used', + }, + content: { + type: 'string', + description: 'Formatted implementation context', + }, + context: { + description: 'Raw context package (verbose mode only)', + }, + }, + required: ['issue', 'format', 'content'], + }, }; } @@ -186,14 +207,23 @@ export class PlanAdapter extends ToolAdapter { duration_ms, }); + // Validate output with Zod + const outputData: PlanOutput = { + issue, + format, + content, + context: format === 'verbose' ? contextPackage : undefined, + }; + + const outputValidation = PlanOutputSchema.safeParse(outputData); + if (!outputValidation.success) { + context.logger.error('Output validation failed', { error: outputValidation.error }); + throw new Error(`Output validation failed: ${outputValidation.error.message}`); + } + return { success: true, - data: { - issue, - format, - content, - context: format === 'verbose' ? contextPackage : undefined, - }, + data: outputValidation.data, metadata: { tokens, duration_ms, diff --git a/packages/mcp-server/src/adapters/built-in/refs-adapter.ts b/packages/mcp-server/src/adapters/built-in/refs-adapter.ts index 9745a1b..9f48391 100644 --- a/packages/mcp-server/src/adapters/built-in/refs-adapter.ts +++ b/packages/mcp-server/src/adapters/built-in/refs-adapter.ts @@ -3,9 +3,9 @@ * Provides call graph queries via the dev_refs tool */ -import type { CalleeInfo, RepositoryIndexer, SearchResult } from '@lytics/dev-agent-core'; +import type { CalleeInfo, SearchResult, SearchService } from '@lytics/dev-agent-core'; import { estimateTokensForText, startTimer } from '../../formatters/utils'; -import { RefsArgsSchema } from '../../schemas/index.js'; +import { RefsArgsSchema, type RefsOutput, RefsOutputSchema } from '../../schemas/index.js'; import { ToolAdapter } from '../tool-adapter'; import type { AdapterContext, ToolDefinition, ToolExecutionContext, ToolResult } from '../types'; import { validateArgs } from '../validation.js'; @@ -20,9 +20,9 @@ export type RefDirection = 'callees' | 'callers' | 'both'; */ export interface RefsAdapterConfig { /** - * Repository indexer instance + * Search service instance */ - repositoryIndexer: RepositoryIndexer; + searchService: SearchService; /** * Default result limit @@ -53,14 +53,16 @@ export class RefsAdapter extends ToolAdapter { author: 'Dev-Agent Team', }; - private indexer: RepositoryIndexer; - private config: Required; + private searchService: SearchService; + private config: Required> & { + searchService: SearchService; + }; constructor(config: RefsAdapterConfig) { super(); - this.indexer = config.repositoryIndexer; + this.searchService = config.searchService; this.config = { - repositoryIndexer: config.repositoryIndexer, + searchService: config.searchService, defaultLimit: config.defaultLimit ?? 20, }; } @@ -102,6 +104,43 @@ export class RefsAdapter extends ToolAdapter { }, required: ['name'], }, + outputSchema: { + type: 'object', + properties: { + name: { + type: 'string', + description: 'Function/method name queried', + }, + direction: { + type: 'string', + enum: ['callees', 'callers', 'both'], + description: 'Direction of query', + }, + content: { + type: 'string', + description: 'Formatted reference information', + }, + target: { + type: 'object', + description: 'Target function details', + properties: { + name: { type: 'string' }, + file: { type: 'string' }, + line: { type: 'number' }, + type: { type: 'string' }, + }, + }, + callees: { + type: 'array', + description: 'Functions called by target (if requested)', + }, + callers: { + type: 'array', + description: 'Functions calling target (if requested)', + }, + }, + required: ['name', 'direction', 'content', 'target'], + }, }; } @@ -119,7 +158,7 @@ export class RefsAdapter extends ToolAdapter { context.logger.debug('Executing refs query', { name, direction, limit }); // First, find the target component - const searchResults = await this.indexer.search(name, { limit: 10 }); + const searchResults = await this.searchService.search(name, { limit: 10 }); const target = this.findBestMatch(searchResults, name); if (!target) { @@ -173,14 +212,25 @@ export class RefsAdapter extends ToolAdapter { const tokens = estimateTokensForText(content); + // Validate output with Zod + const outputData: RefsOutput = { + name, + direction, + content, + target: result.target, + callees: result.callees, + callers: result.callers, + }; + + const outputValidation = RefsOutputSchema.safeParse(outputData); + if (!outputValidation.success) { + context.logger.error('Output validation failed', { error: outputValidation.error }); + throw new Error(`Output validation failed: ${outputValidation.error.message}`); + } + return { success: true, - data: { - name, - direction, - content, - ...result, - }, + data: outputValidation.data, metadata: { tokens, duration_ms, @@ -240,7 +290,7 @@ export class RefsAdapter extends ToolAdapter { // Search for components that might call this target // We search broadly and then filter by callees - const candidates = await this.indexer.search(targetName, { limit: 100 }); + const candidates = await this.searchService.search(targetName, { limit: 100 }); const callers: RefResult[] = []; diff --git a/packages/mcp-server/src/adapters/built-in/search-adapter.ts b/packages/mcp-server/src/adapters/built-in/search-adapter.ts index 9d4b3b6..db07fc7 100644 --- a/packages/mcp-server/src/adapters/built-in/search-adapter.ts +++ b/packages/mcp-server/src/adapters/built-in/search-adapter.ts @@ -3,9 +3,9 @@ * Provides semantic code search via the dev_search tool */ -import type { RepositoryIndexer } from '@lytics/dev-agent-core'; +import type { SearchService } from '@lytics/dev-agent-core'; import { CompactFormatter, type FormatMode, VerboseFormatter } from '../../formatters'; -import { SearchArgsSchema } from '../../schemas/index.js'; +import { SearchArgsSchema, type SearchOutput, SearchOutputSchema } from '../../schemas/index.js'; import { findRelatedTestFiles, formatRelatedFiles } from '../../utils/related-files'; import { ToolAdapter } from '../tool-adapter'; import type { AdapterContext, ToolDefinition, ToolExecutionContext, ToolResult } from '../types'; @@ -16,9 +16,9 @@ import { validateArgs } from '../validation.js'; */ export interface SearchAdapterConfig { /** - * Repository indexer instance + * Search service instance */ - repositoryIndexer: RepositoryIndexer; + searchService: SearchService; /** * Repository root path (for finding related files) @@ -53,16 +53,16 @@ export class SearchAdapter extends ToolAdapter { author: 'Dev-Agent Team', }; - private indexer: RepositoryIndexer; + private searchService: SearchService; private config: Required> & { repositoryPath?: string; }; constructor(config: SearchAdapterConfig) { super(); - this.indexer = config.repositoryIndexer; + this.searchService = config.searchService; this.config = { - repositoryIndexer: config.repositoryIndexer, + searchService: config.searchService, repositoryPath: config.repositoryPath, defaultFormat: config.defaultFormat ?? 'compact', defaultLimit: config.defaultLimit ?? 10, @@ -123,6 +123,24 @@ export class SearchAdapter extends ToolAdapter { }, required: ['query'], }, + outputSchema: { + type: 'object', + properties: { + query: { + type: 'string', + description: 'The search query that was executed', + }, + format: { + type: 'string', + description: 'The output format used', + }, + content: { + type: 'string', + description: 'Formatted search results', + }, + }, + required: ['query', 'format', 'content'], + }, }; } @@ -145,8 +163,8 @@ export class SearchAdapter extends ToolAdapter { tokenBudget, }); - // Perform search - const results = await this.indexer.search(query as string, { + // Perform search using SearchService + const results = await this.searchService.search(query as string, { limit: limit as number, scoreThreshold: scoreThreshold as number, }); @@ -194,13 +212,22 @@ export class SearchAdapter extends ToolAdapter { duration_ms, }); + // Validate output with Zod + const outputData: SearchOutput = { + query: query as string, + format, + content: formatted.content + relatedFilesSection, + }; + + const outputValidation = SearchOutputSchema.safeParse(outputData); + if (!outputValidation.success) { + context.logger.error('Output validation failed', { error: outputValidation.error }); + throw new Error(`Output validation failed: ${outputValidation.error.message}`); + } + return { success: true, - data: { - query, - format, - content: formatted.content + relatedFilesSection, - }, + data: outputValidation.data, metadata: { tokens: formatted.tokens, duration_ms, diff --git a/packages/mcp-server/src/adapters/built-in/status-adapter.ts b/packages/mcp-server/src/adapters/built-in/status-adapter.ts index 13488f6..d00938d 100644 --- a/packages/mcp-server/src/adapters/built-in/status-adapter.ts +++ b/packages/mcp-server/src/adapters/built-in/status-adapter.ts @@ -5,10 +5,9 @@ import * as fs from 'node:fs'; import * as path from 'node:path'; -import type { RepositoryIndexer } from '@lytics/dev-agent-core'; -import { GitHubIndexer } from '@lytics/dev-agent-subagents'; +import type { GitHubService, StatsService } from '@lytics/dev-agent-core'; import { estimateTokensForText } from '../../formatters/utils'; -import { StatusArgsSchema } from '../../schemas/index.js'; +import { StatusArgsSchema, type StatusOutput, StatusOutputSchema } from '../../schemas/index.js'; import { ToolAdapter } from '../tool-adapter'; import type { AdapterContext, ToolDefinition, ToolExecutionContext, ToolResult } from '../types'; import { validateArgs } from '../validation.js'; @@ -23,9 +22,9 @@ export type StatusSection = 'summary' | 'repo' | 'indexes' | 'github' | 'health' */ export interface StatusAdapterConfig { /** - * Repository indexer instance + * Stats service for repository statistics */ - repositoryIndexer: RepositoryIndexer; + statsService: StatsService; /** * Repository path @@ -37,6 +36,11 @@ export interface StatusAdapterConfig { */ vectorStorePath: string; + /** + * Optional GitHub service for GitHub integration status + */ + githubService?: GitHubService; + /** * Default section to display */ @@ -55,19 +59,20 @@ export class StatusAdapter extends ToolAdapter { author: 'Dev-Agent Team', }; - private repositoryIndexer: RepositoryIndexer; + private statsService: StatsService; private repositoryPath: string; private vectorStorePath: string; private defaultSection: StatusSection; - private githubIndexer?: GitHubIndexer; + private githubService?: GitHubService; private githubStatePath?: string; // Track state file path for reload private lastStateFileModTime?: number; // Track state file modification time for auto-reload constructor(config: StatusAdapterConfig) { super(); - this.repositoryIndexer = config.repositoryIndexer; + this.statsService = config.statsService; this.repositoryPath = config.repositoryPath; this.vectorStorePath = config.vectorStorePath; + this.githubService = config.githubService; this.defaultSection = config.defaultSection ?? 'summary'; } @@ -75,38 +80,19 @@ export class StatusAdapter extends ToolAdapter { context.logger.info('StatusAdapter initialized', { repositoryPath: this.repositoryPath, defaultSection: this.defaultSection, + hasGitHubService: !!this.githubService, }); - // Initialize GitHub indexer lazily - try { - // Try to load repository from state file - let repository: string | undefined; + // Track GitHub state file for reload detection + if (this.githubService) { this.githubStatePath = path.join(this.repositoryPath, '.dev-agent/github-state.json'); try { - const stateContent = await fs.promises.readFile(this.githubStatePath, 'utf-8'); - const state = JSON.parse(stateContent); - repository = state.repository; - - // Track initial modification time + // Track initial modification time for change detection const stats = await fs.promises.stat(this.githubStatePath); this.lastStateFileModTime = stats.mtimeMs; } catch { - // State file doesn't exist, will try gh CLI + // State file doesn't exist yet, will be created on first GitHub index } - - this.githubIndexer = new GitHubIndexer( - { - vectorStorePath: `${this.vectorStorePath}-github`, - statePath: this.githubStatePath, // Use absolute path - autoUpdate: false, // We handle updates via file watching - staleThreshold: 15 * 60 * 1000, - }, - repository - ); - await this.githubIndexer.initialize(); - } catch (error) { - context.logger.warn('GitHub indexer initialization failed', { error }); - // Not fatal, GitHub section will show "not indexed" } } @@ -130,50 +116,29 @@ export class StatusAdapter extends ToolAdapter { } /** - * Reload the GitHub indexer to pick up fresh data + * Update tracking of GitHub state file modification time + * Note: GitHubService handles its own data freshness, this is just for tracking */ - private async reloadGitHubIndexer(): Promise { - if (!this.githubIndexer || !this.githubStatePath) { + private async updateGitHubStateTracking(): Promise { + if (!this.githubStatePath) { return; } try { - // Close existing indexer - await this.githubIndexer.close(); - - // Load fresh repository from state file - const stateContent = await fs.promises.readFile(this.githubStatePath, 'utf-8'); - const state = JSON.parse(stateContent); - const repository: string | undefined = state.repository; - - // Update modification time const stats = await fs.promises.stat(this.githubStatePath); this.lastStateFileModTime = stats.mtimeMs; - - // Re-initialize with fresh data - this.githubIndexer = new GitHubIndexer( - { - vectorStorePath: `${this.vectorStorePath}-github`, - statePath: this.githubStatePath, - autoUpdate: false, - staleThreshold: 15 * 60 * 1000, - }, - repository - ); - await this.githubIndexer.initialize(); - } catch (error) { - // Log error but don't crash - old indexer might still work - console.error('[StatusAdapter] Failed to reload GitHub indexer:', error); + } catch { + // State file may not exist yet } } /** - * Ensure GitHub indexer is up-to-date - * Checks for file modifications and reloads if needed + * Ensure GitHub state tracking is up-to-date + * GitHubService handles data freshness internally */ private async ensureGitHubIndexerUpToDate(): Promise { - if (this.githubIndexer && (await this.hasGitHubStateChanged())) { - await this.reloadGitHubIndexer(); + if (this.githubService && (await this.hasGitHubStateChanged())) { + await this.updateGitHubStateTracking(); } } @@ -201,6 +166,28 @@ export class StatusAdapter extends ToolAdapter { }, required: [], }, + outputSchema: { + type: 'object', + properties: { + content: { + type: 'string', + description: 'Status report content in markdown format', + }, + section: { + type: 'string', + description: 'The section that was displayed', + }, + format: { + type: 'string', + description: 'The format that was used', + }, + length: { + type: 'number', + description: 'Length of the content in characters', + }, + }, + required: ['content', 'section', 'format', 'length'], + }, }; } @@ -225,13 +212,23 @@ export class StatusAdapter extends ToolAdapter { context.logger.info('Status check completed', { section, format, duration_ms }); + // Validate output with Zod (ensures type safety) + const outputData: StatusOutput = { + section, + format, + content, + length: content.length, + }; + + const outputValidation = StatusOutputSchema.safeParse(outputData); + if (!outputValidation.success) { + context.logger.error('Output validation failed', { error: outputValidation.error }); + throw new Error(`Output validation failed: ${outputValidation.error.message}`); + } + return { success: true, - data: { - section, - format, - content, - }, + data: outputValidation.data, metadata: { tokens, duration_ms, @@ -280,8 +277,8 @@ export class StatusAdapter extends ToolAdapter { * Generate summary (overview of all sections) */ private async generateSummary(format: string): Promise { - const repoStats = await this.repositoryIndexer.getStats(); - const githubStats = this.githubIndexer?.getStats() ?? null; + const repoStats = await this.statsService.getStats(); + const githubStats = (await this.githubService?.getStats()) ?? null; if (format === 'verbose') { return this.generateVerboseSummary(repoStats, githubStats); @@ -336,8 +333,8 @@ export class StatusAdapter extends ToolAdapter { * Generate verbose summary with all details */ private generateVerboseSummary( - repoStats: Awaited>, - githubStats: ReturnType['getStats']> + repoStats: Awaited>, + githubStats: Awaited['getStats']>> | null ): string { const lines: string[] = ['## Dev-Agent Status (Detailed)', '']; @@ -388,7 +385,7 @@ export class StatusAdapter extends ToolAdapter { * Generate repository status */ private async generateRepoStatus(format: string): Promise { - const stats = await this.repositoryIndexer.getStats(); + const stats = await this.statsService.getStats(); const lines: string[] = ['## Repository Index', '']; @@ -430,8 +427,8 @@ export class StatusAdapter extends ToolAdapter { * Generate indexes status */ private async generateIndexesStatus(format: string): Promise { - const repoStats = await this.repositoryIndexer.getStats(); - const githubStats = this.githubIndexer?.getStats() ?? null; + const repoStats = await this.statsService.getStats(); + const githubStats = (await this.githubService?.getStats()) ?? null; const storageSize = await this.getStorageSize(); const lines: string[] = ['## Vector Indexes', '']; @@ -487,7 +484,7 @@ export class StatusAdapter extends ToolAdapter { // Check for index updates and reload if needed await this.ensureGitHubIndexerUpToDate(); - const stats = this.githubIndexer?.getStats() ?? null; + const stats = (await this.githubService?.getStats()) ?? null; const lines: string[] = ['## GitHub Integration', '']; @@ -579,7 +576,7 @@ export class StatusAdapter extends ToolAdapter { } // Vector storage - const stats = await this.repositoryIndexer.getStats(); + const stats = await this.statsService.getStats(); if (stats) { checks.push({ name: 'Vector Storage', diff --git a/packages/mcp-server/src/schemas/index.ts b/packages/mcp-server/src/schemas/index.ts index fbc3e08..62b0a7b 100644 --- a/packages/mcp-server/src/schemas/index.ts +++ b/packages/mcp-server/src/schemas/index.ts @@ -173,6 +173,18 @@ export const StatusArgsSchema = z export type StatusArgs = z.infer; +/** + * Status output schema + */ +export const StatusOutputSchema = z.object({ + content: z.string(), + section: z.string(), + format: z.string(), + length: z.number(), +}); + +export type StatusOutput = z.infer; + // ============================================================================ // Health Adapter // ============================================================================ @@ -184,3 +196,139 @@ export const HealthArgsSchema = z .strict(); export type HealthArgs = z.infer; + +// ============================================================================ +// Output Schemas (Runtime validation for adapter responses) +// ============================================================================ + +/** + * Search output schema + */ +export const SearchOutputSchema = z.object({ + query: z.string(), + format: z.string(), + content: z.string(), +}); + +export type SearchOutput = z.infer; + +/** + * GitHub output schema + */ +export const GitHubOutputSchema = z.object({ + action: z.string(), + format: z.string(), + content: z.string(), + resultsTotal: z.number().optional(), + resultsReturned: z.number().optional(), +}); + +export type GitHubOutput = z.infer; + +/** + * Health check result schema + */ +export const HealthCheckResultSchema = z.object({ + status: z.enum(['pass', 'warn', 'fail']), + message: z.string(), + details: z.any().optional(), // Allow any type for details +}); + +export const HealthOutputSchema = z.object({ + status: z.enum(['healthy', 'degraded', 'unhealthy']), + uptime: z.number(), + timestamp: z.string(), + checks: z.object({ + vectorStorage: HealthCheckResultSchema, + repository: HealthCheckResultSchema, + githubIndex: HealthCheckResultSchema.optional(), + }), + formattedReport: z.string(), +}); + +export type HealthOutput = z.infer; + +/** + * Map output schema + */ +export const MapOutputSchema = z.object({ + content: z.string(), + totalComponents: z.number(), + totalDirectories: z.number(), + depth: z.number(), + focus: z.string().nullable(), + truncated: z.boolean(), +}); + +export type MapOutput = z.infer; + +/** + * Plan output schema + */ +export const PlanOutputSchema = z.object({ + issue: z.number(), + format: z.string(), + content: z.string(), + context: z.any().optional(), // Complex nested structure, can refine later +}); + +export type PlanOutput = z.infer; + +/** + * History commit summary schema + */ +export const HistoryCommitSummarySchema = z.object({ + hash: z.string(), + subject: z.string(), + author: z.string(), + date: z.string(), + filesChanged: z.number(), +}); + +export const HistoryOutputSchema = z.object({ + searchType: z.enum(['semantic', 'file']), + query: z.string().optional(), + file: z.string().optional(), + commits: z.array(HistoryCommitSummarySchema), + content: z.string(), +}); + +export type HistoryOutput = z.infer; + +/** + * Refs result schema (some fields may be undefined in practice) + */ +export const RefResultSchema = z.object({ + name: z.string(), + file: z.string().optional(), + line: z.number().optional(), + type: z.string().optional(), +}); + +export const RefsOutputSchema = z.object({ + name: z.string(), + direction: z.string(), + content: z.string(), + target: z.object({ + name: z.string(), + file: z.string(), + line: z.number(), + type: z.string(), + }), + callees: z.array(RefResultSchema).optional(), + callers: z.array(RefResultSchema).optional(), +}); + +export type RefsOutput = z.infer; + +/** + * Explore output schema + */ +export const ExploreOutputSchema = z.object({ + action: z.string(), + query: z.string(), + format: z.string(), + content: z.string(), +}); + +export type ExploreOutput = z.infer; diff --git a/packages/mcp-server/src/server/__tests__/server.integration.test.ts b/packages/mcp-server/src/server/__tests__/server.integration.test.ts index 1ea4596..f70580d 100644 --- a/packages/mcp-server/src/server/__tests__/server.integration.test.ts +++ b/packages/mcp-server/src/server/__tests__/server.integration.test.ts @@ -3,14 +3,23 @@ * Tests the full server + adapter + transport stack */ -import { afterEach, beforeEach, describe, expect, it } from 'vitest'; +import { afterEach, beforeAll, beforeEach, describe, expect, it, vi } from 'vitest'; import { MockAdapter } from '../../adapters/__tests__/mock-adapter'; +import { ConsoleLogger } from '../../utils/logger'; import { MCPServer } from '../mcp-server'; describe('MCP Server Integration', () => { let server: MCPServer; let mockAdapter: MockAdapter; + beforeAll(() => { + // Suppress all logger output globally for this test suite + vi.spyOn(ConsoleLogger.prototype, 'info').mockImplementation(() => {}); + vi.spyOn(ConsoleLogger.prototype, 'debug').mockImplementation(() => {}); + vi.spyOn(ConsoleLogger.prototype, 'warn').mockImplementation(() => {}); + vi.spyOn(ConsoleLogger.prototype, 'error').mockImplementation(() => {}); + }); + beforeEach(() => { mockAdapter = new MockAdapter(); diff --git a/packages/mcp-server/tsconfig.json b/packages/mcp-server/tsconfig.json index ee300d6..e57fbce 100644 --- a/packages/mcp-server/tsconfig.json +++ b/packages/mcp-server/tsconfig.json @@ -8,6 +8,12 @@ "declarationMap": true }, "include": ["src/**/*", "bin/**/*"], - "exclude": ["node_modules", "dist", "**/*.test.ts"], + "exclude": [ + "node_modules", + "dist", + "**/*.test.ts", + "**/*.spec.ts", + "**/__tests__/**" + ], "references": [{ "path": "../core" }, { "path": "../subagents" }, { "path": "../logger" }] } diff --git a/packages/subagents/CHANGELOG.md b/packages/subagents/CHANGELOG.md index 94eab3e..084180a 100644 --- a/packages/subagents/CHANGELOG.md +++ b/packages/subagents/CHANGELOG.md @@ -35,7 +35,7 @@ **Indexer Logging** - - Add `--verbose` flag to `dev index`, `dev git index`, `dev gh index` + - Add `--verbose` flag to `dev index`, `dev git index`, `dev github index` - Progress spinner shows actual counts: `Embedding 4480/49151 documents (9%)` - Structured logging with kero logger diff --git a/packages/subagents/package.json b/packages/subagents/package.json index a9c9a01..6589b59 100644 --- a/packages/subagents/package.json +++ b/packages/subagents/package.json @@ -31,6 +31,7 @@ }, "dependencies": { "@lytics/dev-agent-core": "workspace:*", + "@lytics/dev-agent-types": "workspace:*", "@lytics/kero": "workspace:*", "zod": "^4.1.13" }, diff --git a/packages/subagents/src/coordinator/__tests__/github-coordinator.integration.test.ts b/packages/subagents/src/coordinator/__tests__/github-coordinator.integration.test.ts index 2b09e6a..f76b4ff 100644 --- a/packages/subagents/src/coordinator/__tests__/github-coordinator.integration.test.ts +++ b/packages/subagents/src/coordinator/__tests__/github-coordinator.integration.test.ts @@ -46,8 +46,12 @@ describe('Coordinator → GitHub Integration', () => { let coordinator: SubagentCoordinator; let github: GitHubAgent; let tempDir: string; + let errorSpy: any; // Mock spy for CoordinatorLogger.error beforeEach(async () => { + // Suppress error logs globally for all tests (expected errors during test setup) + errorSpy = vi.spyOn(CoordinatorLogger.prototype, 'error').mockImplementation(() => {}); + // Create temp directory tempDir = await mkdtemp(join(tmpdir(), 'gh-coordinator-test-')); @@ -70,6 +74,7 @@ describe('Coordinator → GitHub Integration', () => { }); afterEach(async () => { + errorSpy.mockRestore(); await coordinator.stop(); await rm(tempDir, { recursive: true, force: true }); }); @@ -123,9 +128,6 @@ describe('Coordinator → GitHub Integration', () => { }); it('should route search request to GitHub agent', async () => { - // Suppress error logs (search without indexed data is expected to be handled) - const errorSpy = vi.spyOn(CoordinatorLogger.prototype, 'error').mockImplementation(() => {}); - // Index first (required for search) const indexResponse = await coordinator.sendMessage({ type: 'request', @@ -162,14 +164,9 @@ describe('Coordinator → GitHub Integration', () => { const result = searchResponse?.payload as unknown as GitHubContextResult; expect(result.action).toBe('search'); expect(Array.isArray(result.results)).toBe(true); - - errorSpy.mockRestore(); }); it('should handle context requests', async () => { - // Suppress error logs (GitHub data not indexed error is expected) - const errorSpy = vi.spyOn(CoordinatorLogger.prototype, 'error').mockImplementation(() => {}); - const response = await coordinator.sendMessage({ type: 'request', sender: 'test', @@ -186,14 +183,9 @@ describe('Coordinator → GitHub Integration', () => { const result = response?.payload as unknown as GitHubContextResult; expect(result.action).toBe('context'); - - errorSpy.mockRestore(); }); it('should handle related requests', async () => { - // Suppress error logs (GitHub data not indexed error is expected) - const errorSpy = vi.spyOn(CoordinatorLogger.prototype, 'error').mockImplementation(() => {}); - const response = await coordinator.sendMessage({ type: 'request', sender: 'test', @@ -210,8 +202,6 @@ describe('Coordinator → GitHub Integration', () => { const result = response?.payload as unknown as GitHubContextResult; expect(result.action).toBe('related'); - - errorSpy.mockRestore(); }); it('should handle non-request messages gracefully', async () => { @@ -229,9 +219,6 @@ describe('Coordinator → GitHub Integration', () => { describe('Error Handling', () => { it('should handle invalid actions', async () => { - // Suppress error logs for this intentional error test - const errorSpy = vi.spyOn(CoordinatorLogger.prototype, 'error').mockImplementation(() => {}); - const response = await coordinator.sendMessage({ type: 'request', sender: 'test', @@ -244,14 +231,9 @@ describe('Coordinator → GitHub Integration', () => { expect(response).toBeDefined(); expect(response?.type).toBe('response'); - - errorSpy.mockRestore(); }); it('should handle missing required fields', async () => { - // Suppress error logs for this intentional error test - const errorSpy = vi.spyOn(CoordinatorLogger.prototype, 'error').mockImplementation(() => {}); - const response = await coordinator.sendMessage({ type: 'request', sender: 'test', diff --git a/packages/subagents/src/github/README.md b/packages/subagents/src/github/README.md index fe6d6f1..a449333 100644 --- a/packages/subagents/src/github/README.md +++ b/packages/subagents/src/github/README.md @@ -32,16 +32,16 @@ github/ ```bash # Index GitHub data (issues, PRs, discussions) -dev gh index +dev github index # Index with options -dev gh index --issues --prs --limit 100 +dev github index --issues --prs --limit 100 # Search GitHub context -dev gh search "rate limiting" +dev github search "rate limiting" # Get full context for an issue -dev gh context 42 +dev github context 42 ``` ### Programmatic Usage @@ -506,8 +506,8 @@ The agent handles errors gracefully and returns structured error responses: 4. **Batch processing:** For very large repos, index in batches with lower limits ```bash # Example: Index open items separately - dev gh index --state open --limit 500 - dev gh index --state closed --limit 100 + dev github index --state open --limit 500 + dev github index --state closed --limit 100 ``` ## Future Enhancements @@ -539,21 +539,21 @@ gh auth login **Solution:** ```bash # Use lower limit -dev gh index --limit 100 +dev github index --limit 100 # Or for very large repos -dev gh index --limit 50 +dev github index --limit 50 # Alternative: Index by state separately -dev gh index --state open --limit 500 -dev gh index --state closed --limit 100 +dev github index --state open --limit 500 +dev github index --state closed --limit 100 ``` **Cause:** Buffer overflow when fetching many issues/PRs with large bodies. Default limit of 500 works for most repos, but very active repositories may need lower limits. ### No results when searching -1. Check if data is indexed: `dev gh index` +1. Check if data is indexed: `dev github index` 2. Verify search query matches content 3. Check `state` filter (default: 'all') diff --git a/packages/subagents/src/github/types.ts b/packages/subagents/src/github/types.ts index d269c8f..2b9793d 100644 --- a/packages/subagents/src/github/types.ts +++ b/packages/subagents/src/github/types.ts @@ -1,172 +1,33 @@ /** * GitHub Context Subagent Types - * Type definitions for GitHub data indexing and context provision - */ - -import type { Logger } from '@lytics/kero'; - -/** - * Type of GitHub document - */ -export type GitHubDocumentType = 'issue' | 'pull_request' | 'discussion'; - -/** - * GitHub document status - */ -export type GitHubState = 'open' | 'closed' | 'merged'; - -/** - * GitHub document that can be indexed - */ -export interface GitHubDocument { - type: GitHubDocumentType; - number: number; - title: string; - body: string; - state: GitHubState; - labels: string[]; - author: string; - createdAt: string; - updatedAt: string; - closedAt?: string; - url: string; - repository: string; // owner/repo format - - // For PRs only - mergedAt?: string; - headBranch?: string; - baseBranch?: string; - - // Metadata - comments: number; - reactions: Record; - - // Relationships (extracted from text) - relatedIssues: number[]; - relatedPRs: number[]; - linkedFiles: string[]; - mentions: string[]; -} - -/** - * GitHub search options - */ -export interface GitHubSearchOptions { - type?: GitHubDocumentType; - state?: GitHubState; - labels?: string[]; - author?: string; - limit?: number; - scoreThreshold?: number; - since?: string; // ISO date - until?: string; // ISO date -} - -/** - * GitHub search result - */ -export interface GitHubSearchResult { - document: GitHubDocument; - score: number; - matchedFields: string[]; // Which fields matched the query -} - -/** - * GitHub context for an issue/PR - */ -export interface GitHubContext { - document: GitHubDocument; - relatedIssues: GitHubDocument[]; - relatedPRs: GitHubDocument[]; - linkedCodeFiles: Array<{ - path: string; - reason: string; - score: number; - }>; - discussionSummary?: string; -} - -/** - * GitHub indexer configuration - */ -export interface GitHubIndexerConfig { - vectorStorePath: string; // Path to LanceDB vector storage - statePath?: string; // Path to state file (default: .dev-agent/github-state.json) - autoUpdate?: boolean; // Enable auto-updates (default: true) - staleThreshold?: number; // Stale threshold in ms (default: 15 minutes) -} - -/** - * GitHub indexer state (persisted to disk) - */ -export interface GitHubIndexerState { - version: string; // State format version - repository: string; - lastIndexed: string; // ISO date - totalDocuments: number; - byType: Record; - byState: Record; -} - -/** - * Progress information for GitHub indexing - */ -export interface GitHubIndexProgress { - phase: 'fetching' | 'enriching' | 'embedding' | 'complete'; - documentsProcessed: number; - totalDocuments: number; - percentComplete: number; -} - -/** - * GitHub indexing options - */ -export interface GitHubIndexOptions { - repository?: string; // If not provided, use current repo - types?: GitHubDocumentType[]; - state?: GitHubState[]; - since?: string; // ISO date - only index items updated after this - limit?: number; // Max items to fetch (for testing) - /** Progress callback */ - onProgress?: (progress: GitHubIndexProgress) => void; - /** Logger instance */ - logger?: Logger; -} - -/** - * GitHub indexing stats - */ -export interface GitHubIndexStats { - repository: string; - totalDocuments: number; - byType: Record; - byState: Record; - lastIndexed: string; // ISO date - indexDuration: number; // milliseconds -} - -/** - * GitHub fetcher response from gh CLI - */ -export interface GitHubAPIResponse { - number: number; - title: string; - body: string; - state: string; - labels: Array<{ name: string }>; - author: { login: string }; - createdAt: string; - updatedAt: string; - closedAt?: string; - url: string; - comments: number; - reactions?: Record; - - // PR-specific - mergedAt?: string; - headRefName?: string; - baseRefName?: string; -} + * + * Re-exports shared types from @lytics/dev-agent-types for backward compatibility. + * New code should import directly from @lytics/dev-agent-types/github. + */ + +import type { + GitHubContext, + GitHubDocument, + GitHubIndexOptions, + GitHubIndexStats, + GitHubSearchOptions, + GitHubSearchResult, +} from '@lytics/dev-agent-types/github'; + +export type { + GitHubAPIResponse, + GitHubContext, + GitHubDocument, + GitHubDocumentType, + GitHubIndexerConfig, + GitHubIndexerState, + GitHubIndexOptions, + GitHubIndexProgress, + GitHubIndexStats, + GitHubSearchOptions, + GitHubSearchResult, + GitHubState, +} from '@lytics/dev-agent-types/github'; /** * GitHub Context request (for agent communication) diff --git a/packages/subagents/src/github/utils/fetcher.ts b/packages/subagents/src/github/utils/fetcher.ts index 7b607c1..8756845 100644 --- a/packages/subagents/src/github/utils/fetcher.ts +++ b/packages/subagents/src/github/utils/fetcher.ts @@ -217,7 +217,7 @@ export function apiResponseToDocument( closedAt: response.closedAt, url: response.url, repository, - comments: response.comments || 0, + comments: Array.isArray(response.comments) ? response.comments.length : 0, reactions: response.reactions || {}, relatedIssues: [], relatedPRs: [], diff --git a/packages/subagents/src/planner/utils/__tests__/context-assembler.test.ts b/packages/subagents/src/planner/utils/__tests__/context-assembler.test.ts index b153b8a..fdedee8 100644 --- a/packages/subagents/src/planner/utils/__tests__/context-assembler.test.ts +++ b/packages/subagents/src/planner/utils/__tests__/context-assembler.test.ts @@ -1,16 +1,16 @@ import type { RepositoryIndexer, SearchResult } from '@lytics/dev-agent-core'; import { beforeEach, describe, expect, it, vi } from 'vitest'; import type { ContextPackage } from '../../context-types'; -import { assembleContext, formatContextPackage } from '../context-assembler'; -// Mock the GitHub fetch -vi.mock('../github', () => ({ - fetchGitHubIssue: vi.fn(), -})); +// Mock execSync from child_process to avoid actual shell commands +const mockExecSync = vi.hoisted(() => vi.fn()); -import { fetchGitHubIssue } from '../github'; +vi.mock('node:child_process', () => ({ + execSync: mockExecSync, +})); -const mockFetchGitHubIssue = vi.mocked(fetchGitHubIssue); +// Now we can safely import the modules +import { assembleContext, formatContextPackage } from '../context-assembler'; describe('Context Assembler', () => { const mockIssue = { @@ -69,7 +69,35 @@ describe('Context Assembler', () => { beforeEach(() => { vi.clearAllMocks(); - mockFetchGitHubIssue.mockResolvedValue(mockIssue); + + // Mock execSync to return appropriate responses + mockExecSync.mockImplementation((cmd: string) => { + if (cmd === 'gh --version') { + return Buffer.from('gh version 2.0.0'); + } + if (cmd.toString().includes('gh issue view')) { + // Return mock issue data as JSON + return Buffer.from( + JSON.stringify({ + number: mockIssue.number, + title: mockIssue.title, + body: mockIssue.body, + state: mockIssue.state, + createdAt: mockIssue.createdAt, + updatedAt: mockIssue.updatedAt, + labels: mockIssue.labels.map((name) => ({ name })), + assignees: mockIssue.assignees.map((login) => ({ login })), + author: { login: mockIssue.author }, + comments: mockIssue.comments.map((c) => ({ + author: { login: c.author }, + body: c.body, + createdAt: c.createdAt, + })), + }) + ); + } + return Buffer.from(''); + }); }); describe('assembleContext', () => { @@ -172,9 +200,38 @@ describe('Context Assembler', () => { it('should infer relevance reasons correctly', async () => { // Mock issue with title matching a function name - mockFetchGitHubIssue.mockResolvedValueOnce({ + const customIssue = { ...mockIssue, title: 'Fix verifyToken function', + }; + + // Clear previous mock and set up new one for this test + vi.clearAllMocks(); + mockExecSync.mockImplementation((cmd: string) => { + if (cmd === 'gh --version') { + return Buffer.from('gh version 2.0.0'); + } + if (cmd.toString().includes('gh issue view')) { + return Buffer.from( + JSON.stringify({ + number: customIssue.number, + title: customIssue.title, + body: customIssue.body, + state: customIssue.state, + createdAt: customIssue.createdAt, + updatedAt: customIssue.updatedAt, + labels: customIssue.labels.map((name) => ({ name })), + assignees: customIssue.assignees.map((login) => ({ login })), + author: { login: customIssue.author }, + comments: customIssue.comments.map((c) => ({ + author: { login: c.author }, + body: c.body, + createdAt: c.createdAt, + })), + }) + ); + } + return Buffer.from(''); }); const result = await assembleContext(42, mockIndexer, '/repo'); diff --git a/packages/subagents/tsconfig.json b/packages/subagents/tsconfig.json index 0396e7f..af06662 100644 --- a/packages/subagents/tsconfig.json +++ b/packages/subagents/tsconfig.json @@ -4,9 +4,21 @@ "outDir": "./dist", "rootDir": "./src", "composite": true, + "declaration": true, + "declarationMap": true, "types": ["node", "vitest/globals"] }, - "references": [{ "path": "../core" }, { "path": "../logger" }], + "references": [ + { "path": "../core" }, + { "path": "../logger" }, + { "path": "../types" } + ], "include": ["src/**/*"], - "exclude": ["node_modules", "dist"] + "exclude": [ + "node_modules", + "dist", + "**/*.test.ts", + "**/*.spec.ts", + "**/__tests__/**" + ] } diff --git a/packages/types/README.md b/packages/types/README.md new file mode 100644 index 0000000..44222f4 --- /dev/null +++ b/packages/types/README.md @@ -0,0 +1,28 @@ +# @lytics/dev-agent-types + +Shared TypeScript type definitions for dev-agent packages. + +## Purpose + +This package provides common type definitions that are shared across multiple dev-agent packages, preventing circular dependencies and ensuring type consistency. + +## Structure + +- `github.ts` - GitHub-related types (documents, search, indexing) +- `index.ts` - Main exports + +## Usage + +```typescript +import type { GitHubDocument, GitHubSearchResult } from '@lytics/dev-agent-types/github'; +``` + +## Why a Separate Package? + +This package exists to break circular dependencies between: +- `@lytics/dev-agent-core` (services) +- `@lytics/dev-agent-subagents` (GitHub indexer, agents) +- `@lytics/dev-agent-mcp` (MCP adapters) + +By extracting shared types into a separate package that all others depend on, we maintain a clean dependency graph while ensuring type safety. + diff --git a/packages/types/package.json b/packages/types/package.json new file mode 100644 index 0000000..70b3542 --- /dev/null +++ b/packages/types/package.json @@ -0,0 +1,37 @@ +{ + "name": "@lytics/dev-agent-types", + "version": "0.1.0", + "description": "Shared TypeScript types for dev-agent packages", + "type": "module", + "main": "./dist/index.js", + "types": "./dist/index.d.ts", + "exports": { + ".": { + "types": "./dist/index.d.ts", + "import": "./dist/index.js" + }, + "./github": { + "types": "./dist/github.d.ts", + "import": "./dist/github.js" + } + }, + "files": [ + "dist" + ], + "scripts": { + "build": "tsc", + "dev": "tsc --watch", + "clean": "rm -rf dist" + }, + "dependencies": { + "@lytics/kero": "workspace:*" + }, + "devDependencies": { + "typescript": "^5.7.2" + }, + "publishConfig": { + "access": "restricted" + }, + "private": true +} + diff --git a/packages/types/src/github.ts b/packages/types/src/github.ts new file mode 100644 index 0000000..948b0c3 --- /dev/null +++ b/packages/types/src/github.ts @@ -0,0 +1,191 @@ +/** + * GitHub Types + * + * Shared type definitions for GitHub operations across dev-agent packages. + * These types are used by: + * - @lytics/dev-agent-core (GitHubService) + * - @lytics/dev-agent-subagents (GitHubIndexer, GitHubAgent) + * - @lytics/dev-agent-mcp (GitHubAdapter) + */ + +import type { Logger } from '@lytics/kero'; + +/** + * Type of GitHub document + */ +export type GitHubDocumentType = 'issue' | 'pull_request' | 'discussion'; + +/** + * GitHub document status + */ +export type GitHubState = 'open' | 'closed' | 'merged'; + +/** + * GitHub document that can be indexed + */ +export interface GitHubDocument { + type: GitHubDocumentType; + number: number; + title: string; + body: string; + state: GitHubState; + labels: string[]; + author: string; + createdAt: string; + updatedAt: string; + closedAt?: string; + url: string; + repository: string; // owner/repo format + + // For PRs only + mergedAt?: string; + headBranch?: string; + baseBranch?: string; + + // Metadata + comments: number; + reactions: Record; + + // Relationships (extracted from text) + relatedIssues: number[]; + relatedPRs: number[]; + linkedFiles: string[]; + mentions: string[]; +} + +/** + * GitHub search options + */ +export interface GitHubSearchOptions { + type?: GitHubDocumentType; + state?: GitHubState; + labels?: string[]; + author?: string; + limit?: number; + scoreThreshold?: number; + since?: string; // ISO date + until?: string; // ISO date +} + +/** + * GitHub search result + */ +export interface GitHubSearchResult { + document: GitHubDocument; + score: number; + matchedFields: string[]; // Which fields matched the query +} + +/** + * GitHub context for an issue/PR + */ +export interface GitHubContext { + document: GitHubDocument; + relatedIssues: GitHubDocument[]; + relatedPRs: GitHubDocument[]; + linkedCodeFiles: Array<{ + path: string; + reason: string; + score: number; + }>; + discussionSummary?: string; +} + +/** + * GitHub indexer configuration + */ +export interface GitHubIndexerConfig { + vectorStorePath: string; // Path to LanceDB vector storage + statePath?: string; // Path to state file (default: .dev-agent/github-state.json) + autoUpdate?: boolean; // Enable auto-updates (default: true) + staleThreshold?: number; // Stale threshold in ms (default: 15 minutes) +} + +/** + * GitHub indexer state (persisted to disk) + */ +export interface GitHubIndexerState { + version: string; // State format version + repository: string; + lastIndexed: string; // ISO date + totalDocuments: number; + byType: Record; + byState: Record; +} + +/** + * Progress information for GitHub indexing + */ +export interface GitHubIndexProgress { + phase: 'fetching' | 'enriching' | 'embedding' | 'complete'; + documentsProcessed: number; + totalDocuments: number; + percentComplete: number; +} + +/** + * GitHub indexing options + */ +export interface GitHubIndexOptions { + repository?: string; // If not provided, use current repo + types?: GitHubDocumentType[]; + state?: GitHubState[]; + since?: string; // ISO date - only index items updated after this + limit?: number; // Max items to fetch (for testing) + /** Progress callback */ + onProgress?: (progress: GitHubIndexProgress) => void; + /** Logger instance */ + logger?: Logger; +} + +/** + * GitHub indexing stats + */ +export interface GitHubIndexStats { + repository: string; + totalDocuments: number; + byType: Record; + byState: Record; + lastIndexed: string; // ISO date + indexDuration: number; // milliseconds +} + +/** + * GitHub indexer instance interface + * This represents the actual indexer implementation from subagents + */ +export interface GitHubIndexerInstance { + initialize(): Promise; + index(options?: GitHubIndexOptions): Promise; + search(query: string, options?: GitHubSearchOptions): Promise; + getDocument(number: number): Promise; + getStats(): GitHubIndexStats | null; // Synchronous in implementation + close(): Promise; +} + +/** + * GitHub fetcher response from gh CLI + */ +export interface GitHubAPIResponse { + number: number; + title: string; + body: string; + state: string; + labels: Array<{ name: string }>; + author: { login: string }; + createdAt: string; + updatedAt: string; + closedAt?: string; + url: string; + comments: Array<{ + author: { login: string }; + body: string; + createdAt: string; + }>; + reactions?: Record; + + // PR-specific fields + mergedAt?: string; + headRefName?: string; + baseRefName?: string; +} diff --git a/packages/types/src/index.ts b/packages/types/src/index.ts new file mode 100644 index 0000000..57dc0eb --- /dev/null +++ b/packages/types/src/index.ts @@ -0,0 +1,8 @@ +/** + * @lytics/dev-agent-types + * + * Shared type definitions for dev-agent packages. + */ + +// Re-export all GitHub types +export * from './github.js'; diff --git a/packages/types/tsconfig.json b/packages/types/tsconfig.json new file mode 100644 index 0000000..98adaad --- /dev/null +++ b/packages/types/tsconfig.json @@ -0,0 +1,16 @@ +{ + "extends": "../../tsconfig.json", + "compilerOptions": { + "outDir": "./dist", + "rootDir": "./src", + "composite": true, + "declaration": true, + "declarationMap": true + }, + "include": ["src/**/*"], + "exclude": ["node_modules", "dist"], + "references": [ + { "path": "../logger" } + ] +} + diff --git a/pnpm-lock.yaml b/pnpm-lock.yaml index eb93145..47870ad 100644 --- a/pnpm-lock.yaml +++ b/pnpm-lock.yaml @@ -97,6 +97,9 @@ importers: '@lancedb/lancedb': specifier: ^0.22.3 version: 0.22.3(apache-arrow@18.1.0) + '@lytics/dev-agent-types': + specifier: workspace:* + version: link:../types '@lytics/kero': specifier: workspace:* version: link:../logger @@ -214,6 +217,9 @@ importers: '@lytics/dev-agent-subagents': specifier: workspace:* version: link:../subagents + '@lytics/dev-agent-types': + specifier: workspace:* + version: link:../types '@lytics/kero': specifier: workspace:* version: link:../logger @@ -236,6 +242,9 @@ importers: '@lytics/dev-agent-core': specifier: workspace:* version: link:../core + '@lytics/dev-agent-types': + specifier: workspace:* + version: link:../types '@lytics/kero': specifier: workspace:* version: link:../logger @@ -250,6 +259,16 @@ importers: specifier: ^5.3.3 version: 5.9.3 + packages/types: + dependencies: + '@lytics/kero': + specifier: workspace:* + version: link:../logger + devDependencies: + typescript: + specifier: ^5.7.2 + version: 5.9.3 + packages: /@babel/code-frame@7.27.1: @@ -1853,7 +1872,7 @@ packages: /@types/better-sqlite3@7.6.13: resolution: {integrity: sha512-NMv9ASNARoKksWtsq/SHakpYAYnhBrQgGD8zkLYk/jaK8jUGn08CfEdTRgYhMypUQAfzSP8W6gNLe0q19/t4VA==} dependencies: - '@types/node': 24.10.1 + '@types/node': 22.19.1 dev: true /@types/chai@5.2.3: @@ -1874,7 +1893,7 @@ packages: /@types/conventional-commits-parser@5.0.2: resolution: {integrity: sha512-BgT2szDXnVypgpNxOK8aL5SGjUdaQbC++WZNjF1Qge3Og2+zhHj+RWhmehLhYyvQwqAmvezruVfOf8+3m74W+g==} dependencies: - '@types/node': 24.10.1 + '@types/node': 22.19.1 dev: true /@types/debug@4.1.12: @@ -1918,12 +1937,12 @@ packages: resolution: {integrity: sha512-LCCV0HdSZZZb34qifBsyWlUmok6W7ouER+oQIGBScS8EsZsQbrtFTUrDX4hOl+CS6p7cnNC4td+qrSVGSCTUfQ==} dependencies: undici-types: 6.21.0 - dev: true /@types/node@24.10.1: resolution: {integrity: sha512-GNWcUTRBgIRJD5zj+Tq0fKOJ5XZajIiBroOF0yvj2bSU1WvNdYS/dn9UxwsujGW4JX06dnHyjV2y9rRaybH0iQ==} dependencies: undici-types: 7.16.0 + dev: true /@types/unist@3.0.3: resolution: {integrity: sha512-ko/gIFJRv177XgZsZcBwnqJN5x/Gien8qNOn0D5bQU/zAzVf9Zt3BlcUiLqhV9y4ARk0GbT3tnUiPNgnTXzc/Q==} @@ -4120,7 +4139,7 @@ packages: '@protobufjs/pool': 1.1.0 '@protobufjs/utf8': 1.1.0 '@types/long': 4.0.2 - '@types/node': 24.10.1 + '@types/node': 22.19.1 long: 4.0.0 dev: false @@ -4802,6 +4821,7 @@ packages: /undici-types@7.16.0: resolution: {integrity: sha512-Zz+aZWSj8LE6zoxD+xrjh4VfkIG8Ya6LvYkZqtUQGJPZjYl53ypCaUwWqo7eI0x66KBGeRo+mlBEkMSeSZ38Nw==} + dev: true /unicorn-magic@0.1.0: resolution: {integrity: sha512-lRfVq8fE8gz6QMBuDM6a+LO3IAzTi05H6gCVaUpir2E1Rwpo4ZUog45KpNXKC/Mn3Yb9UDuHumeFTo9iV/D9FQ==} diff --git a/website/content/docs/architecture.mdx b/website/content/docs/architecture.mdx index 15a462d..971bad8 100644 --- a/website/content/docs/architecture.mdx +++ b/website/content/docs/architecture.mdx @@ -63,7 +63,7 @@ Command-line interface: ```bash dev index . # Index repository dev mcp install # Install MCP integration -dev gh index # Index GitHub issues/PRs +dev github index # Index GitHub issues/PRs dev status # Check indexing status ``` diff --git a/website/content/docs/cli.mdx b/website/content/docs/cli.mdx index 9521ec0..2192a68 100644 --- a/website/content/docs/cli.mdx +++ b/website/content/docs/cli.mdx @@ -108,13 +108,13 @@ Index and search GitHub issues/PRs. ```bash # Index GitHub metadata (also done by dev index) -dev gh index +dev github index # Search issues/PRs -dev gh search "authentication bug" +dev github search "authentication bug" # Get context for an issue -dev gh context --issue 42 +dev github context --issue 42 ``` ### `dev git` diff --git a/website/content/docs/tools/_meta.js b/website/content/docs/tools/_meta.js index c3547f2..a256b73 100644 --- a/website/content/docs/tools/_meta.js +++ b/website/content/docs/tools/_meta.js @@ -6,7 +6,7 @@ export default { 'dev-history': 'dev_history', 'dev-plan': 'dev_plan', 'dev-explore': 'dev_explore', - 'dev-gh': 'dev_gh', + 'dev-github': 'dev_gh', 'dev-status': 'dev_status', 'dev-health': 'dev_health', }; diff --git a/website/content/docs/tools/dev-gh.mdx b/website/content/docs/tools/dev-github.mdx similarity index 91% rename from website/content/docs/tools/dev-gh.mdx rename to website/content/docs/tools/dev-github.mdx index a64924f..d0665f7 100644 --- a/website/content/docs/tools/dev-gh.mdx +++ b/website/content/docs/tools/dev-github.mdx @@ -2,6 +2,8 @@ Search GitHub issues and pull requests using semantic search. Understand your project's history and context. +> **Note:** The CLI command is `dev github`, but the MCP tool name is `dev_gh` for brevity in AI interactions. + ## Usage ``` @@ -88,9 +90,9 @@ Find issues/PRs related to a specific one: ## Requirements -> ⚠️ **Index GitHub first.** Run `dev gh index` to enable semantic search: +> ⚠️ **Index GitHub first.** Run `dev github index` to enable semantic search: > ```bash -> dev gh index +> dev github index > ``` This fetches and indexes your repository's issues and PRs. @@ -99,7 +101,7 @@ This fetches and indexes your repository's issues and PRs. > **Search by intent, not keywords.** Try "issues about slow API responses" instead of "performance". -> **GitHub index auto-reloads.** Changes are detected automatically when you re-run `dev gh index`. +> **GitHub index auto-reloads.** Changes are detected automatically when you re-run `dev github index`. ## Response Metadata diff --git a/website/content/docs/troubleshooting.mdx b/website/content/docs/troubleshooting.mdx index 11766a2..668445d 100644 --- a/website/content/docs/troubleshooting.mdx +++ b/website/content/docs/troubleshooting.mdx @@ -110,7 +110,7 @@ Adjust score threshold: ## GitHub Integration -### `dev gh index` fails +### `dev github index` fails ```bash # Check GitHub CLI @@ -123,7 +123,7 @@ gh auth login ### Stale GitHub data ```bash -dev gh index # Re-index +dev github index # Re-index ``` ## Quick Fixes @@ -133,7 +133,7 @@ dev gh index # Re-index ```bash rm -rf ~/.dev-agent/indexes/* dev index . -dev gh index +dev github index dev mcp install --cursor ```