diff --git a/.github/pull_request_template.md b/.github/pull_request_template.md index b0ca18b..ceed015 100644 --- a/.github/pull_request_template.md +++ b/.github/pull_request_template.md @@ -7,17 +7,22 @@ Closes # -## πŸ§ͺ Testability Checklist +## πŸ§ͺ Code Standards Checklist - + - [ ] Pure functions extracted to `utils/` modules - [ ] Utilities achieve 100% coverage (statements & functions) -- [ ] No `!` non-null assertions (use guard clauses or optional chaining) +- [ ] **No type assertions (`as`, `!`)** without validation +- [ ] **Runtime validation** for external data (Zod/type guards) +- [ ] **Result types** used instead of exceptions - [ ] Modules organized by domain (not generic "utils") -- [ ] Each module < 200 lines +- [ ] Each module < 300 lines, each class < 400 lines +- [ ] Constructor injection for dependencies - [ ] Atomic commits with clear dependencies +**See:** [TypeScript Standards](../docs/TYPESCRIPT_STANDARDS.md) + ## βœ… Testing diff --git a/WORKFLOW.md b/WORKFLOW.md index c414b0f..65f8ee1 100644 --- a/WORKFLOW.md +++ b/WORKFLOW.md @@ -2,6 +2,14 @@ Standard workflow for implementing features in dev-agent. +## πŸ“š Related Documentation + +Before starting development, familiarize yourself with our coding standards: + +- **[TypeScript Standards](./docs/TYPESCRIPT_STANDARDS.md)** ⭐ **START HERE** - Our coding manifesto +- **[Feature Template](./docs/FEATURE_TEMPLATE.md)** - Step-by-step guide for new features +- **[Architecture](./ARCHITECTURE.md)** - System design and package structure + ## The Drillβ„’ ### 1. Find Next Work (Dogfooding! πŸ•πŸ½οΈ) @@ -99,6 +107,7 @@ pnpm typecheck - βœ… 100% function coverage - βœ… No linter errors - βœ… No TypeScript errors +- βœ… **Follows [TypeScript Standards](./docs/TYPESCRIPT_STANDARDS.md)** (no `as`, Result types, pure functions) - βœ… Documentation with examples ### 6. Commit & PR @@ -175,6 +184,156 @@ Fix MCP install error handling' > .changeset/fix-name.md - The wrapper package needs to pull in the latest CLI changes - Ensures `npm install -g dev-agent` gets all improvements +## 🎯 Commit Checkpoints (Know When to Commit) + +**Principle:** Commit when you reach a "green state" at a logical boundary. Secure working progress before entering complexity. + +### The Checkpoint Signals + +Commit when you hit **any 2** of these signals: + +#### 1. βœ… Green State +- All tests passing +- Build successful +- No linter/TypeScript errors +- **This is non-negotiable** - never commit broken code + +#### 2. 🎯 Logical Boundary +- Foundation complete (schemas, types, utils) +- Feature partially working (demo-able) +- Pattern proven (1+ examples working) +- Module/component finished + +#### 3. ⚠️ Before Complexity +- About to refactor large file (>500 lines) +- About to change core architecture +- About to touch multiple interconnected systems +- About to migrate/upgrade major dependencies + +#### 4. πŸ“Š Demonstrable Value +- Can show progress in PR review +- Reviewers can understand what changed +- Rollback would still leave useful code +- "X/Y complete" milestones (e.g., "5/9 adapters migrated") + +#### 5. 🧠 Context Limits +- Approaching 150K+ tokens in AI session +- Been working >2 hours on single task +- About to switch tasks/contexts +- End of work session + +### Examples of Good Checkpoints + +βœ… **Foundation + Pattern Proven** +```bash +git commit -m "feat(mcp): add Zod validation to MCP adapters (5/9 complete) + +- Create schemas for all 9 adapters (247 lines, 33 tests) +- Migrate 5 adapters (eliminates ~150 lines of validation) +- Pattern proven, remaining 4 follow same approach + +All tests passing, build successful" +``` + +βœ… **Before Complexity** +```bash +git commit -m "refactor(indexer): extract pure stat merging functions + +- Extract 6 pure functions from 102-line method +- Add comprehensive tests (17 tests, 100% coverage) +- About to integrate into RepositoryIndexer class + +Foundation secure before complex integration" +``` + +βœ… **Logical Boundary** +```bash +git commit -m "feat(core): add stats metadata tracking + +- Add StatsMetadata interface +- Implement in getStats() method +- Update CLI formatters to display metadata + +Next: Incremental update merging (complex)" +``` + +### Anti-Patterns (Don't Commit) + +❌ **Broken State** +```bash +# NEVER commit this: +git commit -m "WIP: refactoring adapters, tests failing" +git commit -m "fix: half-done, will finish tomorrow" +``` + +❌ **No Value** +```bash +# Don't commit just to save work: +git commit -m "save work" +git commit -m "checkpoint" # (what's done?) +git commit -m "WIP" # (what works?) +``` + +❌ **Debug Code** +```bash +# Don't commit with: +console.log('DEBUG: ...') +// TODO: fix this later +// HACK: temporary workaround +``` + +### Quick Checkpoint Checklist + +Run before every commit: + +```bash +# 1. Quality gates +pnpm build # βœ… Builds without errors? +pnpm test # βœ… All tests pass? +pnpm typecheck # βœ… No TypeScript errors? + +# 2. Review changes +git diff --stat # πŸ“Š Reasonable change size? +git status # πŸ” All intended files staged? + +# 3. If all pass β†’ commit! +git add -A +git commit -m "feat(scope): description..." +``` + +### Why This Matters + +**For Teams:** +- Reduces risk of losing working code +- Makes code review easier (incremental progress) +- Git bisect finds bugs faster +- Enables parallel work (others can pull partial features) + +**For AI Collaboration:** +- Context windows reset - commits are checkpoints +- Recovery is instant (just read git log) +- TODOs + commits = perfect state reconstruction +- Enables long-running refactorings (>1 session) + +**For You:** +- Sleep better (work is secured) +- Switch contexts freely (commit before leaving) +- Experiment safely (can always rollback) +- Build confidence (see progress accumulate) + +### Real Example: Zod Migration + +**Checkpoint Decision:** After migrating 5/9 adapters +- βœ… Green: All tests passing, build successful +- βœ… Logical boundary: Foundation complete, pattern proven +- βœ… Before complexity: Next 4 adapters are 690-724 lines each +- βœ… Demonstrable: "5/9 complete, 150 lines eliminated" +- βœ… Context: At 115K tokens, approaching limit + +**Result:** Committed working state. If next adapters break, we can rollback to this checkpoint. + +--- + ## Commit Message Format ### Structure diff --git a/docs/FEATURE_TEMPLATE.md b/docs/FEATURE_TEMPLATE.md index 4dd3b31..fa6f828 100644 --- a/docs/FEATURE_TEMPLATE.md +++ b/docs/FEATURE_TEMPLATE.md @@ -260,24 +260,16 @@ Before submitting your feature: --- -## πŸ“š Examples +## πŸ“š Examples & Resources -See these implementations: +**Real implementations:** +1. `packages/subagents/src/explorer/` - 99 tests, 100% on utilities +2. `packages/core/src/indexer/stats-merger.ts` - 17 tests, pure functions +3. `packages/cli/src/utils/date-utils.ts` - 18 tests, 100% coverage -1. **Explorer Subagent** - - Path: `packages/subagents/src/explorer/` - - 99 tests, 100% on utilities - - 4 domain modules: metadata, filters, relationships, analysis - -2. **Repository Indexer** - - Path: `packages/core/src/indexer/` - - 87 tests on utilities - - 3 domain modules: language, formatting, documents - -3. **Subagent Coordinator** - - Path: `packages/subagents/src/coordinator/` - - Context manager, task queue, message protocol - - High test coverage with mocks where needed +**Documentation:** +- [TYPESCRIPT_STANDARDS.md](./TYPESCRIPT_STANDARDS.md) - Our coding manifesto +- [REFACTORING_SUMMARY.md](./REFACTORING_SUMMARY.md) - Recent refactoring example --- diff --git a/docs/README.md b/docs/README.md index 5f008ed..1058382 100644 --- a/docs/README.md +++ b/docs/README.md @@ -1,20 +1,125 @@ -# Dev-Agent Documentation +# Documentation Index -This directory contains comprehensive documentation for the Dev-Agent project. +Welcome to the dev-agent documentation! This index helps you find the right doc for your needs. -## Contents +--- -- [Getting Started](./getting-started.md) - Installation and basic usage -- [Architecture](./architecture.md) - System architecture and components -- [Configuration](./configuration.md) - Configuration options -- [Integrations](./integrations.md) - Integration with AI tools -- [API Reference](./api-reference.md) - API documentation -- [Subagents](./subagents.md) - Working with subagents +## πŸš€ Getting Started -## Quick Start +- **[AGENTS.md](../AGENTS.md)** - AI agent guidance for working with this codebase +- **[CLAUDE.md](../CLAUDE.md)** - Claude Code specific integration guide +- **[README.md](../README.md)** - Project overview and quick start -See the [Getting Started](./getting-started.md) guide for installation and basic usage instructions. +--- -## Contributing +## πŸ“ Architecture & Design -For contribution guidelines, see the [Contributing](./contributing.md) document. \ No newline at end of file +- **[ARCHITECTURE.md](../ARCHITECTURE.md)** - System architecture, design decisions +- **[LANGUAGE_SUPPORT.md](../LANGUAGE_SUPPORT.md)** - Supported languages and parsers + +--- + +## πŸ’» Development + +### Core Standards (Read These First!) + +- **[TYPESCRIPT_STANDARDS.md](./TYPESCRIPT_STANDARDS.md)** ⭐ **START HERE - Our Manifesto** + - Pure function extraction + - Runtime validation (Zod) + - Result types vs exceptions + - Dependency injection + - Testing requirements + - Module organization + +- **[WORKFLOW.md](../WORKFLOW.md)** ⭐ **Essential** + - Branch strategy + - Commit standards + - PR process + - Testing requirements + +### Feature Development + +- **[FEATURE_TEMPLATE.md](./FEATURE_TEMPLATE.md)** + - Step-by-step guide for new features + - Structure and organization + - Testing strategy + - Checklist and examples + +--- + +## πŸ§ͺ Testing & Quality + +- **[TYPESCRIPT_STANDARDS.md](./TYPESCRIPT_STANDARDS.md)** - Testing philosophy, coverage targets, patterns + +--- + +## 🀝 Contributing + +- **[CONTRIBUTING.md](../CONTRIBUTING.md)** - How to contribute +- **[CODE_OF_CONDUCT.md](../CODE_OF_CONDUCT.md)** - Community standards +- **[WORKFLOW.md](../WORKFLOW.md)** - Development workflow + +--- + +## πŸ”§ Operations + +- **[TROUBLESHOOTING.md](../TROUBLESHOOTING.md)** - Common issues and solutions +- **[SECURITY.md](../SECURITY.md)** - Security policy + +--- + +## πŸ“ Other Resources + +- **[CHANGELOG.md](../CHANGELOG.md)** - Version history +- **[PLAN.md](../PLAN.md)** - Project roadmap + +--- + +## 🎯 Quick Reference by Task + +### "I'm implementing a new feature" +1. Read [TYPESCRIPT_STANDARDS.md](./TYPESCRIPT_STANDARDS.md) (our manifesto) +2. Follow [WORKFLOW.md](../WORKFLOW.md) +3. Use [FEATURE_TEMPLATE.md](./FEATURE_TEMPLATE.md) for structure + +### "I'm refactoring existing code" +1. Apply [TYPESCRIPT_STANDARDS.md](./TYPESCRIPT_STANDARDS.md) patterns +2. See [REFACTORING_SUMMARY.md](./REFACTORING_SUMMARY.md) for example + +### "I'm fixing a bug" +1. Follow [WORKFLOW.md](../WORKFLOW.md#bug-fixes) +2. Add regression tests +3. Document in [TROUBLESHOOTING.md](../TROUBLESHOOTING.md) if common + +### "I'm reviewing a PR" +1. Check [.github/pull_request_template.md](../.github/pull_request_template.md) +2. Verify [TYPESCRIPT_STANDARDS.md](./TYPESCRIPT_STANDARDS.md) compliance + +### "I'm using AI to help code" +1. Share [AGENTS.md](../AGENTS.md) and [TYPESCRIPT_STANDARDS.md](./TYPESCRIPT_STANDARDS.md) with AI +2. Use [FEATURE_TEMPLATE.md](./FEATURE_TEMPLATE.md) for structure + +--- + +## πŸ“š External Resources + +- [TypeScript Handbook](https://www.typescriptlang.org/docs/handbook/intro.html) +- [Zod Documentation](https://zod.dev/) +- [Vitest Guide](https://vitest.dev/guide/) +- [Turborepo Docs](https://turbo.build/repo/docs) + +--- + +## ❓ Questions? + +If you can't find what you're looking for: + +1. Check this index again +2. Search the repository +3. Ask in GitHub Discussions +4. Create an issue with label `documentation` + +--- + +**Last Updated:** 2024-12-12 +**Maintained By:** dev-agent contributors diff --git a/docs/TESTABILITY.md b/docs/TESTABILITY.md deleted file mode 100644 index 5de7312..0000000 --- a/docs/TESTABILITY.md +++ /dev/null @@ -1,350 +0,0 @@ -# Testability Guidelines - -This document outlines our approach to writing testable, maintainable code in the dev-agent monorepo. - -## Philosophy - -> **"If it's hard to test, it's hard to use."** - -Testability is not just about code coverageβ€”it's about **designing modular, reusable, and understandable code**. - ---- - -## Core Principles - -### 1. **Extract Pure Functions** - -❌ **BAD: Inline logic in large classes** -```typescript -class MyService { - private formatData(data: Data): string { - // 50 lines of formatting logic - } - - private validateData(data: Data): boolean { - // 30 lines of validation logic - } -} -``` - -βœ… **GOOD: Extract to testable utility modules** -```typescript -// utils/formatting.ts -export function formatData(data: Data): string { - // 50 lines of formatting logic -} - -// utils/validation.ts -export function validateData(data: Data): boolean { - // 30 lines of validation logic -} - -// service.ts -import { formatData } from './utils/formatting'; -import { validateData } from './utils/validation'; - -class MyService { - // Uses utilities, no private implementation -} -``` - -**Why?** -- βœ… Direct unit tests (no class instantiation needed) -- βœ… Reusable across modules -- βœ… Tree-shakeable for bundlers -- βœ… Easy to understand (SRP) - ---- - -### 2. **Organize by Domain** - -❌ **BAD: Monolithic utils file** -``` -utils.ts (500 lines) -β”œβ”€β”€ String helpers -β”œβ”€β”€ Date helpers -β”œβ”€β”€ Validation helpers -└── Formatting helpers -``` - -βœ… **GOOD: Domain-specific modules** -``` -utils/ -β”œβ”€β”€ strings.ts (50 lines, 10 tests) -β”œβ”€β”€ dates.ts (60 lines, 12 tests) -β”œβ”€β”€ validation.ts (80 lines, 15 tests) -β”œβ”€β”€ formatting.ts (70 lines, 13 tests) -└── index.ts (barrel export) -``` - -**Why?** -- βœ… Clear boundaries (SRP) -- βœ… Easy to navigate -- βœ… Isolated testing -- βœ… Parallel development - ---- - -### 3. **100% Coverage on Utilities** - -Pure utility modules should have **100% coverage** as they: -- Have no side effects -- Are easy to test -- Form the foundation for integration logic - -**Coverage Targets:** -- **Utilities**: 100% statements, 100% functions, >90% branches -- **Integration**: >80% statements, >70% branches -- **CLI/UI**: >60% (harder to test, more mocks) - ---- - -### 4. **No Non-Null Assertions** - -❌ **BAD: Using ! assertions** -```typescript -function process(data: Data | undefined) { - const result = data!.value; // Unsafe! -} -``` - -βœ… **GOOD: Guard clauses or optional chaining** -```typescript -function process(data: Data | undefined) { - if (!data) { - throw new Error('Data is required'); - } - return data.value; // Type-safe -} - -// Or use optional chaining -function process(data: Data | undefined): string | undefined { - return data?.value; -} -``` - ---- - -### 5. **Dependency Order in Commits** - -When extracting utilities, commit in dependency order: - -``` -Commit 1: Foundation (no dependencies) - ↓ -Commit 2: Independent utilities - ↓ -Commit 3: Dependent utilities (use foundation) - ↓ -Commit 4: Integration (wire everything together) -``` - -**Example: Indexer Refactoring** -```bash -1. language.ts (foundation - no deps) -2. formatting.ts (independent - no deps) -3. documents.ts (depends on formatting.ts) -4. Integration (update imports, remove old code) -``` - ---- - -## Practical Checklist - -Before merging code, ask: - -### βœ… **Extraction Checklist** - -- [ ] Are private methods >20 lines? β†’ Extract to utils -- [ ] Is logic reusable? β†’ Extract to utils -- [ ] Can I test this directly? β†’ If no, extract -- [ ] Does this have side effects? β†’ Separate pure/impure -- [ ] Is this module >300 lines? β†’ Split by domain - -### βœ… **Testing Checklist** - -- [ ] 100% coverage on pure functions -- [ ] No mocks for utility tests -- [ ] Integration tests for side effects -- [ ] Edge cases covered (empty, null, boundary) -- [ ] Error paths tested - -### βœ… **Organization Checklist** - -- [ ] Utils organized by domain (not "misc") -- [ ] Barrel export (`index.ts`) for clean imports -- [ ] Each module <150 lines -- [ ] Each test file <400 lines -- [ ] Clear dependency relationships - ---- - -## Real-World Example: Explorer Subagent - -### **Before Refactoring:** -```typescript -// explorer/index.ts (380 lines) -class ExplorerAgent { - private extractMetadata(result: SearchResult) { /* ... */ } - private matchesFileType(result: SearchResult, types: string[]) { /* ... */ } - private isDuplicate(rels: Rel[], file: string, line: number) { /* ... */ } - // 15+ helper methods inline -} -``` - -**Problems:** -- ❌ Can't test helpers directly -- ❌ 57% function coverage -- ❌ Hard to reuse logic - -### **After Refactoring:** -```typescript -// explorer/utils/ -// β”œβ”€β”€ metadata.ts (54 lines, 8 tests, 100% coverage) -// β”œβ”€β”€ filters.ts (42 lines, 15 tests, 100% coverage) -// β”œβ”€β”€ relationships.ts (63 lines, 16 tests, 100% coverage) -// β”œβ”€β”€ analysis.ts (64 lines, 27 tests, 100% coverage) -// └── index.ts (barrel) - -// explorer/index.ts (now 360 lines, cleaner) -import { extractMetadata, matchesFileType } from './utils'; - -class ExplorerAgent { - // Uses utilities, no inline helpers -} -``` - -**Benefits:** -- βœ… 99 unit tests (vs. 33 integration only) -- βœ… 100% coverage on utilities -- βœ… 80% function coverage overall -- βœ… Logic reusable in CLI - ---- - -## When NOT to Extract - -Don't extract everything blindly. Keep logic inline when: - -1. **Tightly coupled to class state** (uses multiple `this.*`) -2. **Very short** (<10 lines, simple) -3. **Used once** and not complex -4. **Side effects required** (file I/O, network, state mutation) - -**Example of OK inline logic:** -```typescript -class Service { - private isInitialized(): boolean { - return this.state !== null; // Simple, uses this.state - } -} -``` - ---- - -## Tooling & Automation - -### **1. Pre-commit Hooks** -```bash -# Already configured in .husky/pre-commit -- Biome linting (catches unused code) -- TypeScript type checking -- Test runs (optional, for speed) -``` - -### **2. CI Coverage Enforcement** -```yaml -# .github/workflows/ci.yml -- name: Check Coverage - run: | - pnpm vitest run --coverage - # Enforce thresholds: - # - Utils: 100% - # - Integration: 80% -``` - -### **3. Code Review Checklist** -Use this in PR descriptions: - -```markdown -## Testability Checklist -- [ ] Utilities extracted where appropriate -- [ ] 100% coverage on pure functions -- [ ] No non-null assertions (!) -- [ ] Domain-specific organization -- [ ] Atomic commits with clear dependencies -``` - ---- - -## Migration Guide - -### **Step 1: Identify Candidates** -```bash -# Find large files with low coverage -pnpm vitest run --coverage - -# Look for: -# - Files >300 lines -# - Coverage <80% -# - Many private methods -``` - -### **Step 2: Extract Utilities** -```bash -# 1. Create utils/ directory -mkdir -p src/myfeature/utils - -# 2. Extract by domain (foundation first) -# 3. Write tests (aim for 100%) -# 4. Update imports -# 5. Remove old code -``` - -### **Step 3: Commit Strategy** -```bash -# Commit 1: Foundation utilities -git commit -m "feat(feature): add [foundation] utilities" - -# Commit 2: Dependent utilities -git commit -m "feat(feature): add [dependent] utilities" - -# Commit 3: Integration -git commit -m "refactor(feature): integrate modular utils" -``` - ---- - -## Success Metrics - -Track these over time: - -| Metric | Target | Current | -|--------|--------|---------| -| **Utils Coverage** | 100% | 100% βœ… | -| **Integration Coverage** | >80% | 76% 🟑 | -| **Avg Module Size** | <200 lines | ~180 βœ… | -| **Test/Code Ratio** | >1.5 | 1.7 βœ… | - ---- - -## References - -- **Example:** `packages/subagents/src/explorer/utils/` (99 tests, 100% coverage) -- **Example:** `packages/core/src/indexer/utils/` (87 tests, 100% coverage) -- **Style Guide:** [ARCHITECTURE.md](./ARCHITECTURE.md) -- **Contributing:** [CONTRIBUTING.md](./CONTRIBUTING.md) - ---- - -## Questions? - -- **"Should I extract this?"** β†’ If you're asking, probably yes. -- **"How small is too small?"** β†’ <10 lines inline is OK. -- **"100% coverage is too hard"** β†’ Only for pure utilities. Integration can be 80%. -- **"This feels like over-engineering"** β†’ Testability = usability. If it's easy to test, it's easy to use. - ---- - -**Remember:** Future you (and your teammates) will thank you for writing testable code! πŸ™ - diff --git a/docs/TYPESCRIPT_STANDARDS.md b/docs/TYPESCRIPT_STANDARDS.md new file mode 100644 index 0000000..b1e17d5 --- /dev/null +++ b/docs/TYPESCRIPT_STANDARDS.md @@ -0,0 +1,272 @@ +# TypeScript Standards & Manifesto + +> **Our philosophy:** If it's hard to test, it's hard to use. Write code that's obvious to humans and AI. + +--- + +## The Core Rules + +### 1. Extract Pure Functions First + +**Classes hide complexity. Pure functions expose it.** + +❌ **BAD:** +```typescript +class StatsManager { + private mergeStats(current, incremental) { /* 100 lines */ } +} +// Can't test without instantiating class +``` + +βœ… **GOOD:** +```typescript +// utils/stats-merger.ts +export function mergeStats(current: Stats, incremental: Stats): Stats { + return { files: current.files + incremental.files, ... }; +} +// Direct test: expect(mergeStats(a, b)).toEqual(expected) +``` + +**When?** If it's >20 lines, pure (no side effects), or reusable β†’ extract it. + +--- + +### 2. No Type Assertions Without Validation + +**TypeScript types vanish at runtime. Validate external data.** + +❌ **BAD (found in old codebase):** +```typescript +const request = message.payload as unknown as ExplorationRequest; +// Runtime bomb waiting to happen +``` + +βœ… **GOOD (Real example from MCP adapters):** +```typescript +import { z } from 'zod'; +import { validateArgs } from './validation.js'; + +// Define schema once +const ExploreArgsSchema = z.object({ + action: z.enum(['pattern', 'similar', 'relationships']), + query: z.string().min(1), + limit: z.number().int().min(1).max(100).default(10), + threshold: z.number().min(0).max(1).default(0.7), +}).strict(); // Reject unknown properties + +// Use in adapter +async execute(args: Record): Promise { + const validation = validateArgs(ExploreArgsSchema, args); + if (!validation.success) { + return validation.error; // Detailed error with field paths + } + + // TypeScript knows exact types! Zero assertions needed. + const { action, query, limit, threshold } = validation.data; + // ^'pattern'|'similar'|'relationships' ^string ^number ^number +} +``` + +**Benefits:** +- βœ… Zero type assertions (`as`, `!`) +- βœ… Automatic TypeScript type inference +- βœ… Runtime validation with detailed errors +- βœ… Schemas are testable (see `packages/mcp-server/src/schemas/__tests__`) +- βœ… ~63% less validation code vs. manual checks + +**Rule:** Never use `as`, `as unknown as`, or `!` without runtime checks. + +--- + +### 3. Result Types, Not Exceptions + +**Exceptions are invisible in type signatures. Result types are explicit.** + +❌ **BAD:** +```typescript +async function fetchUser(id: string): Promise { + if (!valid) throw new Error('Invalid'); + if (!found) throw new Error('Not found'); + return user; +} +// Forces try-catch, unclear what can fail +``` + +βœ… **GOOD:** +```typescript +type Result = + | { ok: true; value: T } + | { ok: false; error: E }; + +async function fetchUser(id: string): Promise> { + if (!valid) return { ok: false, error: { code: 'INVALID_ID' } }; + if (!found) return { ok: false, error: { code: 'NOT_FOUND' } }; + return { ok: true, value: user }; +} + +// Clean usage +const result = await fetchUser('123'); +if (!result.ok) { + logger.error(result.error); + return; +} +const user = result.value; // Type-safe +``` + +**When to throw:** Only for programmer errors (`throw new Error('INVARIANT: ...')`) + +--- + +### 4. Inject Dependencies + +**Hard-coded dependencies = untestable code.** + +❌ **BAD:** +```typescript +class PlannerAgent { + async createPlan() { + const github = new GitHubClient(); // Can't mock + } +} +``` + +βœ… **GOOD:** +```typescript +interface PlannerDeps { + github: GitHubClient; + indexer: RepositoryIndexer; +} + +class PlannerAgent { + constructor(private deps: PlannerDeps) {} +} + +// Testing is easy +new PlannerAgent({ github: mockGitHub, indexer: mockIndexer }); +``` + +--- + +## Size Limits + +- **Modules:** < 300 lines β†’ Split by domain +- **Classes:** < 400 lines β†’ Use Strategy pattern +- **Functions:** < 50 lines β†’ Extract helpers + +**Current refactoring targets:** +- `explore-adapter.ts` (690 lines) +- `github-adapter.ts` (724 lines) +- `coordinator.ts` (480 lines) + +--- + +## Organization + +``` +src/ +β”œβ”€β”€ utils/ # Pure functions +β”‚ β”œβ”€β”€ validation.ts (120 lines, 100% coverage) +β”‚ β”œβ”€β”€ validation.test.ts +β”‚ β”œβ”€β”€ formatting.ts (150 lines, 100% coverage) +β”‚ β”œβ”€β”€ formatting.test.ts +β”‚ └── index.ts (barrel export) +└── index.ts (integration) +``` + +**Rules:** +- Organize by domain (not "misc") +- Colocate tests with source +- Barrel exports (`index.ts`) +- Each module < 300 lines + +--- + +## Testing Requirements + +| Type | Coverage | Why | +|------|----------|-----| +| **Pure utilities** | 100% | Easy to test, no excuses | +| **Integration** | 80%+ | Side effects, mocks needed | +| **CLI/UI** | 60%+ | Harder to test | + +**Test factories:** +```typescript +// __tests__/factories.ts +export function createMessage(overrides?: Partial): Message { + return { id: randomUUID(), type: 'request', ...overrides }; +} + +// Use in tests +const msg = createMessage({ type: 'response' }); +``` + +--- + +## Error Handling + +**Standard error format:** +```typescript +interface AppError { + code: string; // 'NOT_FOUND', 'VALIDATION_ERROR' + message: string; // Human-readable + details?: unknown; // Context + recoverable: boolean; // Can retry? + suggestion?: string; // What to do +} +``` + +--- + +## Commit Checklist + +Before committing: +- [ ] No `as`, `as unknown as`, or `!` +- [ ] External data validated (Zod) +- [ ] Result types for expected failures +- [ ] Dependencies injected +- [ ] Pure functions extracted +- [ ] 100% coverage on utilities +- [ ] Modules < 300 lines, classes < 400 lines + +--- + +## Real Example: Recent Refactoring + +**Before:** 102-line `mergeIncrementalStats()` method with mutations +**After:** 6 pure functions in `stats-merger.ts` (225 lines, 17 tests, 100% coverage) + +**Impact:** +- Tests run in <1ms (no setup) +- No side effects to track +- Reusable across packages +- AI can understand each function + +See: `docs/REFACTORING_SUMMARY.md` + +--- + +## Tools + +```bash +# Runtime validation +pnpm add zod + +# Result type +pnpm add neverthrow # or implement your own + +# Property-based testing +pnpm add -D fast-check +``` + +--- + +## Resources + +- [Zod Documentation](https://zod.dev/) +- [Result Type Pattern](https://github.com/supermacro/neverthrow) +- [FEATURE_TEMPLATE.md](./FEATURE_TEMPLATE.md) - Implementation guide +- [WORKFLOW.md](../WORKFLOW.md) - Git workflow + +--- + +**Questions?** If you're unsure whether to extract something: **extract it.** diff --git a/packages/cli/package.json b/packages/cli/package.json index 0ca77c8..c704332 100644 --- a/packages/cli/package.json +++ b/packages/cli/package.json @@ -43,8 +43,10 @@ "@lytics/dev-agent-subagents": "workspace:*", "@lytics/kero": "workspace:*", "chalk": "^5.6.2", + "cli-table3": "^0.6.5", "commander": "^12.1.0", - "ora": "^8.0.1" + "ora": "^8.0.1", + "terminal-size": "^4.0.0" }, "devDependencies": { "@types/node": "^22.0.0", diff --git a/packages/cli/src/cli.ts b/packages/cli/src/cli.ts index 51ef6cb..d6b31e9 100644 --- a/packages/cli/src/cli.ts +++ b/packages/cli/src/cli.ts @@ -4,6 +4,7 @@ import chalk from 'chalk'; import { Command } from 'commander'; import { cleanCommand } from './commands/clean.js'; import { compactCommand } from './commands/compact.js'; +import { dashboardCommand } from './commands/dashboard.js'; import { exploreCommand } from './commands/explore.js'; import { ghCommand } from './commands/gh.js'; import { gitCommand } from './commands/git.js'; @@ -37,6 +38,7 @@ program.addCommand(ghCommand); program.addCommand(gitCommand); program.addCommand(updateCommand); program.addCommand(statsCommand); +program.addCommand(dashboardCommand); program.addCommand(compactCommand); program.addCommand(cleanCommand); program.addCommand(storageCommand); diff --git a/packages/cli/src/commands/commands.test.ts b/packages/cli/src/commands/commands.test.ts index e18f34c..5b51460 100644 --- a/packages/cli/src/commands/commands.test.ts +++ b/packages/cli/src/commands/commands.test.ts @@ -100,12 +100,11 @@ export class Calculator { }` ); - // Capture logger output + // Capture console output (used by output.log) const loggedMessages: string[] = []; - const loggerModule = await import('../utils/logger.js'); - const originalLog = loggerModule.logger.log; - vi.spyOn(loggerModule.logger, 'log').mockImplementation((msg: string) => { - loggedMessages.push(msg); + const originalConsoleLog = console.log; + vi.spyOn(console, 'log').mockImplementation((...args: unknown[]) => { + loggedMessages.push(args.join(' ')); }); // Mock process.exit to prevent test termination @@ -119,12 +118,15 @@ export class Calculator { await program.parseAsync(['node', 'cli', 'index', indexDir, '--no-git', '--no-github']); exitSpy.mockRestore(); - loggerModule.logger.log = originalLog; + console.log = originalConsoleLog; - // Verify storage size is in the output - const storageSizeLog = loggedMessages.find((msg) => msg.includes('Storage size:')); + // Verify storage size is in the output (new compact format shows it after duration) + const storageSizeLog = loggedMessages.find( + (msg) => msg.includes('Duration:') || msg.includes('Storage:') + ); expect(storageSizeLog).toBeDefined(); - expect(storageSizeLog).toMatch(/Storage size:.*\d+(\.\d+)?\s*(B|KB|MB|GB)/); + // Check for storage size in compact format: "Duration: X β€’ Storage: Y" + expect(loggedMessages.some((msg) => /\d+(\.\d+)?\s*(B|KB|MB|GB)/.test(msg))).toBe(true); }, 30000); // 30s timeout for indexing }); }); diff --git a/packages/cli/src/commands/dashboard.ts b/packages/cli/src/commands/dashboard.ts new file mode 100644 index 0000000..2cbb595 --- /dev/null +++ b/packages/cli/src/commands/dashboard.ts @@ -0,0 +1,20 @@ +/** + * Dashboard command - alias for enhanced stats + */ + +import { Command } from 'commander'; + +/** + * Dashboard command is an alias for stats with enhanced formatting + * Implemented as a simple alias that imports and re-exports stats functionality + */ +export const dashboardCommand = new Command('dashboard') + .description('Display interactive repository dashboard') + .summary('Show enhanced repository statistics and insights') + .option('--json', 'Output stats as JSON', false) + .action(async (options) => { + // Import stats command action dynamically to avoid circular dependency + const { statsCommand } = await import('./stats.js'); + // Execute stats command with the same options + await statsCommand.parseAsync(['node', 'dev', 'stats', ...(options.json ? ['--json'] : [])]); + }); diff --git a/packages/cli/src/commands/gh.ts b/packages/cli/src/commands/gh.ts index 58e0899..b95c19f 100644 --- a/packages/cli/src/commands/gh.ts +++ b/packages/cli/src/commands/gh.ts @@ -9,7 +9,9 @@ import { createLogger } from '@lytics/kero'; import chalk from 'chalk'; import { Command } from 'commander'; import ora from 'ora'; +import { formatNumber } from '../utils/formatters.js'; import { keroLogger, logger } from '../utils/logger.js'; +import { output } from '../utils/output.js'; /** * Create GitHub indexer with centralized storage @@ -99,23 +101,18 @@ export const ghCommand = new Command('gh') }, }); - spinner.succeed(chalk.green('GitHub data indexed!')); + spinner.stop(); - // Display stats - logger.log(''); - logger.log(chalk.bold('Indexing Stats:')); - logger.log(` Repository: ${chalk.cyan(stats.repository)}`); - logger.log(` Total: ${chalk.yellow(stats.totalDocuments)} documents`); + // Compact summary + const issues = stats.byType.issue || 0; + const prs = stats.byType.pull_request || 0; + const duration = (stats.indexDuration / 1000).toFixed(2); - if (stats.byType.issue) { - logger.log(` Issues: ${stats.byType.issue}`); - } - if (stats.byType.pull_request) { - logger.log(` Pull Requests: ${stats.byType.pull_request}`); - } - - logger.log(` Duration: ${stats.indexDuration}ms`); - logger.log(''); + output.log(''); + output.success(`Indexed ${formatNumber(stats.totalDocuments)} GitHub documents`); + output.log(` ${chalk.gray('Repository:')} ${chalk.bold(stats.repository)}`); + output.log(` ${issues} issues β€’ ${prs} PRs β€’ ${duration}s`); + output.log(''); } catch (error) { spinner.fail('Indexing failed'); logger.error((error as Error).message); diff --git a/packages/cli/src/commands/index.ts b/packages/cli/src/commands/index.ts index c0d98e0..0796355 100644 --- a/packages/cli/src/commands/index.ts +++ b/packages/cli/src/commands/index.ts @@ -18,6 +18,7 @@ import ora from 'ora'; import { getDefaultConfig, loadConfig } from '../utils/config.js'; import { formatBytes, getDirectorySize } from '../utils/file.js'; import { createIndexLogger, logger } from '../utils/logger.js'; +import { formatIndexSummary, output } from '../utils/output.js'; /** * Check if a command is available @@ -165,20 +166,10 @@ export const indexCommand = new Command('index') await indexer.close(); - const codeDuration = ((Date.now() - startTime) / 1000).toFixed(2); + const codeDuration = (Date.now() - startTime) / 1000; spinner.succeed(chalk.green('Code indexed successfully!')); - // Show code stats - logger.log(''); - logger.log(chalk.bold('Code Indexing:')); - logger.log(` ${chalk.cyan('Files scanned:')} ${stats.filesScanned}`); - logger.log(` ${chalk.cyan('Documents extracted:')} ${stats.documentsExtracted}`); - logger.log(` ${chalk.cyan('Documents indexed:')} ${stats.documentsIndexed}`); - logger.log(` ${chalk.cyan('Vectors stored:')} ${stats.vectorsStored}`); - logger.log(` ${chalk.cyan('Storage size:')} ${formatBytes(storageSize)}`); - logger.log(` ${chalk.cyan('Duration:')} ${codeDuration}s`); - // Index git history if available let gitStats = { commitsIndexed: 0, durationMs: 0 }; if (canIndexGit) { @@ -206,12 +197,6 @@ export const indexCommand = new Command('index') await gitVectorStore.close(); spinner.succeed(chalk.green('Git history indexed!')); - logger.log(''); - logger.log(chalk.bold('Git History:')); - logger.log(` ${chalk.cyan('Commits indexed:')} ${gitStats.commitsIndexed}`); - logger.log( - ` ${chalk.cyan('Duration:')} ${(gitStats.durationMs / 1000).toFixed(2)}s` - ); } // Index GitHub issues/PRs if available @@ -238,33 +223,50 @@ export const indexCommand = new Command('index') }, }); spinner.succeed(chalk.green('GitHub indexed!')); - logger.log(''); - logger.log(chalk.bold('GitHub:')); - logger.log(` ${chalk.cyan('Issues/PRs indexed:')} ${ghStats.totalDocuments}`); - logger.log( - ` ${chalk.cyan('Duration:')} ${(ghStats.indexDuration / 1000).toFixed(2)}s` - ); } - const totalDuration = ((Date.now() - startTime) / 1000).toFixed(2); - - logger.log(''); - logger.log(chalk.bold('Summary:')); - logger.log(` ${chalk.cyan('Total duration:')} ${totalDuration}s`); - logger.log(` ${chalk.cyan('Storage:')} ${storagePath}`); + const totalDuration = (Date.now() - startTime) / 1000; + + // Compact summary output + output.log(''); + output.log( + formatIndexSummary({ + code: { + files: stats.filesScanned, + documents: stats.documentsIndexed, + vectors: stats.vectorsStored, + duration: codeDuration, + size: formatBytes(storageSize), + }, + git: canIndexGit + ? { commits: gitStats.commitsIndexed, duration: gitStats.durationMs / 1000 } + : undefined, + github: canIndexGitHub + ? { documents: ghStats.totalDocuments, duration: ghStats.indexDuration / 1000 } + : undefined, + total: { + duration: totalDuration, + storage: storagePath, + }, + }) + ); + // Show errors if any if (stats.errors.length > 0) { - logger.log(''); - logger.warn(`${stats.errors.length} error(s) occurred during indexing`); + output.log(''); + output.warn(`${stats.errors.length} error(s) occurred during indexing`); if (options.verbose) { for (const error of stats.errors) { - logger.error(` ${error.file}: ${error.message}`); + output.log(` ${chalk.gray(error.file)}: ${error.message}`); } + } else { + output.log( + ` ${chalk.gray('Run with')} ${chalk.cyan('--verbose')} ${chalk.gray('to see details')}` + ); } } - logger.log(''); - logger.log(`Now you can search with: ${chalk.yellow('dev search ""')}`); + output.log(''); } catch (error) { spinner.fail('Failed to index repository'); logger.error(error instanceof Error ? error.message : String(error)); diff --git a/packages/cli/src/commands/init.ts b/packages/cli/src/commands/init.ts index 53ae6f3..c950eeb 100644 --- a/packages/cli/src/commands/init.ts +++ b/packages/cli/src/commands/init.ts @@ -2,7 +2,7 @@ import chalk from 'chalk'; import { Command } from 'commander'; import ora from 'ora'; import { getDefaultConfig, saveConfig } from '../utils/config.js'; -import { logger } from '../utils/logger.js'; +import { output } from '../utils/output.js'; export const initCommand = new Command('init') .description('Initialize dev-agent in the current directory') @@ -16,19 +16,23 @@ export const initCommand = new Command('init') spinner.text = 'Creating configuration file...'; await saveConfig(config, options.path); - spinner.succeed(chalk.green('Dev-agent initialized successfully!')); + spinner.stop(); - logger.log(''); - logger.log(chalk.bold('Next steps:')); - logger.log(` ${chalk.cyan('1.')} Run ${chalk.yellow('dev index')} to index your repository`); - logger.log( - ` ${chalk.cyan('2.')} Run ${chalk.yellow('dev search ""')} to search your code` + // Clean output without timestamps + output.log(''); + output.success('Initialized dev-agent'); + output.log(''); + output.log(chalk.bold('Next steps:')); + output.log(` ${chalk.cyan('1.')} Run ${chalk.cyan('dev index')} to index your repository`); + output.log( + ` ${chalk.cyan('2.')} Run ${chalk.cyan('dev search ""')} to search your code` ); - logger.log(''); - logger.log(`Configuration saved to ${chalk.cyan('.dev-agent.json')}`); + output.log(''); + output.log(` ${chalk.gray('Config saved:')} ${chalk.cyan('.dev-agent.json')}`); + output.log(''); } catch (error) { spinner.fail('Failed to initialize dev-agent'); - logger.error(error instanceof Error ? error.message : String(error)); + output.error(error instanceof Error ? error.message : String(error)); process.exit(1); } }); diff --git a/packages/cli/src/commands/search.ts b/packages/cli/src/commands/search.ts index 8aa0311..e4e63b1 100644 --- a/packages/cli/src/commands/search.ts +++ b/packages/cli/src/commands/search.ts @@ -10,6 +10,7 @@ import { Command } from 'commander'; import ora from 'ora'; import { loadConfig } from '../utils/config.js'; import { logger } from '../utils/logger.js'; +import { formatSearchResults, output } from '../utils/output.js'; export const searchCommand = new Command('search') .description('Search indexed code semantically') @@ -17,6 +18,7 @@ export const searchCommand = new Command('search') .option('-l, --limit ', 'Maximum number of results', '10') .option('-t, --threshold ', 'Minimum similarity score (0-1)', '0.7') .option('--json', 'Output results as JSON', false) + .option('-v, --verbose', 'Show detailed results with signatures and docs', false) .action(async (query: string, options) => { const spinner = ora('Searching...').start(); @@ -59,14 +61,15 @@ export const searchCommand = new Command('search') await indexer.close(); - spinner.succeed(chalk.green(`Found ${results.length} result(s)`)); + spinner.stop(); if (results.length === 0) { - logger.log(''); - logger.warn('No results found. Try:'); - logger.log(` - Lowering the threshold: ${chalk.yellow('--threshold 0.5')}`); - logger.log(` - Using different keywords`); - logger.log(` - Running ${chalk.yellow('dev update')} to refresh the index`); + output.log(''); + output.warn('No results found. Try:'); + output.log(` β€’ Lower threshold: ${chalk.cyan('--threshold 0.5')}`); + output.log(` β€’ Different keywords`); + output.log(` β€’ Refresh index: ${chalk.cyan('dev update')}`); + output.log(''); return; } @@ -76,40 +79,12 @@ export const searchCommand = new Command('search') return; } - // Pretty print results - logger.log(''); - for (let i = 0; i < results.length; i++) { - const result = results[i]; - const metadata = result.metadata; - const score = (result.score * 100).toFixed(1); - - // Extract file info (metadata uses 'path', not 'file') - const filePath = (metadata.path || metadata.file) as string; - const relativePath = filePath ? path.relative(resolvedRepoPath, filePath) : 'unknown'; - const startLine = metadata.startLine as number; - const endLine = metadata.endLine as number; - const name = metadata.name as string; - const type = metadata.type as string; - - logger.log( - chalk.bold(`${i + 1}. ${chalk.cyan(name || type)} ${chalk.gray(`(${score}% match)`)}`) - ); - logger.log(` ${chalk.gray('File:')} ${relativePath}:${startLine}-${endLine}`); - - // Show signature if available - if (metadata.signature) { - logger.log(` ${chalk.gray('Signature:')} ${chalk.yellow(metadata.signature)}`); - } - - // Show docstring if available - if (metadata.docstring) { - const doc = String(metadata.docstring); - const truncated = doc.length > 80 ? `${doc.substring(0, 77)}...` : doc; - logger.log(` ${chalk.gray('Doc:')} ${truncated}`); - } - - logger.log(''); - } + // Pretty print results (compact or verbose) + output.log(''); + output.success(`Found ${results.length} result(s)`); + output.log(''); + output.log(formatSearchResults(results, resolvedRepoPath, { verbose: options.verbose })); + output.log(''); } catch (error) { spinner.fail('Search failed'); logger.error(error instanceof Error ? error.message : String(error)); diff --git a/packages/cli/src/commands/stats.ts b/packages/cli/src/commands/stats.ts index a5e611b..249c52c 100644 --- a/packages/cli/src/commands/stats.ts +++ b/packages/cli/src/commands/stats.ts @@ -1,6 +1,7 @@ import * as fs from 'node:fs/promises'; import * as path from 'node:path'; import { + type DetailedIndexStats, ensureStorageDirectory, getStorageFilePaths, getStoragePath, @@ -12,10 +13,19 @@ import { Command } from 'commander'; import ora from 'ora'; import { loadConfig } from '../utils/config.js'; import { logger } from '../utils/logger.js'; +import { + formatCompactSummary, + formatComponentTypes, + formatDetailedLanguageTable, + formatGitHubSummary, + formatLanguageBreakdown, + output, +} from '../utils/output.js'; export const statsCommand = new Command('stats') .description('Show indexing statistics') .option('--json', 'Output stats as JSON', false) + .option('-v, --verbose', 'Show detailed breakdown with tables', false) .action(async (options) => { const spinner = ora('Loading statistics...').start(); @@ -48,7 +58,7 @@ export const statsCommand = new Command('stats') await indexer.initialize(); - const stats = await indexer.getStats(); + const stats = (await indexer.getStats()) as DetailedIndexStats | null; // Try to load GitHub stats let githubStats = null; @@ -83,74 +93,62 @@ export const statsCommand = new Command('stats') spinner.stop(); if (!stats) { - logger.warn('No indexing statistics available'); - logger.log(`Run ${chalk.yellow('dev index')} to index your repository first`); + output.warn('No indexing statistics available'); + output.log(`Run ${chalk.cyan('dev index')} to index your repository first`); + output.log(''); return; } // Output as JSON if requested if (options.json) { - console.log(JSON.stringify(stats, null, 2)); + console.log( + JSON.stringify( + { + repository: stats, + github: githubStats || undefined, + }, + null, + 2 + ) + ); return; } - // Pretty print stats - logger.log(''); - logger.log(chalk.bold.cyan('πŸ“Š Indexing Statistics')); - logger.log(''); - logger.log(`${chalk.cyan('Repository:')} ${resolvedRepoPath}`); - logger.log(`${chalk.cyan('Storage:')} ${storagePath}`); - logger.log(`${chalk.cyan('Vector Store:')} ${filePaths.vectors}`); - logger.log(''); - logger.log(`${chalk.cyan('Files Indexed:')} ${stats.filesScanned}`); - logger.log(`${chalk.cyan('Documents Extracted:')} ${stats.documentsExtracted}`); - logger.log(`${chalk.cyan('Vectors Stored:')} ${stats.vectorsStored}`); - logger.log(''); - - if (stats.startTime && stats.endTime) { - const duration = (stats.duration / 1000).toFixed(2); - logger.log( - `${chalk.cyan('Last Indexed:')} ${new Date(stats.startTime).toLocaleString()}` - ); - logger.log(`${chalk.cyan('Duration:')} ${duration}s`); + // Get repository name from path + const repoName = resolvedRepoPath.split('/').pop() || 'repository'; + + output.log(''); + + // Compact one-line summary + output.log(formatCompactSummary(stats, repoName)); + output.log(''); + + // Language breakdown (compact or verbose) + if (stats.byLanguage && Object.keys(stats.byLanguage).length > 0) { + if (options.verbose) { + // Verbose: Show table with LOC + output.log(formatDetailedLanguageTable(stats.byLanguage)); + } else { + // Compact: Show simple list + output.log(formatLanguageBreakdown(stats.byLanguage)); + } + output.log(''); } - if (stats.errors && stats.errors.length > 0) { - logger.log(''); - logger.warn(`${stats.errors.length} error(s) during last indexing`); + // Component types summary (compact - top 3) + if (stats.byComponentType && Object.keys(stats.byComponentType).length > 0) { + output.log(formatComponentTypes(stats.byComponentType)); + output.log(''); } - // Display GitHub stats if available + // GitHub stats (compact) if (githubStats) { - logger.log(''); - logger.log(chalk.bold.cyan('πŸ”— GitHub Integration')); - logger.log(''); - logger.log(`${chalk.cyan('Repository:')} ${githubStats.repository}`); - logger.log(`${chalk.cyan('Total Documents:')} ${githubStats.totalDocuments}`); - logger.log(`${chalk.cyan('Issues:')} ${githubStats.byType.issue || 0}`); - logger.log(`${chalk.cyan('Pull Requests:')} ${githubStats.byType.pull_request || 0}`); - logger.log(''); - logger.log(`${chalk.cyan('Open:')} ${githubStats.byState.open || 0}`); - logger.log(`${chalk.cyan('Closed:')} ${githubStats.byState.closed || 0}`); - if (githubStats.byState.merged) { - logger.log(`${chalk.cyan('Merged:')} ${githubStats.byState.merged}`); - } - logger.log(''); - logger.log( - `${chalk.cyan('Last Synced:')} ${new Date(githubStats.lastIndexed).toLocaleString()}` - ); + output.log(formatGitHubSummary(githubStats)); } else { - logger.log(''); - logger.log(chalk.bold.cyan('πŸ”— GitHub Integration')); - logger.log(''); - logger.log( - chalk.gray('Not indexed. Run') + - chalk.yellow(' dev gh index ') + - chalk.gray('to sync GitHub data.') - ); + output.log(`πŸ”— ${chalk.gray('GitHub not indexed. Run')} ${chalk.cyan('dev gh index')}`); } - logger.log(''); + output.log(''); } catch (error) { spinner.fail('Failed to load statistics'); logger.error(error instanceof Error ? error.message : String(error)); diff --git a/packages/cli/src/commands/update.ts b/packages/cli/src/commands/update.ts index 3680945..2c7f475 100644 --- a/packages/cli/src/commands/update.ts +++ b/packages/cli/src/commands/update.ts @@ -10,6 +10,7 @@ import { Command } from 'commander'; import ora from 'ora'; import { loadConfig } from '../utils/config.js'; import { logger } from '../utils/logger.js'; +import { formatUpdateSummary, output } from '../utils/output.js'; export const updateCommand = new Command('update') .description('Update index with changed files') @@ -66,34 +67,36 @@ export const updateCommand = new Command('update') await indexer.close(); - const duration = ((Date.now() - startTime) / 1000).toFixed(2); + const duration = (Date.now() - startTime) / 1000; - if (stats.filesScanned === 0) { - spinner.succeed(chalk.green('Index is up to date!')); - logger.log(''); - logger.log('No changes detected since last index.'); - } else { - spinner.succeed(chalk.green('Index updated successfully!')); + spinner.stop(); - // Show stats - logger.log(''); - logger.log(chalk.bold('Update Statistics:')); - logger.log(` ${chalk.cyan('Files updated:')} ${stats.filesScanned}`); - logger.log(` ${chalk.cyan('Documents re-indexed:')} ${stats.documentsIndexed}`); - logger.log(` ${chalk.cyan('Duration:')} ${duration}s`); + // Compact output + output.log(''); + output.log( + formatUpdateSummary({ + filesUpdated: stats.filesScanned, + documentsReindexed: stats.documentsIndexed, + duration: Number.parseFloat(duration.toFixed(2)), + }) + ); - if (stats.errors.length > 0) { - logger.log(''); - logger.warn(`${stats.errors.length} error(s) occurred during update`); - if (options.verbose) { - for (const error of stats.errors) { - logger.error(` ${error.file}: ${error.message}`); - } + // Show errors if any + if (stats.errors.length > 0) { + output.log(''); + output.warn(`${stats.errors.length} error(s) occurred during update`); + if (options.verbose) { + for (const error of stats.errors) { + output.log(` ${chalk.gray(error.file)}: ${error.message}`); } + } else { + output.log( + ` ${chalk.gray('Run with')} ${chalk.cyan('--verbose')} ${chalk.gray('to see details')}` + ); } } - logger.log(''); + output.log(''); } catch (error) { spinner.fail('Failed to update index'); logger.error(error instanceof Error ? error.message : String(error)); diff --git a/packages/cli/src/utils/__tests__/date-utils.test.ts b/packages/cli/src/utils/__tests__/date-utils.test.ts new file mode 100644 index 0000000..77f12eb --- /dev/null +++ b/packages/cli/src/utils/__tests__/date-utils.test.ts @@ -0,0 +1,129 @@ +import { describe, expect, it } from 'vitest'; +import { formatDuration, getStaleStatsWarning, getTimeSince, isStatsStale } from '../date-utils'; + +describe('date-utils', () => { + describe('getTimeSince', () => { + it('should return "just now" for very recent dates', () => { + const date = new Date('2024-01-01T12:00:00'); + const now = new Date('2024-01-01T12:00:30'); // 30 seconds later + + expect(getTimeSince(date, now)).toBe('just now'); + }); + + it('should return minutes for recent dates', () => { + const date = new Date('2024-01-01T12:00:00'); + const now = new Date('2024-01-01T12:05:00'); // 5 minutes later + + expect(getTimeSince(date, now)).toBe('5 minutes ago'); + }); + + it('should use singular for 1 minute', () => { + const date = new Date('2024-01-01T12:00:00'); + const now = new Date('2024-01-01T12:01:00'); + + expect(getTimeSince(date, now)).toBe('1 minute ago'); + }); + + it('should return hours for dates within a day', () => { + const date = new Date('2024-01-01T12:00:00'); + const now = new Date('2024-01-01T15:00:00'); // 3 hours later + + expect(getTimeSince(date, now)).toBe('3 hours ago'); + }); + + it('should use singular for 1 hour', () => { + const date = new Date('2024-01-01T12:00:00'); + const now = new Date('2024-01-01T13:00:00'); + + expect(getTimeSince(date, now)).toBe('1 hour ago'); + }); + + it('should return days for dates within a week', () => { + const date = new Date('2024-01-01T12:00:00'); + const now = new Date('2024-01-03T12:00:00'); // 2 days later + + expect(getTimeSince(date, now)).toBe('2 days ago'); + }); + + it('should use singular for 1 day', () => { + const date = new Date('2024-01-01T12:00:00'); + const now = new Date('2024-01-02T12:00:00'); + + expect(getTimeSince(date, now)).toBe('1 day ago'); + }); + + it('should return formatted date for dates older than a week', () => { + const date = new Date('2024-01-01T12:00:00'); + const now = new Date('2024-01-10T12:00:00'); // 9 days later + + const result = getTimeSince(date, now); + expect(result).toContain('1/1/2024'); // Format may vary by locale + }); + + it('should use current date when now not provided', () => { + const oneMinuteAgo = new Date(Date.now() - 60 * 1000); + const result = getTimeSince(oneMinuteAgo); + + expect(result).toMatch(/1 minute ago|just now/); + }); + }); + + describe('formatDuration', () => { + it('should format milliseconds for durations under 1 second', () => { + expect(formatDuration(500)).toBe('500ms'); + expect(formatDuration(0)).toBe('0ms'); + expect(formatDuration(999)).toBe('999ms'); + }); + + it('should format seconds for durations under 1 minute', () => { + expect(formatDuration(1000)).toBe('1.00s'); + expect(formatDuration(5500)).toBe('5.50s'); + expect(formatDuration(59999)).toBe('60.00s'); + }); + + it('should format minutes and seconds for longer durations', () => { + expect(formatDuration(60000)).toBe('1m 0s'); + expect(formatDuration(125000)).toBe('2m 5s'); + expect(formatDuration(3600000)).toBe('60m 0s'); + }); + }); + + describe('isStatsStale', () => { + it('should return false for updates below threshold', () => { + expect(isStatsStale(0)).toBe(false); + expect(isStatsStale(5)).toBe(false); + expect(isStatsStale(10)).toBe(false); + }); + + it('should return true for updates above threshold', () => { + expect(isStatsStale(11)).toBe(true); + expect(isStatsStale(20)).toBe(true); + expect(isStatsStale(100)).toBe(true); + }); + + it('should respect custom threshold', () => { + expect(isStatsStale(5, 3)).toBe(true); + expect(isStatsStale(3, 3)).toBe(false); + expect(isStatsStale(20, 50)).toBe(false); + }); + }); + + describe('getStaleStatsWarning', () => { + it('should return undefined for fresh stats', () => { + expect(getStaleStatsWarning(0)).toBeUndefined(); + expect(getStaleStatsWarning(5)).toBeUndefined(); + expect(getStaleStatsWarning(10)).toBeUndefined(); + }); + + it('should return warning message for stale stats', () => { + const warning = getStaleStatsWarning(11); + expect(warning).toBeDefined(); + expect(warning).toContain('dev index'); + }); + + it('should respect custom threshold', () => { + expect(getStaleStatsWarning(5, 3)).toBeDefined(); + expect(getStaleStatsWarning(3, 3)).toBeUndefined(); + }); + }); +}); diff --git a/packages/cli/src/utils/__tests__/formatters.test.ts b/packages/cli/src/utils/__tests__/formatters.test.ts new file mode 100644 index 0000000..66519ec --- /dev/null +++ b/packages/cli/src/utils/__tests__/formatters.test.ts @@ -0,0 +1,325 @@ +import type { DetailedIndexStats, LanguageStats } from '@lytics/dev-agent-core'; +import { describe, expect, it } from 'vitest'; +import { + createComponentTypesTable, + createHealthIndicator, + createLanguageTable, + createOverviewSection, + formatBytes, + formatDetailedStats, + formatNumber, + getTerminalWidth, +} from '../formatters'; + +describe('formatters', () => { + describe('formatBytes', () => { + it('should format bytes correctly', () => { + expect(formatBytes(0)).toBe('0 B'); + expect(formatBytes(1024)).toBe('1.00 KB'); + expect(formatBytes(1024 * 1024)).toBe('1.00 MB'); + expect(formatBytes(1024 * 1024 * 1024)).toBe('1.00 GB'); + }); + + it('should handle fractional values', () => { + expect(formatBytes(1536)).toBe('1.50 KB'); + expect(formatBytes(2.5 * 1024 * 1024)).toBe('2.50 MB'); + }); + }); + + describe('formatNumber', () => { + it('should format numbers with commas', () => { + expect(formatNumber(1000)).toBe('1,000'); + expect(formatNumber(1000000)).toBe('1,000,000'); + expect(formatNumber(42)).toBe('42'); + }); + }); + + describe('getTerminalWidth', () => { + it('should return a positive number', () => { + const width = getTerminalWidth(); + expect(width).toBeGreaterThan(0); + expect(typeof width).toBe('number'); + }); + + it('should have fallback to 80', () => { + const width = getTerminalWidth(); + // Should be at least 80 (either detected or fallback) + expect(width).toBeGreaterThanOrEqual(80); + }); + }); + + describe('createLanguageTable', () => { + it('should create table with language stats', () => { + const byLanguage: Partial> = { + typescript: { files: 10, components: 50, lines: 1000 }, + javascript: { files: 5, components: 25, lines: 500 }, + }; + + const table = createLanguageTable(byLanguage); + const output = table.toString(); + + expect(output).toContain('TypeScript'); + expect(output).toContain('JavaScript'); + expect(output).toContain('10'); + expect(output).toContain('50'); + expect(output).toContain('1,000'); + }); + + it('should sort by component count descending', () => { + const byLanguage: Partial> = { + javascript: { files: 5, components: 100, lines: 500 }, + typescript: { files: 10, components: 50, lines: 1000 }, + }; + + const table = createLanguageTable(byLanguage); + const output = table.toString(); + const jsIndex = output.indexOf('JavaScript'); + const tsIndex = output.indexOf('TypeScript'); + + // JavaScript should appear first (more components) + expect(jsIndex).toBeLessThan(tsIndex); + }); + + it('should include totals row', () => { + const byLanguage: Partial> = { + typescript: { files: 10, components: 50, lines: 1000 }, + javascript: { files: 5, components: 25, lines: 500 }, + }; + + const table = createLanguageTable(byLanguage); + const output = table.toString(); + + expect(output).toContain('Total'); + expect(output).toContain('15'); // 10 + 5 files + expect(output).toContain('75'); // 50 + 25 components + }); + }); + + describe('createComponentTypesTable', () => { + it('should create table with component types', () => { + const byComponentType: Record = { + function: 50, + class: 25, + interface: 15, + }; + + const table = createComponentTypesTable(byComponentType); + const output = table.toString(); + + expect(output).toContain('Function'); + expect(output).toContain('Class'); + expect(output).toContain('Interface'); + expect(output).toContain('50'); + expect(output).toContain('25'); + expect(output).toContain('15'); + }); + + it('should calculate percentages correctly', () => { + const byComponentType: Record = { + function: 50, + class: 50, + }; + + const table = createComponentTypesTable(byComponentType); + const output = table.toString(); + + expect(output).toContain('50.0%'); + }); + + it('should sort by count descending', () => { + const byComponentType: Record = { + class: 10, + function: 50, + interface: 5, + }; + + const table = createComponentTypesTable(byComponentType); + const output = table.toString(); + + const funcIndex = output.indexOf('Function'); + const classIndex = output.indexOf('Class'); + const interfaceIndex = output.indexOf('Interface'); + + expect(funcIndex).toBeLessThan(classIndex); + expect(classIndex).toBeLessThan(interfaceIndex); + }); + }); + + describe('createHealthIndicator', () => { + it('should show healthy status for good stats', () => { + const stats: DetailedIndexStats = { + filesScanned: 100, + documentsExtracted: 500, + documentsIndexed: 500, + vectorsStored: 500, + duration: 5000, + errors: [], + startTime: new Date(), + endTime: new Date(), + repositoryPath: '/test', + }; + + const health = createHealthIndicator(stats); + expect(health).toContain('Healthy'); + expect(health).toContain('●'); // Indicator dot + }); + + it('should show error status for no files', () => { + const stats: DetailedIndexStats = { + filesScanned: 0, + documentsExtracted: 0, + documentsIndexed: 0, + vectorsStored: 0, + duration: 0, + errors: [], + startTime: new Date(), + endTime: new Date(), + repositoryPath: '/test', + }; + + const health = createHealthIndicator(stats); + expect(health).toContain('No files indexed'); + }); + + it('should show warning for incomplete index', () => { + const stats: DetailedIndexStats = { + filesScanned: 100, + documentsExtracted: 500, + documentsIndexed: 0, + vectorsStored: 0, + duration: 5000, + errors: [], + startTime: new Date(), + endTime: new Date(), + repositoryPath: '/test', + }; + + const health = createHealthIndicator(stats); + expect(health).toContain('Incomplete index'); + }); + + it('should show warning for high error rate', () => { + const stats: DetailedIndexStats = { + filesScanned: 100, + documentsExtracted: 100, + documentsIndexed: 100, + vectorsStored: 100, + duration: 5000, + errors: Array(20).fill({ type: 'scanner', message: 'error' }), + startTime: new Date(), + endTime: new Date(), + repositoryPath: '/test', + }; + + const health = createHealthIndicator(stats); + expect(health).toContain('High error rate'); + }); + }); + + describe('createOverviewSection', () => { + it('should create overview with all fields', () => { + const stats: DetailedIndexStats = { + filesScanned: 100, + documentsExtracted: 500, + documentsIndexed: 500, + vectorsStored: 500, + duration: 5000, + errors: [], + startTime: new Date('2024-01-01'), + endTime: new Date('2024-01-01'), + repositoryPath: '/test', + }; + + const lines = createOverviewSection(stats, '/test/repo'); + + expect(lines.join('\n')).toContain('Repository Overview'); + expect(lines.join('\n')).toContain('/test/repo'); + expect(lines.join('\n')).toContain('100'); + expect(lines.join('\n')).toContain('500'); + expect(lines.join('\n')).toContain('5.00s'); + }); + }); + + describe('formatDetailedStats', () => { + it('should format complete stats', () => { + const stats: DetailedIndexStats = { + filesScanned: 100, + documentsExtracted: 500, + documentsIndexed: 500, + vectorsStored: 500, + duration: 5000, + errors: [], + startTime: new Date(), + endTime: new Date(), + repositoryPath: '/test', + byLanguage: { + typescript: { files: 50, components: 250, lines: 5000 }, + javascript: { files: 50, components: 250, lines: 5000 }, + go: { files: 0, components: 0, lines: 0 }, + markdown: { files: 0, components: 0, lines: 0 }, + }, + byComponentType: { + function: 250, + class: 150, + interface: 100, + }, + }; + + const output = formatDetailedStats(stats, '/test/repo'); + + expect(output).toContain('Repository Overview'); + expect(output).toContain('Language Breakdown'); + expect(output).toContain('Component Types'); + expect(output).toContain('TypeScript'); + expect(output).toContain('Function'); + }); + + it('should handle stats without detailed breakdowns', () => { + const stats: DetailedIndexStats = { + filesScanned: 100, + documentsExtracted: 500, + documentsIndexed: 500, + vectorsStored: 500, + duration: 5000, + errors: [], + startTime: new Date(), + endTime: new Date(), + repositoryPath: '/test', + }; + + const output = formatDetailedStats(stats, '/test/repo'); + + expect(output).toContain('Repository Overview'); + expect(output).not.toContain('Language Breakdown'); + expect(output).not.toContain('Component Types'); + }); + + it('should show packages when requested and available', () => { + const stats: DetailedIndexStats = { + filesScanned: 100, + documentsExtracted: 500, + documentsIndexed: 500, + vectorsStored: 500, + duration: 5000, + errors: [], + startTime: new Date(), + endTime: new Date(), + repositoryPath: '/test', + byPackage: { + 'packages/core': { + name: '@lytics/dev-agent-core', + path: 'packages/core', + files: 50, + components: 250, + languages: {}, + }, + }, + }; + + const output = formatDetailedStats(stats, '/test/repo', { showPackages: true }); + + expect(output).toContain('Packages'); + expect(output).toContain('@lytics/dev-agent-core'); + }); + }); +}); diff --git a/packages/cli/src/utils/config.test.ts b/packages/cli/src/utils/config.test.ts index a10f1ed..01db737 100644 --- a/packages/cli/src/utils/config.test.ts +++ b/packages/cli/src/utils/config.test.ts @@ -1,8 +1,9 @@ import * as fs from 'node:fs/promises'; import * as os from 'node:os'; import * as path from 'node:path'; -import { afterAll, beforeAll, describe, expect, it } from 'vitest'; +import { afterAll, beforeAll, describe, expect, it, vi } from 'vitest'; import { findConfigFile, getDefaultConfig, loadConfig, saveConfig } from './config'; +import * as loggerModule from './logger'; describe('Config Utilities', () => { let testDir: string; @@ -43,7 +44,7 @@ describe('Config Utilities', () => { const adapters = config.mcp?.adapters; expect(adapters).toBeDefined(); - expect(Object.keys(adapters!)).toHaveLength(9); + expect(Object.keys(adapters ?? {})).toHaveLength(9); // Verify all adapters are present and enabled by default expect(adapters?.search?.enabled).toBe(true); @@ -136,11 +137,19 @@ describe('Config Utilities', () => { }); it('should return null if config not found', async () => { + // Suppress error logs for this intentional error test + const errorSpy = vi.spyOn(loggerModule.logger, 'error').mockImplementation(() => {}); + const loaded = await loadConfig('/nonexistent/path/.dev-agent/config.json'); expect(loaded).toBeNull(); + + errorSpy.mockRestore(); }); it('should handle invalid JSON gracefully', async () => { + // Suppress error logs for this intentional error test + const errorSpy = vi.spyOn(loggerModule.logger, 'error').mockImplementation(() => {}); + const invalidDir = path.join(testDir, '.dev-agent-invalid'); await fs.mkdir(invalidDir, { recursive: true }); const invalidPath = path.join(invalidDir, 'config.json'); @@ -148,6 +157,8 @@ describe('Config Utilities', () => { const loaded = await loadConfig(invalidPath); expect(loaded).toBeNull(); + + errorSpy.mockRestore(); }); }); }); diff --git a/packages/cli/src/utils/config.ts b/packages/cli/src/utils/config.ts index 8cf5728..fab5976 100644 --- a/packages/cli/src/utils/config.ts +++ b/packages/cli/src/utils/config.ts @@ -1,6 +1,5 @@ import * as fs from 'node:fs/promises'; import * as path from 'node:path'; -import chalk from 'chalk'; import { logger } from './logger.js'; /** @@ -181,7 +180,7 @@ export async function saveConfig( try { await fs.mkdir(configDir, { recursive: true }); await fs.writeFile(configPath, JSON.stringify(config, null, 2), 'utf-8'); - logger.success(`Config saved to ${chalk.cyan(configPath)}`); + // Silent save - let caller handle user messaging } catch (error) { throw new Error( `Failed to save config: ${error instanceof Error ? error.message : String(error)}` diff --git a/packages/cli/src/utils/date-utils.ts b/packages/cli/src/utils/date-utils.ts new file mode 100644 index 0000000..4b929f6 --- /dev/null +++ b/packages/cli/src/utils/date-utils.ts @@ -0,0 +1,70 @@ +/** + * Date utility functions for formatting and display + * Pure functions for testability + */ + +/** + * Get human-readable time since a date + * Pure function + */ +export function getTimeSince(date: Date, now: Date = new Date()): string { + const diffMs = now.getTime() - date.getTime(); + const diffMinutes = Math.floor(diffMs / (1000 * 60)); + const diffHours = Math.floor(diffMs / (1000 * 60 * 60)); + const diffDays = Math.floor(diffMs / (1000 * 60 * 60 * 24)); + + if (diffMinutes < 1) { + return 'just now'; + } + if (diffMinutes < 60) { + return `${diffMinutes} minute${diffMinutes === 1 ? '' : 's'} ago`; + } + if (diffHours < 24) { + return `${diffHours} hour${diffHours === 1 ? '' : 's'} ago`; + } + if (diffDays < 7) { + return `${diffDays} day${diffDays === 1 ? '' : 's'} ago`; + } + return date.toLocaleDateString(); +} + +/** + * Get formatted duration string + * Pure function + */ +export function formatDuration(milliseconds: number): string { + if (milliseconds < 1000) { + return `${milliseconds}ms`; + } + + const seconds = milliseconds / 1000; + if (seconds < 60) { + return `${seconds.toFixed(2)}s`; + } + + const minutes = Math.floor(seconds / 60); + const remainingSeconds = Math.floor(seconds % 60); + return `${minutes}m ${remainingSeconds}s`; +} + +/** + * Check if stats are potentially stale based on update count + * Pure function + */ +export function isStatsStale(incrementalUpdatesSince: number, threshold = 10): boolean { + return incrementalUpdatesSince > threshold; +} + +/** + * Get warning message for stale stats + * Pure function + */ +export function getStaleStatsWarning( + incrementalUpdatesSince: number, + threshold = 10 +): string | undefined { + if (!isStatsStale(incrementalUpdatesSince, threshold)) { + return undefined; + } + return "Consider running 'dev index' for most accurate statistics"; +} diff --git a/packages/cli/src/utils/formatters.ts b/packages/cli/src/utils/formatters.ts new file mode 100644 index 0000000..99b3370 --- /dev/null +++ b/packages/cli/src/utils/formatters.ts @@ -0,0 +1,310 @@ +/** + * Formatting utilities for enhanced CLI output + */ + +import type { DetailedIndexStats, LanguageStats, SupportedLanguage } from '@lytics/dev-agent-core'; +import chalk from 'chalk'; +import Table from 'cli-table3'; +import terminalSize from 'terminal-size'; +import { getTimeSince } from './date-utils'; + +/** + * Format bytes to human-readable string + */ +export function formatBytes(bytes: number): string { + if (bytes === 0) return '0 B'; + const k = 1024; + const sizes = ['B', 'KB', 'MB', 'GB']; + const i = Math.floor(Math.log(bytes) / Math.log(k)); + return `${(bytes / k ** i).toFixed(2)} ${sizes[i]}`; +} + +/** + * Format number with commas + */ +export function formatNumber(num: number): string { + return num.toLocaleString(); +} + +/** + * Get terminal width with fallback + */ +export function getTerminalWidth(): number { + try { + const size = terminalSize(); + return size.columns; + } catch { + return 80; // Fallback to 80 columns + } +} + +/** + * Create a language stats table + */ +export function createLanguageTable( + byLanguage: Partial> +): Table.Table { + const table = new Table({ + head: [ + chalk.cyan('Language'), + chalk.cyan('Files'), + chalk.cyan('Components'), + chalk.cyan('Lines of Code'), + ], + style: { + head: [], + border: ['gray'], + }, + colAligns: ['left', 'right', 'right', 'right'], + }); + + // Sort by component count (descending) + const entries = Object.entries(byLanguage).sort(([, a], [, b]) => b.components - a.components); + + for (const [language, stats] of entries) { + table.push([ + capitalizeLanguage(language), + formatNumber(stats.files), + formatNumber(stats.components), + formatNumber(stats.lines), + ]); + } + + // Add totals row + const totals = entries.reduce( + (acc, [, stats]) => ({ + files: acc.files + stats.files, + components: acc.components + stats.components, + lines: acc.lines + stats.lines, + }), + { files: 0, components: 0, lines: 0 } + ); + + table.push([ + chalk.bold('Total'), + chalk.bold(formatNumber(totals.files)), + chalk.bold(formatNumber(totals.components)), + chalk.bold(formatNumber(totals.lines)), + ]); + + return table; +} + +/** + * Create a component types table + */ +export function createComponentTypesTable( + byComponentType: Partial> +): Table.Table { + const table = new Table({ + head: [chalk.cyan('Component Type'), chalk.cyan('Count'), chalk.cyan('Percentage')], + style: { + head: [], + border: ['gray'], + }, + colAligns: ['left', 'right', 'right'], + }); + + // Calculate total for percentages + const total = Object.values(byComponentType).reduce((sum: number, count) => { + return sum + (count || 0); + }, 0); + + // Sort by count (descending), filtering out undefined + const entries = Object.entries(byComponentType) + .filter((entry): entry is [string, number] => entry[1] !== undefined) + .sort(([, a], [, b]) => b - a); + + for (const [type, count] of entries) { + const percentage = total > 0 ? ((count / total) * 100).toFixed(1) : '0.0'; + table.push([capitalizeType(type), formatNumber(count), `${percentage}%`]); + } + + return table; +} + +/** + * Create health status indicator + */ +export function createHealthIndicator(stats: DetailedIndexStats): string { + const { filesScanned, documentsIndexed, vectorsStored, errors } = stats; + + // Health checks + const hasFiles = filesScanned > 0; + const hasDocuments = documentsIndexed > 0; + const hasVectors = vectorsStored > 0; + const hasErrors = errors && errors.length > 0; + const errorRate = hasErrors && documentsIndexed > 0 ? errors.length / documentsIndexed : 0; + + if (!hasFiles) { + return `${chalk.red('●')} ${chalk.red('No files indexed')}`; + } + + if (!hasDocuments || !hasVectors) { + return `${chalk.yellow('●')} ${chalk.yellow('Incomplete index')}`; + } + + if (hasErrors && errorRate > 0.1) { + return `${chalk.yellow('●')} ${chalk.yellow(`High error rate (${(errorRate * 100).toFixed(1)}%)`)}`; + } + + if (hasErrors) { + return `${chalk.green('●')} ${chalk.green('Healthy')} ${chalk.gray(`(${errors.length} errors)`)}`; + } + + return `${chalk.green('●')} ${chalk.green('Healthy')}`; +} + +/** + * Capitalize language name + */ +export function capitalizeLanguage(lang: string): string { + const map: Record = { + typescript: 'TypeScript', + javascript: 'JavaScript', + go: 'Go', + markdown: 'Markdown', + }; + return map[lang.toLowerCase()] || lang.charAt(0).toUpperCase() + lang.slice(1); +} + +/** + * Capitalize component type + */ +function capitalizeType(type: string): string { + const map: Record = { + function: 'Function', + class: 'Class', + interface: 'Interface', + type: 'Type', + method: 'Method', + variable: 'Variable', + documentation: 'Documentation', + struct: 'Struct', + }; + return map[type.toLowerCase()] || type.charAt(0).toUpperCase() + type.slice(1); +} + +/** + * Create a compact overview section + */ +export function createOverviewSection(stats: DetailedIndexStats, repoPath: string): string[] { + const lines: string[] = []; + + lines.push(chalk.bold.cyan('πŸ“Š Repository Overview')); + lines.push(''); + lines.push(`${chalk.cyan('Repository:')} ${repoPath}`); + lines.push(`${chalk.cyan('Files Indexed:')} ${formatNumber(stats.filesScanned)}`); + lines.push(`${chalk.cyan('Components:')} ${formatNumber(stats.documentsIndexed)}`); + lines.push(`${chalk.cyan('Vectors Stored:')} ${formatNumber(stats.vectorsStored)}`); + + if (stats.startTime) { + const date = new Date(stats.startTime); + lines.push(`${chalk.cyan('Last Indexed:')} ${date.toLocaleString()}`); + } + + if (stats.duration) { + const seconds = (stats.duration / 1000).toFixed(2); + lines.push(`${chalk.cyan('Duration:')} ${seconds}s`); + } + + lines.push(`${chalk.cyan('Health:')} ${createHealthIndicator(stats)}`); + + // Add stats freshness information + if (stats.statsMetadata) { + const metadata = stats.statsMetadata; + lines.push(''); + + if (metadata.isIncremental) { + lines.push(chalk.yellow('ℹ️ Showing incremental update stats')); + if (metadata.affectedLanguages && metadata.affectedLanguages.length > 0) { + const langs = metadata.affectedLanguages.map(capitalizeLanguage).join(', '); + lines.push(` ${chalk.gray('Languages affected:')} ${langs}`); + } + } else { + const updatesSince = metadata.incrementalUpdatesSince || 0; + if (updatesSince > 0) { + const plural = updatesSince === 1 ? 'update' : 'updates'; + lines.push(chalk.gray(`πŸ’‘ ${updatesSince} incremental ${plural} since last full index`)); + } + } + + if (metadata.lastFullIndex) { + const timeSince = getTimeSince(new Date(metadata.lastFullIndex)); + lines.push(chalk.gray(` Last full index: ${timeSince}`)); + } + + if (metadata.warning) { + lines.push(''); + lines.push(chalk.yellow(`⚠️ ${metadata.warning}`)); + } + } + + return lines; +} + +/** + * Format detailed stats output + */ +export function formatDetailedStats( + stats: DetailedIndexStats, + repoPath: string, + options: { showPackages?: boolean } = {} +): string { + const sections: string[] = []; + + // Overview section + sections.push(createOverviewSection(stats, repoPath).join('\n')); + + // Language breakdown + if (stats.byLanguage && Object.keys(stats.byLanguage).length > 0) { + sections.push(''); + sections.push(chalk.bold.cyan('πŸ“ Language Breakdown')); + sections.push(''); + sections.push(createLanguageTable(stats.byLanguage).toString()); + } + + // Component types + if (stats.byComponentType && Object.keys(stats.byComponentType).length > 0) { + sections.push(''); + sections.push(chalk.bold.cyan('πŸ”§ Component Types')); + sections.push(''); + sections.push(createComponentTypesTable(stats.byComponentType).toString()); + } + + // Package stats (if requested and available) + if (options.showPackages && stats.byPackage && Object.keys(stats.byPackage).length > 0) { + sections.push(''); + sections.push(chalk.bold.cyan('πŸ“¦ Packages')); + sections.push(''); + const packageTable = createPackageTable(stats.byPackage); + sections.push(packageTable.toString()); + } + + return sections.join('\n'); +} + +/** + * Create a package stats table + */ +function createPackageTable( + byPackage: Record +): Table.Table { + const table = new Table({ + head: [chalk.cyan('Package'), chalk.cyan('Files'), chalk.cyan('Components')], + style: { + head: [], + border: ['gray'], + }, + colAligns: ['left', 'right', 'right'], + }); + + // Sort by component count (descending) + const entries = Object.entries(byPackage).sort(([, a], [, b]) => b.components - a.components); + + for (const [, pkg] of entries) { + table.push([pkg.name, formatNumber(pkg.files), formatNumber(pkg.components)]); + } + + return table; +} diff --git a/packages/cli/src/utils/output.ts b/packages/cli/src/utils/output.ts new file mode 100644 index 0000000..6aebb84 --- /dev/null +++ b/packages/cli/src/utils/output.ts @@ -0,0 +1,320 @@ +/** + * Clean output utilities for user-facing CLI output + * Separates user output from debug logging + */ + +import type { DetailedIndexStats, LanguageStats, SupportedLanguage } from '@lytics/dev-agent-core'; +import chalk from 'chalk'; +import Table from 'cli-table3'; +import { getTimeSince } from './date-utils.js'; +import { capitalizeLanguage, formatNumber } from './formatters.js'; + +/** + * Output interface for clean, logger-free output + */ +export const output = { + /** + * Print a line to stdout (no logger prefix) + */ + log(message: string = ''): void { + console.log(message); + }, + + /** + * Print an error to stderr + */ + error(message: string): void { + console.error(chalk.red(`βœ— ${message}`)); + }, + + /** + * Print a success message + */ + success(message: string): void { + console.log(chalk.green(`βœ“ ${message}`)); + }, + + /** + * Print a warning message + */ + warn(message: string): void { + console.log(chalk.yellow(`⚠ ${message}`)); + }, + + /** + * Print an info message + */ + info(message: string): void { + console.log(chalk.blue(`β„Ή ${message}`)); + }, +}; + +/** + * Format a compact one-line summary + */ +export function formatCompactSummary(stats: DetailedIndexStats, repoName: string): string { + const health = getHealthStatus(stats); + const timeSince = stats.startTime ? getTimeSince(new Date(stats.startTime)) : 'unknown'; + + return `πŸ“Š ${chalk.bold(repoName)} β€’ ${formatNumber(stats.filesScanned)} files β€’ ${formatNumber(stats.documentsIndexed)} components β€’ Indexed ${timeSince} β€’ ${health}`; +} + +/** + * Get health status indicator + */ +function getHealthStatus(stats: DetailedIndexStats): string { + const { filesScanned, documentsIndexed, vectorsStored, errors } = stats; + + const hasFiles = filesScanned > 0; + const hasDocuments = documentsIndexed > 0; + const hasVectors = vectorsStored > 0; + const hasErrors = errors && errors.length > 0; + const errorRate = hasErrors && documentsIndexed > 0 ? errors.length / documentsIndexed : 0; + + if (!hasFiles) { + return `${chalk.red('βœ—')} No files`; + } + + if (!hasDocuments || !hasVectors) { + return `${chalk.yellow('⚠')} Incomplete`; + } + + if (hasErrors && errorRate > 0.1) { + return `${chalk.yellow('⚠')} ${(errorRate * 100).toFixed(0)}% errors`; + } + + return `${chalk.green('βœ“')} Healthy`; +} + +/** + * Format language breakdown in compact table + */ +export function formatLanguageBreakdown( + byLanguage: Partial>, + options: { verbose?: boolean } = {} +): string { + const entries = Object.entries(byLanguage).sort(([, a], [, b]) => b.components - a.components); + + const lines: string[] = []; + + for (const [language, stats] of entries) { + const name = capitalizeLanguage(language).padEnd(12); + const files = formatNumber(stats.files).padStart(5); + const components = formatNumber(stats.components).padStart(6); + const loc = options.verbose ? formatNumber(stats.lines).padStart(10) : ''; + + if (options.verbose) { + lines.push( + `${name} ${chalk.gray(files)} files ${chalk.gray(components)} components ${chalk.gray(loc)} LOC` + ); + } else { + lines.push(`${name} ${chalk.gray(files)} files ${chalk.gray(components)} components`); + } + } + + return lines.join('\n'); +} + +/** + * Format component types breakdown + */ +export function formatComponentTypes(byComponentType: Partial>): string { + const entries = Object.entries(byComponentType) + .filter((entry): entry is [string, number] => entry[1] !== undefined) + .sort(([, a], [, b]) => b - a) + .slice(0, 3); // Top 3 only + + const parts = entries.map(([type, count]) => { + const name = type.charAt(0).toUpperCase() + type.slice(1); + return `${name} (${formatNumber(count)})`; + }); + + return `πŸ”§ ${chalk.gray('Top Components:')} ${parts.join(' β€’ ')}`; +} + +/** + * Format GitHub stats in compact form + */ +export function formatGitHubSummary(githubStats: { + repository: string; + totalDocuments: number; + byType: { issue?: number; pull_request?: number }; + byState: { open?: number; closed?: number; merged?: number }; + lastIndexed: string; +}): string { + const issues = githubStats.byType.issue || 0; + const prs = githubStats.byType.pull_request || 0; + const open = githubStats.byState.open || 0; + const merged = githubStats.byState.merged || 0; + + const timeSince = getTimeSince(new Date(githubStats.lastIndexed)); + + return [ + `πŸ”— ${chalk.bold(githubStats.repository)} β€’ ${formatNumber(githubStats.totalDocuments)} documents`, + ` ${chalk.gray(issues.toString())} issues β€’ ${chalk.gray(prs.toString())} PRs β€’ ${chalk.gray(open.toString())} open β€’ ${chalk.gray(merged.toString())} merged β€’ Synced ${timeSince}`, + ].join('\n'); +} + +/** + * Format detailed stats with tables (for verbose mode) + */ +export function formatDetailedLanguageTable( + byLanguage: Partial> +): string { + const table = new Table({ + head: [ + chalk.cyan('Language'), + chalk.cyan('Files'), + chalk.cyan('Components'), + chalk.cyan('Lines of Code'), + ], + style: { + head: [], + border: ['gray'], + }, + colAligns: ['left', 'right', 'right', 'right'], + }); + + const entries = Object.entries(byLanguage).sort(([, a], [, b]) => b.components - a.components); + + for (const [language, stats] of entries) { + table.push([ + capitalizeLanguage(language), + formatNumber(stats.files), + formatNumber(stats.components), + formatNumber(stats.lines), + ]); + } + + // Add totals row + const totals = entries.reduce( + (acc, [, stats]) => ({ + files: acc.files + stats.files, + components: acc.components + stats.components, + lines: acc.lines + stats.lines, + }), + { files: 0, components: 0, lines: 0 } + ); + + table.push([ + chalk.bold('Total'), + chalk.bold(formatNumber(totals.files)), + chalk.bold(formatNumber(totals.components)), + chalk.bold(formatNumber(totals.lines)), + ]); + + return table.toString(); +} + +/** + * Format index success summary (compact) + */ +export function formatIndexSummary(stats: { + code: { files: number; documents: number; vectors: number; duration: number; size: string }; + git?: { commits: number; duration: number }; + github?: { documents: number; duration: number }; + total: { duration: number; storage: string }; +}): string { + const lines: string[] = []; + + // One-line summary + const parts: string[] = []; + parts.push(`${formatNumber(stats.code.files)} files`); + parts.push(`${formatNumber(stats.code.documents)} components`); + if (stats.git) parts.push(`${formatNumber(stats.git.commits)} commits`); + if (stats.github) parts.push(`${formatNumber(stats.github.documents)} GitHub docs`); + + lines.push(`πŸ“Š ${chalk.bold('Indexed:')} ${parts.join(' β€’ ')}`); + + // Timing and storage + lines.push( + ` ${chalk.gray('Duration:')} ${stats.total.duration}s β€’ ${chalk.gray('Storage:')} ${stats.code.size}` + ); + + // Next step + lines.push(''); + lines.push(` ${chalk.gray('Search with:')} ${chalk.cyan('dev search ""')}`); + + return lines.join('\n'); +} + +/** + * Format update summary + */ +export function formatUpdateSummary(stats: { + filesUpdated: number; + documentsReindexed: number; + duration: number; +}): string { + if (stats.filesUpdated === 0) { + return `${chalk.green('βœ“')} Index is up to date`; + } + + return [ + `${chalk.green('βœ“')} Updated ${formatNumber(stats.filesUpdated)} files β€’ ${formatNumber(stats.documentsReindexed)} components β€’ ${stats.duration}s`, + ].join('\n'); +} + +/** + * Format search results (compact) + */ +export function formatSearchResults( + results: Array<{ + score: number; + metadata: { + name?: string; + type?: string; + path?: string; + file?: string; + startLine?: number; + endLine?: number; + signature?: string; + docstring?: string; + }; + }>, + repoPath: string, + options: { verbose?: boolean } = {} +): string { + if (results.length === 0) { + return ''; + } + + const lines: string[] = []; + + for (let i = 0; i < results.length; i++) { + const result = results[i]; + const metadata = result.metadata; + const score = (result.score * 100).toFixed(1); + + const name = metadata.name || metadata.type || 'Unknown'; + const filePath = (metadata.path || metadata.file) as string; + const relativePath = filePath ? filePath.replace(`${repoPath}/`, '') : 'unknown'; + const location = `${relativePath}:${metadata.startLine}-${metadata.endLine}`; + + if (options.verbose) { + // Verbose: Multi-line with details + lines.push(chalk.bold(`${i + 1}. ${chalk.cyan(name)} ${chalk.gray(`(${score}% match)`)}`)); + lines.push(` ${chalk.gray('File:')} ${location}`); + + if (metadata.signature) { + lines.push(` ${chalk.gray('Signature:')} ${chalk.yellow(metadata.signature)}`); + } + + if (metadata.docstring) { + const doc = String(metadata.docstring); + const truncated = doc.length > 80 ? `${doc.substring(0, 77)}...` : doc; + lines.push(` ${chalk.gray('Doc:')} ${truncated}`); + } + lines.push(''); + } else { + // Compact: One line per result + const scoreColor = + result.score > 0.8 ? chalk.green : result.score > 0.6 ? chalk.yellow : chalk.gray; + lines.push( + `${chalk.gray((i + 1).toString().padStart(2))} ${chalk.cyan(name.padEnd(30).substring(0, 30))} ${scoreColor(`${score}%`)} ${chalk.gray(location)}` + ); + } + } + + return lines.join('\n'); +} diff --git a/packages/core/package.json b/packages/core/package.json index b651ca2..c30baa8 100644 --- a/packages/core/package.json +++ b/packages/core/package.json @@ -30,9 +30,9 @@ "test:watch": "vitest" }, "devDependencies": { - "tree-sitter-wasms": "^0.1.13", "@types/mdast": "^4.0.4", "@types/node": "^24.10.1", + "tree-sitter-wasms": "^0.1.13", "typescript": "^5.3.3" }, "dependencies": { @@ -45,6 +45,7 @@ "remark-stringify": "^11.0.0", "ts-morph": "^27.0.2", "unified": "^11.0.5", - "web-tree-sitter": "^0.25.10" + "web-tree-sitter": "^0.25.10", + "zod": "^4.1.13" } } diff --git a/packages/core/src/indexer/__tests__/detailed-stats.integration.test.ts b/packages/core/src/indexer/__tests__/detailed-stats.integration.test.ts index 90c3db7..2226a28 100644 --- a/packages/core/src/indexer/__tests__/detailed-stats.integration.test.ts +++ b/packages/core/src/indexer/__tests__/detailed-stats.integration.test.ts @@ -68,12 +68,12 @@ describe('Detailed Stats Integration', () => { expect(stats.byLanguage?.javascript).toBeDefined(); // TypeScript should have 2 components (function + class) - expect(stats.byLanguage?.typescript.files).toBe(1); - expect(stats.byLanguage?.typescript.components).toBeGreaterThanOrEqual(2); + expect(stats.byLanguage?.typescript?.files).toBe(1); + expect(stats.byLanguage?.typescript?.components).toBeGreaterThanOrEqual(2); // JavaScript should have 1 component (function) - expect(stats.byLanguage?.javascript.files).toBe(1); - expect(stats.byLanguage?.javascript.components).toBeGreaterThanOrEqual(1); + expect(stats.byLanguage?.javascript?.files).toBe(1); + expect(stats.byLanguage?.javascript?.components).toBeGreaterThanOrEqual(1); }); it('should collect component type stats', async () => { @@ -191,7 +191,12 @@ This is a test project. }); await indexer.initialize(); - const _initialStats = (await indexer.index()) as DetailedIndexStats; + const initialStats = (await indexer.index()) as DetailedIndexStats; + + // Verify initial stats metadata + expect(initialStats.statsMetadata).toBeDefined(); + expect(initialStats.statsMetadata?.isIncremental).toBe(false); + expect(initialStats.statsMetadata?.incrementalUpdatesSince).toBe(0); // Add new file await fs.writeFile( @@ -211,12 +216,29 @@ This is a test project. since: new Date(Date.now() - 1000), })) as DetailedIndexStats; - await indexer.close(); + // Verify incremental update stats show only the new JavaScript file + expect(updateStats.statsMetadata).toBeDefined(); + expect(updateStats.statsMetadata?.isIncremental).toBe(true); + expect(updateStats.statsMetadata?.incrementalUpdatesSince).toBe(1); + expect(updateStats.statsMetadata?.affectedLanguages).toContain('javascript'); - // Verify update stats show the new JavaScript file expect(updateStats.byLanguage).toBeDefined(); expect(updateStats.byLanguage?.javascript).toBeDefined(); - expect(updateStats.byLanguage?.javascript.files).toBeGreaterThanOrEqual(1); + expect(updateStats.byLanguage?.javascript?.files).toBe(1); // Only the new file + + // Verify getStats() returns the full picture (both TypeScript and JavaScript) + const fullStats = (await indexer.getStats()) as DetailedIndexStats; + expect(fullStats.byLanguage).toBeDefined(); + expect(fullStats.byLanguage?.typescript).toBeDefined(); + expect(fullStats.byLanguage?.typescript?.files).toBe(1); + expect(fullStats.byLanguage?.javascript).toBeDefined(); + expect(fullStats.byLanguage?.javascript?.files).toBe(1); + + // Full stats metadata should show it's not incremental + expect(fullStats.statsMetadata?.isIncremental).toBe(false); + expect(fullStats.statsMetadata?.incrementalUpdatesSince).toBe(1); + + await indexer.close(); }); it('should calculate line counts correctly', async () => { @@ -247,7 +269,8 @@ This is a test project. await indexer.close(); // Verify line count is captured - expect(stats.byLanguage?.typescript.lines).toBeGreaterThan(0); + expect(stats.byLanguage?.typescript).toBeDefined(); + expect(stats.byLanguage?.typescript?.lines).toBeGreaterThan(0); }); it('should handle empty repository gracefully', async () => { diff --git a/packages/core/src/indexer/__tests__/stats-aggregator.test.ts b/packages/core/src/indexer/__tests__/stats-aggregator.test.ts index 7b71eb3..00ba774 100644 --- a/packages/core/src/indexer/__tests__/stats-aggregator.test.ts +++ b/packages/core/src/indexer/__tests__/stats-aggregator.test.ts @@ -230,7 +230,7 @@ describe('StatsAggregator', () => { // Should match the more specific package expect(stats.byPackage['packages/core'].components).toBe(1); - expect(stats.byPackage['packages'].components).toBe(0); + expect(stats.byPackage.packages.components).toBe(0); }); it('should handle mixed languages in a package', () => { @@ -346,8 +346,9 @@ describe('StatsAggregator', () => { // Should complete in reasonable time (<100ms for 10k docs) expect(duration).toBeLessThan(100); - expect(stats.byLanguage.typescript.files).toBe(10000); - expect(stats.byLanguage.typescript.components).toBe(10000); + expect(stats.byLanguage.typescript).toBeDefined(); + expect(stats.byLanguage.typescript?.files).toBe(10000); + expect(stats.byLanguage.typescript?.components).toBe(10000); }); }); @@ -438,7 +439,8 @@ describe('StatsAggregator', () => { const stats = aggregator.getDetailedStats(); - expect(stats.byLanguage.typescript.files).toBe(1); + expect(stats.byLanguage.typescript).toBeDefined(); + expect(stats.byLanguage.typescript?.files).toBe(1); expect(Object.keys(stats.byPackage).length).toBe(0); }); @@ -460,7 +462,8 @@ describe('StatsAggregator', () => { const stats = aggregator.getDetailedStats(); - expect(stats.byLanguage.typescript.lines).toBe(1); + expect(stats.byLanguage.typescript).toBeDefined(); + expect(stats.byLanguage.typescript?.lines).toBe(1); }); }); }); diff --git a/packages/core/src/indexer/__tests__/stats-merger.test.ts b/packages/core/src/indexer/__tests__/stats-merger.test.ts new file mode 100644 index 0000000..4cebaff --- /dev/null +++ b/packages/core/src/indexer/__tests__/stats-merger.test.ts @@ -0,0 +1,411 @@ +import { describe, expect, it } from 'vitest'; +import { + addIncrementalComponentStats, + addIncrementalLanguageStats, + addIncrementalPackageStats, + type MergeableStats, + mergeStats, + subtractDeletedFiles, +} from '../stats-merger'; +import type { FileMetadata, LanguageStats, PackageStats, SupportedLanguage } from '../types'; + +describe('stats-merger', () => { + describe('subtractDeletedFiles', () => { + it('should subtract file count for deleted files', () => { + const stats: Partial> = { + typescript: { files: 3, components: 10, lines: 100 }, + javascript: { files: 2, components: 5, lines: 50 }, + }; + + const deleted = [ + { + path: 'deleted.ts', + metadata: { language: 'typescript' } as FileMetadata, + }, + ]; + + const result = subtractDeletedFiles(stats, deleted); + + expect(result.typescript).toEqual({ files: 2, components: 10, lines: 100 }); + expect(result.javascript).toEqual({ files: 2, components: 5, lines: 50 }); + }); + + it('should remove language when no files left', () => { + const stats: Partial> = { + typescript: { files: 1, components: 2, lines: 20 }, + }; + + const deleted = [ + { + path: 'only.ts', + metadata: { language: 'typescript' } as FileMetadata, + }, + ]; + + const result = subtractDeletedFiles(stats, deleted); + + expect(result.typescript).toBeUndefined(); + }); + + it('should handle deleting from non-existent language gracefully', () => { + const stats: Partial> = { + typescript: { files: 2, components: 5, lines: 50 }, + }; + + const deleted = [ + { + path: 'deleted.go', + metadata: { language: 'go' } as FileMetadata, + }, + ]; + + const result = subtractDeletedFiles(stats, deleted); + + expect(result).toEqual(stats); + }); + + it('should not mutate original stats', () => { + const stats: Partial> = { + typescript: { files: 3, components: 10, lines: 100 }, + }; + + const original = JSON.parse(JSON.stringify(stats)); + const deleted = [ + { + path: 'deleted.ts', + metadata: { language: 'typescript' } as FileMetadata, + }, + ]; + + subtractDeletedFiles(stats, deleted); + + expect(stats).toEqual(original); + }); + + it('should handle multiple deletions of same language', () => { + const stats: Partial> = { + typescript: { files: 5, components: 20, lines: 200 }, + }; + + const deleted = [ + { + path: 'deleted1.ts', + metadata: { language: 'typescript' } as FileMetadata, + }, + { + path: 'deleted2.ts', + metadata: { language: 'typescript' } as FileMetadata, + }, + ]; + + const result = subtractDeletedFiles(stats, deleted); + + expect(result.typescript).toEqual({ files: 3, components: 20, lines: 200 }); + }); + }); + + describe('addIncrementalLanguageStats', () => { + it('should add new language', () => { + const current: Partial> = { + typescript: { files: 2, components: 10, lines: 100 }, + }; + + const incremental: Partial> = { + javascript: { files: 1, components: 3, lines: 30 }, + }; + + const result = addIncrementalLanguageStats(current, incremental); + + expect(result.typescript).toEqual({ files: 2, components: 10, lines: 100 }); + expect(result.javascript).toEqual({ files: 1, components: 3, lines: 30 }); + }); + + it('should merge with existing language', () => { + const current: Partial> = { + typescript: { files: 2, components: 10, lines: 100 }, + }; + + const incremental: Partial> = { + typescript: { files: 1, components: 5, lines: 50 }, + }; + + const result = addIncrementalLanguageStats(current, incremental); + + expect(result.typescript).toEqual({ files: 3, components: 15, lines: 150 }); + }); + + it('should not mutate original stats', () => { + const current: Partial> = { + typescript: { files: 2, components: 10, lines: 100 }, + }; + + const original = JSON.parse(JSON.stringify(current)); + const incremental: Partial> = { + javascript: { files: 1, components: 3, lines: 30 }, + }; + + addIncrementalLanguageStats(current, incremental); + + expect(current).toEqual(original); + }); + + it('should handle empty incremental stats', () => { + const current: Partial> = { + typescript: { files: 2, components: 10, lines: 100 }, + }; + + const result = addIncrementalLanguageStats(current, {}); + + expect(result).toEqual(current); + }); + }); + + describe('addIncrementalComponentStats', () => { + it('should add new component types', () => { + const current = { + function: 10, + class: 5, + }; + + const incremental = { + interface: 3, + }; + + const result = addIncrementalComponentStats(current, incremental); + + expect(result).toEqual({ + function: 10, + class: 5, + interface: 3, + }); + }); + + it('should merge with existing types', () => { + const current = { + function: 10, + class: 5, + }; + + const incremental = { + function: 3, + class: 2, + }; + + const result = addIncrementalComponentStats(current, incremental); + + expect(result).toEqual({ + function: 13, + class: 7, + }); + }); + + it('should skip non-numeric values', () => { + const current = { + function: 10, + }; + + const incremental = { + function: 3, + invalid: 'not-a-number' as any, + }; + + const result = addIncrementalComponentStats(current, incremental); + + expect(result).toEqual({ + function: 13, + }); + }); + }); + + describe('addIncrementalPackageStats', () => { + it('should add new package', () => { + const current: Record = { + pkg1: { + name: 'package-1', + path: 'packages/pkg1', + files: 5, + components: 20, + languages: { typescript: 5 }, + }, + }; + + const incremental: Record = { + pkg2: { + name: 'package-2', + path: 'packages/pkg2', + files: 3, + components: 10, + languages: { javascript: 3 }, + }, + }; + + const result = addIncrementalPackageStats(current, incremental); + + expect(result.pkg1).toEqual(current.pkg1); + expect(result.pkg2).toEqual(incremental.pkg2); + }); + + it('should merge with existing package', () => { + const current: Record = { + pkg1: { + name: 'package-1', + path: 'packages/pkg1', + files: 5, + components: 20, + languages: { typescript: 5 }, + }, + }; + + const incremental: Record = { + pkg1: { + name: 'package-1', + path: 'packages/pkg1', + files: 2, + components: 8, + languages: { typescript: 1, javascript: 1 }, + }, + }; + + const result = addIncrementalPackageStats(current, incremental); + + expect(result.pkg1).toEqual({ + name: 'package-1', + path: 'packages/pkg1', + files: 7, + components: 28, + languages: { typescript: 6, javascript: 1 }, + }); + }); + }); + + describe('mergeStats', () => { + it('should perform full merge operation', () => { + const currentStats: MergeableStats = { + byLanguage: { + typescript: { files: 10, components: 50, lines: 500 }, + javascript: { files: 5, components: 20, lines: 200 }, + }, + byComponentType: { + function: 40, + class: 20, + }, + }; + + const deletedFiles = [ + { + path: 'deleted.js', + metadata: { language: 'javascript' } as FileMetadata, + }, + ]; + + const changedFiles = [ + { + path: 'changed.ts', + metadata: { language: 'typescript' } as FileMetadata, + }, + ]; + + const incrementalStats = { + byLanguage: { + typescript: { files: 1, components: 6, lines: 60 }, + go: { files: 1, components: 3, lines: 30 }, + } as Partial>, + byComponentType: { + function: 5, + struct: 3, + }, + byPackage: {}, + }; + + const result = mergeStats({ + currentStats, + deletedFiles, + changedFiles, + incrementalStats, + }); + + // TypeScript: 10 - 1 (changed) + 1 (re-added) = 10 files + expect(result.byLanguage?.typescript).toEqual({ + files: 10, + components: 56, + lines: 560, + }); + + // JavaScript: 5 - 1 (deleted) = 4 files + expect(result.byLanguage?.javascript).toEqual({ + files: 4, + components: 20, + lines: 200, + }); + + // Go: new language + expect(result.byLanguage?.go).toEqual({ + files: 1, + components: 3, + lines: 30, + }); + + // Component types merged + expect(result.byComponentType).toEqual({ + function: 45, + class: 20, + struct: 3, + }); + }); + + it('should handle empty current stats', () => { + const currentStats: MergeableStats = { + byLanguage: {}, + byComponentType: {}, + }; + + const incrementalStats = { + byLanguage: { + typescript: { files: 1, components: 5, lines: 50 }, + } as Partial>, + byComponentType: { + function: 3, + }, + byPackage: {}, + }; + + const result = mergeStats({ + currentStats, + deletedFiles: [], + changedFiles: [], + incrementalStats, + }); + + expect(result.byLanguage).toEqual(incrementalStats.byLanguage); + expect(result.byComponentType).toEqual(incrementalStats.byComponentType); + }); + + it('should not mutate input stats', () => { + const currentStats: MergeableStats = { + byLanguage: { + typescript: { files: 10, components: 50, lines: 500 }, + }, + byComponentType: { + function: 40, + }, + }; + + const original = JSON.parse(JSON.stringify(currentStats)); + + mergeStats({ + currentStats, + deletedFiles: [], + changedFiles: [], + incrementalStats: { + byLanguage: { + javascript: { files: 1, components: 3, lines: 30 }, + } as Partial>, + byComponentType: {}, + byPackage: {}, + }, + }); + + expect(currentStats).toEqual(original); + }); + }); +}); diff --git a/packages/core/src/indexer/__tests__/test-factories.ts b/packages/core/src/indexer/__tests__/test-factories.ts new file mode 100644 index 0000000..9615367 --- /dev/null +++ b/packages/core/src/indexer/__tests__/test-factories.ts @@ -0,0 +1,97 @@ +/** + * Test factories for creating test data + * Promotes DRY principles and makes tests more readable + */ + +import type { + DetailedIndexStats, + FileMetadata, + LanguageStats, + StatsMetadata, + SupportedLanguage, +} from '../types'; + +/** + * Create language stats for testing + */ +export function createLanguageStats(overrides: Partial = {}): LanguageStats { + return { + files: 1, + components: 5, + lines: 100, + ...overrides, + }; +} + +/** + * Create file metadata for testing + */ +export function createFileMetadata(overrides: Partial = {}): FileMetadata { + return { + path: 'src/test.ts', + hash: 'abc123', + lastModified: new Date(), + lastIndexed: new Date(), + documentIds: ['doc1', 'doc2'], + size: 1024, + language: 'typescript', + ...overrides, + }; +} + +/** + * Create stats metadata for testing + */ +export function createStatsMetadata(overrides: Partial = {}): StatsMetadata { + const now = new Date(); + return { + isIncremental: false, + lastFullIndex: now, + lastUpdate: now, + incrementalUpdatesSince: 0, + ...overrides, + }; +} + +/** + * Create detailed index stats for testing + */ +export function createDetailedIndexStats( + overrides: Partial = {} +): DetailedIndexStats { + const now = new Date(); + return { + filesScanned: 10, + documentsExtracted: 50, + documentsIndexed: 50, + vectorsStored: 50, + duration: 5000, + errors: [], + startTime: now, + endTime: now, + repositoryPath: '/test/repo', + byLanguage: { + typescript: createLanguageStats(), + }, + byComponentType: { + function: 30, + class: 15, + interface: 5, + }, + statsMetadata: createStatsMetadata(), + ...overrides, + }; +} + +/** + * Create a map of language stats by language + */ +export function createLanguageStatsMap( + languages: Array<{ lang: SupportedLanguage; stats?: Partial }> +): Partial> { + const result: Partial> = {}; + for (const { lang, stats } of languages) { + result[lang] = createLanguageStats(stats); + } + return result; +} diff --git a/packages/core/src/indexer/index.ts b/packages/core/src/indexer/index.ts index f1a8403..c43f8e7 100644 --- a/packages/core/src/indexer/index.ts +++ b/packages/core/src/indexer/index.ts @@ -11,6 +11,7 @@ import { getCurrentSystemResources, getOptimalConcurrency } from '../utils/concu import { VectorStorage } from '../vector'; import type { EmbeddingDocument, SearchOptions, SearchResult } from '../vector/types'; import { StatsAggregator } from './stats-aggregator'; +import { mergeStats } from './stats-merger'; import type { DetailedIndexStats, FileMetadata, @@ -19,6 +20,7 @@ import type { IndexerState, IndexOptions, IndexStats, + SupportedLanguage, UpdateOptions, } from './types'; import { getExtensionForLanguage, prepareDocumentsForEmbedding } from './utils'; @@ -224,9 +226,6 @@ export class RepositoryIndexer { logger?.info({ documentsIndexed, errors: errors.length }, 'Embedding complete'); - // Update state - await this.updateState(scanResult.documents); - // Phase 4: Complete const endTime = new Date(); onProgress?.({ @@ -251,8 +250,23 @@ export class RepositoryIndexer { endTime, repositoryPath: this.config.repositoryPath, ...detailedStats, + statsMetadata: { + isIncremental: false, + lastFullIndex: endTime, + lastUpdate: endTime, + incrementalUpdatesSince: 0, + }, }; + // Update state with file metadata and detailed stats + await this.updateState(scanResult.documents, detailedStats); + + // Reset incremental update counter after full index + if (this.state) { + this.state.incrementalUpdatesSince = 0; + this.state.lastUpdate = endTime; + } + return stats; } catch (error) { errors.push({ @@ -338,7 +352,8 @@ export class RepositoryIndexer { // Scan and index changed + added files let documentsExtracted = 0; let documentsIndexed = 0; - const statsAggregator = new StatsAggregator(); + let incrementalStats: ReturnType | null = null; + const affectedLanguages = new Set(); if (filesToReindex.length > 0) { const scanResult = await scanRepository({ @@ -350,26 +365,47 @@ export class RepositoryIndexer { documentsExtracted = scanResult.documents.length; - // Aggregate detailed statistics + // Calculate stats for incremental changes + const statsAggregator = new StatsAggregator(); for (const doc of scanResult.documents) { statsAggregator.addDocument(doc); + affectedLanguages.add(doc.language); } + incrementalStats = statsAggregator.getDetailedStats(); // Index new documents const embeddingDocuments = prepareDocumentsForEmbedding(scanResult.documents); await this.vectorStorage.addDocuments(embeddingDocuments); documentsIndexed = embeddingDocuments.length; + // Merge incremental stats into state (updates the full repository stats) + this.applyStatsMerge(deleted, changed, incrementalStats); + // Update state with new documents await this.updateState(scanResult.documents); } else { - // Only deletions - still need to save state + // Only deletions - need to update stats by removing deleted file contributions + if (deleted.length > 0) { + this.applyStatsMerge(deleted, [], null); + } + // Save state await this.saveState(); } const endTime = new Date(); - const detailedStats = statsAggregator.getDetailedStats(); + // Update metadata + const incrementalUpdatesSince = (this.state.incrementalUpdatesSince || 0) + 1; + if (this.state) { + this.state.incrementalUpdatesSince = incrementalUpdatesSince; + this.state.lastUpdate = endTime; + } + + // Build metadata + const lastFullIndex = this.state?.lastIndexTime || endTime; + const warning = this.getStatsWarning(incrementalUpdatesSince); + + // Return incremental stats (what changed) with metadata return { filesScanned: filesToReindex.length, documentsExtracted, @@ -380,7 +416,16 @@ export class RepositoryIndexer { startTime, endTime, repositoryPath: this.config.repositoryPath, - ...detailedStats, + // Include incremental stats if we calculated them + ...(incrementalStats || {}), + statsMetadata: { + isIncremental: true, + lastFullIndex, + lastUpdate: endTime, + incrementalUpdatesSince, + affectedLanguages: Array.from(affectedLanguages) as SupportedLanguage[], + warning, + }, }; } @@ -394,12 +439,16 @@ export class RepositoryIndexer { /** * Get indexing statistics */ - async getStats(): Promise { + async getStats(): Promise { if (!this.state) { return null; } const vectorStats = await this.vectorStorage.getStats(); + const lastFullIndex = this.state.lastIndexTime; + const lastUpdate = this.state.lastUpdate || lastFullIndex; + const incrementalUpdatesSince = this.state.incrementalUpdatesSince || 0; + const warning = this.getStatsWarning(incrementalUpdatesSince); return { filesScanned: this.state.stats.totalFiles, @@ -411,9 +460,77 @@ export class RepositoryIndexer { startTime: this.state.lastIndexTime, endTime: this.state.lastIndexTime, repositoryPath: this.state.repositoryPath, + byLanguage: this.state.stats.byLanguage, + byComponentType: this.state.stats.byComponentType, + byPackage: this.state.stats.byPackage, + statsMetadata: { + isIncremental: false, // getStats returns full picture + lastFullIndex, + lastUpdate, + incrementalUpdatesSince, + warning, + }, }; } + /** + * Apply stat merging using pure functions + * Wrapper around the pure mergeStats function that updates state + */ + private applyStatsMerge( + deleted: string[], + changed: string[], + incrementalStats: ReturnType | null + ): void { + if (!this.state) { + return; + } + + // Prepare file metadata for deleted and changed files + const deletedFiles = deleted + .map((path) => ({ path, metadata: this.state?.files[path] })) + .filter((f) => f.metadata !== undefined); + + const changedFiles = changed + .map((path) => ({ path, metadata: this.state?.files[path] })) + .filter((f) => f.metadata !== undefined); + + // Use pure function to compute new stats + const mergedStats = mergeStats({ + currentStats: { + byLanguage: this.state.stats.byLanguage || {}, + byComponentType: this.state.stats.byComponentType || {}, + byPackage: this.state.stats.byPackage || {}, + }, + deletedFiles: deletedFiles.filter((f) => f.metadata !== undefined) as Array<{ + path: string; + metadata: FileMetadata; + }>, + changedFiles: changedFiles.filter((f) => f.metadata !== undefined) as Array<{ + path: string; + metadata: FileMetadata; + }>, + incrementalStats, + }); + + // Update state with merged stats + this.state.stats.byLanguage = mergedStats.byLanguage; + this.state.stats.byComponentType = mergedStats.byComponentType; + this.state.stats.byPackage = mergedStats.byPackage; + } + + /** + * Get warning message for stale stats + * Extracted for testability + */ + private getStatsWarning(incrementalUpdatesSince: number): string | undefined { + const threshold = 10; + if (incrementalUpdatesSince > threshold) { + return "Consider running 'dev index' for most accurate statistics"; + } + return undefined; + } + /** * Close the indexer and cleanup resources */ @@ -463,7 +580,23 @@ export class RepositoryIndexer { /** * Update state with newly indexed documents */ - private async updateState(documents: Document[]): Promise { + private async updateState( + documents: Document[], + detailedStats?: { + byLanguage?: Record; + byComponentType?: Partial>; + byPackage?: Record< + string, + { + name: string; + path: string; + files: number; + components: number; + languages: Partial>; + } + >; + } + ): Promise { if (!this.state) { this.state = { version: INDEXER_VERSION, @@ -523,6 +656,19 @@ export class RepositoryIndexer { this.state.stats.totalVectors = documents.length; this.state.lastIndexTime = new Date(); + // Save detailed stats if provided + if (detailedStats) { + if (detailedStats.byLanguage) { + this.state.stats.byLanguage = detailedStats.byLanguage; + } + if (detailedStats.byComponentType) { + this.state.stats.byComponentType = detailedStats.byComponentType; + } + if (detailedStats.byPackage) { + this.state.stats.byPackage = detailedStats.byPackage; + } + } + // Save state await this.saveState(); } diff --git a/packages/core/src/indexer/stats-aggregator.ts b/packages/core/src/indexer/stats-aggregator.ts index c503a78..31eff66 100644 --- a/packages/core/src/indexer/stats-aggregator.ts +++ b/packages/core/src/indexer/stats-aggregator.ts @@ -60,15 +60,12 @@ export class StatsAggregator { * Get aggregated statistics */ getDetailedStats(): { - byLanguage: Record; - byComponentType: Record; + byLanguage: Partial>; + byComponentType: Partial>; byPackage: Record; } { return { - byLanguage: Object.fromEntries(this.languageStats) as Record< - SupportedLanguage, - LanguageStats - >, + byLanguage: Object.fromEntries(this.languageStats), byComponentType: Object.fromEntries(this.componentTypeStats), byPackage: Object.fromEntries(this.packageStats), }; diff --git a/packages/core/src/indexer/stats-merger.ts b/packages/core/src/indexer/stats-merger.ts new file mode 100644 index 0000000..7ba5828 --- /dev/null +++ b/packages/core/src/indexer/stats-merger.ts @@ -0,0 +1,212 @@ +/** + * Pure functions for merging incremental stats into existing repository stats + * Extracted for testability and reusability + */ + +import type { StatsAggregator } from './stats-aggregator'; +import type { FileMetadata, LanguageStats, PackageStats, SupportedLanguage } from './types'; + +/** + * Stats that can be merged + */ +export interface MergeableStats { + byLanguage?: Partial>; + byComponentType?: Partial>; + byPackage?: Record; +} + +/** + * Input for stat merging operation + */ +export interface StatMergeInput { + currentStats: MergeableStats; + deletedFiles: Array<{ path: string; metadata: FileMetadata }>; + changedFiles: Array<{ path: string; metadata: FileMetadata }>; + incrementalStats: ReturnType | null; +} + +/** + * Merge incremental stats into existing repository stats + * Pure function that returns new stats without mutations + */ +export function mergeStats(input: StatMergeInput): MergeableStats { + const { currentStats, deletedFiles, changedFiles, incrementalStats } = input; + + // Deep clone to avoid mutations + const merged: MergeableStats = { + byLanguage: currentStats.byLanguage ? { ...currentStats.byLanguage } : {}, + byComponentType: currentStats.byComponentType ? { ...currentStats.byComponentType } : {}, + byPackage: currentStats.byPackage ? { ...currentStats.byPackage } : {}, + }; + + // Ensure language stats are cloned deeply + if (merged.byLanguage) { + for (const [lang, stats] of Object.entries(merged.byLanguage)) { + if (stats) { + merged.byLanguage[lang as SupportedLanguage] = { ...stats }; + } + } + } + + // Process deletions + merged.byLanguage = subtractDeletedFiles(merged.byLanguage || {}, deletedFiles); + + // Process changes (remove old contribution) + merged.byLanguage = subtractChangedFiles(merged.byLanguage || {}, changedFiles); + + // Add new/changed file contributions + if (incrementalStats) { + merged.byLanguage = addIncrementalLanguageStats( + merged.byLanguage || {}, + incrementalStats.byLanguage || {} + ); + + merged.byComponentType = addIncrementalComponentStats( + merged.byComponentType || {}, + incrementalStats.byComponentType || {} + ); + + merged.byPackage = addIncrementalPackageStats( + merged.byPackage || {}, + incrementalStats.byPackage || {} + ); + } + + return merged; +} + +/** + * Subtract deleted files from language stats + * Pure function + */ +export function subtractDeletedFiles( + stats: Partial>, + deletedFiles: Array<{ path: string; metadata: FileMetadata }> +): Partial> { + const result = { ...stats }; + + for (const { metadata } of deletedFiles) { + const lang = metadata.language as SupportedLanguage; + const langStats = result[lang]; + + if (langStats) { + const updated = { ...langStats }; + updated.files = Math.max(0, updated.files - 1); + + if (updated.files === 0) { + // Remove language if no files left + delete result[lang]; + } else { + result[lang] = updated; + } + } + } + + return result; +} + +/** + * Subtract changed files from language stats (they'll be re-added with new stats) + * Pure function + */ +export function subtractChangedFiles( + stats: Partial>, + changedFiles: Array<{ path: string; metadata: FileMetadata }> +): Partial> { + // Same logic as deletions - we subtract the old contribution + return subtractDeletedFiles(stats, changedFiles); +} + +/** + * Add incremental language stats + * Pure function + */ +export function addIncrementalLanguageStats( + currentStats: Partial>, + incrementalStats: Partial> +): Partial> { + const result = { ...currentStats }; + + for (const [lang, stats] of Object.entries(incrementalStats)) { + const langKey = lang as SupportedLanguage; + const current = result[langKey]; + + if (!current) { + // New language, just add it + result[langKey] = { ...stats }; + } else { + // Merge with existing + result[langKey] = { + files: current.files + stats.files, + components: current.components + stats.components, + lines: current.lines + stats.lines, + }; + } + } + + return result; +} + +/** + * Add incremental component type stats + * Pure function + */ +export function addIncrementalComponentStats( + currentStats: Partial>, + incrementalStats: Partial> +): Partial> { + const result = { ...currentStats }; + + for (const [type, count] of Object.entries(incrementalStats)) { + if (typeof count === 'number') { + result[type] = (result[type] || 0) + count; + } + } + + return result; +} + +/** + * Add incremental package stats + * Pure function + */ +export function addIncrementalPackageStats( + currentStats: Record, + incrementalStats: Record +): Record { + const result = { ...currentStats }; + + for (const [pkgPath, pkgStats] of Object.entries(incrementalStats)) { + const current = result[pkgPath]; + + if (!current) { + // New package + result[pkgPath] = { + name: pkgStats.name, + path: pkgStats.path, + files: pkgStats.files, + components: pkgStats.components, + languages: { ...pkgStats.languages }, + }; + } else { + // Merge with existing + const languages = { ...current.languages }; + for (const [lang, count] of Object.entries(pkgStats.languages)) { + if (typeof count === 'number') { + languages[lang as SupportedLanguage] = + ((languages[lang as SupportedLanguage] as number) || 0) + count; + } + } + + result[pkgPath] = { + name: current.name, + path: current.path, + files: current.files + pkgStats.files, + components: current.components + pkgStats.components, + languages, + }; + } + } + + return result; +} diff --git a/packages/core/src/indexer/types.ts b/packages/core/src/indexer/types.ts index 4465faa..59fdb4d 100644 --- a/packages/core/src/indexer/types.ts +++ b/packages/core/src/indexer/types.ts @@ -61,6 +61,29 @@ export interface IndexProgress { percentComplete: number; } +/** + * Metadata about the freshness and source of statistics + */ +export interface StatsMetadata { + /** Whether this is from an incremental update (vs full index) */ + isIncremental: boolean; + + /** Timestamp of the last full index */ + lastFullIndex: Date; + + /** Timestamp of the last update (full or incremental) */ + lastUpdate: Date; + + /** Number of incremental updates since last full index */ + incrementalUpdatesSince: number; + + /** Languages affected by this update (only set for incremental updates) */ + affectedLanguages?: SupportedLanguage[]; + + /** Warning message if stats may be stale */ + warning?: string; +} + /** * Statistics from an indexing operation */ @@ -91,6 +114,9 @@ export interface IndexStats { /** Repository path that was indexed */ repositoryPath: string; + + /** Metadata about stats freshness and source */ + statsMetadata?: StatsMetadata; } /** @@ -158,7 +184,7 @@ export interface PackageStats { */ export interface DetailedIndexStats extends IndexStats { /** Statistics broken down by language */ - byLanguage?: Record; + byLanguage?: Partial>; /** Statistics broken down by component type */ byComponentType?: Partial>; @@ -212,6 +238,12 @@ export interface IndexerState { /** Last full index timestamp */ lastIndexTime: Date; + /** Last update timestamp (full or incremental) */ + lastUpdate?: Date; + + /** Number of incremental updates since last full index */ + incrementalUpdatesSince?: number; + /** File metadata map (path -> metadata) */ files: Record; @@ -220,6 +252,9 @@ export interface IndexerState { totalFiles: number; totalDocuments: number; totalVectors: number; + byLanguage?: Partial>; + byComponentType?: Partial>; + byPackage?: Record; }; } diff --git a/packages/core/src/storage/metadata.ts b/packages/core/src/storage/metadata.ts index 18aaf4c..027e6e7 100644 --- a/packages/core/src/storage/metadata.ts +++ b/packages/core/src/storage/metadata.ts @@ -7,6 +7,7 @@ import { execSync } from 'node:child_process'; import * as fs from 'node:fs/promises'; import * as path from 'node:path'; import { getGitRemote, normalizeGitRemote } from './path'; +import { validateRepositoryMetadata } from './validation.js'; export interface RepositoryMetadata { version: string; @@ -73,7 +74,9 @@ export async function loadMetadata(storagePath: string): Promise; + +/** + * Validate repository metadata from JSON + * Returns null if validation fails (for backward compatibility with current error handling) + */ +export function validateRepositoryMetadata(data: unknown): RepositoryMetadataData | null { + const result = RepositoryMetadataSchema.safeParse(data); + return result.success ? result.data : null; +} diff --git a/packages/mcp-server/package.json b/packages/mcp-server/package.json index 9045d4f..ca6512c 100644 --- a/packages/mcp-server/package.json +++ b/packages/mcp-server/package.json @@ -19,7 +19,8 @@ "dependencies": { "@lytics/dev-agent-core": "workspace:*", "@lytics/dev-agent-subagents": "workspace:*", - "@lytics/kero": "workspace:*" + "@lytics/kero": "workspace:*", + "zod": "^4.1.13" }, "devDependencies": { "@types/node": "^22.0.0", diff --git a/packages/mcp-server/src/adapters/__tests__/explore-adapter.test.ts b/packages/mcp-server/src/adapters/__tests__/explore-adapter.test.ts index 6195887..b1d7140 100644 --- a/packages/mcp-server/src/adapters/__tests__/explore-adapter.test.ts +++ b/packages/mcp-server/src/adapters/__tests__/explore-adapter.test.ts @@ -64,7 +64,8 @@ describe('ExploreAdapter', () => { ); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_ACTION'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('action'); }); it('should reject empty query', async () => { @@ -77,7 +78,8 @@ describe('ExploreAdapter', () => { ); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_QUERY'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('query'); }); it('should reject invalid limit', async () => { @@ -91,7 +93,8 @@ describe('ExploreAdapter', () => { ); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_LIMIT'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('limit'); }); it('should reject invalid threshold', async () => { @@ -105,7 +108,8 @@ describe('ExploreAdapter', () => { ); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_THRESHOLD'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('threshold'); }); it('should reject invalid format', async () => { @@ -119,7 +123,8 @@ describe('ExploreAdapter', () => { ); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_FORMAT'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('format'); }); }); diff --git a/packages/mcp-server/src/adapters/__tests__/github-adapter.test.ts b/packages/mcp-server/src/adapters/__tests__/github-adapter.test.ts index 8da055c..669363e 100644 --- a/packages/mcp-server/src/adapters/__tests__/github-adapter.test.ts +++ b/packages/mcp-server/src/adapters/__tests__/github-adapter.test.ts @@ -87,7 +87,8 @@ describe('GitHubAdapter', () => { ); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_ACTION'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('action'); }); it('should reject search without query', async () => { @@ -99,7 +100,8 @@ describe('GitHubAdapter', () => { ); expect(result.success).toBe(false); - expect(result.error?.code).toBe('MISSING_QUERY'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('query'); }); it('should reject context without number', async () => { @@ -111,7 +113,8 @@ describe('GitHubAdapter', () => { ); expect(result.success).toBe(false); - expect(result.error?.code).toBe('MISSING_NUMBER'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('number'); }); it('should reject related without number', async () => { @@ -123,7 +126,8 @@ describe('GitHubAdapter', () => { ); expect(result.success).toBe(false); - expect(result.error?.code).toBe('MISSING_NUMBER'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('number'); }); it('should reject invalid limit', async () => { @@ -137,7 +141,8 @@ describe('GitHubAdapter', () => { ); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_LIMIT'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('limit'); }); it('should reject invalid format', async () => { @@ -151,7 +156,8 @@ describe('GitHubAdapter', () => { ); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_FORMAT'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('format'); }); }); diff --git a/packages/mcp-server/src/adapters/__tests__/history-adapter.test.ts b/packages/mcp-server/src/adapters/__tests__/history-adapter.test.ts index 77270a1..319cb9d 100644 --- a/packages/mcp-server/src/adapters/__tests__/history-adapter.test.ts +++ b/packages/mcp-server/src/adapters/__tests__/history-adapter.test.ts @@ -179,14 +179,16 @@ describe('HistoryAdapter', () => { const result = await adapter.execute({}, mockContext); expect(result.success).toBe(false); - expect(result.error?.code).toBe('MISSING_INPUT'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('query'); }); it('should validate limit range', async () => { const result = await adapter.execute({ query: 'test', limit: 100 }, mockContext); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_LIMIT'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('limit'); }); }); diff --git a/packages/mcp-server/src/adapters/__tests__/map-adapter.test.ts b/packages/mcp-server/src/adapters/__tests__/map-adapter.test.ts index 7c45eb9..de81070 100644 --- a/packages/mcp-server/src/adapters/__tests__/map-adapter.test.ts +++ b/packages/mcp-server/src/adapters/__tests__/map-adapter.test.ts @@ -118,28 +118,32 @@ describe('MapAdapter', () => { const result = await adapter.execute({ depth: 10 }, execContext); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_DEPTH'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('depth'); }); it('should reject depth less than 1', async () => { const result = await adapter.execute({ depth: 0 }, execContext); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_DEPTH'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('depth'); }); it('should reject invalid focus type', async () => { const result = await adapter.execute({ focus: 123 }, execContext); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_FOCUS'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('focus'); }); it('should reject invalid token budget', async () => { const result = await adapter.execute({ tokenBudget: 100 }, execContext); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_TOKEN_BUDGET'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('tokenBudget'); }); }); diff --git a/packages/mcp-server/src/adapters/__tests__/plan-adapter.test.ts b/packages/mcp-server/src/adapters/__tests__/plan-adapter.test.ts index daf98b5..ad05780 100644 --- a/packages/mcp-server/src/adapters/__tests__/plan-adapter.test.ts +++ b/packages/mcp-server/src/adapters/__tests__/plan-adapter.test.ts @@ -157,22 +157,24 @@ describe('PlanAdapter', () => { const result = await adapter.execute({ issue: 'invalid' }, mockExecutionContext); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_ISSUE'); - expect(result.error?.message).toContain('positive number'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('issue'); }); it('should reject invalid issue number (negative)', async () => { const result = await adapter.execute({ issue: -1 }, mockExecutionContext); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_ISSUE'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('issue'); }); it('should reject invalid issue number (zero)', async () => { const result = await adapter.execute({ issue: 0 }, mockExecutionContext); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_ISSUE'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('issue'); }); it('should reject invalid format', async () => { @@ -182,7 +184,8 @@ describe('PlanAdapter', () => { ); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_FORMAT'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('format'); expect(result.error?.message).toContain('compact'); }); }); diff --git a/packages/mcp-server/src/adapters/__tests__/refs-adapter.test.ts b/packages/mcp-server/src/adapters/__tests__/refs-adapter.test.ts index ad9abfc..d014f71 100644 --- a/packages/mcp-server/src/adapters/__tests__/refs-adapter.test.ts +++ b/packages/mcp-server/src/adapters/__tests__/refs-adapter.test.ts @@ -123,7 +123,8 @@ describe('RefsAdapter', () => { const result = await adapter.execute({ name: '' }, execContext); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_NAME'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('name'); }); it('should reject invalid direction', async () => { @@ -133,14 +134,16 @@ describe('RefsAdapter', () => { ); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_DIRECTION'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('direction'); }); it('should reject invalid limit', async () => { const result = await adapter.execute({ name: 'createPlan', limit: 100 }, execContext); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_LIMIT'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('limit'); }); }); diff --git a/packages/mcp-server/src/adapters/__tests__/search-adapter.test.ts b/packages/mcp-server/src/adapters/__tests__/search-adapter.test.ts index f658871..8553877 100644 --- a/packages/mcp-server/src/adapters/__tests__/search-adapter.test.ts +++ b/packages/mcp-server/src/adapters/__tests__/search-adapter.test.ts @@ -104,14 +104,16 @@ describe('SearchAdapter', () => { expect(result.success).toBe(false); expect(result.error).toBeDefined(); - expect(result.error?.code).toBe('INVALID_QUERY'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('query'); }); it('should reject non-string query', async () => { const result = await adapter.execute({ query: 123 }, execContext); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_QUERY'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('query'); }); it('should accept valid query', async () => { @@ -158,7 +160,8 @@ describe('SearchAdapter', () => { ); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_FORMAT'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('format'); }); it('should use default format when not specified', async () => { @@ -192,7 +195,8 @@ describe('SearchAdapter', () => { ); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_LIMIT'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('limit'); }); it('should reject limit above 50', async () => { @@ -205,7 +209,8 @@ describe('SearchAdapter', () => { ); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_LIMIT'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('limit'); }); }); @@ -232,7 +237,8 @@ describe('SearchAdapter', () => { ); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_SCORE_THRESHOLD'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('scoreThreshold'); }); it('should reject threshold above 1', async () => { @@ -245,7 +251,8 @@ describe('SearchAdapter', () => { ); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_SCORE_THRESHOLD'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('scoreThreshold'); }); }); diff --git a/packages/mcp-server/src/adapters/__tests__/status-adapter.test.ts b/packages/mcp-server/src/adapters/__tests__/status-adapter.test.ts index 7a2ca4d..14704a7 100644 --- a/packages/mcp-server/src/adapters/__tests__/status-adapter.test.ts +++ b/packages/mcp-server/src/adapters/__tests__/status-adapter.test.ts @@ -147,8 +147,8 @@ describe('StatusAdapter', () => { const result = await adapter.execute({ section: 'invalid' }, mockExecutionContext); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_SECTION'); - expect(result.error?.message).toContain('summary'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('section'); }); it('should reject invalid format', async () => { @@ -158,8 +158,8 @@ describe('StatusAdapter', () => { ); expect(result.success).toBe(false); - expect(result.error?.code).toBe('INVALID_FORMAT'); - expect(result.error?.message).toContain('compact'); + expect(result.error?.code).toBe('INVALID_PARAMS'); + expect(result.error?.message).toContain('format'); }); }); diff --git a/packages/mcp-server/src/adapters/built-in/explore-adapter.ts b/packages/mcp-server/src/adapters/built-in/explore-adapter.ts index 8dc8ef2..b387c46 100644 --- a/packages/mcp-server/src/adapters/built-in/explore-adapter.ts +++ b/packages/mcp-server/src/adapters/built-in/explore-adapter.ts @@ -13,8 +13,10 @@ import type { RelationshipResult, SimilarCodeResult, } from '@lytics/dev-agent-subagents'; +import { ExploreArgsSchema } from '../../schemas/index.js'; import { ToolAdapter } from '../tool-adapter'; import type { AdapterContext, ToolDefinition, ToolExecutionContext, ToolResult } from '../types'; +import { validateArgs } from '../validation.js'; export interface ExploreAdapterConfig { repositoryPath: string; @@ -119,69 +121,13 @@ export class ExploreAdapter extends ToolAdapter { } async execute(args: Record, context: ToolExecutionContext): Promise { - const { - action, - query, - limit = this.defaultLimit, - threshold = this.defaultThreshold, - fileTypes, - format = this.defaultFormat, - } = args; - - // Validate action - if (action !== 'pattern' && action !== 'similar' && action !== 'relationships') { - return { - success: false, - error: { - code: 'INVALID_ACTION', - message: 'Action must be "pattern", "similar", or "relationships"', - }, - }; - } - - // Validate query - if (typeof query !== 'string' || query.trim().length === 0) { - return { - success: false, - error: { - code: 'INVALID_QUERY', - message: 'Query must be a non-empty string', - }, - }; - } - - // Validate limit - if (typeof limit !== 'number' || limit < 1 || limit > 100) { - return { - success: false, - error: { - code: 'INVALID_LIMIT', - message: 'Limit must be between 1 and 100', - }, - }; - } - - // Validate threshold - if (typeof threshold !== 'number' || threshold < 0 || threshold > 1) { - return { - success: false, - error: { - code: 'INVALID_THRESHOLD', - message: 'Threshold must be between 0 and 1', - }, - }; + // Validate args with Zod + const validation = validateArgs(ExploreArgsSchema, args); + if (!validation.success) { + return validation.error; } - // Validate format - if (format !== 'compact' && format !== 'verbose') { - return { - success: false, - error: { - code: 'INVALID_FORMAT', - message: 'Format must be either "compact" or "verbose"', - }, - }; - } + const { action, query, limit, threshold, fileTypes, format } = validation.data; try { context.logger.debug('Executing exploration', { diff --git a/packages/mcp-server/src/adapters/built-in/github-adapter.ts b/packages/mcp-server/src/adapters/built-in/github-adapter.ts index c1372fa..993a5e6 100644 --- a/packages/mcp-server/src/adapters/built-in/github-adapter.ts +++ b/packages/mcp-server/src/adapters/built-in/github-adapter.ts @@ -10,8 +10,10 @@ import { type GitHubSearchResult, } from '@lytics/dev-agent-subagents'; import { estimateTokensForText } from '../../formatters/utils'; +import { GitHubArgsSchema } from '../../schemas/index.js'; import { ToolAdapter } from '../tool-adapter'; import type { AdapterContext, ToolDefinition, ToolExecutionContext, ToolResult } from '../types'; +import { validateArgs } from '../validation.js'; export interface GitHubAdapterConfig { repositoryPath: string; @@ -226,71 +228,13 @@ export class GitHubAdapter extends ToolAdapter { } async execute(args: Record, context: ToolExecutionContext): Promise { - const { - action, - query, - number, - type, - state, - labels, - author, - limit = this.defaultLimit, - format = this.defaultFormat, - } = args; - - // Validate action - if (action !== 'search' && action !== 'context' && action !== 'related') { - return { - success: false, - error: { - code: 'INVALID_ACTION', - message: 'Action must be "search", "context", or "related"', - }, - }; - } - - // Validate action-specific requirements - if (action === 'search' && (typeof query !== 'string' || query.trim().length === 0)) { - return { - success: false, - error: { - code: 'MISSING_QUERY', - message: 'Search action requires a query parameter', - }, - }; - } - - if ((action === 'context' || action === 'related') && typeof number !== 'number') { - return { - success: false, - error: { - code: 'MISSING_NUMBER', - message: `${action} action requires a number parameter`, - }, - }; - } - - // Validate limit - if (typeof limit !== 'number' || limit < 1 || limit > 100) { - return { - success: false, - error: { - code: 'INVALID_LIMIT', - message: 'Limit must be between 1 and 100', - }, - }; + // Validate args with Zod + const validation = validateArgs(GitHubArgsSchema, args); + if (!validation.success) { + return validation.error; } - // Validate format - if (format !== 'compact' && format !== 'verbose') { - return { - success: false, - error: { - code: 'INVALID_FORMAT', - message: 'Format must be either "compact" or "verbose"', - }, - }; - } + const { action, query, number, type, state, labels, author, limit, format } = validation.data; try { const startTime = Date.now(); diff --git a/packages/mcp-server/src/adapters/built-in/health-adapter.ts b/packages/mcp-server/src/adapters/built-in/health-adapter.ts index 7e98e34..f30db4f 100644 --- a/packages/mcp-server/src/adapters/built-in/health-adapter.ts +++ b/packages/mcp-server/src/adapters/built-in/health-adapter.ts @@ -5,8 +5,10 @@ */ import * as fs from 'node:fs/promises'; +import { HealthArgsSchema } from '../../schemas/index.js'; import { ToolAdapter } from '../tool-adapter'; import type { AdapterContext, ToolDefinition, ToolExecutionContext, ToolResult } from '../types'; +import { validateArgs } from '../validation.js'; export interface HealthCheckConfig { repositoryPath: string; @@ -74,7 +76,13 @@ export class HealthAdapter extends ToolAdapter { } async execute(args: Record, context: ToolExecutionContext): Promise { - const verbose = args.verbose === true; + // Validate args with Zod (no type assertions!) + const validation = validateArgs(HealthArgsSchema, args); + if (!validation.success) { + return validation.error; + } + + const { verbose } = validation.data; try { const health = await this.performHealthChecks(verbose); diff --git a/packages/mcp-server/src/adapters/built-in/history-adapter.ts b/packages/mcp-server/src/adapters/built-in/history-adapter.ts index 14098c5..7febbd8 100644 --- a/packages/mcp-server/src/adapters/built-in/history-adapter.ts +++ b/packages/mcp-server/src/adapters/built-in/history-adapter.ts @@ -5,8 +5,10 @@ import type { GitCommit, GitIndexer, LocalGitExtractor } from '@lytics/dev-agent-core'; import { estimateTokensForText, startTimer } from '../../formatters/utils'; +import { HistoryArgsSchema } from '../../schemas/index.js'; import { ToolAdapter } from '../tool-adapter'; import type { AdapterContext, ToolDefinition, ToolExecutionContext, ToolResult } from '../types'; +import { validateArgs } from '../validation.js'; /** * History adapter configuration @@ -117,42 +119,13 @@ export class HistoryAdapter extends ToolAdapter { } async execute(args: Record, context: ToolExecutionContext): Promise { - const { - query, - file, - limit = this.config.defaultLimit, - since, - author, - tokenBudget = this.config.defaultTokenBudget, - } = args as { - query?: string; - file?: string; - limit?: number; - since?: string; - author?: string; - tokenBudget?: number; - }; - - // Validate inputs - if (!query && !file) { - return { - success: false, - error: { - code: 'MISSING_INPUT', - message: 'Either "query" or "file" must be provided', - }, - }; + // Validate args with Zod + const validation = validateArgs(HistoryArgsSchema, args); + if (!validation.success) { + return validation.error; } - if (typeof limit !== 'number' || limit < 1 || limit > 50) { - return { - success: false, - error: { - code: 'INVALID_LIMIT', - message: 'Limit must be a number between 1 and 50', - }, - }; - } + const { query, file, limit, since, author, tokenBudget } = validation.data; try { const timer = startTimer(); diff --git a/packages/mcp-server/src/adapters/built-in/map-adapter.ts b/packages/mcp-server/src/adapters/built-in/map-adapter.ts index 4831ae3..3fd17e9 100644 --- a/packages/mcp-server/src/adapters/built-in/map-adapter.ts +++ b/packages/mcp-server/src/adapters/built-in/map-adapter.ts @@ -11,8 +11,10 @@ import { type RepositoryIndexer, } from '@lytics/dev-agent-core'; import { estimateTokensForText, startTimer } from '../../formatters/utils'; +import { MapArgsSchema } from '../../schemas/index.js'; import { ToolAdapter } from '../tool-adapter'; import type { AdapterContext, ToolDefinition, ToolExecutionContext, ToolResult } from '../types'; +import { validateArgs } from '../validation.js'; /** * Map adapter configuration @@ -118,52 +120,13 @@ export class MapAdapter extends ToolAdapter { } async execute(args: Record, context: ToolExecutionContext): Promise { - const { - depth = this.config.defaultDepth, - focus, - includeExports = true, - tokenBudget = this.config.defaultTokenBudget, - includeChangeFrequency = false, - } = args as { - depth?: number; - focus?: string; - includeExports?: boolean; - tokenBudget?: number; - includeChangeFrequency?: boolean; - }; - - // Validate depth - if (typeof depth !== 'number' || depth < 1 || depth > 5) { - return { - success: false, - error: { - code: 'INVALID_DEPTH', - message: 'Depth must be a number between 1 and 5', - }, - }; - } - - // Validate focus if provided - if (focus !== undefined && typeof focus !== 'string') { - return { - success: false, - error: { - code: 'INVALID_FOCUS', - message: 'Focus must be a string path', - }, - }; + // Validate args with Zod + const validation = validateArgs(MapArgsSchema, args); + if (!validation.success) { + return validation.error; } - // Validate tokenBudget - if (typeof tokenBudget !== 'number' || tokenBudget < 500 || tokenBudget > 10000) { - return { - success: false, - error: { - code: 'INVALID_TOKEN_BUDGET', - message: 'Token budget must be a number between 500 and 10000', - }, - }; - } + const { depth, focus, includeExports, tokenBudget, includeChangeFrequency } = validation.data; try { const timer = startTimer(); diff --git a/packages/mcp-server/src/adapters/built-in/plan-adapter.ts b/packages/mcp-server/src/adapters/built-in/plan-adapter.ts index 9e7516f..067ed4e 100644 --- a/packages/mcp-server/src/adapters/built-in/plan-adapter.ts +++ b/packages/mcp-server/src/adapters/built-in/plan-adapter.ts @@ -9,8 +9,10 @@ import type { GitIndexer, RepositoryIndexer } from '@lytics/dev-agent-core'; import type { ContextAssemblyOptions } from '@lytics/dev-agent-subagents'; import { assembleContext, formatContextPackage } from '@lytics/dev-agent-subagents'; import { estimateTokensForText, startTimer } from '../../formatters/utils'; +import { PlanArgsSchema } from '../../schemas/index.js'; import { ToolAdapter } from '../tool-adapter'; import type { AdapterContext, ToolDefinition, ToolExecutionContext, ToolResult } from '../types'; +import { validateArgs } from '../validation.js'; /** * Plan adapter configuration @@ -125,36 +127,14 @@ export class PlanAdapter extends ToolAdapter { } async execute(args: Record, context: ToolExecutionContext): Promise { - const { - issue, - format = this.defaultFormat, - includeCode = true, - includePatterns = true, - tokenBudget = 4000, - includeGitHistory = true, - } = args; - - // Validate issue number - if (typeof issue !== 'number' || issue < 1) { - return { - success: false, - error: { - code: 'INVALID_ISSUE', - message: 'Issue must be a positive number', - }, - }; + // Validate args with Zod + const validation = validateArgs(PlanArgsSchema, args); + if (!validation.success) { + return validation.error; } - // Validate format - if (format !== 'compact' && format !== 'verbose') { - return { - success: false, - error: { - code: 'INVALID_FORMAT', - message: 'Format must be either "compact" or "verbose"', - }, - }; - } + const { issue, format, includeCode, includePatterns, tokenBudget, includeGitHistory } = + validation.data; try { const timer = startTimer(); diff --git a/packages/mcp-server/src/adapters/built-in/refs-adapter.ts b/packages/mcp-server/src/adapters/built-in/refs-adapter.ts index b728e47..9745a1b 100644 --- a/packages/mcp-server/src/adapters/built-in/refs-adapter.ts +++ b/packages/mcp-server/src/adapters/built-in/refs-adapter.ts @@ -5,8 +5,10 @@ import type { CalleeInfo, RepositoryIndexer, SearchResult } from '@lytics/dev-agent-core'; import { estimateTokensForText, startTimer } from '../../formatters/utils'; +import { RefsArgsSchema } from '../../schemas/index.js'; import { ToolAdapter } from '../tool-adapter'; import type { AdapterContext, ToolDefinition, ToolExecutionContext, ToolResult } from '../types'; +import { validateArgs } from '../validation.js'; /** * Direction of relationship query @@ -104,48 +106,13 @@ export class RefsAdapter extends ToolAdapter { } async execute(args: Record, context: ToolExecutionContext): Promise { - const { - name, - direction = 'both', - limit = this.config.defaultLimit, - } = args as { - name: string; - direction?: RefDirection; - limit?: number; - }; - - // Validate name - if (typeof name !== 'string' || name.trim().length === 0) { - return { - success: false, - error: { - code: 'INVALID_NAME', - message: 'Name must be a non-empty string', - }, - }; - } - - // Validate direction - if (!['callees', 'callers', 'both'].includes(direction)) { - return { - success: false, - error: { - code: 'INVALID_DIRECTION', - message: 'Direction must be "callees", "callers", or "both"', - }, - }; + // Validate args with Zod + const validation = validateArgs(RefsArgsSchema, args); + if (!validation.success) { + return validation.error; } - // Validate limit - if (typeof limit !== 'number' || limit < 1 || limit > 50) { - return { - success: false, - error: { - code: 'INVALID_LIMIT', - message: 'Limit must be a number between 1 and 50', - }, - }; - } + const { name, direction, limit } = validation.data; try { const timer = startTimer(); diff --git a/packages/mcp-server/src/adapters/built-in/search-adapter.ts b/packages/mcp-server/src/adapters/built-in/search-adapter.ts index 3ca969a..9d4b3b6 100644 --- a/packages/mcp-server/src/adapters/built-in/search-adapter.ts +++ b/packages/mcp-server/src/adapters/built-in/search-adapter.ts @@ -5,9 +5,11 @@ import type { RepositoryIndexer } from '@lytics/dev-agent-core'; import { CompactFormatter, type FormatMode, VerboseFormatter } from '../../formatters'; +import { SearchArgsSchema } from '../../schemas/index.js'; import { findRelatedTestFiles, formatRelatedFiles } from '../../utils/related-files'; import { ToolAdapter } from '../tool-adapter'; import type { AdapterContext, ToolDefinition, ToolExecutionContext, ToolResult } from '../types'; +import { validateArgs } from '../validation.js'; /** * Search adapter configuration @@ -125,71 +127,13 @@ export class SearchAdapter extends ToolAdapter { } async execute(args: Record, context: ToolExecutionContext): Promise { - const { - query, - format = this.config.defaultFormat, - limit = this.config.defaultLimit, - scoreThreshold = 0, - tokenBudget, - } = args; - - // Validate query - if (typeof query !== 'string' || query.trim().length === 0) { - return { - success: false, - error: { - code: 'INVALID_QUERY', - message: 'Query must be a non-empty string', - }, - }; + // Validate args with Zod + const validation = validateArgs(SearchArgsSchema, args); + if (!validation.success) { + return validation.error; } - // Validate format - if (format !== 'compact' && format !== 'verbose') { - return { - success: false, - error: { - code: 'INVALID_FORMAT', - message: 'Format must be either "compact" or "verbose"', - }, - }; - } - - // Validate limit - if (typeof limit !== 'number' || limit < 1 || limit > 50) { - return { - success: false, - error: { - code: 'INVALID_LIMIT', - message: 'Limit must be a number between 1 and 50', - }, - }; - } - - // Validate scoreThreshold - if (typeof scoreThreshold !== 'number' || scoreThreshold < 0 || scoreThreshold > 1) { - return { - success: false, - error: { - code: 'INVALID_SCORE_THRESHOLD', - message: 'Score threshold must be a number between 0 and 1', - }, - }; - } - - // Validate tokenBudget if provided - if ( - tokenBudget !== undefined && - (typeof tokenBudget !== 'number' || tokenBudget < 500 || tokenBudget > 10000) - ) { - return { - success: false, - error: { - code: 'INVALID_TOKEN_BUDGET', - message: 'Token budget must be a number between 500 and 10000', - }, - }; - } + const { query, format, limit, scoreThreshold, tokenBudget } = validation.data; try { const startTime = Date.now(); diff --git a/packages/mcp-server/src/adapters/built-in/status-adapter.ts b/packages/mcp-server/src/adapters/built-in/status-adapter.ts index fb25b29..13488f6 100644 --- a/packages/mcp-server/src/adapters/built-in/status-adapter.ts +++ b/packages/mcp-server/src/adapters/built-in/status-adapter.ts @@ -8,8 +8,10 @@ import * as path from 'node:path'; import type { RepositoryIndexer } from '@lytics/dev-agent-core'; import { GitHubIndexer } from '@lytics/dev-agent-subagents'; import { estimateTokensForText } from '../../formatters/utils'; +import { StatusArgsSchema } from '../../schemas/index.js'; import { ToolAdapter } from '../tool-adapter'; import type { AdapterContext, ToolDefinition, ToolExecutionContext, ToolResult } from '../types'; +import { validateArgs } from '../validation.js'; /** * Status section types @@ -203,41 +205,20 @@ export class StatusAdapter extends ToolAdapter { } async execute(args: Record, context: ToolExecutionContext): Promise { - const { section = this.defaultSection, format = 'compact' } = args; - - // Validate section - const validSections: StatusSection[] = ['summary', 'repo', 'indexes', 'github', 'health']; - if (!validSections.includes(section as StatusSection)) { - return { - success: false, - error: { - code: 'INVALID_SECTION', - message: `Section must be one of: ${validSections.join(', ')}`, - }, - }; + // Validate args with Zod (no type assertions!) + const validation = validateArgs(StatusArgsSchema, args); + if (!validation.success) { + return validation.error; } - // Validate format - if (format !== 'compact' && format !== 'verbose') { - return { - success: false, - error: { - code: 'INVALID_FORMAT', - message: 'Format must be either "compact" or "verbose"', - }, - }; - } + const { section, format } = validation.data; try { const startTime = Date.now(); context.logger.debug('Executing status check', { section, format }); // Generate status content based on section - const content = await this.generateStatus( - section as StatusSection, - format as string, - context - ); + const content = await this.generateStatus(section, format, context); const duration_ms = Date.now() - startTime; const tokens = estimateTokensForText(content); diff --git a/packages/mcp-server/src/adapters/validation.ts b/packages/mcp-server/src/adapters/validation.ts new file mode 100644 index 0000000..9b11e52 --- /dev/null +++ b/packages/mcp-server/src/adapters/validation.ts @@ -0,0 +1,68 @@ +/** + * Validation utilities for MCP adapters + * + * Following TypeScript Standards: + * - Rule #2: No Type Assertions Without Validation + * - Result types for explicit error handling + */ + +import type { z } from 'zod'; +import type { ToolResult } from './types'; + +/** + * Convert Zod validation error to ToolResult format + * + * Provides detailed error messages with field paths + */ +export function handleValidationError(error: z.ZodError): ToolResult { + const firstError = error.issues[0]; + const path = firstError.path.length > 0 ? `${firstError.path.join('.')}: ` : ''; + + return { + success: false, + error: { + code: 'INVALID_PARAMS', + message: `${path}${firstError.message}`, + details: error.issues.map((e) => ({ + path: e.path.join('.'), + message: e.message, + })), + recoverable: true, + suggestion: 'Check the tool input schema and try again', + }, + }; +} + +/** + * Type-safe validation wrapper + * + * Returns either validated data or a ToolResult error + * + * @example + * ```typescript + * const validation = validateArgs(ExploreArgsSchema, args); + * if (!validation.success) { + * return validation.error; + * } + * // validation.data is now fully typed! + * const { action, query } = validation.data; + * ``` + */ +export function validateArgs( + schema: T, + args: unknown +): { success: true; data: z.infer } | { success: false; error: ToolResult } { + const result = schema.safeParse(args); + + if (!result.success) { + return { + success: false, + error: handleValidationError(result.error), + }; + } + + return { + success: true, + data: result.data, + }; +} diff --git a/packages/mcp-server/src/schemas/__tests__/schemas.test.ts b/packages/mcp-server/src/schemas/__tests__/schemas.test.ts new file mode 100644 index 0000000..4edb69a --- /dev/null +++ b/packages/mcp-server/src/schemas/__tests__/schemas.test.ts @@ -0,0 +1,390 @@ +/** + * Schema validation tests + * + * These tests validate schemas in isolation (pure functions, no mocks needed) + * Adapter tests focus on business logic, not validation + */ + +import { describe, expect, it } from 'vitest'; +import { + ExploreArgsSchema, + GitHubArgsSchema, + HealthArgsSchema, + HistoryArgsSchema, + MapArgsSchema, + PlanArgsSchema, + RefsArgsSchema, + SearchArgsSchema, + StatusArgsSchema, +} from '../index'; + +describe('ExploreArgsSchema', () => { + it('should validate valid input', () => { + const result = ExploreArgsSchema.safeParse({ + action: 'pattern', + query: 'authentication', + }); + + expect(result.success).toBe(true); + if (result.success) { + expect(result.data.action).toBe('pattern'); + expect(result.data.limit).toBe(10); // default + } + }); + + it('should apply defaults', () => { + const result = ExploreArgsSchema.safeParse({ + action: 'similar', + query: 'test', + }); + + expect(result.success).toBe(true); + if (result.success) { + expect(result.data).toMatchObject({ + limit: 10, + threshold: 0.7, + format: 'compact', + }); + } + }); + + it('should reject invalid action', () => { + const result = ExploreArgsSchema.safeParse({ + action: 'invalid', + query: 'test', + }); + + expect(result.success).toBe(false); + if (!result.success) { + expect(result.error.issues[0].path).toEqual(['action']); + } + }); + + it('should reject empty query', () => { + const result = ExploreArgsSchema.safeParse({ + action: 'pattern', + query: '', + }); + + expect(result.success).toBe(false); + }); + + it('should reject out-of-range limit', () => { + const result = ExploreArgsSchema.safeParse({ + action: 'pattern', + query: 'test', + limit: 200, + }); + + expect(result.success).toBe(false); + }); + + it('should reject out-of-range threshold', () => { + const result = ExploreArgsSchema.safeParse({ + action: 'pattern', + query: 'test', + threshold: 1.5, + }); + + expect(result.success).toBe(false); + }); + + it('should reject unknown properties', () => { + const result = ExploreArgsSchema.safeParse({ + action: 'pattern', + query: 'test', + unknownProp: 'value', + }); + + expect(result.success).toBe(false); + }); +}); + +describe('SearchArgsSchema', () => { + it('should validate valid input', () => { + const result = SearchArgsSchema.safeParse({ + query: 'authentication flow', + }); + + expect(result.success).toBe(true); + }); + + it('should apply defaults', () => { + const result = SearchArgsSchema.safeParse({ + query: 'test', + }); + + expect(result.success).toBe(true); + if (result.success) { + expect(result.data).toMatchObject({ + format: 'compact', + limit: 10, + scoreThreshold: 0, + }); + } + }); + + it('should validate tokenBudget range', () => { + const validResult = SearchArgsSchema.safeParse({ + query: 'test', + tokenBudget: 5000, + }); + expect(validResult.success).toBe(true); + + const invalidResult = SearchArgsSchema.safeParse({ + query: 'test', + tokenBudget: 50000, + }); + expect(invalidResult.success).toBe(false); + }); +}); + +describe('RefsArgsSchema', () => { + it('should validate valid input', () => { + const result = RefsArgsSchema.safeParse({ + name: 'createPlan', + }); + + expect(result.success).toBe(true); + if (result.success) { + expect(result.data.direction).toBe('both'); // default + } + }); + + it('should validate direction values', () => { + const validDirections = ['callees', 'callers', 'both']; + for (const direction of validDirections) { + const result = RefsArgsSchema.safeParse({ + name: 'test', + direction, + }); + expect(result.success).toBe(true); + } + + const invalidResult = RefsArgsSchema.safeParse({ + name: 'test', + direction: 'invalid', + }); + expect(invalidResult.success).toBe(false); + }); + + it('should reject empty name', () => { + const result = RefsArgsSchema.safeParse({ + name: '', + }); + + expect(result.success).toBe(false); + }); +}); + +describe('MapArgsSchema', () => { + it('should validate valid input with defaults', () => { + const result = MapArgsSchema.safeParse({}); + + expect(result.success).toBe(true); + if (result.success) { + expect(result.data).toMatchObject({ + depth: 2, + includeExports: true, + includeChangeFrequency: false, + tokenBudget: 2000, + }); + } + }); + + it('should validate depth range', () => { + const validResult = MapArgsSchema.safeParse({ depth: 3 }); + expect(validResult.success).toBe(true); + + const tooLowResult = MapArgsSchema.safeParse({ depth: 0 }); + expect(tooLowResult.success).toBe(false); + + const tooHighResult = MapArgsSchema.safeParse({ depth: 10 }); + expect(tooHighResult.success).toBe(false); + }); +}); + +describe('HistoryArgsSchema', () => { + it('should validate with query', () => { + const result = HistoryArgsSchema.safeParse({ + query: 'authentication refactor', + }); + + expect(result.success).toBe(true); + }); + + it('should validate with file', () => { + const result = HistoryArgsSchema.safeParse({ + file: 'src/auth/token.ts', + }); + + expect(result.success).toBe(true); + }); + + it('should reject when neither query nor file provided', () => { + const result = HistoryArgsSchema.safeParse({ + author: 'john@example.com', + }); + + expect(result.success).toBe(false); + if (!result.success) { + expect(result.error.issues[0].message).toContain('query or file'); + } + }); + + it('should accept both query and file', () => { + const result = HistoryArgsSchema.safeParse({ + query: 'bug fix', + file: 'src/index.ts', + }); + + expect(result.success).toBe(true); + }); +}); + +describe('PlanArgsSchema', () => { + it('should validate valid input', () => { + const result = PlanArgsSchema.safeParse({ + issue: 42, + }); + + expect(result.success).toBe(true); + if (result.success) { + expect(result.data).toMatchObject({ + includeCode: true, + includeGitHistory: true, + includePatterns: true, + tokenBudget: 4000, + format: 'compact', + }); + } + }); + + it('should reject non-positive issue numbers', () => { + const zeroResult = PlanArgsSchema.safeParse({ issue: 0 }); + expect(zeroResult.success).toBe(false); + + const negativeResult = PlanArgsSchema.safeParse({ issue: -1 }); + expect(negativeResult.success).toBe(false); + }); + + it('should reject non-integer issue numbers', () => { + const result = PlanArgsSchema.safeParse({ issue: 42.5 }); + expect(result.success).toBe(false); + }); +}); + +describe('GitHubArgsSchema', () => { + it('should validate search action with query', () => { + const result = GitHubArgsSchema.safeParse({ + action: 'search', + query: 'authentication bug', + }); + + expect(result.success).toBe(true); + }); + + it('should reject search action without query', () => { + const result = GitHubArgsSchema.safeParse({ + action: 'search', + }); + + expect(result.success).toBe(false); + if (!result.success) { + expect(result.error.issues[0].message).toContain('query'); + } + }); + + it('should validate context action with number', () => { + const result = GitHubArgsSchema.safeParse({ + action: 'context', + number: 42, + }); + + expect(result.success).toBe(true); + }); + + it('should reject context action without number', () => { + const result = GitHubArgsSchema.safeParse({ + action: 'context', + }); + + expect(result.success).toBe(false); + if (!result.success) { + expect(result.error.issues[0].message).toContain('number'); + } + }); + + it('should validate related action with number', () => { + const result = GitHubArgsSchema.safeParse({ + action: 'related', + number: 42, + }); + + expect(result.success).toBe(true); + }); + + it('should validate optional filters', () => { + const result = GitHubArgsSchema.safeParse({ + action: 'search', + query: 'bug', + type: 'issue', + state: 'open', + author: 'username', + labels: ['bug', 'urgent'], + }); + + expect(result.success).toBe(true); + }); +}); + +describe('StatusArgsSchema', () => { + it('should validate with defaults', () => { + const result = StatusArgsSchema.safeParse({}); + + expect(result.success).toBe(true); + if (result.success) { + expect(result.data).toMatchObject({ + format: 'compact', + section: 'summary', + }); + } + }); + + it('should validate all section values', () => { + const sections = ['summary', 'repo', 'indexes', 'github', 'health']; + for (const section of sections) { + const result = StatusArgsSchema.safeParse({ section }); + expect(result.success).toBe(true); + } + }); +}); + +describe('HealthArgsSchema', () => { + it('should validate with default', () => { + const result = HealthArgsSchema.safeParse({}); + + expect(result.success).toBe(true); + if (result.success) { + expect(result.data.verbose).toBe(false); + } + }); + + it('should validate verbose flag', () => { + const result = HealthArgsSchema.safeParse({ verbose: true }); + + expect(result.success).toBe(true); + if (result.success) { + expect(result.data.verbose).toBe(true); + } + }); + + it('should reject unknown properties', () => { + const result = HealthArgsSchema.safeParse({ + verbose: true, + unknownProp: 'value', + }); + + expect(result.success).toBe(false); + }); +}); diff --git a/packages/mcp-server/src/schemas/index.ts b/packages/mcp-server/src/schemas/index.ts new file mode 100644 index 0000000..fbc3e08 --- /dev/null +++ b/packages/mcp-server/src/schemas/index.ts @@ -0,0 +1,186 @@ +/** + * Zod schemas for MCP adapter validation + * + * Following TypeScript Standards Rule #2: No Type Assertions Without Validation + * See: docs/TYPESCRIPT_STANDARDS.md + */ + +import { z } from 'zod'; + +// ============================================================================ +// Shared Base Schemas +// ============================================================================ + +/** + * Common format option for output + */ +export const FormatSchema = z.enum(['compact', 'verbose']); + +/** + * Base schema for queries with pagination and formatting + */ +export const BaseQuerySchema = z.object({ + format: FormatSchema.default('compact'), + limit: z.number().int().min(1).max(50).default(10), +}); + +// ============================================================================ +// Explore Adapter +// ============================================================================ + +export const ExploreArgsSchema = z + .object({ + action: z.enum(['pattern', 'similar', 'relationships']), + query: z.string().min(1, 'Query must be a non-empty string'), + limit: z.number().int().min(1).max(100).default(10), + threshold: z.number().min(0).max(1).default(0.7), + fileTypes: z.array(z.string()).optional(), + format: FormatSchema.default('compact'), + }) + .strict(); // Reject unknown properties + +export type ExploreArgs = z.infer; + +// ============================================================================ +// Search Adapter +// ============================================================================ + +export const SearchArgsSchema = z + .object({ + query: z.string().min(1, 'Query must be a non-empty string'), + format: FormatSchema.default('compact'), + limit: z.number().int().min(1).max(50).default(10), + scoreThreshold: z.number().min(0).max(1).default(0), + tokenBudget: z.number().int().min(500).max(10000).optional(), + }) + .strict(); + +export type SearchArgs = z.infer; + +// ============================================================================ +// Refs Adapter +// ============================================================================ + +export const RefsArgsSchema = z + .object({ + name: z.string().min(1, 'Name must be a non-empty string'), + direction: z.enum(['callees', 'callers', 'both']).default('both'), + limit: z.number().int().min(1).max(50).default(20), + }) + .strict(); + +export type RefsArgs = z.infer; + +// ============================================================================ +// Map Adapter +// ============================================================================ + +export const MapArgsSchema = z + .object({ + depth: z.number().int().min(1).max(5).default(2), + focus: z.string().optional(), + includeExports: z.boolean().default(true), + includeChangeFrequency: z.boolean().default(false), + tokenBudget: z.number().int().min(500).max(10000).default(2000), + }) + .strict(); + +export type MapArgs = z.infer; + +// ============================================================================ +// History Adapter +// ============================================================================ + +export const HistoryArgsSchema = z + .object({ + query: z.string().min(1).optional(), + file: z.string().optional(), + author: z.string().optional(), + since: z.string().optional(), // ISO date or relative like "2 weeks ago" + limit: z.number().int().min(1).max(50).default(10), + tokenBudget: z.number().int().min(100).max(10000).default(2000), + }) + .refine((data) => data.query || data.file, { + message: 'Either query or file must be provided', + }) + .strict(); + +export type HistoryArgs = z.infer; + +// ============================================================================ +// Plan Adapter +// ============================================================================ + +export const PlanArgsSchema = z + .object({ + issue: z.number().int().positive({ message: 'Issue number must be a positive integer' }), + includeCode: z.boolean().default(true), + includeGitHistory: z.boolean().default(true), + includePatterns: z.boolean().default(true), + tokenBudget: z.number().int().min(1000).max(10000).default(4000), + format: FormatSchema.default('compact'), + }) + .strict(); + +export type PlanArgs = z.infer; + +// ============================================================================ +// GitHub Adapter +// ============================================================================ + +export const GitHubArgsSchema = z + .object({ + action: z.enum(['search', 'context', 'related']), + query: z.string().min(1).optional(), + number: z.number().int().positive().optional(), + type: z.enum(['issue', 'pull_request']).optional(), + state: z.enum(['open', 'closed', 'merged']).optional(), + author: z.string().optional(), + labels: z.array(z.string()).optional(), + limit: z.number().int().min(1).max(50).default(10), + format: FormatSchema.default('compact'), + }) + .refine( + (data) => { + // search requires query + if (data.action === 'search' && !data.query) { + return false; + } + // context/related require number + if ((data.action === 'context' || data.action === 'related') && !data.number) { + return false; + } + return true; + }, + { + message: 'Invalid combination: search requires "query", context/related require "number"', + } + ) + .strict(); + +export type GitHubArgs = z.infer; + +// ============================================================================ +// Status Adapter +// ============================================================================ + +export const StatusArgsSchema = z + .object({ + format: FormatSchema.default('compact'), + section: z.enum(['summary', 'repo', 'indexes', 'github', 'health']).default('summary'), + }) + .strict(); + +export type StatusArgs = z.infer; + +// ============================================================================ +// Health Adapter +// ============================================================================ + +export const HealthArgsSchema = z + .object({ + verbose: z.boolean().default(false), + }) + .strict(); + +export type HealthArgs = z.infer; diff --git a/packages/subagents/package.json b/packages/subagents/package.json index fd1d40d..a9c9a01 100644 --- a/packages/subagents/package.json +++ b/packages/subagents/package.json @@ -31,7 +31,8 @@ }, "dependencies": { "@lytics/dev-agent-core": "workspace:*", - "@lytics/kero": "workspace:*" + "@lytics/kero": "workspace:*", + "zod": "^4.1.13" }, "devDependencies": { "@types/node": "^22.0.0", diff --git a/packages/subagents/src/coordinator/__tests__/coordinator.integration.test.ts b/packages/subagents/src/coordinator/__tests__/coordinator.integration.test.ts index 0ffd009..bff8670 100644 --- a/packages/subagents/src/coordinator/__tests__/coordinator.integration.test.ts +++ b/packages/subagents/src/coordinator/__tests__/coordinator.integration.test.ts @@ -7,8 +7,9 @@ import { mkdir, rm } from 'node:fs/promises'; import { tmpdir } from 'node:os'; import { join } from 'node:path'; import { RepositoryIndexer } from '@lytics/dev-agent-core'; -import { afterEach, beforeEach, describe, expect, it } from 'vitest'; +import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'; import { ExplorerAgent } from '../../explorer'; +import { CoordinatorLogger } from '../../logger'; import { SubagentCoordinator } from '../coordinator'; describe('Coordinator β†’ Explorer Integration', () => { @@ -108,6 +109,9 @@ describe('Coordinator β†’ Explorer Integration', () => { }); it('should route similar code request to Explorer', async () => { + // Suppress error logs for validation failures (missing filePath) + const errorSpy = vi.spyOn(CoordinatorLogger.prototype, 'error').mockImplementation(() => {}); + const response = await coordinator.sendMessage({ type: 'request', sender: 'test', @@ -130,6 +134,8 @@ describe('Coordinator β†’ Explorer Integration', () => { expect(result.action).toBe('similar'); expect(Array.isArray(result.results)).toBe(true); } + + errorSpy.mockRestore(); }); it('should route relationships request to Explorer', async () => { @@ -175,6 +181,9 @@ describe('Coordinator β†’ Explorer Integration', () => { }); it('should handle unknown actions gracefully', async () => { + // Suppress error logs for this intentional error test + const errorSpy = vi.spyOn(CoordinatorLogger.prototype, 'error').mockImplementation(() => {}); + const response = await coordinator.sendMessage({ type: 'request', sender: 'test', @@ -186,14 +195,19 @@ describe('Coordinator β†’ Explorer Integration', () => { }); expect(response).toBeDefined(); - expect(response?.type).toBe('response'); + expect(response?.type).toBe('error'); const result = response?.payload as { error?: string }; expect(result.error).toBeDefined(); - expect(result.error).toContain('Unknown action'); + expect(result.error).toContain('Invalid exploration request'); + + errorSpy.mockRestore(); }); it('should handle non-existent agent gracefully', async () => { + // Suppress error logs for this intentional error test + const errorSpy = vi.spyOn(CoordinatorLogger.prototype, 'error').mockImplementation(() => {}); + const response = await coordinator.sendMessage({ type: 'request', sender: 'test', @@ -207,6 +221,8 @@ describe('Coordinator β†’ Explorer Integration', () => { const error = response?.payload as { error: string }; expect(error.error).toContain('not found'); + + errorSpy.mockRestore(); }); }); diff --git a/packages/subagents/src/coordinator/__tests__/github-coordinator.integration.test.ts b/packages/subagents/src/coordinator/__tests__/github-coordinator.integration.test.ts index 311c341..2b09e6a 100644 --- a/packages/subagents/src/coordinator/__tests__/github-coordinator.integration.test.ts +++ b/packages/subagents/src/coordinator/__tests__/github-coordinator.integration.test.ts @@ -10,6 +10,7 @@ import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'; import type { GitHubAgentConfig } from '../../github/agent'; import { GitHubAgent } from '../../github/agent'; import type { GitHubContextResult, GitHubDocument } from '../../github/types'; +import { CoordinatorLogger } from '../../logger'; import { SubagentCoordinator } from '../coordinator'; // Mock GitHub utilities to avoid actual gh CLI calls @@ -122,6 +123,9 @@ describe('Coordinator β†’ GitHub Integration', () => { }); it('should route search request to GitHub agent', async () => { + // Suppress error logs (search without indexed data is expected to be handled) + const errorSpy = vi.spyOn(CoordinatorLogger.prototype, 'error').mockImplementation(() => {}); + // Index first (required for search) const indexResponse = await coordinator.sendMessage({ type: 'request', @@ -158,9 +162,14 @@ describe('Coordinator β†’ GitHub Integration', () => { const result = searchResponse?.payload as unknown as GitHubContextResult; expect(result.action).toBe('search'); expect(Array.isArray(result.results)).toBe(true); + + errorSpy.mockRestore(); }); it('should handle context requests', async () => { + // Suppress error logs (GitHub data not indexed error is expected) + const errorSpy = vi.spyOn(CoordinatorLogger.prototype, 'error').mockImplementation(() => {}); + const response = await coordinator.sendMessage({ type: 'request', sender: 'test', @@ -177,9 +186,14 @@ describe('Coordinator β†’ GitHub Integration', () => { const result = response?.payload as unknown as GitHubContextResult; expect(result.action).toBe('context'); + + errorSpy.mockRestore(); }); it('should handle related requests', async () => { + // Suppress error logs (GitHub data not indexed error is expected) + const errorSpy = vi.spyOn(CoordinatorLogger.prototype, 'error').mockImplementation(() => {}); + const response = await coordinator.sendMessage({ type: 'request', sender: 'test', @@ -196,6 +210,8 @@ describe('Coordinator β†’ GitHub Integration', () => { const result = response?.payload as unknown as GitHubContextResult; expect(result.action).toBe('related'); + + errorSpy.mockRestore(); }); it('should handle non-request messages gracefully', async () => { @@ -213,6 +229,9 @@ describe('Coordinator β†’ GitHub Integration', () => { describe('Error Handling', () => { it('should handle invalid actions', async () => { + // Suppress error logs for this intentional error test + const errorSpy = vi.spyOn(CoordinatorLogger.prototype, 'error').mockImplementation(() => {}); + const response = await coordinator.sendMessage({ type: 'request', sender: 'test', @@ -225,9 +244,14 @@ describe('Coordinator β†’ GitHub Integration', () => { expect(response).toBeDefined(); expect(response?.type).toBe('response'); + + errorSpy.mockRestore(); }); it('should handle missing required fields', async () => { + // Suppress error logs for this intentional error test + const errorSpy = vi.spyOn(CoordinatorLogger.prototype, 'error').mockImplementation(() => {}); + const response = await coordinator.sendMessage({ type: 'request', sender: 'test', @@ -240,6 +264,8 @@ describe('Coordinator β†’ GitHub Integration', () => { }); expect(response).toBeDefined(); + + errorSpy.mockRestore(); }); }); diff --git a/packages/subagents/src/explorer/__tests__/index.test.ts b/packages/subagents/src/explorer/__tests__/index.test.ts index 86a23a3..5e0ac07 100644 --- a/packages/subagents/src/explorer/__tests__/index.test.ts +++ b/packages/subagents/src/explorer/__tests__/index.test.ts @@ -564,6 +564,9 @@ describe('ExplorerAgent', () => { describe('error handling', () => { it('should handle unknown actions', async () => { + // Suppress error logs for this intentional error test + const errorSpy = vi.spyOn(CoordinatorLogger.prototype, 'error').mockImplementation(() => {}); + const message: Message = { id: 'msg-9', type: 'request', @@ -579,13 +582,19 @@ describe('ExplorerAgent', () => { const response = await explorer.handleMessage(message); expect(response).toBeDefined(); - expect(response?.type).toBe('response'); + expect(response?.type).toBe('error'); const result = response?.payload as { error?: string }; expect(result.error).toBeDefined(); + expect(result.error).toContain('Invalid exploration request'); + + errorSpy.mockRestore(); }); it('should return error response on failure', async () => { + // Suppress error logs for this intentional error test + const errorSpy = vi.spyOn(CoordinatorLogger.prototype, 'error').mockImplementation(() => {}); + // Create explorer without initialization const uninitializedExplorer = new ExplorerAgent(); @@ -603,6 +612,8 @@ describe('ExplorerAgent', () => { }; await expect(uninitializedExplorer.handleMessage(message)).rejects.toThrow(); + + errorSpy.mockRestore(); }); it('should ignore non-request messages', async () => { diff --git a/packages/subagents/src/explorer/index.ts b/packages/subagents/src/explorer/index.ts index 5e942e2..9bb4386 100644 --- a/packages/subagents/src/explorer/index.ts +++ b/packages/subagents/src/explorer/index.ts @@ -3,6 +3,7 @@ * Explores and analyzes code patterns using semantic search */ +import { validateExplorationRequest } from '../schemas/messages.js'; import type { Agent, AgentContext, Message } from '../types'; import type { CodeInsights, @@ -53,7 +54,7 @@ export class ExplorerAgent implements Agent { } try { - const request = message.payload as unknown as ExplorationRequest; + const request = validateExplorationRequest(message.payload); logger.debug('Processing exploration request', { action: request.action }); let result: ExplorationResult | ExplorationError; diff --git a/packages/subagents/src/github/agent.ts b/packages/subagents/src/github/agent.ts index 39c6189..38dc3cb 100644 --- a/packages/subagents/src/github/agent.ts +++ b/packages/subagents/src/github/agent.ts @@ -3,6 +3,7 @@ * Provides rich context from GitHub issues, PRs, and discussions */ +import { validateGitHubContextRequest } from '../schemas/messages.js'; import type { Agent, AgentContext, Message } from '../types'; import { GitHubIndexer } from './indexer'; import type { @@ -67,7 +68,7 @@ export class GitHubAgent implements Agent { } try { - const request = message.payload as unknown as GitHubContextRequest; + const request = validateGitHubContextRequest(message.payload); logger.debug('Processing GitHub request', { action: request.action }); let result: GitHubContextResult | GitHubContextError; diff --git a/packages/subagents/src/planner/__tests__/index.test.ts b/packages/subagents/src/planner/__tests__/index.test.ts index 14ec28d..f5a5240 100644 --- a/packages/subagents/src/planner/__tests__/index.test.ts +++ b/packages/subagents/src/planner/__tests__/index.test.ts @@ -162,8 +162,8 @@ describe('PlannerAgent', () => { const response = await planner.handleMessage(message); expect(response).toBeTruthy(); - expect(response?.type).toBe('response'); - expect((response?.payload as { error?: string }).error).toContain('Unknown action'); + expect(response?.type).toBe('error'); + expect((response?.payload as { error?: string }).error).toContain('Invalid planning request'); }); it('should generate correct response message structure', async () => { diff --git a/packages/subagents/src/planner/index.ts b/packages/subagents/src/planner/index.ts index 4c3c4bf..fec7559 100644 --- a/packages/subagents/src/planner/index.ts +++ b/packages/subagents/src/planner/index.ts @@ -3,6 +3,7 @@ * Analyzes GitHub issues and creates actionable development plans */ +import { validatePlanningRequest } from '../schemas/messages.js'; import type { Agent, AgentContext, Message } from '../types'; import type { Plan, PlanningError, PlanningRequest, PlanningResult } from './types'; @@ -33,7 +34,7 @@ export class PlannerAgent implements Agent { } try { - const request = message.payload as unknown as PlanningRequest; + const request = validatePlanningRequest(message.payload); logger.debug('Processing planning request', { action: request.action }); let result: PlanningResult | PlanningError; diff --git a/packages/subagents/src/planner/utils/github.ts b/packages/subagents/src/planner/utils/github.ts index 4ffb915..788fa2c 100644 --- a/packages/subagents/src/planner/utils/github.ts +++ b/packages/subagents/src/planner/utils/github.ts @@ -4,6 +4,7 @@ */ import { execSync } from 'node:child_process'; +import { type GitHubIssueData, GitHubIssueSchema } from '../../schemas/github-cli.js'; import type { GitHubComment, GitHubIssue } from '../types'; /** @@ -65,27 +66,33 @@ export async function fetchGitHubIssue( cwd: repositoryPath, // Run in the repository directory }); - const data = JSON.parse(output); + // Parse and validate GitHub CLI response + const rawData = JSON.parse(output); + const parseResult = GitHubIssueSchema.safeParse(rawData); - // Parse comments if included - let comments: GitHubComment[] | undefined; - if (options.includeComments && data.comments) { - comments = data.comments.map( - (c: { author?: { login: string }; body: string; createdAt?: string }) => ({ - author: c.author?.login, - body: c.body, - createdAt: c.createdAt, - }) - ); + if (!parseResult.success) { + const firstError = parseResult.error.issues[0]; + const path = firstError.path.length > 0 ? `${firstError.path.join('.')}: ` : ''; + throw new Error(`Invalid GitHub CLI response: ${path}${firstError.message}`); } + const data: GitHubIssueData = parseResult.data; + + // Transform comments if included + const comments: GitHubComment[] | undefined = data.comments?.map((c) => ({ + author: c.author?.login, + body: c.body, + createdAt: c.createdAt, + })); + + // Transform to internal type return { number: data.number, title: data.title, - body: data.body || '', - state: data.state.toLowerCase() as 'open' | 'closed', - labels: data.labels?.map((l: { name: string }) => l.name) || [], - assignees: data.assignees?.map((a: { login: string }) => a.login) || [], + body: data.body ?? '', + state: data.state, + labels: data.labels.map((l) => l.name), + assignees: data.assignees.map((a) => a.login), author: data.author?.login, createdAt: data.createdAt, updatedAt: data.updatedAt, diff --git a/packages/subagents/src/schemas/__tests__/github-cli.test.ts b/packages/subagents/src/schemas/__tests__/github-cli.test.ts new file mode 100644 index 0000000..c97edf3 --- /dev/null +++ b/packages/subagents/src/schemas/__tests__/github-cli.test.ts @@ -0,0 +1,308 @@ +/** + * GitHub CLI Schema Tests + * Validates external data from gh CLI commands + */ + +import { describe, expect, it } from 'vitest'; +import { + GitHubCommentSchema, + GitHubIssueSchema, + GitHubIssuesArraySchema, + GitHubPullRequestSchema, + GitHubPullRequestsArraySchema, +} from '../github-cli.js'; + +describe('GitHubCommentSchema', () => { + it('should validate valid comment', () => { + const input = { + author: { login: 'octocat' }, + body: 'Great work!', + createdAt: '2024-01-15T10:30:00Z', + }; + + const result = GitHubCommentSchema.safeParse(input); + expect(result.success).toBe(true); + if (result.success) { + expect(result.data.author?.login).toBe('octocat'); + expect(result.data.body).toBe('Great work!'); + } + }); + + it('should allow missing optional fields', () => { + const input = { body: 'Comment without author' }; + + const result = GitHubCommentSchema.safeParse(input); + expect(result.success).toBe(true); + if (result.success) { + expect(result.data.author).toBeUndefined(); + } + }); + + it('should reject invalid comment', () => { + const input = { author: 'not-an-object', body: 123 }; + + const result = GitHubCommentSchema.safeParse(input); + expect(result.success).toBe(false); + }); +}); + +describe('GitHubIssueSchema', () => { + it('should validate valid issue', () => { + const input = { + number: 42, + title: 'Add new feature', + body: 'Description here', + state: 'OPEN', + labels: [{ name: 'bug' }, { name: 'enhancement' }], + assignees: [{ login: 'dev1' }], + author: { login: 'octocat' }, + createdAt: '2024-01-01T00:00:00Z', + updatedAt: '2024-01-15T00:00:00Z', + }; + + const result = GitHubIssueSchema.safeParse(input); + expect(result.success).toBe(true); + if (result.success) { + expect(result.data.number).toBe(42); + expect(result.data.state).toBe('open'); // Transformed to lowercase + expect(result.data.labels).toHaveLength(2); + } + }); + + it('should handle null body', () => { + const input = { + number: 1, + title: 'Issue without body', + body: null, + state: 'open', + labels: [], + assignees: [], + createdAt: '2024-01-01T00:00:00Z', + updatedAt: '2024-01-01T00:00:00Z', + }; + + const result = GitHubIssueSchema.safeParse(input); + expect(result.success).toBe(true); + if (result.success) { + expect(result.data.body).toBeNull(); + } + }); + + it('should default empty arrays', () => { + const input = { + number: 1, + title: 'Minimal issue', + body: '', + state: 'closed', + createdAt: '2024-01-01T00:00:00Z', + updatedAt: '2024-01-01T00:00:00Z', + }; + + const result = GitHubIssueSchema.safeParse(input); + expect(result.success).toBe(true); + if (result.success) { + expect(result.data.labels).toEqual([]); + expect(result.data.assignees).toEqual([]); + } + }); + + it('should include comments if provided', () => { + const input = { + number: 1, + title: 'Issue with comments', + body: '', + state: 'open', + createdAt: '2024-01-01T00:00:00Z', + updatedAt: '2024-01-01T00:00:00Z', + comments: [ + { author: { login: 'user1' }, body: 'First comment' }, + { body: 'Anonymous comment' }, + ], + }; + + const result = GitHubIssueSchema.safeParse(input); + expect(result.success).toBe(true); + if (result.success) { + expect(result.data.comments).toHaveLength(2); + expect(result.data.comments?.[0].author?.login).toBe('user1'); + } + }); + + it('should reject invalid issue number', () => { + const input = { + number: -1, + title: 'Invalid issue', + body: '', + state: 'open', + createdAt: '2024-01-01T00:00:00Z', + updatedAt: '2024-01-01T00:00:00Z', + }; + + const result = GitHubIssueSchema.safeParse(input); + expect(result.success).toBe(false); + }); + + it('should reject missing required fields', () => { + const input = { + number: 1, + title: 'Incomplete issue', + // Missing body, state, createdAt, updatedAt + }; + + const result = GitHubIssueSchema.safeParse(input); + expect(result.success).toBe(false); + }); +}); + +describe('GitHubPullRequestSchema', () => { + it('should validate valid PR', () => { + const input = { + number: 10, + title: 'Fix bug', + body: 'Fixes #42', + state: 'MERGED', + labels: [{ name: 'bugfix' }], + assignees: [], + author: { login: 'contributor' }, + createdAt: '2024-01-01T00:00:00Z', + updatedAt: '2024-01-15T00:00:00Z', + mergedAt: '2024-01-15T00:00:00Z', + isDraft: false, + }; + + const result = GitHubPullRequestSchema.safeParse(input); + expect(result.success).toBe(true); + if (result.success) { + expect(result.data.number).toBe(10); + expect(result.data.state).toBe('merged'); // Transformed to lowercase + expect(result.data.mergedAt).toBeDefined(); + expect(result.data.isDraft).toBe(false); + } + }); + + it('should handle draft PR without mergedAt', () => { + const input = { + number: 20, + title: 'WIP: New feature', + body: null, + state: 'open', + createdAt: '2024-01-01T00:00:00Z', + updatedAt: '2024-01-01T00:00:00Z', + isDraft: true, + }; + + const result = GitHubPullRequestSchema.safeParse(input); + expect(result.success).toBe(true); + if (result.success) { + expect(result.data.isDraft).toBe(true); + expect(result.data.mergedAt).toBeUndefined(); + } + }); + + it('should default isDraft to false', () => { + const input = { + number: 30, + title: 'Regular PR', + body: '', + state: 'open', + createdAt: '2024-01-01T00:00:00Z', + updatedAt: '2024-01-01T00:00:00Z', + }; + + const result = GitHubPullRequestSchema.safeParse(input); + expect(result.success).toBe(true); + if (result.success) { + expect(result.data.isDraft).toBe(false); + } + }); +}); + +describe('Array Schemas', () => { + it('should validate array of issues', () => { + const input = [ + { + number: 1, + title: 'First issue', + body: '', + state: 'open', + createdAt: '2024-01-01T00:00:00Z', + updatedAt: '2024-01-01T00:00:00Z', + }, + { + number: 2, + title: 'Second issue', + body: null, + state: 'closed', + createdAt: '2024-01-02T00:00:00Z', + updatedAt: '2024-01-02T00:00:00Z', + }, + ]; + + const result = GitHubIssuesArraySchema.safeParse(input); + expect(result.success).toBe(true); + if (result.success) { + expect(result.data).toHaveLength(2); + } + }); + + it('should validate array of PRs', () => { + const input = [ + { + number: 10, + title: 'PR 1', + body: '', + state: 'open', + createdAt: '2024-01-01T00:00:00Z', + updatedAt: '2024-01-01T00:00:00Z', + }, + ]; + + const result = GitHubPullRequestsArraySchema.safeParse(input); + expect(result.success).toBe(true); + if (result.success) { + expect(result.data).toHaveLength(1); + } + }); + + it('should reject array with invalid items', () => { + const input = [ + { number: 'not-a-number', title: 'Invalid' }, // Invalid + ]; + + const result = GitHubIssuesArraySchema.safeParse(input); + expect(result.success).toBe(false); + }); + + it('should accept empty array', () => { + const result = GitHubIssuesArraySchema.safeParse([]); + expect(result.success).toBe(true); + if (result.success) { + expect(result.data).toHaveLength(0); + } + }); +}); + +describe('Real GitHub CLI Output', () => { + it('should validate actual gh issue view output', () => { + // Real example from gh CLI + const ghOutput = { + assignees: [{ login: 'octocat' }], + author: { login: 'contributor' }, + body: '## Description\n\nThis is a bug report.', + createdAt: '2024-01-01T12:00:00Z', + labels: [{ name: 'bug' }, { name: 'high-priority' }], + number: 42, + state: 'OPEN', + title: 'Fix critical bug', + updatedAt: '2024-01-15T14:30:00Z', + }; + + const result = GitHubIssueSchema.safeParse(ghOutput); + expect(result.success).toBe(true); + if (result.success) { + expect(result.data.number).toBe(42); + expect(result.data.state).toBe('open'); + expect(result.data.labels[0].name).toBe('bug'); + } + }); +}); diff --git a/packages/subagents/src/schemas/__tests__/messages.test.ts b/packages/subagents/src/schemas/__tests__/messages.test.ts new file mode 100644 index 0000000..fcdc358 --- /dev/null +++ b/packages/subagents/src/schemas/__tests__/messages.test.ts @@ -0,0 +1,281 @@ +/** + * Subagent Message Schema Tests + * Validates inter-agent message payloads + */ + +import { describe, expect, it } from 'vitest'; +import { + ExplorationRequestSchema, + GitHubContextRequestSchema, + PlanningRequestSchema, + validateExplorationRequest, + validateGitHubContextRequest, + validatePlanningRequest, +} from '../messages.js'; + +describe('PlanningRequestSchema', () => { + it('should validate valid planning request', () => { + const input = { + action: 'plan', + issueNumber: 42, + useExplorer: true, + detailLevel: 'detailed', + strategy: 'sequential', + }; + + const result = PlanningRequestSchema.safeParse(input); + expect(result.success).toBe(true); + if (result.success) { + expect(result.data.action).toBe('plan'); + expect(result.data.issueNumber).toBe(42); + expect(result.data.useExplorer).toBe(true); + } + }); + + it('should allow optional fields', () => { + const input = { + action: 'plan', + issueNumber: 1, + }; + + const result = PlanningRequestSchema.safeParse(input); + expect(result.success).toBe(true); + }); + + it('should reject negative issue number', () => { + const input = { + action: 'plan', + issueNumber: -1, + }; + + const result = PlanningRequestSchema.safeParse(input); + expect(result.success).toBe(false); + }); + + it('should reject invalid action', () => { + const input = { + action: 'invalid', + issueNumber: 1, + }; + + const result = PlanningRequestSchema.safeParse(input); + expect(result.success).toBe(false); + }); + + it('should reject invalid detail level', () => { + const input = { + action: 'plan', + issueNumber: 1, + detailLevel: 'invalid', + }; + + const result = PlanningRequestSchema.safeParse(input); + expect(result.success).toBe(false); + }); +}); + +describe('ExplorationRequestSchema', () => { + it('should validate pattern exploration', () => { + const input = { + action: 'pattern' as const, + query: 'authentication', + threshold: 0.8, + limit: 10, + }; + + const result = ExplorationRequestSchema.safeParse(input); + expect(result.success).toBe(true); + if (result.success && result.data.action === 'pattern') { + expect(result.data.action).toBe('pattern'); + expect(result.data.query).toBe('authentication'); + } + }); + + it('should validate similar code search', () => { + const input = { + action: 'similar' as const, + filePath: 'src/auth/token.ts', + }; + + const result = ExplorationRequestSchema.safeParse(input); + expect(result.success).toBe(true); + }); + + it('should validate relationships query', () => { + const input = { + action: 'relationships' as const, + component: 'MyClass', + type: 'all' as const, + }; + + const result = ExplorationRequestSchema.safeParse(input); + expect(result.success).toBe(true); + if (result.success && result.data.action === 'relationships') { + expect(result.data.type).toBe('all'); + } + }); + + it('should reject empty query', () => { + const input = { + action: 'pattern' as const, + query: '', + }; + + const result = ExplorationRequestSchema.safeParse(input); + expect(result.success).toBe(false); + }); + + it('should reject invalid action', () => { + const input = { + action: 'invalid', + query: 'test', + }; + + const result = ExplorationRequestSchema.safeParse(input); + expect(result.success).toBe(false); + }); + + it('should reject threshold out of range', () => { + const input = { + action: 'pattern' as const, + query: 'test', + threshold: 1.5, + }; + + const result = ExplorationRequestSchema.safeParse(input); + expect(result.success).toBe(false); + }); +}); + +describe('GitHubContextRequestSchema', () => { + it('should validate context request for issue', () => { + const input = { + action: 'context' as const, + issueNumber: 42, + }; + + const result = GitHubContextRequestSchema.safeParse(input); + expect(result.success).toBe(true); + if (result.success) { + expect(result.data.action).toBe('context'); + expect(result.data.issueNumber).toBe(42); + } + }); + + it('should validate search with options', () => { + const input = { + action: 'search' as const, + query: 'authentication bug', + searchOptions: { + type: 'issue', + state: 'open', + labels: ['bug', 'high-priority'], + }, + }; + + const result = GitHubContextRequestSchema.safeParse(input); + expect(result.success).toBe(true); + if (result.success) { + expect(result.data.query).toBe('authentication bug'); + } + }); + + it('should validate context request', () => { + const input = { + action: 'context' as const, + issueNumber: 42, + includeCodeContext: true, + }; + + const result = GitHubContextRequestSchema.safeParse(input); + expect(result.success).toBe(true); + }); + + it('should allow requests without optional fields', () => { + const input = { + action: 'index' as const, + }; + + const result = GitHubContextRequestSchema.safeParse(input); + expect(result.success).toBe(true); + }); + + it('should reject invalid action', () => { + const input = { + action: 'invalid-action', + }; + + const result = GitHubContextRequestSchema.safeParse(input); + expect(result.success).toBe(false); + }); +}); + +describe('Validation Functions', () => { + describe('validatePlanningRequest', () => { + it('should return validated data for valid input', () => { + const input = { + action: 'plan', + issueNumber: 42, + }; + + const result = validatePlanningRequest(input); + expect(result.action).toBe('plan'); + expect(result.issueNumber).toBe(42); + }); + + it('should throw descriptive error for invalid input', () => { + const input = { + action: 'plan', + issueNumber: 'not-a-number', + }; + + expect(() => validatePlanningRequest(input)).toThrow('Invalid planning request'); + expect(() => validatePlanningRequest(input)).toThrow('issueNumber'); + }); + }); + + describe('validateExplorationRequest', () => { + it('should return validated data for valid input', () => { + const input = { + action: 'pattern' as const, + query: 'auth', + }; + + const result = validateExplorationRequest(input); + expect(result.action).toBe('pattern'); + if (result.action === 'pattern') { + expect(result.query).toBe('auth'); + } + }); + + it('should throw descriptive error for invalid input', () => { + const input = { + action: 'pattern' as const, + query: '', + }; + + expect(() => validateExplorationRequest(input)).toThrow('Invalid exploration request'); + }); + }); + + describe('validateGitHubContextRequest', () => { + it('should return validated data for valid input', () => { + const input = { + action: 'context' as const, + issueNumber: 42, + }; + + const result = validateGitHubContextRequest(input); + expect(result.action).toBe('context'); + expect(result.issueNumber).toBe(42); + }); + + it('should throw descriptive error for invalid input', () => { + const input = { + action: 'invalid', + }; + + expect(() => validateGitHubContextRequest(input)).toThrow('Invalid GitHub context request'); + }); + }); +}); diff --git a/packages/subagents/src/schemas/github-cli.ts b/packages/subagents/src/schemas/github-cli.ts new file mode 100644 index 0000000..38493db --- /dev/null +++ b/packages/subagents/src/schemas/github-cli.ts @@ -0,0 +1,74 @@ +/** + * Zod schemas for GitHub CLI output validation + * + * Following TypeScript Standards Rule #2: No Type Assertions Without Validation + * External data from `gh` CLI must be validated at runtime + */ + +import { z } from 'zod'; + +/** + * GitHub user object from gh CLI + */ +const GitHubUserSchema = z.object({ + login: z.string(), +}); + +/** + * GitHub label object from gh CLI + */ +const GitHubLabelSchema = z.object({ + name: z.string(), +}); + +/** + * GitHub comment from gh CLI + */ +export const GitHubCommentSchema = z.object({ + author: GitHubUserSchema.optional(), + body: z.string(), + createdAt: z.string().optional(), +}); + +/** + * GitHub issue from gh CLI + */ +export const GitHubIssueSchema = z.object({ + number: z.number().int().positive(), + title: z.string(), + body: z.string().nullable(), + state: z.string().transform((s) => s.toLowerCase() as 'open' | 'closed'), + labels: z.array(GitHubLabelSchema).default([]), + assignees: z.array(GitHubUserSchema).default([]), + author: GitHubUserSchema.optional(), + createdAt: z.string(), + updatedAt: z.string(), + comments: z.array(GitHubCommentSchema).optional(), +}); + +export type GitHubIssueData = z.infer; + +/** + * GitHub Pull Request from gh CLI + */ +export const GitHubPullRequestSchema = z.object({ + number: z.number().int().positive(), + title: z.string(), + body: z.string().nullable(), + state: z.string().transform((s) => s.toLowerCase() as 'open' | 'closed' | 'merged'), + labels: z.array(GitHubLabelSchema).default([]), + assignees: z.array(GitHubUserSchema).default([]), + author: GitHubUserSchema.optional(), + createdAt: z.string(), + updatedAt: z.string(), + mergedAt: z.string().nullable().optional(), + isDraft: z.boolean().default(false), +}); + +export type GitHubPullRequestData = z.infer; + +/** + * Array of issues/PRs (for bulk fetching) + */ +export const GitHubIssuesArraySchema = z.array(GitHubIssueSchema); +export const GitHubPullRequestsArraySchema = z.array(GitHubPullRequestSchema); diff --git a/packages/subagents/src/schemas/messages.ts b/packages/subagents/src/schemas/messages.ts new file mode 100644 index 0000000..733e506 --- /dev/null +++ b/packages/subagents/src/schemas/messages.ts @@ -0,0 +1,113 @@ +/** + * Zod schemas for subagent message validation + * + * Following TypeScript Standards Rule #2: No Type Assertions Without Validation + * Inter-agent message payloads must be validated at runtime + */ + +import { z } from 'zod'; + +/** + * Planning request schema + */ +export const PlanningRequestSchema = z.object({ + action: z.literal('plan'), + issueNumber: z.number().int().positive('Issue number must be positive'), + useExplorer: z.boolean().optional(), + detailLevel: z.enum(['simple', 'detailed']).optional(), + strategy: z.enum(['sequential', 'parallel']).optional(), +}); + +export type PlanningRequestData = z.infer; + +/** + * Exploration request schema (discriminated union) + */ +const PatternSearchRequestSchema = z.object({ + action: z.literal('pattern'), + query: z.string().min(1), + limit: z.number().int().positive().optional(), + threshold: z.number().min(0).max(1).optional(), + fileTypes: z.array(z.string()).optional(), +}); + +const SimilarCodeRequestSchema = z.object({ + action: z.literal('similar'), + filePath: z.string().min(1), + limit: z.number().int().positive().optional(), + threshold: z.number().min(0).max(1).optional(), +}); + +const RelationshipRequestSchema = z.object({ + action: z.literal('relationships'), + component: z.string().min(1), + type: z.enum(['imports', 'exports', 'dependencies', 'usages', 'all']).optional(), + limit: z.number().int().positive().optional(), +}); + +const InsightsRequestSchema = z.object({ + action: z.literal('insights'), + type: z.enum(['patterns', 'complexity', 'coverage', 'all']).optional(), +}); + +export const ExplorationRequestSchema = z.discriminatedUnion('action', [ + PatternSearchRequestSchema, + SimilarCodeRequestSchema, + RelationshipRequestSchema, + InsightsRequestSchema, +]); + +export type ExplorationRequestData = z.infer; + +/** + * GitHub context request schema + */ +export const GitHubContextRequestSchema = z.object({ + action: z.enum(['index', 'search', 'context', 'related']), + // For index action + indexOptions: z.any().optional(), + // For search action + query: z.string().optional(), + searchOptions: z.any().optional(), + // For context/related actions + issueNumber: z.number().int().positive().optional(), + prNumber: z.number().int().positive().optional(), + // Include code context from Explorer + includeCodeContext: z.boolean().optional(), +}); + +export type GitHubContextRequestData = z.infer; + +/** + * Validate a subagent message payload + * Returns validated data or throws descriptive error + */ +export function validatePlanningRequest(payload: unknown): PlanningRequestData { + const result = PlanningRequestSchema.safeParse(payload); + if (!result.success) { + const firstError = result.error.issues[0]; + const path = firstError.path.length > 0 ? `${firstError.path.join('.')}: ` : ''; + throw new Error(`Invalid planning request: ${path}${firstError.message}`); + } + return result.data; +} + +export function validateExplorationRequest(payload: unknown): ExplorationRequestData { + const result = ExplorationRequestSchema.safeParse(payload); + if (!result.success) { + const firstError = result.error.issues[0]; + const path = firstError.path.length > 0 ? `${firstError.path.join('.')}: ` : ''; + throw new Error(`Invalid exploration request: ${path}${firstError.message}`); + } + return result.data; +} + +export function validateGitHubContextRequest(payload: unknown): GitHubContextRequestData { + const result = GitHubContextRequestSchema.safeParse(payload); + if (!result.success) { + const firstError = result.error.issues[0]; + const path = firstError.path.length > 0 ? `${firstError.path.join('.')}: ` : ''; + throw new Error(`Invalid GitHub context request: ${path}${firstError.message}`); + } + return result.data; +} diff --git a/pnpm-lock.yaml b/pnpm-lock.yaml index 4ad1762..817d015 100644 --- a/pnpm-lock.yaml +++ b/pnpm-lock.yaml @@ -72,12 +72,18 @@ importers: chalk: specifier: ^5.6.2 version: 5.6.2 + cli-table3: + specifier: ^0.6.5 + version: 0.6.5 commander: specifier: ^12.1.0 version: 12.1.0 ora: specifier: ^8.0.1 version: 8.2.0 + terminal-size: + specifier: ^4.0.0 + version: 4.0.0 devDependencies: '@types/node': specifier: ^22.0.0 @@ -118,6 +124,9 @@ importers: web-tree-sitter: specifier: ^0.25.10 version: 0.25.10 + zod: + specifier: ^4.1.13 + version: 4.1.13 devDependencies: '@types/mdast': specifier: ^4.0.4 @@ -202,6 +211,9 @@ importers: '@lytics/kero': specifier: workspace:* version: link:../logger + zod: + specifier: ^4.1.13 + version: 4.1.13 devDependencies: '@types/node': specifier: ^22.0.0 @@ -221,6 +233,9 @@ importers: '@lytics/kero': specifier: workspace:* version: link:../logger + zod: + specifier: ^4.1.13 + version: 4.1.13 devDependencies: '@types/node': specifier: ^22.0.0 @@ -544,6 +559,13 @@ packages: prettier: 2.8.8 dev: true + /@colors/colors@1.5.0: + resolution: {integrity: sha512-ooWCrlZP11i8GImSjTHYHLkvFDP48nS4+204nGb1RiX/WXYHmJA2III9/e2DWVabCESdW7hBAEzHRqUn9OUVvQ==} + engines: {node: '>=0.1.90'} + requiresBuild: true + dev: false + optional: true + /@commitlint/cli@20.1.0(@types/node@24.10.1)(typescript@5.9.3): resolution: {integrity: sha512-pW5ujjrOovhq5RcYv5xCpb4GkZxkO2+GtOdBW2/qrr0Ll9tl3PX0aBBobGQl3mdZUbOBgwAexEQLeH6uxL0VYg==} engines: {node: '>=v18'} @@ -2131,7 +2153,6 @@ packages: /ansi-regex@5.0.1: resolution: {integrity: sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ==} engines: {node: '>=8'} - dev: true /ansi-regex@6.2.2: resolution: {integrity: sha512-Bq3SmSpyFHaWjPk8If9yc6svM8c56dB5BAtW4Qbw5jHTwwXXcTLoRMkpDJp6VL0XzlWaCHTXrkFURMYmD0sLqg==} @@ -2447,6 +2468,15 @@ packages: engines: {node: '>=6'} dev: false + /cli-table3@0.6.5: + resolution: {integrity: sha512-+W/5efTR7y5HRD7gACw9yQjqMVvEMLBHmboM/kPWam+H+Hmyrgjh6YncVKK122YZkXrLudzTuAukUw9FnMf7IQ==} + engines: {node: 10.* || >= 12.*} + dependencies: + string-width: 4.2.3 + optionalDependencies: + '@colors/colors': 1.5.0 + dev: false + /cliui@8.0.1: resolution: {integrity: sha512-BSeNnyus75C4//NQ9gQt1/csTXyo/8Sb+afLAkzAptFuMsod9HFokGNudZpi/oQV73hnVK+sR+5PVRMd+Dr7YQ==} engines: {node: '>=12'} @@ -2686,7 +2716,6 @@ packages: /emoji-regex@8.0.0: resolution: {integrity: sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A==} - dev: true /end-of-stream@1.4.5: resolution: {integrity: sha512-ooEGc6HP26xXq/N+GCGOT0JKCLDGrq2bQUZrQ7gyrJiZANJ/8YDTxTpQBXGMn+WbIQXNVpyWymm7KYVICQnyOg==} @@ -3151,7 +3180,6 @@ packages: /is-fullwidth-code-point@3.0.0: resolution: {integrity: sha512-zymm5+u+sCsSWyD9qNaejV3DFvhCKclKdizYaJUuHA83RLjb7nSuGnddCHGv0hk+KY7BMAlsWeK4Ueg6EV6XQg==} engines: {node: '>=8'} - dev: true /is-glob@4.0.3: resolution: {integrity: sha512-xelSayHH36ZgE7ZWhli7pW34hNbNl8Ojv5KVmkJD4hBdD3th8Tfk9vYasLM+mXWOZhFkgZfxhLSnrwRr4elSSg==} @@ -4357,7 +4385,6 @@ packages: emoji-regex: 8.0.0 is-fullwidth-code-point: 3.0.0 strip-ansi: 6.0.1 - dev: true /string-width@7.2.0: resolution: {integrity: sha512-tsaTIkKW9b4N+AEj+SVA+WhJzV7/zMhcSu78mLKWSk7cXMOSHsBKFWUs0fWwq8QyK3MgJBQRX6Gbi4kYbdvGkQ==} @@ -4379,7 +4406,6 @@ packages: engines: {node: '>=8'} dependencies: ansi-regex: 5.0.1 - dev: true /strip-ansi@7.1.2: resolution: {integrity: sha512-gmBGslpoQJtgnMAvOVqGZpEz9dyoKTCzy2nfz/n8aIFhN/jCE/rCmcxabB6jOOHV+0WNnylOxaxBQPSvcWklhA==} @@ -4487,6 +4513,11 @@ packages: engines: {node: '>=8'} dev: true + /terminal-size@4.0.0: + resolution: {integrity: sha512-rcdty1xZ2/BkWa4ANjWRp4JGpda2quksXIHgn5TMjNBPZfwzJIgR68DKfSYiTL+CZWowDX/sbOo5ME/FRURvYQ==} + engines: {node: '>=18'} + dev: false + /text-decoder@1.2.3: resolution: {integrity: sha512-3/o9z3X0X0fTupwsYvR03pJ/DjWuqqrfwBgTQzdWDiQSm9KitAyz/9WqsT2JQW7KV2m+bC2ol/zqpW37NHxLaA==} dependencies: @@ -5243,6 +5274,10 @@ packages: engines: {node: '>=12.20'} dev: true + /zod@4.1.13: + resolution: {integrity: sha512-AvvthqfqrAhNH9dnfmrfKzX5upOdjUVJYFqNSlkmGf64gRaTzlPwz99IHYnVs28qYAybvAlBV+H7pn0saFY4Ig==} + dev: false + /zwitch@2.0.4: resolution: {integrity: sha512-bXE4cR/kVZhKZX/RjPEflHaKVhUVl85noU3v6b8apfQEc1x4A+zBxjZ4lN8LqGd6WZ3dl98pY4o717VFmoPp+A==} dev: false