This guide provides comprehensive instructions for testing all terminal features of Ano.
Run the automated test script:
# Make script executable
chmod +x test-script.sh
# Run all tests
./test-script.shThis will test all CLI commands systematically and provide a summary report.
Use this checklist to manually verify each feature works correctly.
- Ano CLI is built and linked:
npm run build && npm link - Git is configured with user name and email
- Test files are created (or use test-script.sh setup)
- Initialize a new team configuration
ano team init --name "My Team" - Verify
.ano/config.jsonis created - Check that config has correct project name
- Display current team configuration
ano team show
- Verify output shows team name, members, roles, and requirements
- Add a team member with default role
ano team add-member "Alice Smith" "alice@example.com"
- Add a team member with specific role
ano team add-member "Bob Jones" "bob@example.com" --role lead
- Verify members appear in
ano team show
- Remove a team member
ano team remove-member "alice@example.com" - Verify member is removed from config
- Set minimum approvals required
ano team set-requirements --min-approvals 2
- Set required roles
ano team set-requirements --required-roles "lead,senior"
- Add single-line annotation
ano annotate plan.md:10 "This needs clarification" --type question - Add multi-line annotation
ano annotate plan.md:15-20 "This entire section is problematic" --type concern - Add annotation with different types:
-
--type concern -
--type question -
--type suggestion -
--type blocker
-
- Verify annotation appears in
.annotations.jsonfile
- List all annotations
ano list plan.md
- Filter by type
ano list plan.md --type blocker
- Filter by status
ano list plan.md --status open ano list plan.md --status resolved
- JSON output
ano list plan.md --json
- Verify output formatting is readable
- Check colors and formatting in terminal
- Add a minor suggestion (nit)
ano nit plan.md:5 "Typo: 'teh' should be 'the'" - Verify it creates a
suggestiontype annotation
- Add a question quickly
ano q plan.md:8 "What about edge cases?" - Verify it creates a
questiontype annotation
- Add a blocker quickly
ano block plan.md:12 "Security review required" - Verify it creates a
blockertype annotation
- Get annotation ID from
ano list - Add reply to existing annotation
ano reply plan.md <annotation-id> "Here's my response"
- Add multiple replies to same annotation
- Verify replies appear in
ano listoutput - Check reply threading in web UI
- Resolve an open annotation
ano resolve plan.md <annotation-id>
- Verify status changes to
resolved - Check that resolved items are hidden by default in lists
- Reopen a resolved annotation
ano reopen plan.md <annotation-id>
- Verify status changes back to
open
- Add basic approval
ano approve plan.md "Looks good to me" - Add approval with title
ano approve plan.md "LGTM" --title "Senior Engineer"
- Request changes
ano approve plan.md "Needs work" --request-changes - Verify approvals are stored separately from annotations
- Check approval status
ano check plan.md
- Verify output shows:
- Approval count vs requirement
- Individual approvals with timestamps
- Approval status (approved/changes requested/pending)
- Exit code (0 if approved, 1 if not)
- JSON output
ano check plan.md --json
- Quick approve with "LGTM"
ano lgtm plan.md
- Verify it adds approval
- Quick approve with "Ship it!"
ano shipit plan.md
- With optional message
ano shipit plan.md "Ready for production"
- Modify a file (add/remove lines)
- Run sync to relocate annotations
ano sync plan.md
- Verify annotations moved to correct new positions
- Test with various file modifications:
- Insert lines before annotations
- Delete lines before annotations
- Modify annotated lines slightly
- Major file restructure
- Create annotation on line 10
- Insert 5 lines at top of file
- Run sync
- Verify annotation is now on line 15
- Check that context matching works correctly
- Export annotations to JSON
ano export plan.md -o backup.json - Verify JSON file is created
- Check JSON structure is valid
- Import annotations from JSON
ano import plan.md backup.json
- Verify annotations are restored
- Test overwriting existing annotations
- Show differences in annotations
ano diff plan.md
- Modify annotations and run diff again
- Verify diff shows changes correctly
- Delete a specific annotation
ano delete plan.md <annotation-id>
- Verify annotation is removed
- Check that replies are also deleted
- Start web server
ano serve plan.md
- Verify server starts on port 3000 (default)
- Custom port
ano serve plan.md --port 8080
- Open browser to http://localhost:3000
- Test web UI features:
- View annotations inline
- Add new annotations via UI
- Reply to annotations
- Resolve/reopen annotations
- Filter by type/status
- Activity feed shows recent changes
- Diff view works
- Edit file content directly
- Real-time updates via SSE
- Keyboard shortcuts (j/k navigation, r to resolve)
- Export HTML
- Copy for Claude
- Shareable URLs
- Display main help
- Verify all commands are listed
- Display version number
-
ano annotate --help -
ano approve --help -
ano team --help -
ano check --help -
ano serve --help - Verify help text is clear and accurate
- Start MCP server
npm run mcp
- Verify server starts without errors
- Test in Claude Code (requires MCP configuration)
-
read_annotationstool works -
add_annotationtool works -
resolve_annotationtool works -
approve_filetool works
-
- Configure approval gate hook in Claude Code settings
- Create a plan file
- Try to execute without approvals
- Verify Claude is blocked
- Add required approvals
- Verify Claude can now execute
-
Missing file: Try commands on non-existent file
ano list nonexistent.md
- Verify clear error message
-
Invalid line number: Annotate line beyond file length
ano annotate plan.md:999 "Test"- Verify error handling
-
Invalid annotation ID: Use non-existent ID
ano resolve plan.md invalid-id
- Verify error message
-
No team config: Run team commands without init
- Verify helpful error message
-
Invalid JSON: Corrupt annotation file manually
- Verify graceful error handling
-
Permission errors: Test on read-only files
- Verify error messages
-
Port already in use: Start server on occupied port
ano serve plan.md --port 3000 # In another terminal ano serve plan.md --port 3000- Verify error handling
- Large files: Test with 1000+ line file
- Many annotations: Create 100+ annotations
- Web UI responsiveness: Test with large datasets
- Sync performance: Test on files with many annotations
- Commit annotation files
- Branch and merge with conflicts
- Verify annotations work across branches
- MCP server integration
- Approval gate hooks
- Auto-open web viewer
- Multiple team members adding annotations
- Import/export between users
- Approval workflow with multiple reviewers
If possible, test on multiple platforms:
- macOS
- Linux
- Windows (WSL)
Test web viewer in:
- Chrome/Chromium
- Firefox
- Safari
- Edge
After any code changes, re-run:
- Automated test script:
./test-script.sh - Critical path tests:
- Create annotation
- List annotations
- Approve file
- Check approval
- Serve web UI
When reporting bugs, include:
- Command run
- Expected behavior
- Actual behavior
- Error messages
- Environment (OS, Node version)
- Ano version (
ano --version)
After testing:
# Remove test directory
rm -rf .ano-test
# Remove test artifacts
rm -f annotations-backup.json
rm -f *.annotations.json
# Reset team config if needed
rm -rf .anoBefore each release:
- Run automated test script
- Complete manual testing checklist
- Test MCP integration with Claude Code
- Test approval gate hook
- Cross-platform verification
- Browser compatibility check
- Some tests require manual verification (e.g., visual checks in web UI)
- Web UI tests are best done manually with browser DevTools open
- MCP tests require Claude Code to be installed and configured
- Team features require multiple git identities or team members
Current status:
- ✅ Unit tests: Not implemented yet
- ✅ Integration tests: Not implemented yet
- ✅ Manual testing: Checklist provided
- ✅ Automated CLI testing: Script provided
Next steps: Add unit tests with a framework like Vitest or Jest.