-
Notifications
You must be signed in to change notification settings - Fork 0
Description
GitHub Issues for Testing Implementation
Copy these issues into your GitHub repository to track your testing implementation progress.
Milestone 1: Testing Infrastructure Setup (Week 1-2)
Issue #1: Install Testing Dependencies
Labels: setup, dependencies, priority: high
Description:
Install and configure Vitest and Playwright testing frameworks along with all required dependencies.
Tasks:
- Install Vitest dependencies:
npm install -D vitest @vitest/ui @vitest/coverage-v8 happy-dom - Install Playwright:
npm install -D @playwright/test playwright - Install Testing Library:
npm install -D @testing-library/dom - Install TypeScript types:
npm install -D @types/node - Run
npx playwright installto install browsers - Verify installations with
npx vitest --versionandnpx playwright --version
Acceptance Criteria:
- All dependencies installed without errors
- Version commands return proper versions
- Playwright browsers installed successfully
References:
- See
PACKAGE_JSON_ADDITIONS.mdfor exact versions - See
QUICK_START.mdfor installation guide
Issue #2: Add Testing Scripts to package.json
Labels: setup, configuration, priority: high
Description:
Add npm scripts for running unit tests and E2E tests.
Tasks:
- Add unit testing scripts (test, test:watch, test:ui, test:coverage)
- Add E2E testing scripts (test:e2e, test:e2e:ui, test:e2e:debug, test:e2e:report)
- Add combined test script (test:all)
- Verify all scripts work with
npm run test -- --version
Scripts to Add:
{
"scripts": {
"test": "vitest run",
"test:watch": "vitest",
"test:ui": "vitest --ui",
"test:coverage": "vitest run --coverage",
"test:unit": "vitest run --reporter=verbose",
"test:e2e": "playwright test",
"test:e2e:ui": "playwright test --ui",
"test:e2e:headed": "playwright test --headed",
"test:e2e:debug": "playwright test --debug",
"test:e2e:report": "playwright show-report",
"test:e2e:chromium": "playwright test --project=chromium",
"test:all": "npm run test && npm run test:e2e"
}
}Acceptance Criteria:
- All scripts added to package.json
- Each script runs without errors
- Help/version commands work properly
References:
- See
PACKAGE_JSON_ADDITIONS.md
Issue #3: Create Vitest Configuration
Labels: setup, configuration, priority: high
Description:
Set up Vitest configuration file with proper settings for the Astro project.
Tasks:
- Create
vitest.config.tsin project root - Configure happy-dom environment
- Set up path aliases (@components, @utils, @lib, etc.)
- Configure coverage settings with 60% thresholds
- Set up test file patterns and exclusions
- Configure setup file path
File Location: vitest.config.ts (root)
Acceptance Criteria:
- Configuration file created and valid
- TypeScript recognizes the config
- Path aliases work in test files
- Coverage thresholds set to 60%
References:
- Use provided
vitest.config.tsfile - [Vitest Configuration Docs](https://vitest.dev/config/)
Issue #4: Create Playwright Configuration
Labels: setup, configuration, priority: high
Description:
Set up Playwright configuration for E2E testing across multiple browsers.
Tasks:
- Create
playwright.config.tsin project root - Configure base URL (http://localhost:4321)
- Set up browser projects (Chromium, Firefox, WebKit)
- Configure mobile viewports (Pixel 5, iPhone 12)
- Set up web server for local testing
- Configure screenshots and video on failure
- Set up HTML and JSON reporters
File Location: playwright.config.ts (root)
Acceptance Criteria:
- Configuration file created and valid
- All browser projects configured
- Web server starts automatically
- Screenshot/video capture configured
References:
- Use provided
playwright.config.tsfile - [Playwright Configuration Docs](https://playwright.dev/docs/test-configuration)
Issue #5: Create Test Setup and Directory Structure
Labels: setup, infrastructure, priority: high
Description:
Create test directories and setup files for organizing tests.
Tasks:
- Create
src/test/directory - Create
src/test/setup.tswith environment mocks - Create
src/test/mocks/directory for mock utilities - Create
e2e/directory in project root - Create
.gitignoreentries for test outputs - Update
tsconfig.jsonto include test files
Directory Structure:
├── e2e/
│ └── .gitkeep
├── src/
│ └── test/
│ ├── setup.ts
│ └── mocks/
│ └── .gitkeep
Acceptance Criteria:
- All directories created
- Setup file properly configures test environment
- Mock environment variables set
- localStorage and sessionStorage mocked
- Git ignores coverage and report directories
References:
- Use provided
setup.tsfile - See
TESTING_STRATEGY.mdfor directory structure
Issue #6: Update TypeScript Configuration for Tests
Labels: setup, configuration, priority: medium
Description:
Update tsconfig.json to properly recognize test files and types.
Tasks:
- Add vitest types to compilerOptions.types
- Add @testing-library/dom types
- Include test files in src/**/*.test.ts pattern
- Include spec files in src/**/*.spec.ts pattern
- Verify path aliases work in tests
- Exclude coverage and report directories
Acceptance Criteria:
- TypeScript recognizes test files
- No type errors in test imports
- Vitest globals work without imports (if enabled)
- Path aliases resolve correctly
Milestone 2: Core Unit Tests (Week 3-4)
Issue #7: Write Unit Tests for Utility Functions
Labels: testing, unit-test, priority: high
Description:
Create unit tests for common utility functions used throughout the portfolio.
Tasks:
- Test
formatDatefunction (if exists) - Test
calculateReadingTimefunction - Test
generateSlugfunction - Test any string manipulation utilities
- Test date/time formatting utilities
- Achieve 70%+ coverage for utils
Test Files:
src/utils/formatDate.test.tssrc/utils/readingTime.test.tssrc/utils/slug.test.ts
Acceptance Criteria:
- All utility functions have tests
- Tests cover happy path and edge cases
- Coverage > 70% for utilities
- All tests pass
Example:
describe('calculateReadingTime', () => {
it('should calculate reading time correctly', () => {
const text = 'word '.repeat(200);
expect(calculateReadingTime(text)).toBe(1);
});
});References:
- See
example-utils.test.tsfor patterns - See
TESTING_STRATEGY.mdsection on Unit Testing
Issue #8: Write Unit Tests for Supabase Integration
Labels: testing, unit-test, integration, priority: high
Description:
Create comprehensive tests for Supabase view counter functionality.
Tasks:
- Create mock Supabase client
- Test
getViewCountfunction - Test
incrementViewCountfunction - Test error handling for network failures
- Test handling of missing posts
- Mock Supabase responses
Test Files:
src/lib/supabase.test.tssrc/test/mocks/supabase.ts
Acceptance Criteria:
- All Supabase functions tested
- Mock client properly simulates API
- Error cases handled gracefully
- Tests don't make real API calls
- Coverage > 80% for Supabase integration
Example:
vi.mock('@supabase/supabase-js', () => ({
createClient: vi.fn(() => ({
from: vi.fn(() => ({
select: vi.fn().mockReturnThis(),
eq: vi.fn().mockResolvedValue({ data: { views: 42 } }),
})),
})),
}));References:
- See
TESTING_STRATEGY.mdsection on Supabase testing - [Vitest Mocking Docs](https://vitest.dev/guide/mocking.html)
Issue #9: Write Tests for API Endpoints
Labels: testing, unit-test, api, priority: high
Description:
Test all API endpoints in the portfolio (view counter, contact form, etc.).
Tasks:
- Test
/api/views/[slug]endpoint - Test contact form API endpoint (if exists)
- Mock external service calls
- Test error responses (400, 404, 500)
- Test request validation
- Test response format
Test Files:
src/pages/api/views.test.tssrc/pages/api/contact.test.ts
Acceptance Criteria:
- All API endpoints have tests
- Success and error cases covered
- No real API calls made
- Proper status codes tested
- Response format validated
References:
- See
TESTING_STRATEGY.mdAPI testing section
Issue #10: Write Tests for Content Collections
Labels: testing, unit-test, content, priority: medium
Description:
Test content collection schemas and validation.
Tasks:
- Test blog post schema validation
- Test portfolio item schema
- Test content sorting/filtering
- Test frontmatter parsing
- Verify required fields
Test Files:
src/content/config.test.ts
Acceptance Criteria:
- Content schemas validated
- Invalid content rejected
- Required fields enforced
- Date parsing works correctly
Milestone 3: E2E Critical Paths (Week 5-6)
Issue #11: Write E2E Tests for Homepage
Labels: testing, e2e, priority: high
Description:
Create comprehensive E2E tests for homepage functionality.
Tasks:
- Test homepage loads successfully
- Test navigation menu displays
- Test navigation to all main pages
- Test dark mode toggle
- Test responsive design (mobile/desktop)
- Test SEO meta tags
- Test footer links
- Test no console errors on load
Test File: e2e/homepage.spec.ts
Acceptance Criteria:
- All homepage features tested
- Tests pass on Chromium, Firefox, WebKit
- Mobile and desktop viewports covered
- No critical console errors
- SEO tags verified
References:
- Use provided
homepage.spec.tsfile
Issue #12: Write E2E Tests for Blog
Labels: testing, e2e, priority: high
Description:
Test blog listing page and individual blog post pages.
Tasks:
- Test blog listing page loads
- Test blog posts display correctly
- Test navigation to individual posts
- Test view counter increments
- Test reading time displays
- Test category filtering (if exists)
- Test blog post metadata (title, date, author)
- Test blog navigation (next/previous)
Test File: e2e/blog.spec.ts
Acceptance Criteria:
- Blog listing works
- Individual posts load
- View counter functionality verified
- Reading time accurate
- All metadata present
Example:
test('should increment view counter', async ({ page }) => {
await page.goto('/blog/test-post-slug');
const viewCounter = page.locator('[data-testid="view-count"]');
await expect(viewCounter).toBeVisible();
});Issue #13: Write E2E Tests for Contact Form
Labels: testing, e2e, priority: high
Description:
Test contact form validation and submission.
Tasks:
- Test form displays correctly
- Test required field validation
- Test email format validation
- Test successful form submission
- Test error handling
- Mock email service API
- Test success/error messages
Test File: e2e/contact.spec.ts
Acceptance Criteria:
- Form validation works
- Email format checked
- Submission succeeds with valid data
- Error messages display properly
- No real emails sent in tests
Example:
test('should validate email format', async ({ page }) => {
await page.goto('/contact');
await page.fill('input[name="email"]', 'invalid-email');
await page.click('button[type="submit"]');
await expect(page.locator('.error')).toBeVisible();
});Issue #14: Write E2E Tests for Portfolio Page
Labels: testing, e2e, priority: medium
Description:
Test portfolio project showcase functionality.
Tasks:
- Test portfolio page loads
- Test projects display correctly
- Test project filtering (if exists)
- Test project detail modals/pages
- Test external links open correctly
- Test responsive layout
Test File: e2e/portfolio.spec.ts
Acceptance Criteria:
- Portfolio page loads successfully
- Projects display with correct data
- Filters work (if applicable)
- Links navigate properly
- Mobile layout works
Issue #15: Write SEO and Accessibility Tests
Labels: testing, e2e, accessibility, seo, priority: medium
Description:
Add comprehensive SEO and accessibility testing.
Tasks:
- Install @axe-core/playwright for accessibility
- Test meta tags on all pages
- Test Open Graph tags
- Test Twitter Card tags
- Test structured data (JSON-LD)
- Test heading hierarchy
- Run axe accessibility scans
- Test keyboard navigation
- Test ARIA labels
Test Files:
e2e/seo.spec.tse2e/accessibility.spec.ts
Acceptance Criteria:
- All pages have proper meta tags
- No critical accessibility violations
- Heading hierarchy correct (one h1)
- Keyboard navigation works
- ARIA labels present
Example:
import AxeBuilder from '@axe-core/playwright';
test('should not have accessibility violations', async ({ page }) => {
await page.goto('/');
const results = await new AxeBuilder({ page }).analyze();
expect(results.violations).toEqual([]);
});Milestone 4: CI/CD Integration (Week 7)
Issue #16: Set Up GitHub Actions Workflow
Labels: ci-cd, automation, priority: high
Description:
Configure GitHub Actions to run tests automatically on push and PR.
Tasks:
- Create
.github/workflows/test.yml - Configure unit test job
- Configure E2E test job (multi-browser)
- Configure lint/typecheck job
- Configure build check job
- Set up test result artifacts
- Set up coverage reporting
- Add status badges to README
File Location: .github/workflows/test.yml
Acceptance Criteria:
- Workflow runs on push to main branch
- Workflow runs on pull requests
- All test jobs execute successfully
- Test reports uploaded as artifacts
- Build succeeds in CI
References:
- Use provided
test-workflow.ymlfile - [GitHub Actions Docs](https://docs.github.com/en/actions)
Issue #17: Add Codecov Integration
Labels: ci-cd, coverage, priority: medium
Description:
Integrate Codecov for test coverage reporting and tracking.
Tasks:
- Sign up for Codecov account
- Add CODECOV_TOKEN to GitHub secrets
- Update workflow to upload coverage
- Add coverage badge to README
- Set coverage thresholds
- Configure Codecov settings
Acceptance Criteria:
- Coverage uploads to Codecov on CI runs
- Coverage badge displays in README
- Coverage tracked over time
- Pull requests show coverage diff
References:
Issue #18: Configure Netlify Build Settings for Tests
Labels: ci-cd, deployment, priority: medium
Description:
Update Netlify configuration to run tests before deployment.
Tasks:
- Update
netlify.tomlbuild command - Add test command to build pipeline
- Configure environment variables
- Set up deploy previews with tests
- Add Lighthouse plugin for performance
Configuration:
[build]
command = "npm run build && npm run test"
publish = "dist"
[[plugins]]
package = "@netlify/plugin-lighthouse"Acceptance Criteria:
- Tests run before each deployment
- Failed tests block deployment
- Environment variables set correctly
- Deploy previews include test results
Milestone 5: Documentation and Polish (Week 8)
Issue #19: Document Testing Patterns
Labels: documentation, priority: medium
Description:
Create comprehensive documentation for the testing infrastructure.
Tasks:
- Create TESTING.md in project root
- Document how to run tests
- Document how to write new tests
- Add testing guidelines
- Document mocking patterns
- Add troubleshooting guide
- Include example test patterns
File Location: TESTING.md (root)
Acceptance Criteria:
- New contributors can understand testing setup
- All test commands documented
- Common patterns explained
- Troubleshooting section complete
Issue #20: Add Test Coverage Badge to README
Labels: documentation, priority: low
Description:
Add visual indicators of test status and coverage to README.
Tasks:
- Add GitHub Actions status badge
- Add Codecov coverage badge
- Add Playwright badge
- Update README with testing section
- Link to testing documentation
Example Badges:
[](https://github.com/Nerajno/portfolio_v3/actions)
[](https://codecov.io/gh/Nerajno/portfolio_v3)Acceptance Criteria:
- Badges display correctly
- Badges link to proper pages
- Testing section in README
- Instructions for running tests
Issue #21: Create Test Data Fixtures
Labels: testing, infrastructure, priority: low
Description:
Create reusable test data fixtures for consistent testing.
Tasks:
- Create
e2e/fixtures/directory - Create test blog post data
- Create test portfolio project data
- Create test contact form data
- Export fixtures for reuse
- Document fixture usage
File Locations:
e2e/fixtures/blog-posts.tse2e/fixtures/portfolio-items.tse2e/fixtures/contact-form.ts
Acceptance Criteria:
- Fixtures available for all tests
- Consistent test data across tests
- Easy to update test data
- Well-documented
Milestone 6: Advanced Testing (Week 9-10)
Issue #22: Add Visual Regression Testing
Labels: testing, e2e, visual, priority: low
Description:
Implement visual regression testing to catch UI changes.
Tasks:
- Configure Playwright screenshot comparison
- Take baseline screenshots
- Add visual tests for key pages
- Set up screenshot diffing
- Configure pixel tolerance
- Add to CI pipeline
Acceptance Criteria:
- Baseline screenshots captured
- Visual changes detected
- Acceptable threshold configured
- CI runs visual tests
References:
Issue #23: Add Performance Testing
Labels: testing, performance, priority: low
Description:
Add performance benchmarks and testing.
Tasks:
- Test page load times
- Test Time to Interactive (TTI)
- Test First Contentful Paint (FCP)
- Test Largest Contentful Paint (LCP)
- Set performance budgets
- Add Lighthouse CI
Acceptance Criteria:
- Performance metrics tracked
- Performance budgets enforced
- Lighthouse scores > 90
- Performance regressions caught
Issue #24: Add Cross-Browser Screenshot Tests
Labels: testing, e2e, cross-browser, priority: low
Description:
Verify visual consistency across different browsers.
Tasks:
- Take screenshots on Chromium, Firefox, WebKit
- Compare layouts across browsers
- Test mobile vs desktop
- Document browser differences
- Fix critical differences
Acceptance Criteria:
- Screenshots from all browsers
- Critical differences documented
- Major layout issues resolved
- Tests run on all browsers in CI
Milestone 7: Maintenance and Optimization (Ongoing)
Issue #25: Optimize Test Execution Time
Labels: optimization, performance, priority: low
Description:
Improve test suite performance and execution time.
Tasks:
- Identify slow tests
- Parallelize test execution
- Optimize test setup/teardown
- Use test filtering effectively
- Cache dependencies in CI
- Reduce E2E test overhead
Acceptance Criteria:
- Unit tests run < 30 seconds
- E2E tests run < 5 minutes
- CI pipeline optimized
- Local test experience improved
Issue #26: Set Up Test Coverage Monitoring
Labels: monitoring, coverage, priority: medium
Description:
Implement ongoing test coverage monitoring and goals.
Tasks:
- Set up coverage trends tracking
- Configure coverage gates (min 60%)
- Add coverage reports to PRs
- Create coverage improvement plan
- Identify untested code areas
Acceptance Criteria:
- Coverage tracked over time
- PRs show coverage changes
- Coverage gates enforced
- Regular coverage reviews
Issue #27: Add Pre-commit Hooks for Tests
Labels: automation, git, priority: low
Description:
Run tests automatically before commits to catch issues early.
Tasks:
- Install husky
- Configure pre-commit hook
- Run unit tests on commit
- Run linter on commit
- Configure lint-staged
- Document hook setup
Acceptance Criteria:
- Tests run before commit
- Failing tests prevent commit
- Only changed files tested
- Hook documented
Summary
Total Issues: 27
Estimated Timeline: 10-12 weeks
Priority Breakdown:
- High Priority: 15 issues
- Medium Priority: 8 issues
- Low Priority: 4 issues
Labels to Create:
setuptestingunit-teste2eci-cddocumentationconfigurationaccessibilityseoperformanceoptimizationpriority: highpriority: mediumpriority: low
Suggested Milestones:
- Testing Infrastructure Setup (Issues [ImgBot] Optimize images #1-6)
- Core Unit Tests (Issues Breakdown the building of the portfolio into components #7-10)
- E2E Critical Paths (Issues Navbar #11-15)
- CI/CD Integration (Issues Carousel #16-18)
- Documentation and Polish (Issues Portfolio - page #19-21)
- Advanced Testing (Issues About #22-24)
- Maintenance and Optimization (Issues Add SSL #25-27)