Skip to content

Commit 9008f14

Browse files
djm81Dominikus Nold
andauthored
chore: merge dev branch to main (#4)
* chore: remove mcp.json from version control * fix: correct test paths for new .specfact structure - Fix test_compare_with_smart_defaults: remove duplicate mkdir for plans directory - Fix test_compare_output_to_specfact_reports: move auto-derived plan to .specfact/plans/ instead of .specfact/reports/brownfield/ - Fix test_team_collaboration_workflow: use correct pattern 'report-*.md' for comparison reports and check auto-derived plans in .specfact/plans/ * fix: update .gitignore * fix: correct pre-merge check to only flag root-level temporary files - Fix regex to only match temporary files at project root (not in tests/) - Patterns match .gitignore: /test_*.py, /debug_*.py, /trigger_*.py, /temp_*.py - Use grep -v '/' to ensure files are at root level (no subdirectories) - Also check for analysis artifacts at root - Exclude legitimate test files in tests/ directory * fix: add conftest.py to fix tools imports in tests - Create tests/conftest.py to add project root to sys.path - Fixes ModuleNotFoundError for 'tools' module in Python 3.11 tests - Update test_smart_test_coverage.py to use consistent sys.path approach - Required because pytest --import-mode=importlib doesn't respect pythonpath during test collection * fix: install specfact-cli from local source in specfact-gate workflow - Change from 'pip install specfact-cli' (PyPI) to 'pip install -e .' (local source) - Required because package is not yet published to PyPI - Matches approach used in pr-orchestrator.yml workflow - Install hatch first to ensure proper environment setup * refactor: combine hatch installation and CLI setup steps - Combine 'Install dependencies' and 'Install SpecFact CLI from source' steps - Ensures hatch is available before use - Matches pattern used in pr-orchestrator.yml cli-validation job - More efficient and clearer workflow structure * fix: make repro main command the default callback - Replace @app.command() with @app.callback(invoke_without_command=True) - Allows 'specfact repro --verbose --budget 90' without requiring a subcommand - Fixes workflow error: 'No such option: --verbose' - main() now runs when repro is called without a subcommand * fix: remove invalid -k filter from Python 3.11 compatibility tests - Remove -k 'contract' filter that was deselecting all tests (exit code 5) - Run unit and integration tests instead for Python 3.11 compatibility check - Skip E2E tests to keep compatibility check fast - Tests are advisory (don't fail build) to allow gradual compatibility work * fix: sync __version__ with pyproject.toml (0.4.0) - Update __version__ in __init__.py to match pyproject.toml version - Ensures version consistency across package metadata * fix: exclude tests, docs, tools from PyPI package - Add [tool.hatch.build.targets.sdist] configuration - Include only essential files: src/, README.md, LICENSE.md, pyproject.toml - Exclude development files: tests/, tools/, docs/, .github/, .cursor/, contracts/, reports/, etc. - Ensure clean PyPI package with only production code * docs: fix uvx command syntax in all documentation - Update all occurrences of 'uvx specfact' to 'uvx --from specfact-cli specfact' - Fixes issue where uvx couldn't find package by command name alone - Package name is 'specfact-cli', command name is 'specfact' - Updated in: README.md, docs/README.md, docs/getting-started/installation.md, docs/guides/competitive-analysis.md, AGENTS.md * fix: Contract test directory handling and GitHub Pages legal files (#3) * fix: handle directories in contract test file scanning - Add is_file() check in _get_modified_files() to skip directories - Add IsADirectoryError handling in _get_file_hash() and _compute_file_hash() - Fix contract test error when scanning resources directory - Ensure only Python files are processed for contract validation * fix: include LICENSE.md and TRADEMARKS.md in GitHub Pages - Copy LICENSE.md and TRADEMARKS.md to docs/ before building - Add root files to workflow paths trigger - Update docs/index.md to use relative links for LICENSE and TRADEMARKS - Ensure legal information is included in published documentation * feat: enable PR orchestrator workflow for dev branch - Add dev branch to pull_request and push triggers - Ensure CI/CD runs on PRs to both main and dev branches - Maintains same path ignore rules for both branches * feat: enable specfact-gate workflow for dev branch - Add dev branch to pull_request and push triggers - Ensure contract validation runs on PRs to both main and dev branches * fix: replace percent format with f-string in plan.py - Fix UP031 ruff error by using f-string instead of percent format - Update prompt text to use modern Python string formatting * fix: resolve import sorting conflict between isort and ruff - Remove isort from format and lint scripts to avoid conflicts - Configure ruff's isort to match black profile (multi_line_output=3, combine_as_imports) - Use ruff for both import sorting and formatting (more reliable and modern) - Fix I001 import sorting errors in plan.py This resolves the conflict where format and lint were producing different results due to isort and ruff having different import sorting configurations. * fix: use hatch run in GitHub workflow to ensure tools are available - Change specfact repro to hatch run specfact repro in specfact-gate.yml - Ensures all tools (semgrep, basedpyright, ruff, crosshair) are available in PATH - Fix I001 import sorting in plan.py * fix: replace try-except-pass with contextlib.suppress in logger_setup - Fix SIM105 ruff error by using contextlib.suppress(Exception) - Replace nested try-except-pass blocks with contextlib.suppress - Improves code quality and follows ruff best practices * fix: exclude logger_setup.py from CrossHair analysis - Exclude logger_setup.py from CrossHair due to known signature analysis bug - CrossHair has issues analyzing functions with *args/**kwargs patterns with decorators - Contract exploration remains advisory, this prevents false failures * fix: resolve ruff errors and CrossHair syntax issue - Fix C414: Remove unnecessary list() call in sorted() - Fix B007: Rename unused loop variable story_idx to _story_idx - Fix CrossHair: Exclude common directory instead of using non-existent --exclude flag - CrossHair doesn't support --exclude, so we exclude common/ by only including other directories * fix: use unpacking instead of list concatenation for CrossHair targets - Fix RUF005: Use unpacking (*crosshair_targets) instead of list concatenation - Improves code quality and follows ruff best practices * fix: resolve RET504 and SIM102 ruff errors - Fix RET504: Remove unnecessary assignment before return in feature_keys.py - Fix SIM102: Combine nested if statements into single if in fsm.py - Improves code quality and follows ruff best practices * fix: make CrossHair failures non-blocking - Treat CrossHair failures as warnings (advisory only) - Contract exploration is advisory, not blocking - CrossHair has known issues analyzing certain function signatures with decorators - Only count non-CrossHair failures for exit code determination * fix: ruff check * fix: ruff check * fix: ruff check --------- Co-authored-by: Dominikus Nold <[email protected]> * feat: dynamic CrossHair detection, GitHub Action integration, and enforcement report enhancements (v0.4.2) (#5) * feat: dynamic CrossHair detection, GitHub Action integration, and enforcement report enhancements - Replace hard-coded skip list with dynamic signature issue detection - Add comprehensive metadata to enforcement reports (plan, budget, config) - Add structured findings extraction from tool output - Add auto-fix support for Semgrep via --fix flag - Add GitHub Action workflow with PR annotations and comments - Add GitHub annotations utility with contract-first validation - Add comprehensive test suite for new features - Sync versions to 0.4.2 across all files Fixes: CrossHair signature analysis limitations blocking CI/CD New Features: GitHub Action integration, auto-fix support, enhanced reports * fix: handle CrossHair signature analysis limitations in GitHub annotations - Detect signature analysis limitations in create_annotations_from_report - Treat signature issues as non-blocking (notice level, not error) - Filter signature issues from failed checks in PR comments - Add dedicated section for signature analysis limitations in PR comments - Prevents workflow failures for non-blocking CrossHair signature issues Fixes: GitHub Action workflow failing on CrossHair signature analysis limitations * fix: escape GitHub Actions syntax in Jinja2 template - Use {% raw %} blocks to escape GitHub Actions expressions - Fixes Jinja2 UndefinedError for 'steps' variable - All 5 failing tests now pass Fixes: - test_import_speckit_via_cli_command - test_import_speckit_generates_github_action - test_import_speckit_with_full_workflow - test_generate_from_template - test_generate_github_action * fix: handle CrossHair signature issues in ReproChecker and fix ruff whitespace - Detect CrossHair signature analysis limitations in ReproChecker.run_check() - Mark signature issues as skipped instead of failed - Fix whitespace issues in test_directory_structure_workflow.py (W293) - Prevents local repro failures on CrossHair signature limitations Fixes: specfact repro failing on CrossHair signature analysis limitations * chore: remove duplicate specfact-gate.yml workflow - specfact.yml is the new comprehensive workflow with PR annotations - specfact-gate.yml was the old workflow with same triggers - Removing to prevent duplicate workflow executions Fixes: workflow running twice on each push * fix: show all ruff errors by using --output-format=full - Add --output-format=full flag to ruff check command - Ensures all linting errors are reported, not just a few - Fixes issue where pipeline only shows limited number of errors Fixes: ruff report showing only a few issues instead of all * fix: remove whitespace from blank lines in test_analyze_command.py - Fix 20 W293 whitespace errors in dedent() strings - Ruff now passes all checks for this file Fixes: ruff linting errors in test file * fix: remove whitespace from blank lines in test files - Fix W293 whitespace errors in: - tests/integration/analyzers/test_code_analyzer_integration.py - tests/unit/analyzers/test_code_analyzer.py - tests/unit/tools/test_smart_test_coverage.py - tests/unit/utils/test_ide_setup.py - All whitespace errors now fixed (68 fixed) - Remaining 2 SIM105 suggestions are style recommendations, not errors Fixes: ruff linting errors in test files * fix: replace try-except-pass with contextlib.suppress for SIM105 - Replace try-except-pass pattern with contextlib.suppress(SystemExit) - Fixes 2 SIM105 errors in test_smart_test_coverage.py - All ruff linting errors now fixed Fixes: SIM105 linting errors in test files --------- Co-authored-by: Dominikus Nold <[email protected]> --------- Co-authored-by: Dominikus Nold <[email protected]>
1 parent 507ad05 commit 9008f14

File tree

93 files changed

+3767
-493
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

93 files changed

+3767
-493
lines changed

.cursor/rules/testing-and-build-guide.mdc

Lines changed: 48 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -215,21 +215,64 @@ For a quick list of all available script options, run:
215215
bash ./scripts/rebuild_containers.sh --help
216216
```
217217

218+
## Branch Protection & Workflow
219+
220+
### Branch Protection Rules
221+
222+
This repository has branch protection enabled for `dev` and `main` branches:
223+
224+
- **No direct commits**: All changes must be made via Pull Requests
225+
- **Required PRs**: Create feature/bugfix/hotfix branches and submit PRs
226+
- **CI/CD gates**: All PRs must pass CI/CD checks before merging
227+
- **Approval required**: PRs may require approval before merging (depending on repository settings)
228+
229+
### Development Workflow
230+
231+
1. **Create a feature branch**:
232+
```bash
233+
git checkout -b feature/your-feature-name
234+
# or
235+
git checkout -b bugfix/your-bugfix-name
236+
# or
237+
git checkout -b hotfix/your-hotfix-name
238+
```
239+
240+
2. **Make your changes** and test locally:
241+
```bash
242+
hatch run format
243+
hatch run lint
244+
hatch run contract-test
245+
hatch test --cover -v
246+
```
247+
248+
3. **Commit and push**:
249+
```bash
250+
git add .
251+
git commit -m "feat: your feature description"
252+
git push origin feature/your-feature-name
253+
```
254+
255+
4. **Create a Pull Request** to `dev` or `main` via GitHub
256+
257+
5. **Wait for CI/CD checks** to pass before merging
258+
218259
## Release Guidelines
219260

220-
A "release" in this project corresponds to a set of versioneditHub Container Registry. This process is automated.
261+
A "release" in this project corresponds to a set of versioned Docker images and PyPI packages. This process is automated.
221262

222263
### Automated Release Workflow
223264

224265
Our release process is handled automatically by GitHub Actions. Here is how it works:
225266

226-
1. **Trigger**: A push or pull request to the `main` or `dev` branch triggers the `Tests` workflow.
227-
2. **Testing**: The workflow runs the complete test suite using `hatch run smart-test`.
228-
3. **Build & Push**: If the tests pass, the `Build and Push Docker Images` workflow is automatically triggered. This workflow:
267+
1. **Trigger**: A push to the `main` branch (after successful PR merge) triggers the release workflow.
268+
2. **Testing**: The workflow runs the complete test suite using `hatch run contract-test`.
269+
3. **Package Validation**: The package is built and validated.
270+
4. **PyPI Publication**: If the version in `pyproject.toml` is newer than the latest PyPI version, the package is automatically published to PyPI.
271+
5. **Build & Push**: Docker images are built and pushed to GHCR. This workflow:
229272
- Builds all service images.
230273
- Tags the images with two tags:
231274
- A unique, immutable tag (the Git commit SHA).
232-
- A floating tag (`latest` for the `main` branch, `dev` for the `dev` branch).
275+
- A floating tag (`latest` for the `main` branch).
233276
- Pushes both tags to the GitHub Container Registry (GHCR).
234277

235278
```mermaid

.cursorrules

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,13 +5,20 @@
55
- When starting a new chat session, capture the current timestamp from the client system using the `run_terminal_cmd` tool with `date "+%Y-%m-%d %H:%M:%S %z"` to ensure accurate timestamps are used in logs, commits, and other time-sensitive operations.
66
- When starting a new chat session, get familiar with the build and test guide (refer to `.cursor/rules/testing-and-build-guide.mdc`).
77
- When starting a new task, first check the project overview and current status in `README.md` and `AGENTS.md`.
8+
- **Branch Protection**: This repository has branch protection enabled for `dev` and `main` branches. All changes must be made via Pull Requests:
9+
- Create a feature branch: `git checkout -b feature/your-feature-name`
10+
- Create a bugfix branch: `git checkout -b bugfix/your-bugfix-name`
11+
- Create a hotfix branch: `git checkout -b hotfix/your-hotfix-name`
12+
- Push your branch and create a PR to `dev` or `main`
13+
- Direct commits to `dev` or `main` are not allowed
814
- After any code changes, follow these steps in order:
915
1. Apply linting and formatting to ensure code quality: `hatch run format`
1016
2. Type checking: `hatch run type-check` (basedpyright)
1117
3. **Contract-first approach**: Run `hatch run contract-test` for contract validation
1218
4. Run full test suite: `hatch test --cover -v`
1319
5. Verify all tests pass and contracts are satisfied
1420
6. Fix any issues and repeat steps until all tests pass
21+
- **Version Management**: When updating the version in `pyproject.toml`, ensure it's newer than the latest PyPI version. The CI/CD pipeline will automatically publish to PyPI only if the new version is greater than the published version.
1522
- **Contract-first**: All public APIs must have `@icontract` decorators and `@beartype` type checking
1623
- **CLI focus**: Commands should follow typer patterns with rich console output
1724
- **Data validation**: Use Pydantic models for all data structures

.github/workflows/github-pages.yml

Lines changed: 76 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,76 @@
1+
name: GitHub Pages
2+
3+
on:
4+
push:
5+
branches:
6+
- main
7+
paths:
8+
- 'docs/**'
9+
- '.github/workflows/github-pages.yml'
10+
- '_config.yml'
11+
- 'docs/Gemfile'
12+
- 'docs/index.md'
13+
- 'LICENSE.md'
14+
- 'TRADEMARKS.md'
15+
workflow_dispatch:
16+
17+
permissions:
18+
contents: read
19+
pages: write
20+
id-token: write
21+
22+
concurrency:
23+
group: "pages"
24+
cancel-in-progress: false
25+
26+
jobs:
27+
build:
28+
name: Build GitHub Pages
29+
runs-on: ubuntu-latest
30+
steps:
31+
- name: Checkout
32+
uses: actions/checkout@v4
33+
with:
34+
fetch-depth: 0
35+
36+
- name: Setup Ruby (for Jekyll)
37+
uses: ruby/setup-ruby@v1
38+
with:
39+
ruby-version: '3.1'
40+
bundler-cache: true
41+
working-directory: ./docs
42+
43+
- name: Install Jekyll dependencies
44+
run: |
45+
cd docs
46+
bundle install
47+
48+
- name: Copy root files to docs
49+
run: |
50+
# Copy important root files to docs directory for inclusion in GitHub Pages
51+
cp LICENSE.md docs/LICENSE.md
52+
cp TRADEMARKS.md docs/TRADEMARKS.md
53+
54+
- name: Build with Jekyll
55+
run: |
56+
jekyll build --source docs --destination _site
57+
env:
58+
JEKYLL_ENV: production
59+
60+
- name: Setup Pages
61+
uses: actions/configure-pages@v4
62+
63+
- name: Upload artifact
64+
uses: actions/upload-pages-artifact@v3
65+
with:
66+
path: _site
67+
68+
deploy:
69+
name: Deploy to GitHub Pages
70+
runs-on: ubuntu-latest
71+
needs: build
72+
steps:
73+
- name: Deploy to GitHub Pages
74+
id: deployment
75+
uses: actions/deploy-pages@v4
76+

.github/workflows/pr-orchestrator.yml

Lines changed: 58 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,13 +4,13 @@ name: PR Orchestrator - SpecFact CLI
44

55
on:
66
pull_request:
7-
branches: [main]
7+
branches: [main, dev]
88
paths-ignore:
99
- "docs/**"
1010
- "**.md"
1111
- "**.mdc"
1212
push:
13-
branches: [main]
13+
branches: [main, dev]
1414
paths-ignore:
1515
- "docs/**"
1616
- "**.md"
@@ -288,6 +288,62 @@ jobs:
288288
path: dist/
289289
if-no-files-found: error
290290

291+
publish-pypi:
292+
name: Publish to PyPI
293+
runs-on: ubuntu-latest
294+
needs: [package-validation]
295+
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
296+
permissions:
297+
contents: read
298+
steps:
299+
- name: Checkout
300+
uses: actions/checkout@v4
301+
302+
- name: Set up Python 3.12
303+
uses: actions/setup-python@v5
304+
with:
305+
python-version: "3.12"
306+
cache: "pip"
307+
cache-dependency-path: |
308+
pyproject.toml
309+
310+
- name: Install dependencies
311+
run: |
312+
python -m pip install --upgrade pip
313+
pip install build twine packaging
314+
# Note: tomllib is part of Python 3.11+ standard library
315+
# This project requires Python >= 3.11, so no additional TOML library needed
316+
317+
- name: Make script executable
318+
run: chmod +x .github/workflows/scripts/check-and-publish-pypi.sh
319+
320+
- name: Check version and publish to PyPI
321+
id: publish
322+
env:
323+
PYPI_API_TOKEN: ${{ secrets.PYPI_API_TOKEN }}
324+
run: |
325+
./.github/workflows/scripts/check-and-publish-pypi.sh
326+
327+
- name: Summary
328+
if: always()
329+
run: |
330+
PUBLISHED="${{ steps.publish.outputs.published }}"
331+
VERSION="${{ steps.publish.outputs.version }}"
332+
333+
{
334+
echo "## PyPI Publication Summary"
335+
echo "| Parameter | Value |"
336+
echo "|-----------|--------|"
337+
echo "| Version | $VERSION |"
338+
echo "| Published | $PUBLISHED |"
339+
340+
if [ "$PUBLISHED" = "true" ]; then
341+
echo "| Status | ✅ Published to PyPI |"
342+
else
343+
echo "| Status | ⏭️ Skipped (version not newer) |"
344+
fi
345+
} >> "$GITHUB_STEP_SUMMARY"
346+
291347
build-and-push-container:
292348
name: Build and Push Container
293349
runs-on: ubuntu-latest
Lines changed: 115 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,115 @@
1+
#!/usr/bin/env bash
2+
set -euo pipefail
3+
4+
# check-and-publish-pypi.sh
5+
# Extracts version from pyproject.toml, compares with latest PyPI version,
6+
# and publishes if the new version is greater.
7+
# Usage: check-and-publish-pypi.sh
8+
9+
echo "🔍 Checking PyPI version..."
10+
11+
# Extract version from pyproject.toml
12+
# Note: tomllib is part of Python 3.11+ standard library
13+
# This project requires Python >= 3.11, so tomllib is always available
14+
LOCAL_VERSION=$(python << 'PYTHON_SCRIPT'
15+
import sys
16+
import tomllib
17+
18+
try:
19+
with open('pyproject.toml', 'rb') as f:
20+
data = tomllib.load(f)
21+
print(data['project']['version'])
22+
except FileNotFoundError:
23+
print('Error: pyproject.toml not found', file=sys.stderr)
24+
sys.exit(1)
25+
except KeyError as e:
26+
print(f'Error: Could not find version in pyproject.toml: {e}', file=sys.stderr)
27+
sys.exit(1)
28+
PYTHON_SCRIPT
29+
)
30+
31+
echo "📦 Local version: $LOCAL_VERSION"
32+
33+
# Get latest PyPI version
34+
echo "🌐 Querying PyPI for latest version..."
35+
PYPI_VERSION=$(python << 'PYTHON_SCRIPT'
36+
import json
37+
import urllib.request
38+
import sys
39+
40+
try:
41+
url = 'https://pypi.org/pypi/specfact-cli/json'
42+
with urllib.request.urlopen(url, timeout=10) as response:
43+
data = json.loads(response.read())
44+
print(data['info']['version'])
45+
except urllib.error.HTTPError as e:
46+
if e.code == 404:
47+
print('0.0.0')
48+
else:
49+
print(f'Error: HTTP {e.code}', file=sys.stderr)
50+
sys.exit(1)
51+
except Exception as e:
52+
print(f'Error querying PyPI: {e}', file=sys.stderr)
53+
sys.exit(1)
54+
PYTHON_SCRIPT
55+
)
56+
57+
if [ -z "$PYPI_VERSION" ]; then
58+
echo "⚠️ Could not determine PyPI version, assuming first release"
59+
PYPI_VERSION="0.0.0"
60+
fi
61+
62+
echo "📦 Latest PyPI version: $PYPI_VERSION"
63+
64+
# Compare versions using Python packaging
65+
SHOULD_PUBLISH=$(python << PYTHON_SCRIPT
66+
from packaging import version
67+
import sys
68+
69+
local_ver = version.parse('$LOCAL_VERSION')
70+
pypi_ver = version.parse('$PYPI_VERSION')
71+
72+
if local_ver > pypi_ver:
73+
print('true')
74+
else:
75+
print('false')
76+
print(f'⚠️ Local version ({local_ver}) is not greater than PyPI version ({pypi_ver})', file=sys.stderr)
77+
print('Skipping PyPI publication.', file=sys.stderr)
78+
PYTHON_SCRIPT
79+
)
80+
81+
if [ "$SHOULD_PUBLISH" = "true" ]; then
82+
echo "✅ Version $LOCAL_VERSION is newer than PyPI version $PYPI_VERSION"
83+
echo "🚀 Publishing to PyPI..."
84+
85+
# Build package
86+
echo "📦 Building package..."
87+
python -m pip install --upgrade build twine
88+
python -m build
89+
90+
# Validate package
91+
echo "🔍 Validating package..."
92+
twine check dist/*
93+
94+
# Publish to PyPI
95+
echo "📤 Publishing to PyPI..."
96+
if [ -z "${PYPI_API_TOKEN:-}" ]; then
97+
echo "❌ Error: PYPI_API_TOKEN secret is not set"
98+
exit 1
99+
fi
100+
twine upload dist/* \
101+
--username __token__ \
102+
--password "${PYPI_API_TOKEN}" \
103+
--non-interactive \
104+
--skip-existing
105+
106+
echo "✅ Successfully published version $LOCAL_VERSION to PyPI"
107+
108+
# Set output for workflow
109+
echo "published=true" >> $GITHUB_OUTPUT
110+
echo "version=$LOCAL_VERSION" >> $GITHUB_OUTPUT
111+
else
112+
echo "⏭️ Skipping PyPI publication (version not newer)"
113+
echo "published=false" >> $GITHUB_OUTPUT
114+
echo "version=$LOCAL_VERSION" >> $GITHUB_OUTPUT
115+
fi

0 commit comments

Comments
 (0)