diff --git a/projects/unit3/build-mcp-server/solution/README.md b/projects/unit3/build-mcp-server/solution/README.md new file mode 100644 index 0000000..6c90cb0 --- /dev/null +++ b/projects/unit3/build-mcp-server/solution/README.md @@ -0,0 +1,73 @@ +# Module 1: Basic MCP Server with PR Template Tools + +This module implements a basic MCP server that provides tools for analyzing git changes and suggesting appropriate PR templates. + +## Setup + +### 1. Install uv + +Follow the official installation instructions at: https://docs.astral.sh/uv/getting-started/installation/ + +### 2. Install dependencies + +```bash +# Install all dependencies +uv sync + +# Or install with dev dependencies for testing +uv sync --all-extras +``` + +### 3. Configure the MCP Server + +Add the server to Claude Code: + +```bash +# Add the MCP server +claude mcp add pr-agent -- uv --directory /absolute/path/to/module1/solution run server.py + +# Verify it's configured +claude mcp list +``` + +## Tools Available + +1. **analyze_file_changes** - Get the full diff and list of changed files +2. **get_pr_templates** - List available PR templates with their content +3. **suggest_template** - Let Claude analyze changes and suggest a template + +## Usage Example + +1. Make some changes in a git repository +2. Ask Claude: "Can you analyze my changes and suggest a PR template?" +3. Claude will: + - Use `analyze_file_changes` to see what changed + - Analyze the diff to understand the nature of changes + - Use `suggest_template` to recommend the most appropriate template + - Help you fill out the template based on the specific changes + +## How It Works + +Unlike traditional template systems that rely on file extensions or simple patterns, this MCP server provides Claude with raw git data and lets Claude's intelligence determine: +- What type of change is being made (bug fix, feature, refactor, etc.) +- Which template is most appropriate +- How to fill out the template based on the actual code changes + +This approach leverages Claude's understanding of code and context rather than rigid rules. + +## Running Tests + +```bash +# Run the validation script +uv run python validate_solution.py + +# Run unit tests +uv run pytest test_server.py -v +``` + +## Running the Server Directly + +```bash +# Start the MCP server +uv run server.py +``` \ No newline at end of file diff --git a/projects/unit3/build-mcp-server/solution/manual_test.md b/projects/unit3/build-mcp-server/solution/manual_test.md new file mode 100644 index 0000000..0dcfa4c --- /dev/null +++ b/projects/unit3/build-mcp-server/solution/manual_test.md @@ -0,0 +1,115 @@ +# Manual Testing Guide for Module 1 Solution + +## Prerequisites + +1. Ensure you're in a git repository with some changes +2. Install uv following instructions at: https://docs.astral.sh/uv/getting-started/installation/ +3. Install dependencies: + ```bash + uv sync --all-extras + ``` + +## Test 1: Validate the Solution + +Run the automated validation script: +```bash +uv run python validate_solution.py +``` + +This will check: +- Git environment +- Python imports +- Server creation +- Tool registration +- Tool execution +- Template creation + +## Test 2: Run Unit Tests + +```bash +uv run pytest test_server.py -v +``` + +## Test 3: Test with Claude Code + +1. **Configure MCP Server** + + Add the server to Claude Code: + ```bash + # Add the MCP server + claude mcp add pr-agent -- uv --directory /absolute/path/to/module1/solution run server.py + + # Verify it's configured + claude mcp list + ``` + +2. **Restart Claude Code** to pick up the new server + +3. **Make Some Git Changes** + + In any git repository: + ```bash + echo "test change" >> README.md + git add README.md + ``` + +4. **Test with Claude** + + Ask Claude: + - "Can you analyze my git changes?" + - "What PR templates are available?" + - "Based on my changes, which PR template should I use?" + +## Test 4: Direct Server Testing + +You can also test the server directly: + +```python +import asyncio +from server import analyze_file_changes, get_pr_templates, suggest_template + +async def test(): + # Test analyze_file_changes + changes = await analyze_file_changes("main", True) + print("Changes:", changes[:200] + "...") + + # Test get_pr_templates + templates = await get_pr_templates() + print("Templates available:", len(json.loads(templates))) + + # Test suggest_template + suggestion = await suggest_template( + "Fixed authentication bug", + "bug" + ) + print("Suggestion:", json.loads(suggestion)["recommended_template"]["type"]) + +asyncio.run(test()) +``` + +## Expected Behavior + +1. **analyze_file_changes** should return JSON with: + - base_branch + - files_changed + - statistics + - commits + - diff (if include_diff=True) + +2. **get_pr_templates** should return JSON array of templates with: + - filename + - type + - content + +3. **suggest_template** should return JSON with: + - recommended_template + - reasoning + - template_content + - usage_hint + +## Troubleshooting + +- **"Git not found"**: Ensure you're in a git repository +- **Import errors**: Check virtual environment is activated +- **MCP connection failed**: Verify the path in the claude mcp add command is absolute +- **No tools showing**: Restart Claude Code after adding the server \ No newline at end of file diff --git a/projects/unit3/build-mcp-server/solution/pyproject.toml b/projects/unit3/build-mcp-server/solution/pyproject.toml new file mode 100644 index 0000000..c2c9f7e --- /dev/null +++ b/projects/unit3/build-mcp-server/solution/pyproject.toml @@ -0,0 +1,28 @@ +[project] +name = "pr-agent" +version = "1.0.0" +description = "MCP server for PR template suggestions" +readme = "README.md" +requires-python = ">=3.10" +dependencies = [ + "mcp[cli]>=1.0.0", +] + +[project.optional-dependencies] +dev = [ + "pytest>=8.3.0", + "pytest-asyncio>=0.21.0", +] + +[build-system] +requires = ["hatchling"] +build-backend = "hatchling.build" + +[tool.hatch.build.targets.wheel] +packages = ["."] + +[tool.uv] +dev-dependencies = [ + "pytest>=8.3.0", + "pytest-asyncio>=0.21.0", +] \ No newline at end of file diff --git a/projects/unit3/build-mcp-server/solution/server.py b/projects/unit3/build-mcp-server/solution/server.py new file mode 100644 index 0000000..ceb8049 --- /dev/null +++ b/projects/unit3/build-mcp-server/solution/server.py @@ -0,0 +1,222 @@ +#!/usr/bin/env python3 +""" +Module 1: Basic MCP Server with PR Template Tools +A minimal MCP server that provides tools for analyzing file changes and suggesting PR templates. +""" + +import json +import os +import subprocess +from typing import Dict, List, Any, Optional +from pathlib import Path + +from mcp.server.fastmcp import FastMCP + +# Initialize the FastMCP server +mcp = FastMCP("pr-agent") + +# PR template directory (shared between starter and solution) +TEMPLATES_DIR = Path(__file__).parent.parent.parent / "templates" + + +@mcp.tool() +async def analyze_file_changes( + base_branch: str = "main", + include_diff: bool = True, + max_diff_lines: int = 500, + working_directory: Optional[str] = None +) -> str: + """Get the full diff and list of changed files in the current git repository. + + Args: + base_branch: Base branch to compare against (default: main) + include_diff: Include the full diff content (default: true) + max_diff_lines: Maximum number of diff lines to include (default: 500) + working_directory: Directory to run git commands in (default: current directory) + """ + try: + # Try to get working directory from roots first + if working_directory is None: + try: + context = mcp.get_context() + roots_result = await context.session.list_roots() + # Get the first root - Claude Code sets this to the CWD + root = roots_result.roots[0] + # FileUrl object has a .path property that gives us the path directly + working_directory = root.uri.path + except Exception as e: + # If we can't get roots, fall back to current directory + pass + + # Use provided working directory or current directory + cwd = working_directory if working_directory else os.getcwd() + + # Debug output + debug_info = { + "provided_working_directory": working_directory, + "actual_cwd": cwd, + "server_process_cwd": os.getcwd(), + "server_file_location": str(Path(__file__).parent), + "roots_check": None + } + + # Add roots debug info + try: + context = mcp.get_context() + roots_result = await context.session.list_roots() + debug_info["roots_check"] = { + "found": True, + "count": len(roots_result.roots), + "roots": [str(root.uri) for root in roots_result.roots] + } + except Exception as e: + debug_info["roots_check"] = { + "found": False, + "error": str(e) + } + + # Get list of changed files + files_result = subprocess.run( + ["git", "diff", "--name-status", f"{base_branch}...HEAD"], + capture_output=True, + text=True, + check=True, + cwd=cwd + ) + + # Get diff statistics + stat_result = subprocess.run( + ["git", "diff", "--stat", f"{base_branch}...HEAD"], + capture_output=True, + text=True, + cwd=cwd + ) + + # Get the actual diff if requested + diff_content = "" + truncated = False + if include_diff: + diff_result = subprocess.run( + ["git", "diff", f"{base_branch}...HEAD"], + capture_output=True, + text=True, + cwd=cwd + ) + diff_lines = diff_result.stdout.split('\n') + + # Check if we need to truncate + if len(diff_lines) > max_diff_lines: + diff_content = '\n'.join(diff_lines[:max_diff_lines]) + diff_content += f"\n\n... Output truncated. Showing {max_diff_lines} of {len(diff_lines)} lines ..." + diff_content += "\n... Use max_diff_lines parameter to see more ..." + truncated = True + else: + diff_content = diff_result.stdout + + # Get commit messages for context + commits_result = subprocess.run( + ["git", "log", "--oneline", f"{base_branch}..HEAD"], + capture_output=True, + text=True, + cwd=cwd + ) + + analysis = { + "base_branch": base_branch, + "files_changed": files_result.stdout, + "statistics": stat_result.stdout, + "commits": commits_result.stdout, + "diff": diff_content if include_diff else "Diff not included (set include_diff=true to see full diff)", + "truncated": truncated, + "total_diff_lines": len(diff_lines) if include_diff else 0, + "_debug": debug_info + } + + return json.dumps(analysis, indent=2) + + except subprocess.CalledProcessError as e: + return json.dumps({"error": f"Git error: {e.stderr}"}) + except Exception as e: + return json.dumps({"error": str(e)}) + + +@mcp.tool() +async def get_pr_templates() -> str: + """List available PR templates with their content.""" + templates = [] + + # Define default templates + default_templates = { + "bug.md": "Bug Fix", + "feature.md": "Feature", + "docs.md": "Documentation", + "refactor.md": "Refactor", + "test.md": "Test", + "performance.md": "Performance", + "security.md": "Security" + } + + for filename, template_type in default_templates.items(): + template_path = TEMPLATES_DIR / filename + + # Read template content + content = template_path.read_text() + + templates.append({ + "filename": filename, + "type": template_type, + "content": content + }) + + return json.dumps(templates, indent=2) + + +@mcp.tool() +async def suggest_template(changes_summary: str, change_type: str) -> str: + """Let Claude analyze the changes and suggest the most appropriate PR template. + + Args: + changes_summary: Your analysis of what the changes do + change_type: The type of change you've identified (bug, feature, docs, refactor, test, etc.) + """ + + # Get available templates + templates_response = await get_pr_templates() + templates = json.loads(templates_response) + + # Map change types to template files + type_mapping = { + "bug": "bug.md", + "fix": "bug.md", + "feature": "feature.md", + "enhancement": "feature.md", + "docs": "docs.md", + "documentation": "docs.md", + "refactor": "refactor.md", + "cleanup": "refactor.md", + "test": "test.md", + "testing": "test.md", + "performance": "performance.md", + "optimization": "performance.md", + "security": "security.md" + } + + # Find matching template + template_file = type_mapping.get(change_type.lower(), "feature.md") + selected_template = next( + (t for t in templates if t["filename"] == template_file), + templates[0] # Default to first template if no match + ) + + suggestion = { + "recommended_template": selected_template, + "reasoning": f"Based on your analysis: '{changes_summary}', this appears to be a {change_type} change.", + "template_content": selected_template["content"], + "usage_hint": "Claude can help you fill out this template based on the specific changes in your PR." + } + + return json.dumps(suggestion, indent=2) + + +if __name__ == "__main__": + mcp.run() \ No newline at end of file diff --git a/projects/unit3/build-mcp-server/solution/test_server.py b/projects/unit3/build-mcp-server/solution/test_server.py new file mode 100644 index 0000000..280167b --- /dev/null +++ b/projects/unit3/build-mcp-server/solution/test_server.py @@ -0,0 +1,216 @@ +#!/usr/bin/env python3 +""" +Unit tests for Module 1: Basic MCP Server +Run these tests to validate your implementation +""" + +import json +import pytest +import asyncio +from pathlib import Path +from unittest.mock import patch, MagicMock + +# Import your implemented functions +try: + from server import ( + mcp, + analyze_file_changes, + get_pr_templates, + suggest_template + ) + IMPORTS_SUCCESSFUL = True +except ImportError as e: + IMPORTS_SUCCESSFUL = False + IMPORT_ERROR = str(e) + + +class TestImplementation: + """Test that the required functions are implemented.""" + + def test_imports(self): + """Test that all required functions can be imported.""" + assert IMPORTS_SUCCESSFUL, f"Failed to import required functions: {IMPORT_ERROR if not IMPORTS_SUCCESSFUL else ''}" + assert mcp is not None, "FastMCP server instance not found" + assert callable(analyze_file_changes), "analyze_file_changes should be a callable function" + assert callable(get_pr_templates), "get_pr_templates should be a callable function" + assert callable(suggest_template), "suggest_template should be a callable function" + + +@pytest.mark.skipif(not IMPORTS_SUCCESSFUL, reason="Imports failed") +class TestAnalyzeFileChanges: + """Test the analyze_file_changes tool.""" + + @pytest.mark.asyncio + async def test_returns_json_string(self): + """Test that analyze_file_changes returns a JSON string.""" + with patch('subprocess.run') as mock_run: + mock_run.return_value = MagicMock(stdout="", stderr="") + + result = await analyze_file_changes() + + assert isinstance(result, str), "Should return a string" + # Should be valid JSON + data = json.loads(result) + assert isinstance(data, dict), "Should return a JSON object" + + @pytest.mark.asyncio + async def test_includes_required_fields(self): + """Test that the result includes expected fields.""" + with patch('subprocess.run') as mock_run: + mock_run.return_value = MagicMock(stdout="M\tfile1.py\n", stderr="") + + result = await analyze_file_changes() + data = json.loads(result) + + # For starter code, accept error messages; for full implementation, expect data + is_implemented = not ("error" in data and "Not implemented" in str(data.get("error", ""))) + if is_implemented: + # Check for some expected fields (flexible to allow different implementations) + assert any(key in data for key in ["files_changed", "files", "changes", "diff"]), \ + "Result should include file change information" + else: + # Starter code - just verify it returns something structured + assert isinstance(data, dict), "Should return a JSON object even if not implemented" + + @pytest.mark.asyncio + async def test_output_limiting(self): + """Test that large diffs are properly truncated.""" + with patch('subprocess.run') as mock_run: + # Create a mock diff with many lines + large_diff = "\n".join([f"+ line {i}" for i in range(1000)]) + + # Set up mock responses + mock_run.side_effect = [ + MagicMock(stdout="M\tfile1.py\n", stderr=""), # files changed + MagicMock(stdout="1 file changed, 1000 insertions(+)", stderr=""), # stats + MagicMock(stdout=large_diff, stderr=""), # diff + MagicMock(stdout="abc123 Initial commit", stderr="") # commits + ] + + # Test with default limit (500 lines) + result = await analyze_file_changes(include_diff=True) + data = json.loads(result) + + # Check if it's implemented + if "error" not in data or "Not implemented" not in str(data.get("error", "")): + if "diff" in data and data["diff"] != "Diff not included (set include_diff=true to see full diff)": + diff_lines = data["diff"].split('\n') + # Should be truncated to around 500 lines plus truncation message + assert len(diff_lines) < 600, "Large diffs should be truncated" + + # Check for truncation indicator + if "truncated" in data: + assert data["truncated"] == True, "Should indicate truncation" + + # Should have truncation message + assert "truncated" in data["diff"].lower() or "..." in data["diff"], \ + "Should indicate diff was truncated" + + +@pytest.mark.skipif(not IMPORTS_SUCCESSFUL, reason="Imports failed") +class TestGetPRTemplates: + """Test the get_pr_templates tool.""" + + @pytest.mark.asyncio + async def test_returns_json_string(self): + """Test that get_pr_templates returns a JSON string.""" + result = await get_pr_templates() + + assert isinstance(result, str), "Should return a string" + # Should be valid JSON + data = json.loads(result) + + # For starter code, accept error messages; for full implementation, expect list + is_implemented = not ("error" in data and isinstance(data, dict)) + if is_implemented: + assert isinstance(data, list), "Should return a JSON array of templates" + else: + # Starter code - just verify it returns something structured + assert isinstance(data, dict), "Should return a JSON object even if not implemented" + + @pytest.mark.asyncio + async def test_returns_templates(self): + """Test that templates are returned.""" + result = await get_pr_templates() + templates = json.loads(result) + + # For starter code, accept error messages; for full implementation, expect templates + is_implemented = not ("error" in templates and isinstance(templates, dict)) + if is_implemented: + assert len(templates) > 0, "Should return at least one template" + + # Check that templates have expected structure + for template in templates: + assert isinstance(template, dict), "Each template should be a dictionary" + # Should have some identifying information + assert any(key in template for key in ["filename", "name", "type", "id"]), \ + "Templates should have an identifier" + else: + # Starter code - just verify it's structured correctly + assert isinstance(templates, dict), "Should return structured error for starter code" + + +@pytest.mark.skipif(not IMPORTS_SUCCESSFUL, reason="Imports failed") +class TestSuggestTemplate: + """Test the suggest_template tool.""" + + @pytest.mark.asyncio + async def test_returns_json_string(self): + """Test that suggest_template returns a JSON string.""" + result = await suggest_template( + "Fixed a bug in the authentication system", + "bug" + ) + + assert isinstance(result, str), "Should return a string" + # Should be valid JSON + data = json.loads(result) + assert isinstance(data, dict), "Should return a JSON object" + + @pytest.mark.asyncio + async def test_suggestion_structure(self): + """Test that the suggestion has expected structure.""" + result = await suggest_template( + "Added new feature for user management", + "feature" + ) + suggestion = json.loads(result) + + # For starter code, accept error messages; for full implementation, expect suggestion + is_implemented = not ("error" in suggestion and "Not implemented" in str(suggestion.get("error", ""))) + if is_implemented: + # Check for some expected fields (flexible to allow different implementations) + assert any(key in suggestion for key in ["template", "recommended_template", "suggestion"]), \ + "Should include a template recommendation" + else: + # Starter code - just verify it's structured correctly + assert isinstance(suggestion, dict), "Should return structured error for starter code" + + +@pytest.mark.skipif(not IMPORTS_SUCCESSFUL, reason="Imports failed") +class TestToolRegistration: + """Test that tools are properly registered with FastMCP.""" + + def test_tools_have_decorators(self): + """Test that tool functions are decorated with @mcp.tool().""" + # In FastMCP, decorated functions should have certain attributes + # This is a basic check that functions exist and are callable + assert hasattr(analyze_file_changes, '__name__'), \ + "analyze_file_changes should be a proper function" + assert hasattr(get_pr_templates, '__name__'), \ + "get_pr_templates should be a proper function" + assert hasattr(suggest_template, '__name__'), \ + "suggest_template should be a proper function" + + +if __name__ == "__main__": + if not IMPORTS_SUCCESSFUL: + print(f"❌ Cannot run tests - imports failed: {IMPORT_ERROR}") + print("\nMake sure you've:") + print("1. Implemented all three tool functions") + print("2. Decorated them with @mcp.tool()") + print("3. Installed dependencies with: uv sync") + exit(1) + + # Run tests + pytest.main([__file__, "-v"]) \ No newline at end of file diff --git a/projects/unit3/build-mcp-server/solution/uv.lock b/projects/unit3/build-mcp-server/solution/uv.lock new file mode 100644 index 0000000..73f7a33 --- /dev/null +++ b/projects/unit3/build-mcp-server/solution/uv.lock @@ -0,0 +1,550 @@ +version = 1 +revision = 2 +requires-python = ">=3.10" + +[[package]] +name = "annotated-types" +version = "0.7.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/annotated-types/0.7.0/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/annotated-types/0.7.0/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53" }, +] + +[[package]] +name = "anyio" +version = "4.9.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "exceptiongroup", marker = "python_full_version < '3.11'" }, + { name = "idna" }, + { name = "sniffio" }, + { name = "typing-extensions", marker = "python_full_version < '3.13'" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/anyio/4.9.0/anyio-4.9.0.tar.gz", hash = "sha256:673c0c244e15788651a4ff38710fea9675823028a6f08a5eda409e0c9840a028" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/anyio/4.9.0/anyio-4.9.0-py3-none-any.whl", hash = "sha256:9f76d541cad6e36af7beb62e978876f3b41e3e04f2c1fbf0884604c0a9c4d93c" }, +] + +[[package]] +name = "certifi" +version = "2025.4.26" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/certifi/2025.4.26/certifi-2025.4.26.tar.gz", hash = "sha256:0a816057ea3cdefcef70270d2c515e4506bbc954f417fa5ade2021213bb8f0c6" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/certifi/2025.4.26/certifi-2025.4.26-py3-none-any.whl", hash = "sha256:30350364dfe371162649852c63336a15c70c6510c2ad5015b21c2345311805f3" }, +] + +[[package]] +name = "click" +version = "8.1.8" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "colorama", marker = "sys_platform == 'win32'" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/click/8.1.8/click-8.1.8.tar.gz", hash = "sha256:ed53c9d8990d83c2a27deae68e4ee337473f6330c040a31d4225c9574d16096a" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/click/8.1.8/click-8.1.8-py3-none-any.whl", hash = "sha256:63c132bbbed01578a06712a2d1f497bb62d9c1c0d329b7903a866228027263b2" }, +] + +[[package]] +name = "colorama" +version = "0.4.6" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/colorama/0.4.6/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/colorama/0.4.6/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6" }, +] + +[[package]] +name = "exceptiongroup" +version = "1.3.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "typing-extensions", marker = "python_full_version < '3.13'" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/exceptiongroup/1.3.0/exceptiongroup-1.3.0.tar.gz", hash = "sha256:b241f5885f560bc56a59ee63ca4c6a8bfa46ae4ad651af316d4e81817bb9fd88" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/exceptiongroup/1.3.0/exceptiongroup-1.3.0-py3-none-any.whl", hash = "sha256:4d111e6e0c13d0644cad6ddaa7ed0261a0b36971f6d23e7ec9b4b9097da78a10" }, +] + +[[package]] +name = "h11" +version = "0.16.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/h11/0.16.0/h11-0.16.0.tar.gz", hash = "sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/h11/0.16.0/h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86" }, +] + +[[package]] +name = "httpcore" +version = "1.0.9" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "certifi" }, + { name = "h11" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/httpcore/1.0.9/httpcore-1.0.9.tar.gz", hash = "sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/httpcore/1.0.9/httpcore-1.0.9-py3-none-any.whl", hash = "sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55" }, +] + +[[package]] +name = "httpx" +version = "0.28.1" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "anyio" }, + { name = "certifi" }, + { name = "httpcore" }, + { name = "idna" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/httpx/0.28.1/httpx-0.28.1.tar.gz", hash = "sha256:75e98c5f16b0f35b567856f597f06ff2270a374470a5c2392242528e3e3e42fc" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/httpx/0.28.1/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad" }, +] + +[[package]] +name = "httpx-sse" +version = "0.4.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/httpx-sse/0.4.0/httpx-sse-0.4.0.tar.gz", hash = "sha256:1e81a3a3070ce322add1d3529ed42eb5f70817f45ed6ec915ab753f961139721" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/httpx-sse/0.4.0/httpx_sse-0.4.0-py3-none-any.whl", hash = "sha256:f329af6eae57eaa2bdfd962b42524764af68075ea87370a2de920af5341e318f" }, +] + +[[package]] +name = "idna" +version = "3.10" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/idna/3.10/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/idna/3.10/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3" }, +] + +[[package]] +name = "iniconfig" +version = "2.1.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/iniconfig/2.1.0/iniconfig-2.1.0.tar.gz", hash = "sha256:3abbd2e30b36733fee78f9c7f7308f2d0050e88f0087fd25c2645f63c773e1c7" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/iniconfig/2.1.0/iniconfig-2.1.0-py3-none-any.whl", hash = "sha256:9deba5723312380e77435581c6bf4935c94cbfab9b1ed33ef8d238ea168eb760" }, +] + +[[package]] +name = "markdown-it-py" +version = "3.0.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "mdurl" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/markdown-it-py/3.0.0/markdown-it-py-3.0.0.tar.gz", hash = "sha256:e3f60a94fa066dc52ec76661e37c851cb232d92f9886b15cb560aaada2df8feb" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/markdown-it-py/3.0.0/markdown_it_py-3.0.0-py3-none-any.whl", hash = "sha256:355216845c60bd96232cd8d8c40e8f9765cc86f46880e43a8fd22dc1a1a8cab1" }, +] + +[[package]] +name = "mcp" +version = "1.9.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "anyio" }, + { name = "httpx" }, + { name = "httpx-sse" }, + { name = "pydantic" }, + { name = "pydantic-settings" }, + { name = "python-multipart" }, + { name = "sse-starlette" }, + { name = "starlette" }, + { name = "uvicorn", marker = "sys_platform != 'emscripten'" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/mcp/1.9.0/mcp-1.9.0.tar.gz", hash = "sha256:905d8d208baf7e3e71d70c82803b89112e321581bcd2530f9de0fe4103d28749" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/mcp/1.9.0/mcp-1.9.0-py3-none-any.whl", hash = "sha256:9dfb89c8c56f742da10a5910a1f64b0d2ac2c3ed2bd572ddb1cfab7f35957178" }, +] + +[package.optional-dependencies] +cli = [ + { name = "python-dotenv" }, + { name = "typer" }, +] + +[[package]] +name = "mdurl" +version = "0.1.2" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/mdurl/0.1.2/mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/mdurl/0.1.2/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8" }, +] + +[[package]] +name = "packaging" +version = "25.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/packaging/25.0/packaging-25.0.tar.gz", hash = "sha256:d443872c98d677bf60f6a1f2f8c1cb748e8fe762d2bf9d3148b5599295b0fc4f" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/packaging/25.0/packaging-25.0-py3-none-any.whl", hash = "sha256:29572ef2b1f17581046b3a2227d5c611fb25ec70ca1ba8554b24b0e69331a484" }, +] + +[[package]] +name = "pluggy" +version = "1.6.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pluggy/1.6.0/pluggy-1.6.0.tar.gz", hash = "sha256:7dcc130b76258d33b90f61b658791dede3486c3e6bfb003ee5c9bfb396dd22f3" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pluggy/1.6.0/pluggy-1.6.0-py3-none-any.whl", hash = "sha256:e920276dd6813095e9377c0bc5566d94c932c33b27a3e3945d8389c374dd4746" }, +] + +[[package]] +name = "pr-agent" +version = "1.0.0" +source = { editable = "." } +dependencies = [ + { name = "mcp", extra = ["cli"] }, +] + +[package.optional-dependencies] +dev = [ + { name = "pytest" }, + { name = "pytest-asyncio" }, +] + +[package.dev-dependencies] +dev = [ + { name = "pytest" }, + { name = "pytest-asyncio" }, +] + +[package.metadata] +requires-dist = [ + { name = "mcp", extras = ["cli"], specifier = ">=1.0.0" }, + { name = "pytest", marker = "extra == 'dev'", specifier = ">=8.3.0" }, + { name = "pytest-asyncio", marker = "extra == 'dev'", specifier = ">=0.21.0" }, +] +provides-extras = ["dev"] + +[package.metadata.requires-dev] +dev = [ + { name = "pytest", specifier = ">=8.3.0" }, + { name = "pytest-asyncio", specifier = ">=0.21.0" }, +] + +[[package]] +name = "pydantic" +version = "2.11.4" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "annotated-types" }, + { name = "pydantic-core" }, + { name = "typing-extensions" }, + { name = "typing-inspection" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic/2.11.4/pydantic-2.11.4.tar.gz", hash = "sha256:32738d19d63a226a52eed76645a98ee07c1f410ee41d93b4afbfa85ed8111c2d" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic/2.11.4/pydantic-2.11.4-py3-none-any.whl", hash = "sha256:d9615eaa9ac5a063471da949c8fc16376a84afb5024688b3ff885693506764eb" }, +] + +[[package]] +name = "pydantic-core" +version = "2.33.2" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "typing-extensions" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2.tar.gz", hash = "sha256:7cb8bc3605c29176e1b105350d2e6474142d7c1bd1d9327c4a9bdb46bf827acc" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:2b3d326aaef0c0399d9afffeb6367d5e26ddc24d351dbc9c636840ac355dc5d8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:0e5b2671f05ba48b94cb90ce55d8bdcaaedb8ba00cc5359f6810fc918713983d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0069c9acc3f3981b9ff4cdfaf088e98d83440a4c7ea1bc07460af3d4dc22e72d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d53b22f2032c42eaaf025f7c40c2e3b94568ae077a606f006d206a463bc69572" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0405262705a123b7ce9f0b92f123334d67b70fd1f20a9372b907ce1080c7ba02" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4b25d91e288e2c4e0662b8038a28c6a07eaac3e196cfc4ff69de4ea3db992a1b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6bdfe4b3789761f3bcb4b1ddf33355a71079858958e3a552f16d5af19768fef2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:efec8db3266b76ef9607c2c4c419bdb06bf335ae433b80816089ea7585816f6a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:031c57d67ca86902726e0fae2214ce6770bbe2f710dc33063187a68744a5ecac" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:f8de619080e944347f5f20de29a975c2d815d9ddd8be9b9b7268e2e3ef68605a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:73662edf539e72a9440129f231ed3757faab89630d291b784ca99237fb94db2b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-win32.whl", hash = "sha256:0a39979dcbb70998b0e505fb1556a1d550a0781463ce84ebf915ba293ccb7e22" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-win_amd64.whl", hash = "sha256:b0379a2b24882fef529ec3b4987cb5d003b9cda32256024e6fe1586ac45fc640" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:4c5b0a576fb381edd6d27f0a85915c6daf2f8138dc5c267a57c08a62900758c7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e799c050df38a639db758c617ec771fd8fb7a5f8eaaa4b27b101f266b216a246" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dc46a01bf8d62f227d5ecee74178ffc448ff4e5197c756331f71efcc66dc980f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a144d4f717285c6d9234a66778059f33a89096dfb9b39117663fd8413d582dcc" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:73cf6373c21bc80b2e0dc88444f41ae60b2f070ed02095754eb5a01df12256de" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3dc625f4aa79713512d1976fe9f0bc99f706a9dee21dfd1810b4bbbf228d0e8a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:881b21b5549499972441da4758d662aeea93f1923f953e9cbaff14b8b9565aef" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bdc25f3681f7b78572699569514036afe3c243bc3059d3942624e936ec93450e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:fe5b32187cbc0c862ee201ad66c30cf218e5ed468ec8dc1cf49dec66e160cc4d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:bc7aee6f634a6f4a95676fcb5d6559a2c2a390330098dba5e5a5f28a2e4ada30" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:235f45e5dbcccf6bd99f9f472858849f73d11120d76ea8707115415f8e5ebebf" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-win32.whl", hash = "sha256:6368900c2d3ef09b69cb0b913f9f8263b03786e5b2a387706c5afb66800efd51" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-win_amd64.whl", hash = "sha256:1e063337ef9e9820c77acc768546325ebe04ee38b08703244c1309cccc4f1bab" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-win_arm64.whl", hash = "sha256:6b99022f1d19bc32a4c2a0d544fc9a76e3be90f0b3f4af413f87d38749300e65" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:a7ec89dc587667f22b6a0b6579c249fca9026ce7c333fc142ba42411fa243cdc" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3c6db6e52c6d70aa0d00d45cdb9b40f0433b96380071ea80b09277dba021ddf7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e61206137cbc65e6d5256e1166f88331d3b6238e082d9f74613b9b765fb9025" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:eb8c529b2819c37140eb51b914153063d27ed88e3bdc31b71198a198e921e011" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c52b02ad8b4e2cf14ca7b3d918f3eb0ee91e63b3167c32591e57c4317e134f8f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:96081f1605125ba0855dfda83f6f3df5ec90c61195421ba72223de35ccfb2f88" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f57a69461af2a5fa6e6bbd7a5f60d3b7e6cebb687f55106933188e79ad155c1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:572c7e6c8bb4774d2ac88929e3d1f12bc45714ae5ee6d9a788a9fb35e60bb04b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:db4b41f9bd95fbe5acd76d89920336ba96f03e149097365afe1cb092fceb89a1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:fa854f5cf7e33842a892e5c73f45327760bc7bc516339fda888c75ae60edaeb6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:5f483cfb75ff703095c59e365360cb73e00185e01aaea067cd19acffd2ab20ea" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-win32.whl", hash = "sha256:9cb1da0f5a471435a7bc7e439b8a728e8b61e59784b2af70d7c169f8dd8ae290" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-win_amd64.whl", hash = "sha256:f941635f2a3d96b2973e867144fde513665c87f13fe0e193c158ac51bfaaa7b2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-win_arm64.whl", hash = "sha256:cca3868ddfaccfbc4bfb1d608e2ccaaebe0ae628e1416aeb9c4d88c001bb45ab" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:1082dd3e2d7109ad8b7da48e1d4710c8d06c253cbc4a27c1cff4fbcaa97a9e3f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f517ca031dfc037a9c07e748cefd8d96235088b83b4f4ba8939105d20fa1dcd6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0a9f2c9dd19656823cb8250b0724ee9c60a82f3cdf68a080979d13092a3b0fef" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2b0a451c263b01acebe51895bfb0e1cc842a5c666efe06cdf13846c7418caa9a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ea40a64d23faa25e62a70ad163571c0b342b8bf66d5fa612ac0dec4f069d916" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0fb2d542b4d66f9470e8065c5469ec676978d625a8b7a363f07d9a501a9cb36a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fdac5d6ffa1b5a83bca06ffe7583f5576555e6c8b3a91fbd25ea7780f825f7d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:04a1a413977ab517154eebb2d326da71638271477d6ad87a769102f7c2488c56" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:c8e7af2f4e0194c22b5b37205bfb293d166a7344a5b0d0eaccebc376546d77d5" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:5c92edd15cd58b3c2d34873597a1e20f13094f59cf88068adb18947df5455b4e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:65132b7b4a1c0beded5e057324b7e16e10910c106d43675d9bd87d4f38dde162" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-win32.whl", hash = "sha256:52fb90784e0a242bb96ec53f42196a17278855b0f31ac7c3cc6f5c1ec4811849" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-win_amd64.whl", hash = "sha256:c083a3bdd5a93dfe480f1125926afcdbf2917ae714bdb80b36d34318b2bec5d9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-win_arm64.whl", hash = "sha256:e80b087132752f6b3d714f041ccf74403799d3b23a72722ea2e6ba2e892555b9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:61c18fba8e5e9db3ab908620af374db0ac1baa69f0f32df4f61ae23f15e586ac" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:95237e53bb015f67b63c91af7518a62a8660376a6a0db19b89acc77a4d6199f5" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313t-win_amd64.whl", hash = "sha256:c2fc0a768ef76c15ab9238afa6da7f69895bb5d1ee83aeea2e3509af4472d0b9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:5c4aa4e82353f65e548c476b37e64189783aa5384903bfea4f41580f255fddfa" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:d946c8bf0d5c24bf4fe333af284c59a19358aa3ec18cb3dc4370080da1e8ad29" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:87b31b6846e361ef83fedb187bb5b4372d0da3f7e28d85415efa92d6125d6e6d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aa9d91b338f2df0508606f7009fde642391425189bba6d8c653afd80fd6bb64e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2058a32994f1fde4ca0480ab9d1e75a0e8c87c22b53a3ae66554f9af78f2fe8c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:0e03262ab796d986f978f79c943fc5f620381be7287148b8010b4097f79a39ec" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:1a8695a8d00c73e50bff9dfda4d540b7dee29ff9b8053e38380426a85ef10052" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:fa754d1850735a0b0e03bcffd9d4b4343eb417e47196e4485d9cca326073a42c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:a11c8d26a50bfab49002947d3d237abe4d9e4b5bdc8846a63537b6488e197808" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:dd14041875d09cc0f9308e37a6f8b65f5585cf2598a53aa0123df8b129d481f8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:d87c561733f66531dced0da6e864f44ebf89a8fba55f31407b00c2f7f9449593" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2f82865531efd18d6e07a04a17331af02cb7a651583c418df8266f17a63c6612" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bfb5112df54209d820d7bf9317c7a6c9025ea52e49f46b6a2060104bba37de7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:64632ff9d614e5eecfb495796ad51b0ed98c453e447a76bcbeeb69615079fc7e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:f889f7a40498cc077332c7ab6b4608d296d852182211787d4f3ee377aaae66e8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:de4b83bb311557e439b9e186f733f6c645b9417c84e2eb8203f3f820a4b988bf" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:82f68293f055f51b51ea42fafc74b6aad03e70e191799430b90c13d643059ebb" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:329467cecfb529c925cf2bbd4d60d2c509bc2fb52a20c1045bf09bb70971a9c1" }, +] + +[[package]] +name = "pydantic-settings" +version = "2.9.1" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "pydantic" }, + { name = "python-dotenv" }, + { name = "typing-inspection" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-settings/2.9.1/pydantic_settings-2.9.1.tar.gz", hash = "sha256:c509bf79d27563add44e8446233359004ed85066cd096d8b510f715e6ef5d268" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-settings/2.9.1/pydantic_settings-2.9.1-py3-none-any.whl", hash = "sha256:59b4f431b1defb26fe620c71a7d3968a710d719f5f4cdbbdb7926edeb770f6ef" }, +] + +[[package]] +name = "pygments" +version = "2.19.1" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pygments/2.19.1/pygments-2.19.1.tar.gz", hash = "sha256:61c16d2a8576dc0649d9f39e089b5f02bcd27fba10d8fb4dcc28173f7a45151f" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pygments/2.19.1/pygments-2.19.1-py3-none-any.whl", hash = "sha256:9ea1544ad55cecf4b8242fab6dd35a93bbce657034b0611ee383099054ab6d8c" }, +] + +[[package]] +name = "pytest" +version = "8.3.5" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "colorama", marker = "sys_platform == 'win32'" }, + { name = "exceptiongroup", marker = "python_full_version < '3.11'" }, + { name = "iniconfig" }, + { name = "packaging" }, + { name = "pluggy" }, + { name = "tomli", marker = "python_full_version < '3.11'" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pytest/8.3.5/pytest-8.3.5.tar.gz", hash = "sha256:f4efe70cc14e511565ac476b57c279e12a855b11f48f212af1080ef2263d3845" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pytest/8.3.5/pytest-8.3.5-py3-none-any.whl", hash = "sha256:c69214aa47deac29fad6c2a4f590b9c4a9fdb16a403176fe154b79c0b4d4d820" }, +] + +[[package]] +name = "pytest-asyncio" +version = "0.26.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "pytest" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pytest-asyncio/0.26.0/pytest_asyncio-0.26.0.tar.gz", hash = "sha256:c4df2a697648241ff39e7f0e4a73050b03f123f760673956cf0d72a4990e312f" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pytest-asyncio/0.26.0/pytest_asyncio-0.26.0-py3-none-any.whl", hash = "sha256:7b51ed894f4fbea1340262bdae5135797ebbe21d8638978e35d31c6d19f72fb0" }, +] + +[[package]] +name = "python-dotenv" +version = "1.1.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/python-dotenv/1.1.0/python_dotenv-1.1.0.tar.gz", hash = "sha256:41f90bc6f5f177fb41f53e87666db362025010eb28f60a01c9143bfa33a2b2d5" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/python-dotenv/1.1.0/python_dotenv-1.1.0-py3-none-any.whl", hash = "sha256:d7c01d9e2293916c18baf562d95698754b0dbbb5e74d457c45d4f6561fb9d55d" }, +] + +[[package]] +name = "python-multipart" +version = "0.0.20" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/python-multipart/0.0.20/python_multipart-0.0.20.tar.gz", hash = "sha256:8dd0cab45b8e23064ae09147625994d090fa46f5b0d1e13af944c331a7fa9d13" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/python-multipart/0.0.20/python_multipart-0.0.20-py3-none-any.whl", hash = "sha256:8a62d3a8335e06589fe01f2a3e178cdcc632f3fbe0d492ad9ee0ec35aab1f104" }, +] + +[[package]] +name = "rich" +version = "14.0.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "markdown-it-py" }, + { name = "pygments" }, + { name = "typing-extensions", marker = "python_full_version < '3.11'" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/rich/14.0.0/rich-14.0.0.tar.gz", hash = "sha256:82f1bc23a6a21ebca4ae0c45af9bdbc492ed20231dcb63f297d6d1021a9d5725" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/rich/14.0.0/rich-14.0.0-py3-none-any.whl", hash = "sha256:1c9491e1951aac09caffd42f448ee3d04e58923ffe14993f6e83068dc395d7e0" }, +] + +[[package]] +name = "shellingham" +version = "1.5.4" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/shellingham/1.5.4/shellingham-1.5.4.tar.gz", hash = "sha256:8dbca0739d487e5bd35ab3ca4b36e11c4078f3a234bfce294b0a0291363404de" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/shellingham/1.5.4/shellingham-1.5.4-py2.py3-none-any.whl", hash = "sha256:7ecfff8f2fd72616f7481040475a65b2bf8af90a56c89140852d1120324e8686" }, +] + +[[package]] +name = "sniffio" +version = "1.3.1" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/sniffio/1.3.1/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/sniffio/1.3.1/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2" }, +] + +[[package]] +name = "sse-starlette" +version = "2.3.4" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "anyio" }, + { name = "starlette" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/sse-starlette/2.3.4/sse_starlette-2.3.4.tar.gz", hash = "sha256:0ffd6bed217cdbb74a84816437c609278003998b4991cd2e6872d0b35130e4d5" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/sse-starlette/2.3.4/sse_starlette-2.3.4-py3-none-any.whl", hash = "sha256:b8100694f3f892b133d0f7483acb7aacfcf6ed60f863b31947664b6dc74e529f" }, +] + +[[package]] +name = "starlette" +version = "0.46.2" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "anyio" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/starlette/0.46.2/starlette-0.46.2.tar.gz", hash = "sha256:7f7361f34eed179294600af672f565727419830b54b7b084efe44bb82d2fccd5" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/starlette/0.46.2/starlette-0.46.2-py3-none-any.whl", hash = "sha256:595633ce89f8ffa71a015caed34a5b2dc1c0cdb3f0f1fbd1e69339cf2abeec35" }, +] + +[[package]] +name = "tomli" +version = "2.2.1" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1.tar.gz", hash = "sha256:cd45e1dc79c835ce60f7404ec8119f2eb06d38b1deba146f07ced3bbc44505ff" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:678e4fa69e4575eb77d103de3df8a895e1591b48e740211bd1067378c69e8249" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:023aa114dd824ade0100497eb2318602af309e5a55595f76b626d6d9f3b7b0a6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ece47d672db52ac607a3d9599a9d48dcb2f2f735c6c2d1f34130085bb12b112a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6972ca9c9cc9f0acaa56a8ca1ff51e7af152a9f87fb64623e31d5c83700080ee" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c954d2250168d28797dd4e3ac5cf812a406cd5a92674ee4c8f123c889786aa8e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8dd28b3e155b80f4d54beb40a441d366adcfe740969820caf156c019fb5c7ec4" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:e59e304978767a54663af13c07b3d1af22ddee3bb2fb0618ca1593e4f593a106" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:33580bccab0338d00994d7f16f4c4ec25b776af3ffaac1ed74e0b3fc95e885a8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-win32.whl", hash = "sha256:465af0e0875402f1d226519c9904f37254b3045fc5084697cefb9bdde1ff99ff" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-win_amd64.whl", hash = "sha256:2d0f2fdd22b02c6d81637a3c95f8cd77f995846af7414c5c4b8d0545afa1bc4b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4a8f6e44de52d5e6c657c9fe83b562f5f4256d8ebbfe4ff922c495620a7f6cea" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8d57ca8095a641b8237d5b079147646153d22552f1c637fd3ba7f4b0b29167a8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e340144ad7ae1533cb897d406382b4b6fede8890a03738ff1683af800d54192" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:db2b95f9de79181805df90bedc5a5ab4c165e6ec3fe99f970d0e302f384ad222" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:40741994320b232529c802f8bc86da4e1aa9f413db394617b9a256ae0f9a7f77" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:400e720fe168c0f8521520190686ef8ef033fb19fc493da09779e592861b78c6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:02abe224de6ae62c19f090f68da4e27b10af2b93213d36cf44e6e1c5abd19fdd" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b82ebccc8c8a36f2094e969560a1b836758481f3dc360ce9a3277c65f374285e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-win32.whl", hash = "sha256:889f80ef92701b9dbb224e49ec87c645ce5df3fa2cc548664eb8a25e03127a98" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-win_amd64.whl", hash = "sha256:7fc04e92e1d624a4a63c76474610238576942d6b8950a2d7f908a340494e67e4" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f4039b9cbc3048b2416cc57ab3bda989a6fcf9b36cf8937f01a6e731b64f80d7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:286f0ca2ffeeb5b9bd4fcc8d6c330534323ec51b2f52da063b11c502da16f30c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a92ef1a44547e894e2a17d24e7557a5e85a9e1d0048b0b5e7541f76c5032cb13" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9316dc65bed1684c9a98ee68759ceaed29d229e985297003e494aa825ebb0281" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e85e99945e688e32d5a35c1ff38ed0b3f41f43fad8df0bdf79f72b2ba7bc5272" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ac065718db92ca818f8d6141b5f66369833d4a80a9d74435a268c52bdfa73140" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:d920f33822747519673ee656a4b6ac33e382eca9d331c87770faa3eef562aeb2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a198f10c4d1b1375d7687bc25294306e551bf1abfa4eace6650070a5c1ae2744" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-win32.whl", hash = "sha256:d3f5614314d758649ab2ab3a62d4f2004c825922f9e370b29416484086b264ec" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-win_amd64.whl", hash = "sha256:a38aa0308e754b0e3c67e344754dff64999ff9b513e691d0e786265c93583c69" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-py3-none-any.whl", hash = "sha256:cb55c73c5f4408779d0cf3eef9f762b9c9f147a77de7b258bef0a5628adc85cc" }, +] + +[[package]] +name = "typer" +version = "0.15.4" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "click" }, + { name = "rich" }, + { name = "shellingham" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/typer/0.15.4/typer-0.15.4.tar.gz", hash = "sha256:89507b104f9b6a0730354f27c39fae5b63ccd0c95b1ce1f1a6ba0cfd329997c3" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/typer/0.15.4/typer-0.15.4-py3-none-any.whl", hash = "sha256:eb0651654dcdea706780c466cf06d8f174405a659ffff8f163cfbfee98c0e173" }, +] + +[[package]] +name = "typing-extensions" +version = "4.13.2" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/typing-extensions/4.13.2/typing_extensions-4.13.2.tar.gz", hash = "sha256:e6c81219bd689f51865d9e372991c540bda33a0379d5573cddb9a3a23f7caaef" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/typing-extensions/4.13.2/typing_extensions-4.13.2-py3-none-any.whl", hash = "sha256:a439e7c04b49fec3e5d3e2beaa21755cadbbdc391694e28ccdd36ca4a1408f8c" }, +] + +[[package]] +name = "typing-inspection" +version = "0.4.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "typing-extensions" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/typing-inspection/0.4.0/typing_inspection-0.4.0.tar.gz", hash = "sha256:9765c87de36671694a67904bf2c96e395be9c6439bb6c87b5142569dcdd65122" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/typing-inspection/0.4.0/typing_inspection-0.4.0-py3-none-any.whl", hash = "sha256:50e72559fcd2a6367a19f7a7e610e6afcb9fac940c650290eed893d61386832f" }, +] + +[[package]] +name = "uvicorn" +version = "0.34.2" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "click" }, + { name = "h11" }, + { name = "typing-extensions", marker = "python_full_version < '3.11'" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/uvicorn/0.34.2/uvicorn-0.34.2.tar.gz", hash = "sha256:0e929828f6186353a80b58ea719861d2629d766293b6d19baf086ba31d4f3328" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/uvicorn/0.34.2/uvicorn-0.34.2-py3-none-any.whl", hash = "sha256:deb49af569084536d269fe0a6d67e3754f104cf03aba7c11c40f01aadf33c403" }, +] diff --git a/projects/unit3/build-mcp-server/starter/README.md b/projects/unit3/build-mcp-server/starter/README.md new file mode 100644 index 0000000..0016838 --- /dev/null +++ b/projects/unit3/build-mcp-server/starter/README.md @@ -0,0 +1,67 @@ +# Module 1: Basic MCP Server - Starter Code + +Welcome to Module 1! In this module, you'll build a basic MCP server that helps developers create better pull requests by analyzing code changes and suggesting appropriate templates. + +## Your Task + +Implement an MCP server with three tools: + +1. **analyze_file_changes** - Retrieve git diff information and changed files +2. **get_pr_templates** - List available PR templates +3. **suggest_template** - Allow Claude to suggest the most appropriate template + +## Setup + +### 1. Install uv + +Follow the official installation instructions at: https://docs.astral.sh/uv/getting-started/installation/ + +### 2. Install dependencies + +```bash +# Install all dependencies +uv sync + +# Or install with dev dependencies for testing +uv sync --all-extras +``` + +### 3. Start implementing! + +Open `server.py` and follow the TODO comments to implement each tool. + +**Note**: The starter code includes stub implementations that return "Not implemented" errors. This allows you to: +- Run the server immediately +- Test your setup with Claude +- Replace each stub with your actual implementation +- See helpful hints about what each tool should do + +## Design Philosophy + +Instead of using rigid rules based on file extensions or patterns, your tools should provide Claude with raw git data and let Claude's intelligence determine: +- What type of change is being made +- Which template is most appropriate +- How to customize the template for the specific changes + +## Testing Your Implementation + +1. Run the unit tests: + ```bash + uv run pytest test_server.py -v + ``` + +2. Configure the MCP server in Claude Code: + ```bash + # Add the MCP server + claude mcp add pr-agent -- uv --directory /absolute/path/to/module1/starter run server.py + + # Verify it's configured + claude mcp list + ``` + +3. Make some changes in a git repository and ask Claude to analyze your changes and suggest a PR template + +## Need Help? + +- Check the solution in `../solution/` if you get stuck +- Remember: The goal is to give Claude the data it needs, not to implement complex logic yourself \ No newline at end of file diff --git a/projects/unit3/build-mcp-server/starter/pyproject.toml b/projects/unit3/build-mcp-server/starter/pyproject.toml new file mode 100644 index 0000000..38f75a9 --- /dev/null +++ b/projects/unit3/build-mcp-server/starter/pyproject.toml @@ -0,0 +1,28 @@ +[project] +name = "pr-agent" +version = "1.0.0" +description = "MCP server for PR template suggestions - Starter Code" +readme = "README.md" +requires-python = ">=3.10" +dependencies = [ + "mcp[cli]>=1.0.0", +] + +[project.optional-dependencies] +dev = [ + "pytest>=8.3.0", + "pytest-asyncio>=0.21.0", +] + +[build-system] +requires = ["hatchling"] +build-backend = "hatchling.build" + +[tool.hatch.build.targets.wheel] +packages = ["."] + +[tool.uv] +dev-dependencies = [ + "pytest>=8.3.0", + "pytest-asyncio>=0.21.0", +] \ No newline at end of file diff --git a/projects/unit3/build-mcp-server/starter/server.py b/projects/unit3/build-mcp-server/starter/server.py new file mode 100644 index 0000000..15e1b13 --- /dev/null +++ b/projects/unit3/build-mcp-server/starter/server.py @@ -0,0 +1,81 @@ +#!/usr/bin/env python3 +""" +Module 1: Basic MCP Server - Starter Code +TODO: Implement tools for analyzing git changes and suggesting PR templates +""" + +import json +import subprocess +from pathlib import Path + +from mcp.server.fastmcp import FastMCP + +# Initialize the FastMCP server +mcp = FastMCP("pr-agent") + +# PR template directory (shared across all modules) +TEMPLATES_DIR = Path(__file__).parent.parent.parent / "templates" + + +# TODO: Implement tool functions here +# Example structure for a tool: +# @mcp.tool() +# async def analyze_file_changes(base_branch: str = "main", include_diff: bool = True) -> str: +# """Get the full diff and list of changed files in the current git repository. +# +# Args: +# base_branch: Base branch to compare against (default: main) +# include_diff: Include the full diff content (default: true) +# """ +# # Your implementation here +# pass + +# Minimal stub implementations so the server runs +# TODO: Replace these with your actual implementations + +@mcp.tool() +async def analyze_file_changes(base_branch: str = "main", include_diff: bool = True) -> str: + """Get the full diff and list of changed files in the current git repository. + + Args: + base_branch: Base branch to compare against (default: main) + include_diff: Include the full diff content (default: true) + """ + # TODO: Implement this tool + # IMPORTANT: MCP tools have a 25,000 token response limit! + # Large diffs can easily exceed this. Consider: + # - Adding a max_diff_lines parameter (e.g., 500 lines) + # - Truncating large outputs with a message + # - Returning summary statistics alongside limited diffs + + # NOTE: Git commands run in the server's directory by default! + # To run in Claude's working directory, use MCP roots: + # context = mcp.get_context() + # roots_result = await context.session.list_roots() + # working_dir = roots_result.roots[0].uri.path + # subprocess.run(["git", "diff"], cwd=working_dir) + + return json.dumps({"error": "Not implemented yet", "hint": "Use subprocess to run git commands"}) + + +@mcp.tool() +async def get_pr_templates() -> str: + """List available PR templates with their content.""" + # TODO: Implement this tool + return json.dumps({"error": "Not implemented yet", "hint": "Read templates from TEMPLATES_DIR"}) + + +@mcp.tool() +async def suggest_template(changes_summary: str, change_type: str) -> str: + """Let Claude analyze the changes and suggest the most appropriate PR template. + + Args: + changes_summary: Your analysis of what the changes do + change_type: The type of change you've identified (bug, feature, docs, refactor, test, etc.) + """ + # TODO: Implement this tool + return json.dumps({"error": "Not implemented yet", "hint": "Map change_type to templates"}) + + +if __name__ == "__main__": + mcp.run() \ No newline at end of file diff --git a/projects/unit3/build-mcp-server/starter/test_server.py b/projects/unit3/build-mcp-server/starter/test_server.py new file mode 100644 index 0000000..1835ae0 --- /dev/null +++ b/projects/unit3/build-mcp-server/starter/test_server.py @@ -0,0 +1,182 @@ +#!/usr/bin/env python3 +""" +Unit tests for Module 1: Basic MCP Server +Run these tests to validate your implementation +""" + +import json +import pytest +import asyncio +from pathlib import Path +from unittest.mock import patch, MagicMock + +# Import your implemented functions +try: + from server import ( + mcp, + analyze_file_changes, + get_pr_templates, + suggest_template + ) + IMPORTS_SUCCESSFUL = True +except ImportError as e: + IMPORTS_SUCCESSFUL = False + IMPORT_ERROR = str(e) + + +class TestImplementation: + """Test that the required functions are implemented.""" + + def test_imports(self): + """Test that all required functions can be imported.""" + assert IMPORTS_SUCCESSFUL, f"Failed to import required functions: {IMPORT_ERROR if not IMPORTS_SUCCESSFUL else ''}" + assert mcp is not None, "FastMCP server instance not found" + assert callable(analyze_file_changes), "analyze_file_changes should be a callable function" + assert callable(get_pr_templates), "get_pr_templates should be a callable function" + assert callable(suggest_template), "suggest_template should be a callable function" + + +@pytest.mark.skipif(not IMPORTS_SUCCESSFUL, reason="Imports failed") +class TestAnalyzeFileChanges: + """Test the analyze_file_changes tool.""" + + @pytest.mark.asyncio + async def test_returns_json_string(self): + """Test that analyze_file_changes returns a JSON string.""" + with patch('subprocess.run') as mock_run: + mock_run.return_value = MagicMock(stdout="", stderr="") + + result = await analyze_file_changes() + + assert isinstance(result, str), "Should return a string" + # Should be valid JSON + data = json.loads(result) + assert isinstance(data, dict), "Should return a JSON object" + + @pytest.mark.asyncio + async def test_includes_required_fields(self): + """Test that the result includes expected fields.""" + with patch('subprocess.run') as mock_run: + mock_run.return_value = MagicMock(stdout="M\tfile1.py\n", stderr="") + + result = await analyze_file_changes() + data = json.loads(result) + + # For starter code, accept error messages; for full implementation, expect data + is_implemented = not ("error" in data and "Not implemented" in str(data.get("error", ""))) + if is_implemented: + # Check for some expected fields (flexible to allow different implementations) + assert any(key in data for key in ["files_changed", "files", "changes", "diff"]), \ + "Result should include file change information" + else: + # Starter code - just verify it returns something structured + assert isinstance(data, dict), "Should return a JSON object even if not implemented" + + +@pytest.mark.skipif(not IMPORTS_SUCCESSFUL, reason="Imports failed") +class TestGetPRTemplates: + """Test the get_pr_templates tool.""" + + @pytest.mark.asyncio + async def test_returns_json_string(self): + """Test that get_pr_templates returns a JSON string.""" + result = await get_pr_templates() + + assert isinstance(result, str), "Should return a string" + # Should be valid JSON + data = json.loads(result) + + # For starter code, accept error messages; for full implementation, expect list + is_implemented = not ("error" in data and isinstance(data, dict)) + if is_implemented: + assert isinstance(data, list), "Should return a JSON array of templates" + else: + # Starter code - just verify it returns something structured + assert isinstance(data, dict), "Should return a JSON object even if not implemented" + + @pytest.mark.asyncio + async def test_returns_templates(self): + """Test that templates are returned.""" + result = await get_pr_templates() + templates = json.loads(result) + + # For starter code, accept error messages; for full implementation, expect templates + is_implemented = not ("error" in templates and isinstance(templates, dict)) + if is_implemented: + assert len(templates) > 0, "Should return at least one template" + + # Check that templates have expected structure + for template in templates: + assert isinstance(template, dict), "Each template should be a dictionary" + # Should have some identifying information + assert any(key in template for key in ["filename", "name", "type", "id"]), \ + "Templates should have an identifier" + else: + # Starter code - just verify it's structured correctly + assert isinstance(templates, dict), "Should return structured error for starter code" + + +@pytest.mark.skipif(not IMPORTS_SUCCESSFUL, reason="Imports failed") +class TestSuggestTemplate: + """Test the suggest_template tool.""" + + @pytest.mark.asyncio + async def test_returns_json_string(self): + """Test that suggest_template returns a JSON string.""" + result = await suggest_template( + "Fixed a bug in the authentication system", + "bug" + ) + + assert isinstance(result, str), "Should return a string" + # Should be valid JSON + data = json.loads(result) + assert isinstance(data, dict), "Should return a JSON object" + + @pytest.mark.asyncio + async def test_suggestion_structure(self): + """Test that the suggestion has expected structure.""" + result = await suggest_template( + "Added new feature for user management", + "feature" + ) + suggestion = json.loads(result) + + # For starter code, accept error messages; for full implementation, expect suggestion + is_implemented = not ("error" in suggestion and "Not implemented" in str(suggestion.get("error", ""))) + if is_implemented: + # Check for some expected fields (flexible to allow different implementations) + assert any(key in suggestion for key in ["template", "recommended_template", "suggestion"]), \ + "Should include a template recommendation" + else: + # Starter code - just verify it's structured correctly + assert isinstance(suggestion, dict), "Should return structured error for starter code" + + +@pytest.mark.skipif(not IMPORTS_SUCCESSFUL, reason="Imports failed") +class TestToolRegistration: + """Test that tools are properly registered with FastMCP.""" + + def test_tools_have_decorators(self): + """Test that tool functions are decorated with @mcp.tool().""" + # In FastMCP, decorated functions should have certain attributes + # This is a basic check that functions exist and are callable + assert hasattr(analyze_file_changes, '__name__'), \ + "analyze_file_changes should be a proper function" + assert hasattr(get_pr_templates, '__name__'), \ + "get_pr_templates should be a proper function" + assert hasattr(suggest_template, '__name__'), \ + "suggest_template should be a proper function" + + +if __name__ == "__main__": + if not IMPORTS_SUCCESSFUL: + print(f"❌ Cannot run tests - imports failed: {IMPORT_ERROR}") + print("\nMake sure you've:") + print("1. Implemented all three tool functions") + print("2. Decorated them with @mcp.tool()") + print("3. Installed dependencies with: uv sync") + exit(1) + + # Run tests + pytest.main([__file__, "-v"]) \ No newline at end of file diff --git a/projects/unit3/build-mcp-server/starter/validate_starter.py b/projects/unit3/build-mcp-server/starter/validate_starter.py new file mode 100644 index 0000000..36923cd --- /dev/null +++ b/projects/unit3/build-mcp-server/starter/validate_starter.py @@ -0,0 +1,193 @@ +#!/usr/bin/env python3 +""" +Validation script for Module 1 starter code +Ensures the starter template is ready for learners to implement +""" + +import subprocess +import sys +import os +from pathlib import Path + +def test_project_structure(): + """Check that all required files exist.""" + print("Project Structure:") + required_files = [ + "server.py", + "pyproject.toml", + "README.md" + ] + + all_exist = True + for file in required_files: + if Path(file).exists(): + print(f" ✓ {file} exists") + else: + print(f" ✗ {file} missing") + all_exist = False + + return all_exist + +def test_imports(): + """Test that the starter code imports work.""" + try: + # Test importing the server module + import server + print("✓ server.py imports successfully") + + # Check that FastMCP is imported + if hasattr(server, 'mcp'): + print("✓ FastMCP server instance found") + else: + print("✗ FastMCP server instance not found") + return False + + return True + except ImportError as e: + print(f"✗ Import error: {e}") + print(" Please ensure you've installed dependencies: uv sync") + return False + +def test_todos(): + """Check that TODO comments exist for learners.""" + print("\nTODO Comments:") + + with open("server.py", "r") as f: + content = f.read() + + todos = [] + for i, line in enumerate(content.split('\n'), 1): + if 'TODO' in line: + todos.append((i, line.strip())) + + if todos: + print(f"✓ Found {len(todos)} TODO comments for learners:") + for line_no, todo in todos[:5]: # Show first 5 + print(f" Line {line_no}: {todo[:60]}...") + if len(todos) > 5: + print(f" ... and {len(todos) - 5} more") + return True + else: + print("✗ No TODO comments found - learners need guidance!") + return False + +def test_starter_runs(): + """Test that the starter code can at least be executed.""" + print("\nExecution Test:") + + try: + # Try to import and check if server can be initialized + import server + # If we can import it and it has the right attributes, it should run + if hasattr(server, 'mcp') and hasattr(server, 'analyze_file_changes'): + print("✓ Server imports and initializes correctly") + return True + else: + print("✗ Server missing required components") + return False + + except Exception as e: + print(f"✗ Failed to initialize server: {e}") + return False + +def test_dependencies(): + """Check that pyproject.toml is properly configured.""" + print("\nDependencies:") + + try: + import tomllib + except ImportError: + import tomli as tomllib + + try: + with open("pyproject.toml", "rb") as f: + config = tomllib.load(f) + + # Check for required sections + if "project" in config and "dependencies" in config["project"]: + deps = config["project"]["dependencies"] + print(f"✓ Found {len(deps)} dependencies") + for dep in deps: + print(f" - {dep}") + else: + print("✗ No dependencies section found") + return False + + return True + except Exception as e: + print(f"✗ Error reading pyproject.toml: {e}") + return False + +def test_no_implementation(): + """Ensure starter code doesn't contain the solution.""" + print("\nImplementation Check:") + + with open("server.py", "r") as f: + content = f.read() + + # Check that tool functions are not implemented + solution_indicators = [ + "subprocess.run", # Git commands + "json.dumps", # Returning JSON + "git diff", # Git operations + "template", # Template logic + ] + + found_implementations = [] + for indicator in solution_indicators: + if indicator in content.lower(): + found_implementations.append(indicator) + + if found_implementations: + print(f"⚠️ Found possible solution code: {', '.join(found_implementations)}") + print(" Make sure these are only in comments/examples") + return True # Warning, not failure + else: + print("✓ No solution implementation found (good!)") + return True + +def main(): + """Run all validation checks.""" + print("Module 1 Starter Code Validation") + print("=" * 50) + + # Change to starter directory if needed + if Path("validate_starter.py").exists(): + os.chdir(Path("validate_starter.py").parent) + + tests = [ + ("Project Structure", test_project_structure), + ("Python Imports", test_imports), + ("TODO Comments", test_todos), + ("Starter Execution", test_starter_runs), + ("Dependencies", test_dependencies), + ("Clean Starter", test_no_implementation) + ] + + results = [] + for test_name, test_func in tests: + print(f"\n{test_name}:") + try: + results.append(test_func()) + except Exception as e: + print(f"✗ Test failed with error: {e}") + results.append(False) + + print("\n" + "=" * 50) + passed = sum(results) + total = len(results) + print(f"Checks passed: {passed}/{total}") + + if passed == total: + print("\n✓ Starter code is ready for learners!") + print("\nLearners should:") + print("1. Run: uv sync") + print("2. Follow the TODO comments in server.py") + print("3. Test with: uv run pytest test_server.py") + print("4. Configure Claude Desktop when ready") + else: + print("\n✗ Some checks failed. Please review the starter code.") + sys.exit(1) + +if __name__ == "__main__": + main() \ No newline at end of file diff --git a/projects/unit3/github-actions-integration/.gitignore b/projects/unit3/github-actions-integration/.gitignore new file mode 100644 index 0000000..452e01f --- /dev/null +++ b/projects/unit3/github-actions-integration/.gitignore @@ -0,0 +1,2 @@ +# Test data +github_events.json \ No newline at end of file diff --git a/projects/unit3/github-actions-integration/solution/README.md b/projects/unit3/github-actions-integration/solution/README.md new file mode 100644 index 0000000..0105ac7 --- /dev/null +++ b/projects/unit3/github-actions-integration/solution/README.md @@ -0,0 +1,168 @@ +# Module 2: GitHub Actions Integration + +This module extends the PR Agent with GitHub Actions webhook integration and MCP Prompts for standardized CI/CD workflows. + +## Features Added in Module 2 + +1. **GitHub Actions Tools**: + - `get_recent_actions_events()` - View recent webhook events + - `get_workflow_status()` - Check workflow statuses + +2. **MCP Prompts for CI/CD**: + - `analyze_ci_results` - Comprehensive CI/CD analysis + - `create_deployment_summary` - Team-friendly deployment updates + - `generate_pr_status_report` - Combined code and CI/CD report + - `troubleshoot_workflow_failure` - Systematic debugging guide + +3. **Webhook Server**: + - Separate script that runs on port 8080 + - Receives GitHub Actions events + - Stores events in `github_events.json` for the MCP server to read + +## Installation + +```bash +# From the solution directory +uv sync +``` + +## Setting Up Cloudflare Tunnel + +To receive GitHub webhooks locally, you'll need to set up Cloudflare Tunnel (cloudflared): + +### Step 1: Install cloudflared + +**macOS:** +```bash +brew install cloudflared +``` + +**Windows:** +Download the Windows installer from: +https://github.com/cloudflare/cloudflared/releases/latest/download/cloudflared-windows-amd64.msi + +**Linux:** +```bash +# For Debian/Ubuntu (amd64) +curl -L https://github.com/cloudflare/cloudflared/releases/latest/download/cloudflared-linux-amd64.deb -o cloudflared.deb +sudo dpkg -i cloudflared.deb + +# For other Linux distros, download the appropriate binary: +# https://github.com/cloudflare/cloudflared/releases/latest +``` + +### Step 2: Start the Tunnel + +```bash +# This creates a public URL that forwards to your local webhook server +cloudflared tunnel --url http://localhost:8080 +``` + +You'll see output like: +``` +Your quick tunnel has been created! Visit it at: +https://random-name-here.trycloudflare.com +``` + +### Step 3: Configure GitHub Webhook + +1. Go to your GitHub repository → Settings → Webhooks +2. Click "Add webhook" +3. Set **Payload URL** to: `https://your-tunnel-url.trycloudflare.com/webhook/github` +4. Set **Content type** to: `application/json` +5. Select events: + - Workflow runs + - Check runs + - Or choose "Send me everything" +6. Click "Add webhook" + +## Running the Server + +### For Development + +```bash +# Terminal 1: Start the webhook server +python webhook_server.py + +# Terminal 2: Start Cloudflare Tunnel (if testing with real GitHub) +cloudflared tunnel --url http://localhost:8080 + +# Terminal 3: Start the MCP server +uv run server.py +``` + +### With Claude Code + +1. Add to Claude Code settings: +```json +{ + "pr-agent-actions": { + "command": "uv", + "args": ["run", "server.py"], + "cwd": "/path/to/github-actions-integration/solution" + } +} +``` + +2. Restart Claude Code +3. In a separate terminal, start the webhook server: `python webhook_server.py` +4. (Optional) Start Cloudflare Tunnel if testing with real GitHub webhooks + +## Testing Webhooks + +### Manual Test +```bash +# Send a test webhook +curl -X POST http://localhost:8080/webhook/github \ + -H "Content-Type: application/json" \ + -H "X-GitHub-Event: workflow_run" \ + -d '{ + "action": "completed", + "workflow_run": { + "id": 123456789, + "name": "CI Tests", + "head_branch": "main", + "run_number": 42, + "status": "completed", + "conclusion": "success", + "html_url": "https://github.com/user/repo/actions/runs/123456789", + "updated_at": "2024-01-01T10:00:00Z" + }, + "repository": { + "full_name": "user/repo" + }, + "sender": { + "login": "test-user" + } + }' +``` + +### With Claude Code + +After setting up webhooks and pushing a commit: + +1. **Check recent events**: + - Ask: "What GitHub Actions events have we received?" + - Claude will use `get_recent_actions_events()` + +2. **Analyze CI status**: + - Use the prompt: "Analyze CI Results" + - Claude will check workflows and provide insights + +3. **Create deployment summary**: + - Use the prompt: "Create Deployment Summary" + - Claude will format a team-friendly update + +## Module Structure + +- `server.py` - Main MCP server with Tools and Prompts +- `webhook_server.py` - Separate webhook server that stores events +- `github_events.json` - File where webhook events are stored (created automatically) +- `pyproject.toml` - Dependencies for both servers +- `README.md` - This file + +## Next Steps + +- Complete the exercises in the module +- Experiment with different prompt workflows +- Move on to Module 3 for Hugging Face Hub integration \ No newline at end of file diff --git a/projects/unit3/github-actions-integration/solution/manual_test.md b/projects/unit3/github-actions-integration/solution/manual_test.md new file mode 100644 index 0000000..ad4940b --- /dev/null +++ b/projects/unit3/github-actions-integration/solution/manual_test.md @@ -0,0 +1,266 @@ +# Manual Testing Guide - Module 2: GitHub Actions Integration + +This guide walks through comprehensive testing of the GitHub Actions integration module. + +## Prerequisites + +- Python environment with `uv` installed +- Claude Code configured +- Port 8080 available +- (Optional) Cloudflare Tunnel installed for real GitHub testing + +## Test Sequence + +### 1. Environment Setup + +```bash +cd projects/unit3/github-actions-integration/solution +uv sync +``` + +### 2. Start Services + +**Terminal 1 - Webhook Server:** +```bash +python webhook_server.py +``` + +Expected output: +``` +🚀 Starting webhook server on http://localhost:8080 +📝 Events will be saved to: /path/to/github_events.json +🔗 Webhook URL: http://localhost:8080/webhook/github +``` + +**Terminal 2 - MCP Server:** +```bash +uv run server.py +``` + +Expected output: +``` +Starting PR Agent MCP server... +To receive GitHub webhooks, run the webhook server separately: + python webhook_server.py +``` + +### 3. Claude Code Configuration + +Add to Claude Code settings and restart: +```json +{ + "mcpServers": { + "pr-agent-actions": { + "command": "uv", + "args": ["run", "server.py"], + "cwd": "/path/to/github-actions-integration/solution" + } + } +} +``` + +### 4. Test Module 1 Functionality + +Verify existing tools still work: + +``` +User: "Analyze my file changes" +Expected: Claude uses analyze_file_changes() tool + +User: "Show me available PR templates" +Expected: Claude lists all templates + +User: "Suggest a template for a bug fix" +Expected: Claude recommends bug.md template +``` + +### 5. Send Test Webhook Events + +**Success Event:** +```bash +curl -X POST http://localhost:8080/webhook/github \ + -H "Content-Type: application/json" \ + -H "X-GitHub-Event: workflow_run" \ + -d '{ + "action": "completed", + "workflow_run": { + "id": 123456789, + "name": "CI Tests", + "head_branch": "main", + "head_sha": "abc123", + "path": ".github/workflows/ci.yml", + "display_title": "CI Tests for PR #42", + "run_number": 42, + "event": "push", + "status": "completed", + "conclusion": "success", + "workflow_id": 12345, + "url": "https://api.github.com/repos/user/repo/actions/runs/123456789", + "html_url": "https://github.com/user/repo/actions/runs/123456789", + "created_at": "2024-01-01T10:00:00Z", + "updated_at": "2024-01-01T10:05:00Z", + "run_started_at": "2024-01-01T10:00:00Z" + }, + "repository": { + "id": 987654321, + "name": "repo", + "full_name": "user/repo", + "owner": { + "login": "user", + "type": "User" + } + }, + "sender": { + "login": "test-user", + "type": "User" + } + }' +``` + +**Failure Event:** +```bash +curl -X POST http://localhost:8080/webhook/github \ + -H "Content-Type: application/json" \ + -H "X-GitHub-Event: workflow_run" \ + -d '{ + "action": "completed", + "workflow_run": { + "id": 123456790, + "name": "Deploy to Production", + "head_branch": "release/v1.0", + "status": "completed", + "conclusion": "failure", + "run_number": 15, + "html_url": "https://github.com/user/repo/actions/runs/123456790", + "updated_at": "2024-01-01T10:10:00Z" + }, + "repository": {"full_name": "user/repo"}, + "sender": {"login": "test-user"} + }' +``` + +**In-Progress Event:** +```bash +curl -X POST http://localhost:8080/webhook/github \ + -H "Content-Type: application/json" \ + -H "X-GitHub-Event: workflow_run" \ + -d '{ + "action": "in_progress", + "workflow_run": { + "id": 123456791, + "name": "Build and Test", + "status": "in_progress", + "conclusion": null, + "run_number": 100, + "html_url": "https://github.com/user/repo/actions/runs/123456791", + "updated_at": "2024-01-01T10:15:00Z" + }, + "repository": {"full_name": "user/repo"}, + "sender": {"login": "test-user"} + }' +``` + +### 6. Test New Tools + +``` +User: "What GitHub Actions events have we received?" +Expected: Claude shows all 3 events with details + +User: "Show me the workflow status" +Expected: Claude shows: +- CI Tests: success +- Deploy to Production: failure +- Build and Test: in_progress +``` + +### 7. Test MCP Prompts + +Test each prompt by name: + +**Prompt 1:** +``` +User: "analyze ci results" +Expected: Claude provides formatted analysis showing: +- Overall health: Warning (due to failure) +- Failed workflows: Deploy to Production +- Successful workflows: CI Tests +- In-progress: Build and Test +- Recommendations based on failures +``` + +**Prompt 2:** +``` +User: "create deployment summary" +Expected: Claude creates Slack-style message about the deployment failure +``` + +**Prompt 3:** +``` +User: "generate pr status report" +Expected: Claude combines file changes analysis with CI/CD status +``` + +**Prompt 4:** +``` +User: "troubleshoot workflow failure" +Expected: Claude provides structured troubleshooting for Deploy to Production failure +``` + +### 8. Verify File Storage + +Check that events are persisted: +```bash +cat github_events.json +``` + +Should contain all 3 events in JSON array format. + +### 9. Test Event Limit + +Send 5 more events to verify only last 100 are kept (manual verification). + +### 10. Test with Real GitHub (Optional) + +**Terminal 3 - Cloudflare Tunnel:** +```bash +cloudflared tunnel --url http://localhost:8080 +``` + +1. Note the tunnel URL (e.g., https://random-name.trycloudflare.com) +2. Configure GitHub webhook: + - Go to repo Settings → Webhooks + - Add webhook with URL: `https://your-tunnel.trycloudflare.com/webhook/github` + - Select "Workflow runs" events +3. Push a commit or manually trigger a workflow +4. Verify real events appear in Claude + +## Troubleshooting + +### No events showing up +- Check webhook server is running +- Verify `github_events.json` exists +- Ensure correct curl commands + +### Port 8080 already in use +```bash +lsof -i :8080 # Find process +kill -9 # Kill it +``` + +### Prompts not working +- Use exact lowercase names with spaces +- Don't use quotes around prompt names + +### Claude not finding server +- Restart Claude Code after config changes +- Check the cwd path is absolute + +## Success Criteria + +- [ ] All Module 1 tools work +- [ ] Webhook server receives and stores events +- [ ] Both new tools return correct data +- [ ] All 4 prompts execute successfully +- [ ] Events persist in JSON file +- [ ] Multiple event types handled correctly +- [ ] Real GitHub integration works (if tested) \ No newline at end of file diff --git a/projects/unit3/github-actions-integration/solution/pyproject.toml b/projects/unit3/github-actions-integration/solution/pyproject.toml new file mode 100644 index 0000000..611a3ab --- /dev/null +++ b/projects/unit3/github-actions-integration/solution/pyproject.toml @@ -0,0 +1,29 @@ +[project] +name = "pr-agent-actions" +version = "2.0.0" +description = "MCP server with GitHub Actions integration and Prompts" +readme = "README.md" +requires-python = ">=3.10" +dependencies = [ + "mcp[cli]>=1.0.0", + "aiohttp>=3.10.0,<4.0.0", +] + +[project.optional-dependencies] +dev = [ + "pytest>=8.3.0", + "pytest-asyncio>=0.21.0", +] + +[build-system] +requires = ["hatchling"] +build-backend = "hatchling.build" + +[tool.hatch.build.targets.wheel] +packages = ["."] + +[tool.uv] +dev-dependencies = [ + "pytest>=8.3.0", + "pytest-asyncio>=0.21.0", +] \ No newline at end of file diff --git a/projects/unit3/github-actions-integration/solution/server.py b/projects/unit3/github-actions-integration/solution/server.py new file mode 100644 index 0000000..4a11633 --- /dev/null +++ b/projects/unit3/github-actions-integration/solution/server.py @@ -0,0 +1,402 @@ +#!/usr/bin/env python3 +""" +Module 2: GitHub Actions Integration with MCP Prompts +Extends the PR agent with webhook handling and standardized CI/CD workflows using Prompts. +""" + +import json +import os +import subprocess +from typing import Dict, Any, Optional +from pathlib import Path + +from mcp.server.fastmcp import FastMCP + +# Initialize the FastMCP server +mcp = FastMCP("pr-agent-actions") + +# PR template directory (shared between starter and solution) +TEMPLATES_DIR = Path(__file__).parent.parent.parent / "templates" + +# File where webhook server stores events +EVENTS_FILE = Path(__file__).parent / "github_events.json" + + +# ===== Original Tools from Module 1 (with output limiting) ===== + +@mcp.tool() +async def analyze_file_changes( + base_branch: str = "main", + include_diff: bool = True, + max_diff_lines: int = 500, + working_directory: Optional[str] = None +) -> str: + """Get the full diff and list of changed files in the current git repository. + + Args: + base_branch: Base branch to compare against (default: main) + include_diff: Include the full diff content (default: true) + max_diff_lines: Maximum number of diff lines to include (default: 500) + working_directory: Directory to run git commands in (default: current directory) + """ + try: + # Try to get working directory from roots first + if working_directory is None: + try: + context = mcp.get_context() + roots_result = await context.session.list_roots() + # Get the first root - Claude Code sets this to the CWD + root = roots_result.roots[0] + # FileUrl object has a .path property that gives us the path directly + working_directory = root.uri.path + except Exception as e: + # If we can't get roots, fall back to current directory + pass + + # Use provided working directory or current directory + cwd = working_directory if working_directory else os.getcwd() + # Get list of changed files + files_result = subprocess.run( + ["git", "diff", "--name-status", f"{base_branch}...HEAD"], + capture_output=True, + text=True, + check=True, + cwd=cwd + ) + + # Get diff statistics + stat_result = subprocess.run( + ["git", "diff", "--stat", f"{base_branch}...HEAD"], + capture_output=True, + text=True, + cwd=cwd + ) + + # Get the actual diff if requested + diff_content = "" + truncated = False + if include_diff: + diff_result = subprocess.run( + ["git", "diff", f"{base_branch}...HEAD"], + capture_output=True, + text=True, + cwd=cwd + ) + diff_lines = diff_result.stdout.split('\n') + + # Check if we need to truncate + if len(diff_lines) > max_diff_lines: + diff_content = '\n'.join(diff_lines[:max_diff_lines]) + diff_content += f"\n\n... Output truncated. Showing {max_diff_lines} of {len(diff_lines)} lines ..." + diff_content += "\n... Use max_diff_lines parameter to see more ..." + truncated = True + else: + diff_content = diff_result.stdout + + # Get commit messages for context + commits_result = subprocess.run( + ["git", "log", "--oneline", f"{base_branch}..HEAD"], + capture_output=True, + text=True, + cwd=cwd + ) + + analysis = { + "base_branch": base_branch, + "files_changed": files_result.stdout, + "statistics": stat_result.stdout, + "commits": commits_result.stdout, + "diff": diff_content if include_diff else "Diff not included (set include_diff=true to see full diff)", + "truncated": truncated, + "total_diff_lines": len(diff_lines) if include_diff else 0 + } + + return json.dumps(analysis, indent=2) + + except subprocess.CalledProcessError as e: + return json.dumps({"error": f"Git error: {e.stderr}"}) + except Exception as e: + return json.dumps({"error": str(e)}) + + +@mcp.tool() +async def get_pr_templates() -> str: + """List available PR templates with their content.""" + templates = [] + + # Define default templates + default_templates = { + "bug.md": "Bug Fix", + "feature.md": "Feature", + "docs.md": "Documentation", + "refactor.md": "Refactor", + "test.md": "Test", + "performance.md": "Performance", + "security.md": "Security" + } + + for filename, template_type in default_templates.items(): + template_path = TEMPLATES_DIR / filename + + # Read template content + content = template_path.read_text() + + templates.append({ + "filename": filename, + "type": template_type, + "content": content + }) + + return json.dumps(templates, indent=2) + + +@mcp.tool() +async def suggest_template(changes_summary: str, change_type: str) -> str: + """Let Claude analyze the changes and suggest the most appropriate PR template. + + Args: + changes_summary: Your analysis of what the changes do + change_type: The type of change you've identified (bug, feature, docs, refactor, test, etc.) + """ + + # Get available templates + templates_response = await get_pr_templates() + templates = json.loads(templates_response) + + # Map change types to template files + type_mapping = { + "bug": "bug.md", + "fix": "bug.md", + "feature": "feature.md", + "enhancement": "feature.md", + "docs": "docs.md", + "documentation": "docs.md", + "refactor": "refactor.md", + "cleanup": "refactor.md", + "test": "test.md", + "testing": "test.md", + "performance": "performance.md", + "optimization": "performance.md", + "security": "security.md" + } + + # Find matching template + template_file = type_mapping.get(change_type.lower(), "feature.md") + selected_template = next( + (t for t in templates if t["filename"] == template_file), + templates[0] # Default to first template if no match + ) + + suggestion = { + "recommended_template": selected_template, + "reasoning": f"Based on your analysis: '{changes_summary}', this appears to be a {change_type} change.", + "template_content": selected_template["content"], + "usage_hint": "Claude can help you fill out this template based on the specific changes in your PR." + } + + return json.dumps(suggestion, indent=2) + + +# ===== New Module 2: GitHub Actions Tools ===== + +@mcp.tool() +async def get_recent_actions_events(limit: int = 10) -> str: + """Get recent GitHub Actions events received via webhook. + + Args: + limit: Maximum number of events to return (default: 10) + """ + # Read events from file + if not EVENTS_FILE.exists(): + return json.dumps([]) + + with open(EVENTS_FILE, 'r') as f: + events = json.load(f) + + # Return most recent events + recent = events[-limit:] + return json.dumps(recent, indent=2) + + +@mcp.tool() +async def get_workflow_status(workflow_name: Optional[str] = None) -> str: + """Get the current status of GitHub Actions workflows. + + Args: + workflow_name: Optional specific workflow name to filter by + """ + # Read events from file + if not EVENTS_FILE.exists(): + return json.dumps({"message": "No GitHub Actions events received yet"}) + + with open(EVENTS_FILE, 'r') as f: + events = json.load(f) + + if not events: + return json.dumps({"message": "No GitHub Actions events received yet"}) + + # Filter for workflow events + workflow_events = [ + e for e in events + if e.get("workflow_run") is not None + ] + + if workflow_name: + workflow_events = [ + e for e in workflow_events + if e["workflow_run"].get("name") == workflow_name + ] + + # Group by workflow and get latest status + workflows = {} + for event in workflow_events: + run = event["workflow_run"] + name = run["name"] + if name not in workflows or run["updated_at"] > workflows[name]["updated_at"]: + workflows[name] = { + "name": name, + "status": run["status"], + "conclusion": run.get("conclusion"), + "run_number": run["run_number"], + "updated_at": run["updated_at"], + "html_url": run["html_url"] + } + + return json.dumps(list(workflows.values()), indent=2) + + +# ===== New Module 2: MCP Prompts ===== + +@mcp.prompt() +async def analyze_ci_results(): + """Analyze recent CI/CD results and provide insights.""" + return """Please analyze the recent CI/CD results from GitHub Actions: + +1. First, call get_recent_actions_events() to fetch the latest CI/CD events +2. Then call get_workflow_status() to check current workflow states +3. Identify any failures or issues that need attention +4. Provide actionable next steps based on the results + +Format your response as: +## CI/CD Status Summary +- **Overall Health**: [Good/Warning/Critical] +- **Failed Workflows**: [List any failures with links] +- **Successful Workflows**: [List recent successes] +- **Recommendations**: [Specific actions to take] +- **Trends**: [Any patterns you notice]""" + + +@mcp.prompt() +async def create_deployment_summary(): + """Generate a deployment summary for team communication.""" + return """Create a deployment summary for team communication: + +1. Check workflow status with get_workflow_status() +2. Look specifically for deployment-related workflows +3. Note the deployment outcome, timing, and any issues + +Format as a concise message suitable for Slack: + +🚀 **Deployment Update** +- **Status**: [✅ Success / ❌ Failed / ⏳ In Progress] +- **Environment**: [Production/Staging/Dev] +- **Version/Commit**: [If available from workflow data] +- **Duration**: [If available] +- **Key Changes**: [Brief summary if available] +- **Issues**: [Any problems encountered] +- **Next Steps**: [Required actions if failed] + +Keep it brief but informative for team awareness.""" + + +@mcp.prompt() +async def generate_pr_status_report(): + """Generate a comprehensive PR status report including CI/CD results.""" + return """Generate a comprehensive PR status report: + +1. Use analyze_file_changes() to understand what changed +2. Use get_workflow_status() to check CI/CD status +3. Use suggest_template() to recommend the appropriate PR template +4. Combine all information into a cohesive report + +Create a detailed report with: + +## 📋 PR Status Report + +### 📝 Code Changes +- **Files Modified**: [Count by type - .py, .js, etc.] +- **Change Type**: [Feature/Bug/Refactor/etc.] +- **Impact Assessment**: [High/Medium/Low with reasoning] +- **Key Changes**: [Bullet points of main modifications] + +### 🔄 CI/CD Status +- **All Checks**: [✅ Passing / ❌ Failing / ⏳ Running] +- **Test Results**: [Pass rate, failed tests if any] +- **Build Status**: [Success/Failed with details] +- **Code Quality**: [Linting, coverage if available] + +### 📌 Recommendations +- **PR Template**: [Suggested template and why] +- **Next Steps**: [What needs to happen before merge] +- **Reviewers**: [Suggested reviewers based on files changed] + +### ⚠️ Risks & Considerations +- [Any deployment risks] +- [Breaking changes] +- [Dependencies affected]""" + + +@mcp.prompt() +async def troubleshoot_workflow_failure(): + """Help troubleshoot a failing GitHub Actions workflow.""" + return """Help troubleshoot failing GitHub Actions workflows: + +1. Use get_recent_actions_events() to find recent failures +2. Use get_workflow_status() to see which workflows are failing +3. Analyze the failure patterns and timing +4. Provide systematic troubleshooting steps + +Structure your response as: + +## 🔧 Workflow Troubleshooting Guide + +### ❌ Failed Workflow Details +- **Workflow Name**: [Name of failing workflow] +- **Failure Type**: [Test/Build/Deploy/Lint] +- **First Failed**: [When did it start failing] +- **Failure Rate**: [Intermittent or consistent] + +### 🔍 Diagnostic Information +- **Error Patterns**: [Common error messages or symptoms] +- **Recent Changes**: [What changed before failures started] +- **Dependencies**: [External services or resources involved] + +### 💡 Possible Causes (ordered by likelihood) +1. **[Most Likely]**: [Description and why] +2. **[Likely]**: [Description and why] +3. **[Possible]**: [Description and why] + +### ✅ Suggested Fixes +**Immediate Actions:** +- [ ] [Quick fix to try first] +- [ ] [Second quick fix] + +**Investigation Steps:** +- [ ] [How to gather more info] +- [ ] [Logs or data to check] + +**Long-term Solutions:** +- [ ] [Preventive measure] +- [ ] [Process improvement] + +### 📚 Resources +- [Relevant documentation links] +- [Similar issues or solutions]""" + + +if __name__ == "__main__": + # Run MCP server normally + print("Starting PR Agent MCP server...") + print("To receive GitHub webhooks, run the webhook server separately:") + print(" python webhook_server.py") + mcp.run() \ No newline at end of file diff --git a/projects/unit3/github-actions-integration/solution/test_server.py b/projects/unit3/github-actions-integration/solution/test_server.py new file mode 100644 index 0000000..0bb24fd --- /dev/null +++ b/projects/unit3/github-actions-integration/solution/test_server.py @@ -0,0 +1,193 @@ +#!/usr/bin/env python3 +""" +Unit tests for Module 1: Basic MCP Server +""" + +import json +import pytest +import asyncio +from pathlib import Path +from unittest.mock import patch, MagicMock, AsyncMock +from server import ( + mcp, + analyze_file_changes, + get_pr_templates, + suggest_template, + create_default_template, + TEMPLATES_DIR +) + + +class TestAnalyzeFileChanges: + """Test the analyze_file_changes tool.""" + + @pytest.mark.asyncio + async def test_analyze_with_diff(self): + """Test analyzing changes with full diff included.""" + mock_result = MagicMock() + mock_result.stdout = "M\tfile1.py\nA\tfile2.py\n" + mock_result.stderr = "" + + with patch('subprocess.run') as mock_run: + mock_run.return_value = mock_result + + result = await analyze_file_changes("main", include_diff=True) + + assert isinstance(result, str) + data = json.loads(result) + assert data["base_branch"] == "main" + assert "files_changed" in data + assert "statistics" in data + assert "commits" in data + assert "diff" in data + + @pytest.mark.asyncio + async def test_analyze_without_diff(self): + """Test analyzing changes without diff content.""" + mock_result = MagicMock() + mock_result.stdout = "M\tfile1.py\n" + + with patch('subprocess.run') as mock_run: + mock_run.return_value = mock_result + + result = await analyze_file_changes("main", include_diff=False) + + data = json.loads(result) + assert "Diff not included" in data["diff"] + + @pytest.mark.asyncio + async def test_analyze_git_error(self): + """Test handling git command errors.""" + with patch('subprocess.run') as mock_run: + mock_run.side_effect = Exception("Git not found") + + result = await analyze_file_changes("main", True) + + assert "Error:" in result + + +class TestPRTemplates: + """Test PR template management.""" + + @pytest.mark.asyncio + async def test_get_templates(self, tmp_path, monkeypatch): + """Test getting available templates.""" + # Use temporary directory for templates + monkeypatch.setattr('server.TEMPLATES_DIR', tmp_path) + + result = await get_pr_templates() + + templates = json.loads(result) + assert len(templates) > 0 + assert any(t["type"] == "Bug Fix" for t in templates) + assert any(t["type"] == "Feature" for t in templates) + assert all("content" in t for t in templates) + + def test_create_default_template(self, tmp_path): + """Test creating default template files.""" + template_path = tmp_path / "test.md" + + create_default_template(template_path, "Bug Fix") + + assert template_path.exists() + content = template_path.read_text() + assert "## Bug Fix" in content + assert "Description" in content + assert "Root Cause" in content + + +class TestSuggestTemplate: + """Test template suggestion based on analysis.""" + + @pytest.mark.asyncio + async def test_suggest_bug_fix(self, tmp_path, monkeypatch): + """Test suggesting bug fix template.""" + monkeypatch.setattr('server.TEMPLATES_DIR', tmp_path) + + # Create templates first + await get_pr_templates() + + result = await suggest_template( + "Fixed null pointer exception in user service", + "bug" + ) + + suggestion = json.loads(result) + assert suggestion["recommended_template"]["filename"] == "bug.md" + assert "Bug Fix" in suggestion["recommended_template"]["type"] + assert "reasoning" in suggestion + + @pytest.mark.asyncio + async def test_suggest_feature(self, tmp_path, monkeypatch): + """Test suggesting feature template.""" + monkeypatch.setattr('server.TEMPLATES_DIR', tmp_path) + + await get_pr_templates() + + result = await suggest_template( + "Added new authentication method for API", + "feature" + ) + + suggestion = json.loads(result) + assert suggestion["recommended_template"]["filename"] == "feature.md" + + @pytest.mark.asyncio + async def test_suggest_with_type_variations(self, tmp_path, monkeypatch): + """Test template suggestion with various type names.""" + monkeypatch.setattr('server.TEMPLATES_DIR', tmp_path) + + await get_pr_templates() + + # Test variations + for change_type, expected_file in [ + ("fix", "bug.md"), + ("enhancement", "feature.md"), + ("documentation", "docs.md"), + ("cleanup", "refactor.md"), + ("testing", "test.md"), + ("optimization", "performance.md") + ]: + result = await suggest_template(f"Some {change_type} work", change_type) + suggestion = json.loads(result) + assert suggestion["recommended_template"]["filename"] == expected_file + + +class TestIntegration: + """Integration tests for the complete workflow.""" + + @pytest.mark.asyncio + async def test_full_workflow(self, tmp_path, monkeypatch): + """Test the complete workflow from analysis to suggestion.""" + monkeypatch.setattr('server.TEMPLATES_DIR', tmp_path) + + # Mock git commands + with patch('subprocess.run') as mock_run: + mock_run.return_value = MagicMock( + stdout="M\tsrc/main.py\nM\ttests/test_main.py\n", + stderr="" + ) + + # 1. Analyze changes + analysis_result = await analyze_file_changes("main", True) + + # 2. Get templates + templates_result = await get_pr_templates() + + # 3. Suggest template based on analysis + suggestion_result = await suggest_template( + "Updated main functionality and added tests", + "feature" + ) + + # Verify results + assert all(isinstance(r, str) for r in [analysis_result, templates_result, suggestion_result]) + + suggestion = json.loads(suggestion_result) + assert "recommended_template" in suggestion + assert "template_content" in suggestion + assert suggestion["recommended_template"]["type"] == "Feature" + + +if __name__ == "__main__": + pytest.main([__file__, "-v"]) \ No newline at end of file diff --git a/projects/unit3/github-actions-integration/solution/uv.lock b/projects/unit3/github-actions-integration/solution/uv.lock new file mode 100644 index 0000000..9854dc8 --- /dev/null +++ b/projects/unit3/github-actions-integration/solution/uv.lock @@ -0,0 +1,983 @@ +version = 1 +requires-python = ">=3.10" + +[[package]] +name = "aiohappyeyeballs" +version = "2.6.1" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohappyeyeballs/2.6.1/aiohappyeyeballs-2.6.1.tar.gz", hash = "sha256:c3f9d0113123803ccadfdf3f0faa505bc78e6a72d1cc4806cbd719826e943558" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohappyeyeballs/2.6.1/aiohappyeyeballs-2.6.1-py3-none-any.whl", hash = "sha256:f349ba8f4b75cb25c99c5c2d84e997e485204d2902a9597802b0371f09331fb8" }, +] + +[[package]] +name = "aiohttp" +version = "3.11.18" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "aiohappyeyeballs" }, + { name = "aiosignal" }, + { name = "async-timeout", marker = "python_full_version < '3.11'" }, + { name = "attrs" }, + { name = "frozenlist" }, + { name = "multidict" }, + { name = "propcache" }, + { name = "yarl" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18.tar.gz", hash = "sha256:ae856e1138612b7e412db63b7708735cff4d38d0399f6a5435d3dac2669f558a" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:96264854fedbea933a9ca4b7e0c745728f01380691687b7365d18d9e977179c4" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:9602044ff047043430452bc3a2089743fa85da829e6fc9ee0025351d66c332b6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:5691dc38750fcb96a33ceef89642f139aa315c8a193bbd42a0c33476fd4a1609" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:554c918ec43f8480b47a5ca758e10e793bd7410b83701676a4782672d670da55" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8a4076a2b3ba5b004b8cffca6afe18a3b2c5c9ef679b4d1e9859cf76295f8d4f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:767a97e6900edd11c762be96d82d13a1d7c4fc4b329f054e88b57cdc21fded94" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f0ddc9337a0fb0e727785ad4f41163cc314376e82b31846d3835673786420ef1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f414f37b244f2a97e79b98d48c5ff0789a0b4b4609b17d64fa81771ad780e415" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:fdb239f47328581e2ec7744ab5911f97afb10752332a6dd3d98e14e429e1a9e7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:f2c50bad73ed629cc326cc0f75aed8ecfb013f88c5af116f33df556ed47143eb" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:0a8d8f20c39d3fa84d1c28cdb97f3111387e48209e224408e75f29c6f8e0861d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:106032eaf9e62fd6bc6578c8b9e6dc4f5ed9a5c1c7fb2231010a1b4304393421" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:b491e42183e8fcc9901d8dcd8ae644ff785590f1727f76ca86e731c61bfe6643" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:ad8c745ff9460a16b710e58e06a9dec11ebc0d8f4dd82091cefb579844d69868" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-win32.whl", hash = "sha256:8e57da93e24303a883146510a434f0faf2f1e7e659f3041abc4e3fb3f6702a9f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-win_amd64.whl", hash = "sha256:cc93a4121d87d9f12739fc8fab0a95f78444e571ed63e40bfc78cd5abe700ac9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:427fdc56ccb6901ff8088544bde47084845ea81591deb16f957897f0f0ba1be9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:2c828b6d23b984255b85b9b04a5b963a74278b7356a7de84fda5e3b76866597b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:5c2eaa145bb36b33af1ff2860820ba0589e165be4ab63a49aebfd0981c173b66" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3d518ce32179f7e2096bf4e3e8438cf445f05fedd597f252de9f54c728574756" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0700055a6e05c2f4711011a44364020d7a10fbbcd02fbf3e30e8f7e7fddc8717" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8bd1cde83e4684324e6ee19adfc25fd649d04078179890be7b29f76b501de8e4" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:73b8870fe1c9a201b8c0d12c94fe781b918664766728783241a79e0468427e4f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:25557982dd36b9e32c0a3357f30804e80790ec2c4d20ac6bcc598533e04c6361" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:7e889c9df381a2433802991288a61e5a19ceb4f61bd14f5c9fa165655dcb1fd1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:9ea345fda05bae217b6cce2acf3682ce3b13d0d16dd47d0de7080e5e21362421" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:9f26545b9940c4b46f0a9388fd04ee3ad7064c4017b5a334dd450f616396590e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:3a621d85e85dccabd700294494d7179ed1590b6d07a35709bb9bd608c7f5dd1d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:9c23fd8d08eb9c2af3faeedc8c56e134acdaf36e2117ee059d7defa655130e5f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:d9e6b0e519067caa4fd7fb72e3e8002d16a68e84e62e7291092a5433763dc0dd" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-win32.whl", hash = "sha256:122f3e739f6607e5e4c6a2f8562a6f476192a682a52bda8b4c6d4254e1138f4d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-win_amd64.whl", hash = "sha256:e6f3c0a3a1e73e88af384b2e8a0b9f4fb73245afd47589df2afcab6b638fa0e6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:63d71eceb9cad35d47d71f78edac41fcd01ff10cacaa64e473d1aec13fa02df2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:d1929da615840969929e8878d7951b31afe0bac883d84418f92e5755d7b49508" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:7d0aebeb2392f19b184e3fdd9e651b0e39cd0f195cdb93328bd124a1d455cd0e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3849ead845e8444f7331c284132ab314b4dac43bfae1e3cf350906d4fff4620f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5e8452ad6b2863709f8b3d615955aa0807bc093c34b8e25b3b52097fe421cb7f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3b8d2b42073611c860a37f718b3d61ae8b4c2b124b2e776e2c10619d920350ec" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:40fbf91f6a0ac317c0a07eb328a1384941872f6761f2e6f7208b63c4cc0a7ff6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:44ff5625413fec55216da5eaa011cf6b0a2ed67a565914a212a51aa3755b0009" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:7f33a92a2fde08e8c6b0c61815521324fc1612f397abf96eed86b8e31618fdb4" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:11d5391946605f445ddafda5eab11caf310f90cdda1fd99865564e3164f5cff9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:3cc314245deb311364884e44242e00c18b5896e4fe6d5f942e7ad7e4cb640adb" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:0f421843b0f70740772228b9e8093289924359d306530bcd3926f39acbe1adda" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:e220e7562467dc8d589e31c1acd13438d82c03d7f385c9cd41a3f6d1d15807c1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ab2ef72f8605046115bc9aa8e9d14fd49086d405855f40b79ed9e5c1f9f4faea" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-win32.whl", hash = "sha256:12a62691eb5aac58d65200c7ae94d73e8a65c331c3a86a2e9670927e94339ee8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-win_amd64.whl", hash = "sha256:364329f319c499128fd5cd2d1c31c44f234c58f9b96cc57f743d16ec4f3238c8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:474215ec618974054cf5dc465497ae9708543cbfc312c65212325d4212525811" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:6ced70adf03920d4e67c373fd692123e34d3ac81dfa1c27e45904a628567d804" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:2d9f6c0152f8d71361905aaf9ed979259537981f47ad099c8b3d81e0319814bd" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a35197013ed929c0aed5c9096de1fc5a9d336914d73ab3f9df14741668c0616c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:540b8a1f3a424f1af63e0af2d2853a759242a1769f9f1ab053996a392bd70118" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f9e6710ebebfce2ba21cee6d91e7452d1125100f41b906fb5af3da8c78b764c1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f8af2ef3b4b652ff109f98087242e2ab974b2b2b496304063585e3d78de0b000" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:28c3f975e5ae3dbcbe95b7e3dcd30e51da561a0a0f2cfbcdea30fc1308d72137" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:c28875e316c7b4c3e745172d882d8a5c835b11018e33432d281211af35794a93" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:13cd38515568ae230e1ef6919e2e33da5d0f46862943fcda74e7e915096815f3" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:0e2a92101efb9f4c2942252c69c63ddb26d20f46f540c239ccfa5af865197bb8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:e6d3e32b8753c8d45ac550b11a1090dd66d110d4ef805ffe60fa61495360b3b2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:ea4cf2488156e0f281f93cc2fd365025efcba3e2d217cbe3df2840f8c73db261" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9d4df95ad522c53f2b9ebc07f12ccd2cb15550941e11a5bbc5ddca2ca56316d7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-win32.whl", hash = "sha256:cdd1bbaf1e61f0d94aced116d6e95fe25942f7a5f42382195fd9501089db5d78" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-win_amd64.whl", hash = "sha256:bdd619c27e44382cf642223f11cfd4d795161362a5a1fc1fa3940397bc89db01" }, +] + +[[package]] +name = "aiosignal" +version = "1.3.2" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "frozenlist" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiosignal/1.3.2/aiosignal-1.3.2.tar.gz", hash = "sha256:a8c255c66fafb1e499c9351d0bf32ff2d8a0321595ebac3b93713656d2436f54" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiosignal/1.3.2/aiosignal-1.3.2-py2.py3-none-any.whl", hash = "sha256:45cde58e409a301715980c2b01d0c28bdde3770d8290b5eb2173759d9acb31a5" }, +] + +[[package]] +name = "annotated-types" +version = "0.7.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/annotated-types/0.7.0/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/annotated-types/0.7.0/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53" }, +] + +[[package]] +name = "anyio" +version = "4.9.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "exceptiongroup", marker = "python_full_version < '3.11'" }, + { name = "idna" }, + { name = "sniffio" }, + { name = "typing-extensions", marker = "python_full_version < '3.13'" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/anyio/4.9.0/anyio-4.9.0.tar.gz", hash = "sha256:673c0c244e15788651a4ff38710fea9675823028a6f08a5eda409e0c9840a028" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/anyio/4.9.0/anyio-4.9.0-py3-none-any.whl", hash = "sha256:9f76d541cad6e36af7beb62e978876f3b41e3e04f2c1fbf0884604c0a9c4d93c" }, +] + +[[package]] +name = "async-timeout" +version = "5.0.1" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/async-timeout/5.0.1/async_timeout-5.0.1.tar.gz", hash = "sha256:d9321a7a3d5a6a5e187e824d2fa0793ce379a202935782d555d6e9d2735677d3" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/async-timeout/5.0.1/async_timeout-5.0.1-py3-none-any.whl", hash = "sha256:39e3809566ff85354557ec2398b55e096c8364bacac9405a7a1fa429e77fe76c" }, +] + +[[package]] +name = "attrs" +version = "25.3.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/attrs/25.3.0/attrs-25.3.0.tar.gz", hash = "sha256:75d7cefc7fb576747b2c81b4442d4d4a1ce0900973527c011d1030fd3bf4af1b" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/attrs/25.3.0/attrs-25.3.0-py3-none-any.whl", hash = "sha256:427318ce031701fea540783410126f03899a97ffc6f61596ad581ac2e40e3bc3" }, +] + +[[package]] +name = "certifi" +version = "2025.4.26" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/certifi/2025.4.26/certifi-2025.4.26.tar.gz", hash = "sha256:0a816057ea3cdefcef70270d2c515e4506bbc954f417fa5ade2021213bb8f0c6" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/certifi/2025.4.26/certifi-2025.4.26-py3-none-any.whl", hash = "sha256:30350364dfe371162649852c63336a15c70c6510c2ad5015b21c2345311805f3" }, +] + +[[package]] +name = "click" +version = "8.1.8" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "colorama", marker = "sys_platform == 'win32'" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/click/8.1.8/click-8.1.8.tar.gz", hash = "sha256:ed53c9d8990d83c2a27deae68e4ee337473f6330c040a31d4225c9574d16096a" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/click/8.1.8/click-8.1.8-py3-none-any.whl", hash = "sha256:63c132bbbed01578a06712a2d1f497bb62d9c1c0d329b7903a866228027263b2" }, +] + +[[package]] +name = "colorama" +version = "0.4.6" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/colorama/0.4.6/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/colorama/0.4.6/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6" }, +] + +[[package]] +name = "exceptiongroup" +version = "1.3.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "typing-extensions", marker = "python_full_version < '3.13'" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/exceptiongroup/1.3.0/exceptiongroup-1.3.0.tar.gz", hash = "sha256:b241f5885f560bc56a59ee63ca4c6a8bfa46ae4ad651af316d4e81817bb9fd88" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/exceptiongroup/1.3.0/exceptiongroup-1.3.0-py3-none-any.whl", hash = "sha256:4d111e6e0c13d0644cad6ddaa7ed0261a0b36971f6d23e7ec9b4b9097da78a10" }, +] + +[[package]] +name = "frozenlist" +version = "1.6.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0.tar.gz", hash = "sha256:b99655c32c1c8e06d111e7f41c06c29a5318cb1835df23a45518e02a47c63b68" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:e6e558ea1e47fd6fa8ac9ccdad403e5dd5ecc6ed8dda94343056fa4277d5c65e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:f4b3cd7334a4bbc0c472164f3744562cb72d05002cc6fcf58adb104630bbc352" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9799257237d0479736e2b4c01ff26b5c7f7694ac9692a426cb717f3dc02fff9b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f3a7bb0fe1f7a70fb5c6f497dc32619db7d2cdd53164af30ade2f34673f8b1fc" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:36d2fc099229f1e4237f563b2a3e0ff7ccebc3999f729067ce4e64a97a7f2869" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f27a9f9a86dcf00708be82359db8de86b80d029814e6693259befe82bb58a106" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:75ecee69073312951244f11b8627e3700ec2bfe07ed24e3a685a5979f0412d24" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f2c7d5aa19714b1b01a0f515d078a629e445e667b9da869a3cd0e6fe7dec78bd" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:69bbd454f0fb23b51cadc9bdba616c9678e4114b6f9fa372d462ff2ed9323ec8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:7daa508e75613809c7a57136dec4871a21bca3080b3a8fc347c50b187df4f00c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:89ffdb799154fd4d7b85c56d5fa9d9ad48946619e0eb95755723fffa11022d75" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:920b6bd77d209931e4c263223381d63f76828bec574440f29eb497cf3394c249" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:d3ceb265249fb401702fce3792e6b44c1166b9319737d21495d3611028d95769" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:52021b528f1571f98a7d4258c58aa8d4b1a96d4f01d00d51f1089f2e0323cb02" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:0f2ca7810b809ed0f1917293050163c7654cefc57a49f337d5cd9de717b8fad3" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-win32.whl", hash = "sha256:0e6f8653acb82e15e5443dba415fb62a8732b68fe09936bb6d388c725b57f812" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-win_amd64.whl", hash = "sha256:f1a39819a5a3e84304cd286e3dc62a549fe60985415851b3337b6f5cc91907f1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ae8337990e7a45683548ffb2fee1af2f1ed08169284cd829cdd9a7fa7470530d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:8c952f69dd524558694818a461855f35d36cc7f5c0adddce37e962c85d06eac0" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:8f5fef13136c4e2dee91bfb9a44e236fff78fc2cd9f838eddfc470c3d7d90afe" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:716bbba09611b4663ecbb7cd022f640759af8259e12a6ca939c0a6acd49eedba" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:7b8c4dc422c1a3ffc550b465090e53b0bf4839047f3e436a34172ac67c45d595" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b11534872256e1666116f6587a1592ef395a98b54476addb5e8d352925cb5d4a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1c6eceb88aaf7221f75be6ab498dc622a151f5f88d536661af3ffc486245a626" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:62c828a5b195570eb4b37369fcbbd58e96c905768d53a44d13044355647838ff" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e1c6bd2c6399920c9622362ce95a7d74e7f9af9bfec05fff91b8ce4b9647845a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:49ba23817781e22fcbd45fd9ff2b9b8cdb7b16a42a4851ab8025cae7b22e96d0" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:431ef6937ae0f853143e2ca67d6da76c083e8b1fe3df0e96f3802fd37626e606" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:9d124b38b3c299ca68433597ee26b7819209cb8a3a9ea761dfe9db3a04bba584" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:118e97556306402e2b010da1ef21ea70cb6d6122e580da64c056b96f524fbd6a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:fb3b309f1d4086b5533cf7bbcf3f956f0ae6469664522f1bde4feed26fba60f1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:54dece0d21dce4fdb188a1ffc555926adf1d1c516e493c2914d7c370e454bc9e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-win32.whl", hash = "sha256:654e4ba1d0b2154ca2f096bed27461cf6160bc7f504a7f9a9ef447c293caf860" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-win_amd64.whl", hash = "sha256:3e911391bffdb806001002c1f860787542f45916c3baf764264a52765d5a5603" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:c5b9e42ace7d95bf41e19b87cec8f262c41d3510d8ad7514ab3862ea2197bfb1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:ca9973735ce9f770d24d5484dcb42f68f135351c2fc81a7a9369e48cf2998a29" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6ac40ec76041c67b928ca8aaffba15c2b2ee3f5ae8d0cb0617b5e63ec119ca25" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:95b7a8a3180dfb280eb044fdec562f9b461614c0ef21669aea6f1d3dac6ee576" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:c444d824e22da6c9291886d80c7d00c444981a72686e2b59d38b285617cb52c8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bb52c8166499a8150bfd38478248572c924c003cbb45fe3bcd348e5ac7c000f9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b35298b2db9c2468106278537ee529719228950a5fdda686582f68f247d1dc6e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d108e2d070034f9d57210f22fefd22ea0d04609fc97c5f7f5a686b3471028590" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4e1be9111cb6756868ac242b3c2bd1f09d9aea09846e4f5c23715e7afb647103" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:94bb451c664415f02f07eef4ece976a2c65dcbab9c2f1705b7031a3a75349d8c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:d1a686d0b0949182b8faddea596f3fc11f44768d1f74d4cad70213b2e139d821" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:ea8e59105d802c5a38bdbe7362822c522230b3faba2aa35c0fa1765239b7dd70" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:abc4e880a9b920bc5020bf6a431a6bb40589d9bca3975c980495f63632e8382f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:9a79713adfe28830f27a3c62f6b5406c37376c892b05ae070906f07ae4487046" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:9a0318c2068e217a8f5e3b85e35899f5a19e97141a45bb925bb357cfe1daf770" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-win32.whl", hash = "sha256:853ac025092a24bb3bf09ae87f9127de9fe6e0c345614ac92536577cf956dfcc" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-win_amd64.whl", hash = "sha256:2bdfe2d7e6c9281c6e55523acd6c2bf77963cb422fdc7d142fb0cb6621b66878" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:1d7fb014fe0fbfee3efd6a94fc635aeaa68e5e1720fe9e57357f2e2c6e1a647e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:01bcaa305a0fdad12745502bfd16a1c75b14558dabae226852f9159364573117" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:8b314faa3051a6d45da196a2c495e922f987dc848e967d8cfeaee8a0328b1cd4" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:da62fecac21a3ee10463d153549d8db87549a5e77eefb8c91ac84bb42bb1e4e3" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:d1eb89bf3454e2132e046f9599fbcf0a4483ed43b40f545551a39316d0201cd1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d18689b40cb3936acd971f663ccb8e2589c45db5e2c5f07e0ec6207664029a9c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e67ddb0749ed066b1a03fba812e2dcae791dd50e5da03be50b6a14d0c1a9ee45" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:fc5e64626e6682638d6e44398c9baf1d6ce6bc236d40b4b57255c9d3f9761f1f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:437cfd39564744ae32ad5929e55b18ebd88817f9180e4cc05e7d53b75f79ce85" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:62dd7df78e74d924952e2feb7357d826af8d2f307557a779d14ddf94d7311be8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:a66781d7e4cddcbbcfd64de3d41a61d6bdde370fc2e38623f30b2bd539e84a9f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:482fe06e9a3fffbcd41950f9d890034b4a54395c60b5e61fae875d37a699813f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:e4f9373c500dfc02feea39f7a56e4f543e670212102cc2eeb51d3a99c7ffbde6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:e69bb81de06827147b7bfbaeb284d85219fa92d9f097e32cc73675f279d70188" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:7613d9977d2ab4a9141dde4a149f4357e4065949674c5649f920fec86ecb393e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-win32.whl", hash = "sha256:4def87ef6d90429f777c9d9de3961679abf938cb6b7b63d4a7eb8a268babfce4" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-win_amd64.whl", hash = "sha256:37a8a52c3dfff01515e9bbbee0e6063181362f9de3db2ccf9bc96189b557cbfd" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:46138f5a0773d064ff663d273b309b696293d7a7c00a0994c5c13a5078134b64" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:f88bc0a2b9c2a835cb888b32246c27cdab5740059fb3688852bf91e915399b91" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:777704c1d7655b802c7850255639672e90e81ad6fa42b99ce5ed3fbf45e338dd" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:85ef8d41764c7de0dcdaf64f733a27352248493a85a80661f3c678acd27e31f2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:da5cb36623f2b846fb25009d9d9215322318ff1c63403075f812b3b2876c8506" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:cbb56587a16cf0fb8acd19e90ff9924979ac1431baea8681712716a8337577b0" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c6154c3ba59cda3f954c6333025369e42c3acd0c6e8b6ce31eb5c5b8116c07e0" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2e8246877afa3f1ae5c979fe85f567d220f86a50dc6c493b9b7d8191181ae01e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7b0f6cce16306d2e117cf9db71ab3a9e8878a28176aeaf0dbe35248d97b28d0c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:1b8e8cd8032ba266f91136d7105706ad57770f3522eac4a111d77ac126a25a9b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:e2ada1d8515d3ea5378c018a5f6d14b4994d4036591a52ceaf1a1549dec8e1ad" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:cdb2c7f071e4026c19a3e32b93a09e59b12000751fc9b0b7758da899e657d215" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:03572933a1969a6d6ab509d509e5af82ef80d4a5d4e1e9f2e1cdd22c77a3f4d2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:77effc978947548b676c54bbd6a08992759ea6f410d4987d69feea9cd0919911" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:a2bda8be77660ad4089caf2223fdbd6db1858462c4b85b67fbfa22102021e497" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-win32.whl", hash = "sha256:a4d96dc5bcdbd834ec6b0f91027817214216b5b30316494d2b1aebffb87c534f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-win_amd64.whl", hash = "sha256:e18036cb4caa17ea151fd5f3d70be9d354c99eb8cf817a3ccde8a7873b074348" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-py3-none-any.whl", hash = "sha256:535eec9987adb04701266b92745d6cdcef2e77669299359c3009c3404dd5d191" }, +] + +[[package]] +name = "h11" +version = "0.16.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/h11/0.16.0/h11-0.16.0.tar.gz", hash = "sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/h11/0.16.0/h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86" }, +] + +[[package]] +name = "httpcore" +version = "1.0.9" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "certifi" }, + { name = "h11" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/httpcore/1.0.9/httpcore-1.0.9.tar.gz", hash = "sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/httpcore/1.0.9/httpcore-1.0.9-py3-none-any.whl", hash = "sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55" }, +] + +[[package]] +name = "httpx" +version = "0.28.1" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "anyio" }, + { name = "certifi" }, + { name = "httpcore" }, + { name = "idna" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/httpx/0.28.1/httpx-0.28.1.tar.gz", hash = "sha256:75e98c5f16b0f35b567856f597f06ff2270a374470a5c2392242528e3e3e42fc" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/httpx/0.28.1/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad" }, +] + +[[package]] +name = "httpx-sse" +version = "0.4.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/httpx-sse/0.4.0/httpx-sse-0.4.0.tar.gz", hash = "sha256:1e81a3a3070ce322add1d3529ed42eb5f70817f45ed6ec915ab753f961139721" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/httpx-sse/0.4.0/httpx_sse-0.4.0-py3-none-any.whl", hash = "sha256:f329af6eae57eaa2bdfd962b42524764af68075ea87370a2de920af5341e318f" }, +] + +[[package]] +name = "idna" +version = "3.10" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/idna/3.10/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/idna/3.10/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3" }, +] + +[[package]] +name = "iniconfig" +version = "2.1.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/iniconfig/2.1.0/iniconfig-2.1.0.tar.gz", hash = "sha256:3abbd2e30b36733fee78f9c7f7308f2d0050e88f0087fd25c2645f63c773e1c7" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/iniconfig/2.1.0/iniconfig-2.1.0-py3-none-any.whl", hash = "sha256:9deba5723312380e77435581c6bf4935c94cbfab9b1ed33ef8d238ea168eb760" }, +] + +[[package]] +name = "markdown-it-py" +version = "3.0.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "mdurl" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/markdown-it-py/3.0.0/markdown-it-py-3.0.0.tar.gz", hash = "sha256:e3f60a94fa066dc52ec76661e37c851cb232d92f9886b15cb560aaada2df8feb" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/markdown-it-py/3.0.0/markdown_it_py-3.0.0-py3-none-any.whl", hash = "sha256:355216845c60bd96232cd8d8c40e8f9765cc86f46880e43a8fd22dc1a1a8cab1" }, +] + +[[package]] +name = "mcp" +version = "1.9.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "anyio" }, + { name = "httpx" }, + { name = "httpx-sse" }, + { name = "pydantic" }, + { name = "pydantic-settings" }, + { name = "python-multipart" }, + { name = "sse-starlette" }, + { name = "starlette" }, + { name = "uvicorn", marker = "sys_platform != 'emscripten'" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/mcp/1.9.0/mcp-1.9.0.tar.gz", hash = "sha256:905d8d208baf7e3e71d70c82803b89112e321581bcd2530f9de0fe4103d28749" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/mcp/1.9.0/mcp-1.9.0-py3-none-any.whl", hash = "sha256:9dfb89c8c56f742da10a5910a1f64b0d2ac2c3ed2bd572ddb1cfab7f35957178" }, +] + +[package.optional-dependencies] +cli = [ + { name = "python-dotenv" }, + { name = "typer" }, +] + +[[package]] +name = "mdurl" +version = "0.1.2" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/mdurl/0.1.2/mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/mdurl/0.1.2/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8" }, +] + +[[package]] +name = "multidict" +version = "6.4.4" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "typing-extensions", marker = "python_full_version < '3.11'" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4.tar.gz", hash = "sha256:69ee9e6ba214b5245031b76233dd95408a0fd57fdb019ddcc1ead4790932a8e8" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:4f5f29794ac0e73d2a06ac03fd18870adc0135a9d384f4a306a951188ed02f95" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:c04157266344158ebd57b7120d9b0b35812285d26d0e78193e17ef57bfe2979a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:bb61ffd3ab8310d93427e460f565322c44ef12769f51f77277b4abad7b6f7223" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5e0ba18a9afd495f17c351d08ebbc4284e9c9f7971d715f196b79636a4d0de44" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:9faf1b1dcaadf9f900d23a0e6d6c8eadd6a95795a0e57fcca73acce0eb912065" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a4d1cb1327c6082c4fce4e2a438483390964c02213bc6b8d782cf782c9b1471f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:941f1bec2f5dbd51feeb40aea654c2747f811ab01bdd3422a48a4e4576b7d76a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e5f8a146184da7ea12910a4cec51ef85e44f6268467fb489c3caf0cd512f29c2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:232b7237e57ec3c09be97206bfb83a0aa1c5d7d377faa019c68a210fa35831f1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:dc388f75a1c00000824bf28b7633e40854f4127ede80512b44c3cfeeea1839a2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:98af87593a666f739d9dba5d0ae86e01b0e1a9cfcd2e30d2d361fbbbd1a9162d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:aff4cafea2d120327d55eadd6b7f1136a8e5a0ecf6fb3b6863e8aca32cd8e50a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:169c4ba7858176b797fe551d6e99040c531c775d2d57b31bcf4de6d7a669847f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:b9eb4c59c54421a32b3273d4239865cb14ead53a606db066d7130ac80cc8ec93" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7cf3bd54c56aa16fdb40028d545eaa8d051402b61533c21e84046e05513d5780" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f682c42003c7264134bfe886376299db4cc0c6cd06a3295b41b347044bcb5482" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a920f9cf2abdf6e493c519492d892c362007f113c94da4c239ae88429835bad1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:530d86827a2df6504526106b4c104ba19044594f8722d3e87714e847c74a0275" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-py3-none-any.whl", hash = "sha256:bd4557071b561a8b3b6075c3ce93cf9bfb6182cb241805c3d66ced3b75eff4ac" }, +] + +[[package]] +name = "packaging" +version = "25.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/packaging/25.0/packaging-25.0.tar.gz", hash = "sha256:d443872c98d677bf60f6a1f2f8c1cb748e8fe762d2bf9d3148b5599295b0fc4f" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/packaging/25.0/packaging-25.0-py3-none-any.whl", hash = "sha256:29572ef2b1f17581046b3a2227d5c611fb25ec70ca1ba8554b24b0e69331a484" }, +] + +[[package]] +name = "pluggy" +version = "1.6.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pluggy/1.6.0/pluggy-1.6.0.tar.gz", hash = "sha256:7dcc130b76258d33b90f61b658791dede3486c3e6bfb003ee5c9bfb396dd22f3" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pluggy/1.6.0/pluggy-1.6.0-py3-none-any.whl", hash = "sha256:e920276dd6813095e9377c0bc5566d94c932c33b27a3e3945d8389c374dd4746" }, +] + +[[package]] +name = "pr-agent-actions" +version = "2.0.0" +source = { editable = "." } +dependencies = [ + { name = "aiohttp" }, + { name = "mcp", extra = ["cli"] }, +] + +[package.optional-dependencies] +dev = [ + { name = "pytest" }, + { name = "pytest-asyncio" }, +] + +[package.dev-dependencies] +dev = [ + { name = "pytest" }, + { name = "pytest-asyncio" }, +] + +[package.metadata] +requires-dist = [ + { name = "aiohttp", specifier = ">=3.10.0,<4.0.0" }, + { name = "mcp", extras = ["cli"], specifier = ">=1.0.0" }, + { name = "pytest", marker = "extra == 'dev'", specifier = ">=8.3.0" }, + { name = "pytest-asyncio", marker = "extra == 'dev'", specifier = ">=0.21.0" }, +] + +[package.metadata.requires-dev] +dev = [ + { name = "pytest", specifier = ">=8.3.0" }, + { name = "pytest-asyncio", specifier = ">=0.21.0" }, +] + +[[package]] +name = "propcache" +version = "0.3.1" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1.tar.gz", hash = "sha256:40d980c33765359098837527e18eddefc9a24cea5b45e078a7f3bb5b032c6ecf" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:f27785888d2fdd918bc36de8b8739f2d6c791399552333721b58193f68ea3e98" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d4e89cde74154c7b5957f87a355bb9c8ec929c167b59c83d90654ea36aeb6180" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:730178f476ef03d3d4d255f0c9fa186cb1d13fd33ffe89d39f2cda4da90ceb71" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:967a8eec513dbe08330f10137eacb427b2ca52118769e82ebcfcab0fba92a649" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5b9145c35cc87313b5fd480144f8078716007656093d23059e8993d3a8fa730f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9e64e948ab41411958670f1093c0a57acfdc3bee5cf5b935671bbd5313bcf229" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:319fa8765bfd6a265e5fa661547556da381e53274bc05094fc9ea50da51bfd46" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c66d8ccbc902ad548312b96ed8d5d266d0d2c6d006fd0f66323e9d8f2dd49be7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:2d219b0dbabe75e15e581fc1ae796109b07c8ba7d25b9ae8d650da582bed01b0" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:cd6a55f65241c551eb53f8cf4d2f4af33512c39da5d9777694e9d9c60872f519" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:9979643ffc69b799d50d3a7b72b5164a2e97e117009d7af6dfdd2ab906cb72cd" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:4cf9e93a81979f1424f1a3d155213dc928f1069d697e4353edb8a5eba67c6259" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:2fce1df66915909ff6c824bbb5eb403d2d15f98f1518e583074671a30fe0c21e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:4d0dfdd9a2ebc77b869a0b04423591ea8823f791293b527dc1bb896c1d6f1136" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-win32.whl", hash = "sha256:1f6cc0ad7b4560e5637eb2c994e97b4fa41ba8226069c9277eb5ea7101845b42" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-win_amd64.whl", hash = "sha256:47ef24aa6511e388e9894ec16f0fbf3313a53ee68402bc428744a367ec55b833" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:7f30241577d2fef2602113b70ef7231bf4c69a97e04693bde08ddab913ba0ce5" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:43593c6772aa12abc3af7784bff4a41ffa921608dd38b77cf1dfd7f5c4e71371" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a75801768bbe65499495660b777e018cbe90c7980f07f8aa57d6be79ea6f71da" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f6f1324db48f001c2ca26a25fa25af60711e09b9aaf4b28488602776f4f9a744" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5cdb0f3e1eb6dfc9965d19734d8f9c481b294b5274337a8cb5cb01b462dcb7e0" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1eb34d90aac9bfbced9a58b266f8946cb5935869ff01b164573a7634d39fbcb5" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f35c7070eeec2cdaac6fd3fe245226ed2a6292d3ee8c938e5bb645b434c5f256" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b23c11c2c9e6d4e7300c92e022046ad09b91fd00e36e83c44483df4afa990073" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:3e19ea4ea0bf46179f8a3652ac1426e6dcbaf577ce4b4f65be581e237340420d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:bd39c92e4c8f6cbf5f08257d6360123af72af9f4da75a690bef50da77362d25f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:b0313e8b923b3814d1c4a524c93dfecea5f39fa95601f6a9b1ac96cd66f89ea0" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:e861ad82892408487be144906a368ddbe2dc6297074ade2d892341b35c59844a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:61014615c1274df8da5991a1e5da85a3ccb00c2d4701ac6f3383afd3ca47ab0a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:71ebe3fe42656a2328ab08933d420df5f3ab121772eef78f2dc63624157f0ed9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-win32.whl", hash = "sha256:58aa11f4ca8b60113d4b8e32d37e7e78bd8af4d1a5b5cb4979ed856a45e62005" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-win_amd64.whl", hash = "sha256:9532ea0b26a401264b1365146c440a6d78269ed41f83f23818d4b79497aeabe7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:f78eb8422acc93d7b69964012ad7048764bb45a54ba7a39bb9e146c72ea29723" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:89498dd49c2f9a026ee057965cdf8192e5ae070ce7d7a7bd4b66a8e257d0c976" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:09400e98545c998d57d10035ff623266927cb784d13dd2b31fd33b8a5316b85b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aa8efd8c5adc5a2c9d3b952815ff8f7710cefdcaf5f2c36d26aff51aeca2f12f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c2fe5c910f6007e716a06d269608d307b4f36e7babee5f36533722660e8c4a70" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a0ab8cf8cdd2194f8ff979a43ab43049b1df0b37aa64ab7eca04ac14429baeb7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:563f9d8c03ad645597b8d010ef4e9eab359faeb11a0a2ac9f7b4bc8c28ebef25" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:fb6e0faf8cb6b4beea5d6ed7b5a578254c6d7df54c36ccd3d8b3eb00d6770277" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:1c5c7ab7f2bb3f573d1cb921993006ba2d39e8621019dffb1c5bc94cdbae81e8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:050b571b2e96ec942898f8eb46ea4bfbb19bd5502424747e83badc2d4a99a44e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:e1c4d24b804b3a87e9350f79e2371a705a188d292fd310e663483af6ee6718ee" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:e4fe2a6d5ce975c117a6bb1e8ccda772d1e7029c1cca1acd209f91d30fa72815" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:feccd282de1f6322f56f6845bf1207a537227812f0a9bf5571df52bb418d79d5" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ec314cde7314d2dd0510c6787326bbffcbdc317ecee6b7401ce218b3099075a7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-win32.whl", hash = "sha256:7d2d5a0028d920738372630870e7d9644ce437142197f8c827194fca404bf03b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-win_amd64.whl", hash = "sha256:88c423efef9d7a59dae0614eaed718449c09a5ac79a5f224a8b9664d603f04a3" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:f1528ec4374617a7a753f90f20e2f551121bb558fcb35926f99e3c42367164b8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:dc1915ec523b3b494933b5424980831b636fe483d7d543f7afb7b3bf00f0c10f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a110205022d077da24e60b3df8bcee73971be9575dec5573dd17ae5d81751111" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d249609e547c04d190e820d0d4c8ca03ed4582bcf8e4e160a6969ddfb57b62e5" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5ced33d827625d0a589e831126ccb4f5c29dfdf6766cac441d23995a65825dcb" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4114c4ada8f3181af20808bedb250da6bae56660e4b8dfd9cd95d4549c0962f7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:975af16f406ce48f1333ec5e912fe11064605d5c5b3f6746969077cc3adeb120" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a34aa3a1abc50740be6ac0ab9d594e274f59960d3ad253cd318af76b996dd654" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:9cec3239c85ed15bfaded997773fdad9fb5662b0a7cbc854a43f291eb183179e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:05543250deac8e61084234d5fc54f8ebd254e8f2b39a16b1dce48904f45b744b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:5cb5918253912e088edbf023788de539219718d3b10aef334476b62d2b53de53" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:f3bbecd2f34d0e6d3c543fdb3b15d6b60dd69970c2b4c822379e5ec8f6f621d5" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:aca63103895c7d960a5b9b044a83f544b233c95e0dcff114389d64d762017af7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:5a0a9898fdb99bf11786265468571e628ba60af80dc3f6eb89a3545540c6b0ef" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-win32.whl", hash = "sha256:3a02a28095b5e63128bcae98eb59025924f121f048a62393db682f049bf4ac24" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-win_amd64.whl", hash = "sha256:813fbb8b6aea2fc9659815e585e548fe706d6f663fa73dff59a1677d4595a037" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:a444192f20f5ce8a5e52761a031b90f5ea6288b1eef42ad4c7e64fef33540b8f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:0fbe94666e62ebe36cd652f5fc012abfbc2342de99b523f8267a678e4dfdee3c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:f011f104db880f4e2166bcdcf7f58250f7a465bc6b068dc84c824a3d4a5c94dc" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3e584b6d388aeb0001d6d5c2bd86b26304adde6d9bb9bfa9c4889805021b96de" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8a17583515a04358b034e241f952f1715243482fc2c2945fd99a1b03a0bd77d6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5aed8d8308215089c0734a2af4f2e95eeb360660184ad3912686c181e500b2e7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6d8e309ff9a0503ef70dc9a0ebd3e69cf7b3894c9ae2ae81fc10943c37762458" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b655032b202028a582d27aeedc2e813299f82cb232f969f87a4fde491a233f11" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:9f64d91b751df77931336b5ff7bafbe8845c5770b06630e27acd5dbb71e1931c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:19a06db789a4bd896ee91ebc50d059e23b3639c25d58eb35be3ca1cbe967c3bf" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:bef100c88d8692864651b5f98e871fb090bd65c8a41a1cb0ff2322db39c96c27" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:87380fb1f3089d2a0b8b00f006ed12bd41bd858fabfa7330c954c70f50ed8757" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:e474fc718e73ba5ec5180358aa07f6aded0ff5f2abe700e3115c37d75c947e18" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:17d1c688a443355234f3c031349da69444be052613483f3e4158eef751abcd8a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-win32.whl", hash = "sha256:359e81a949a7619802eb601d66d37072b79b79c2505e6d3fd8b945538411400d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-win_amd64.whl", hash = "sha256:e7fb9a84c9abbf2b2683fa3e7b0d7da4d8ecf139a1c635732a8bda29c5214b0e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-py3-none-any.whl", hash = "sha256:9a8ecf38de50a7f518c21568c80f985e776397b902f1ce0b01f799aba1608b40" }, +] + +[[package]] +name = "pydantic" +version = "2.11.4" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "annotated-types" }, + { name = "pydantic-core" }, + { name = "typing-extensions" }, + { name = "typing-inspection" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic/2.11.4/pydantic-2.11.4.tar.gz", hash = "sha256:32738d19d63a226a52eed76645a98ee07c1f410ee41d93b4afbfa85ed8111c2d" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic/2.11.4/pydantic-2.11.4-py3-none-any.whl", hash = "sha256:d9615eaa9ac5a063471da949c8fc16376a84afb5024688b3ff885693506764eb" }, +] + +[[package]] +name = "pydantic-core" +version = "2.33.2" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "typing-extensions" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2.tar.gz", hash = "sha256:7cb8bc3605c29176e1b105350d2e6474142d7c1bd1d9327c4a9bdb46bf827acc" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:2b3d326aaef0c0399d9afffeb6367d5e26ddc24d351dbc9c636840ac355dc5d8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:0e5b2671f05ba48b94cb90ce55d8bdcaaedb8ba00cc5359f6810fc918713983d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0069c9acc3f3981b9ff4cdfaf088e98d83440a4c7ea1bc07460af3d4dc22e72d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d53b22f2032c42eaaf025f7c40c2e3b94568ae077a606f006d206a463bc69572" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0405262705a123b7ce9f0b92f123334d67b70fd1f20a9372b907ce1080c7ba02" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4b25d91e288e2c4e0662b8038a28c6a07eaac3e196cfc4ff69de4ea3db992a1b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6bdfe4b3789761f3bcb4b1ddf33355a71079858958e3a552f16d5af19768fef2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:efec8db3266b76ef9607c2c4c419bdb06bf335ae433b80816089ea7585816f6a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:031c57d67ca86902726e0fae2214ce6770bbe2f710dc33063187a68744a5ecac" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:f8de619080e944347f5f20de29a975c2d815d9ddd8be9b9b7268e2e3ef68605a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:73662edf539e72a9440129f231ed3757faab89630d291b784ca99237fb94db2b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-win32.whl", hash = "sha256:0a39979dcbb70998b0e505fb1556a1d550a0781463ce84ebf915ba293ccb7e22" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-win_amd64.whl", hash = "sha256:b0379a2b24882fef529ec3b4987cb5d003b9cda32256024e6fe1586ac45fc640" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:4c5b0a576fb381edd6d27f0a85915c6daf2f8138dc5c267a57c08a62900758c7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e799c050df38a639db758c617ec771fd8fb7a5f8eaaa4b27b101f266b216a246" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dc46a01bf8d62f227d5ecee74178ffc448ff4e5197c756331f71efcc66dc980f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a144d4f717285c6d9234a66778059f33a89096dfb9b39117663fd8413d582dcc" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:73cf6373c21bc80b2e0dc88444f41ae60b2f070ed02095754eb5a01df12256de" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3dc625f4aa79713512d1976fe9f0bc99f706a9dee21dfd1810b4bbbf228d0e8a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:881b21b5549499972441da4758d662aeea93f1923f953e9cbaff14b8b9565aef" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bdc25f3681f7b78572699569514036afe3c243bc3059d3942624e936ec93450e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:fe5b32187cbc0c862ee201ad66c30cf218e5ed468ec8dc1cf49dec66e160cc4d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:bc7aee6f634a6f4a95676fcb5d6559a2c2a390330098dba5e5a5f28a2e4ada30" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:235f45e5dbcccf6bd99f9f472858849f73d11120d76ea8707115415f8e5ebebf" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-win32.whl", hash = "sha256:6368900c2d3ef09b69cb0b913f9f8263b03786e5b2a387706c5afb66800efd51" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-win_amd64.whl", hash = "sha256:1e063337ef9e9820c77acc768546325ebe04ee38b08703244c1309cccc4f1bab" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-win_arm64.whl", hash = "sha256:6b99022f1d19bc32a4c2a0d544fc9a76e3be90f0b3f4af413f87d38749300e65" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:a7ec89dc587667f22b6a0b6579c249fca9026ce7c333fc142ba42411fa243cdc" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3c6db6e52c6d70aa0d00d45cdb9b40f0433b96380071ea80b09277dba021ddf7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e61206137cbc65e6d5256e1166f88331d3b6238e082d9f74613b9b765fb9025" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:eb8c529b2819c37140eb51b914153063d27ed88e3bdc31b71198a198e921e011" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c52b02ad8b4e2cf14ca7b3d918f3eb0ee91e63b3167c32591e57c4317e134f8f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:96081f1605125ba0855dfda83f6f3df5ec90c61195421ba72223de35ccfb2f88" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f57a69461af2a5fa6e6bbd7a5f60d3b7e6cebb687f55106933188e79ad155c1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:572c7e6c8bb4774d2ac88929e3d1f12bc45714ae5ee6d9a788a9fb35e60bb04b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:db4b41f9bd95fbe5acd76d89920336ba96f03e149097365afe1cb092fceb89a1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:fa854f5cf7e33842a892e5c73f45327760bc7bc516339fda888c75ae60edaeb6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:5f483cfb75ff703095c59e365360cb73e00185e01aaea067cd19acffd2ab20ea" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-win32.whl", hash = "sha256:9cb1da0f5a471435a7bc7e439b8a728e8b61e59784b2af70d7c169f8dd8ae290" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-win_amd64.whl", hash = "sha256:f941635f2a3d96b2973e867144fde513665c87f13fe0e193c158ac51bfaaa7b2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-win_arm64.whl", hash = "sha256:cca3868ddfaccfbc4bfb1d608e2ccaaebe0ae628e1416aeb9c4d88c001bb45ab" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:1082dd3e2d7109ad8b7da48e1d4710c8d06c253cbc4a27c1cff4fbcaa97a9e3f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f517ca031dfc037a9c07e748cefd8d96235088b83b4f4ba8939105d20fa1dcd6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0a9f2c9dd19656823cb8250b0724ee9c60a82f3cdf68a080979d13092a3b0fef" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2b0a451c263b01acebe51895bfb0e1cc842a5c666efe06cdf13846c7418caa9a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ea40a64d23faa25e62a70ad163571c0b342b8bf66d5fa612ac0dec4f069d916" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0fb2d542b4d66f9470e8065c5469ec676978d625a8b7a363f07d9a501a9cb36a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fdac5d6ffa1b5a83bca06ffe7583f5576555e6c8b3a91fbd25ea7780f825f7d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:04a1a413977ab517154eebb2d326da71638271477d6ad87a769102f7c2488c56" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:c8e7af2f4e0194c22b5b37205bfb293d166a7344a5b0d0eaccebc376546d77d5" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:5c92edd15cd58b3c2d34873597a1e20f13094f59cf88068adb18947df5455b4e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:65132b7b4a1c0beded5e057324b7e16e10910c106d43675d9bd87d4f38dde162" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-win32.whl", hash = "sha256:52fb90784e0a242bb96ec53f42196a17278855b0f31ac7c3cc6f5c1ec4811849" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-win_amd64.whl", hash = "sha256:c083a3bdd5a93dfe480f1125926afcdbf2917ae714bdb80b36d34318b2bec5d9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-win_arm64.whl", hash = "sha256:e80b087132752f6b3d714f041ccf74403799d3b23a72722ea2e6ba2e892555b9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:61c18fba8e5e9db3ab908620af374db0ac1baa69f0f32df4f61ae23f15e586ac" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:95237e53bb015f67b63c91af7518a62a8660376a6a0db19b89acc77a4d6199f5" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313t-win_amd64.whl", hash = "sha256:c2fc0a768ef76c15ab9238afa6da7f69895bb5d1ee83aeea2e3509af4472d0b9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:5c4aa4e82353f65e548c476b37e64189783aa5384903bfea4f41580f255fddfa" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:d946c8bf0d5c24bf4fe333af284c59a19358aa3ec18cb3dc4370080da1e8ad29" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:87b31b6846e361ef83fedb187bb5b4372d0da3f7e28d85415efa92d6125d6e6d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aa9d91b338f2df0508606f7009fde642391425189bba6d8c653afd80fd6bb64e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2058a32994f1fde4ca0480ab9d1e75a0e8c87c22b53a3ae66554f9af78f2fe8c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:0e03262ab796d986f978f79c943fc5f620381be7287148b8010b4097f79a39ec" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:1a8695a8d00c73e50bff9dfda4d540b7dee29ff9b8053e38380426a85ef10052" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:fa754d1850735a0b0e03bcffd9d4b4343eb417e47196e4485d9cca326073a42c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:a11c8d26a50bfab49002947d3d237abe4d9e4b5bdc8846a63537b6488e197808" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:dd14041875d09cc0f9308e37a6f8b65f5585cf2598a53aa0123df8b129d481f8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:d87c561733f66531dced0da6e864f44ebf89a8fba55f31407b00c2f7f9449593" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2f82865531efd18d6e07a04a17331af02cb7a651583c418df8266f17a63c6612" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bfb5112df54209d820d7bf9317c7a6c9025ea52e49f46b6a2060104bba37de7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:64632ff9d614e5eecfb495796ad51b0ed98c453e447a76bcbeeb69615079fc7e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:f889f7a40498cc077332c7ab6b4608d296d852182211787d4f3ee377aaae66e8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:de4b83bb311557e439b9e186f733f6c645b9417c84e2eb8203f3f820a4b988bf" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:82f68293f055f51b51ea42fafc74b6aad03e70e191799430b90c13d643059ebb" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:329467cecfb529c925cf2bbd4d60d2c509bc2fb52a20c1045bf09bb70971a9c1" }, +] + +[[package]] +name = "pydantic-settings" +version = "2.9.1" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "pydantic" }, + { name = "python-dotenv" }, + { name = "typing-inspection" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-settings/2.9.1/pydantic_settings-2.9.1.tar.gz", hash = "sha256:c509bf79d27563add44e8446233359004ed85066cd096d8b510f715e6ef5d268" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-settings/2.9.1/pydantic_settings-2.9.1-py3-none-any.whl", hash = "sha256:59b4f431b1defb26fe620c71a7d3968a710d719f5f4cdbbdb7926edeb770f6ef" }, +] + +[[package]] +name = "pygments" +version = "2.19.1" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pygments/2.19.1/pygments-2.19.1.tar.gz", hash = "sha256:61c16d2a8576dc0649d9f39e089b5f02bcd27fba10d8fb4dcc28173f7a45151f" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pygments/2.19.1/pygments-2.19.1-py3-none-any.whl", hash = "sha256:9ea1544ad55cecf4b8242fab6dd35a93bbce657034b0611ee383099054ab6d8c" }, +] + +[[package]] +name = "pytest" +version = "8.3.5" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "colorama", marker = "sys_platform == 'win32'" }, + { name = "exceptiongroup", marker = "python_full_version < '3.11'" }, + { name = "iniconfig" }, + { name = "packaging" }, + { name = "pluggy" }, + { name = "tomli", marker = "python_full_version < '3.11'" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pytest/8.3.5/pytest-8.3.5.tar.gz", hash = "sha256:f4efe70cc14e511565ac476b57c279e12a855b11f48f212af1080ef2263d3845" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pytest/8.3.5/pytest-8.3.5-py3-none-any.whl", hash = "sha256:c69214aa47deac29fad6c2a4f590b9c4a9fdb16a403176fe154b79c0b4d4d820" }, +] + +[[package]] +name = "pytest-asyncio" +version = "0.26.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "pytest" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pytest-asyncio/0.26.0/pytest_asyncio-0.26.0.tar.gz", hash = "sha256:c4df2a697648241ff39e7f0e4a73050b03f123f760673956cf0d72a4990e312f" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pytest-asyncio/0.26.0/pytest_asyncio-0.26.0-py3-none-any.whl", hash = "sha256:7b51ed894f4fbea1340262bdae5135797ebbe21d8638978e35d31c6d19f72fb0" }, +] + +[[package]] +name = "python-dotenv" +version = "1.1.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/python-dotenv/1.1.0/python_dotenv-1.1.0.tar.gz", hash = "sha256:41f90bc6f5f177fb41f53e87666db362025010eb28f60a01c9143bfa33a2b2d5" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/python-dotenv/1.1.0/python_dotenv-1.1.0-py3-none-any.whl", hash = "sha256:d7c01d9e2293916c18baf562d95698754b0dbbb5e74d457c45d4f6561fb9d55d" }, +] + +[[package]] +name = "python-multipart" +version = "0.0.20" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/python-multipart/0.0.20/python_multipart-0.0.20.tar.gz", hash = "sha256:8dd0cab45b8e23064ae09147625994d090fa46f5b0d1e13af944c331a7fa9d13" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/python-multipart/0.0.20/python_multipart-0.0.20-py3-none-any.whl", hash = "sha256:8a62d3a8335e06589fe01f2a3e178cdcc632f3fbe0d492ad9ee0ec35aab1f104" }, +] + +[[package]] +name = "rich" +version = "14.0.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "markdown-it-py" }, + { name = "pygments" }, + { name = "typing-extensions", marker = "python_full_version < '3.11'" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/rich/14.0.0/rich-14.0.0.tar.gz", hash = "sha256:82f1bc23a6a21ebca4ae0c45af9bdbc492ed20231dcb63f297d6d1021a9d5725" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/rich/14.0.0/rich-14.0.0-py3-none-any.whl", hash = "sha256:1c9491e1951aac09caffd42f448ee3d04e58923ffe14993f6e83068dc395d7e0" }, +] + +[[package]] +name = "shellingham" +version = "1.5.4" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/shellingham/1.5.4/shellingham-1.5.4.tar.gz", hash = "sha256:8dbca0739d487e5bd35ab3ca4b36e11c4078f3a234bfce294b0a0291363404de" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/shellingham/1.5.4/shellingham-1.5.4-py2.py3-none-any.whl", hash = "sha256:7ecfff8f2fd72616f7481040475a65b2bf8af90a56c89140852d1120324e8686" }, +] + +[[package]] +name = "sniffio" +version = "1.3.1" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/sniffio/1.3.1/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/sniffio/1.3.1/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2" }, +] + +[[package]] +name = "sse-starlette" +version = "2.3.4" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "anyio" }, + { name = "starlette" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/sse-starlette/2.3.4/sse_starlette-2.3.4.tar.gz", hash = "sha256:0ffd6bed217cdbb74a84816437c609278003998b4991cd2e6872d0b35130e4d5" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/sse-starlette/2.3.4/sse_starlette-2.3.4-py3-none-any.whl", hash = "sha256:b8100694f3f892b133d0f7483acb7aacfcf6ed60f863b31947664b6dc74e529f" }, +] + +[[package]] +name = "starlette" +version = "0.46.2" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "anyio" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/starlette/0.46.2/starlette-0.46.2.tar.gz", hash = "sha256:7f7361f34eed179294600af672f565727419830b54b7b084efe44bb82d2fccd5" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/starlette/0.46.2/starlette-0.46.2-py3-none-any.whl", hash = "sha256:595633ce89f8ffa71a015caed34a5b2dc1c0cdb3f0f1fbd1e69339cf2abeec35" }, +] + +[[package]] +name = "tomli" +version = "2.2.1" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1.tar.gz", hash = "sha256:cd45e1dc79c835ce60f7404ec8119f2eb06d38b1deba146f07ced3bbc44505ff" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:678e4fa69e4575eb77d103de3df8a895e1591b48e740211bd1067378c69e8249" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:023aa114dd824ade0100497eb2318602af309e5a55595f76b626d6d9f3b7b0a6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ece47d672db52ac607a3d9599a9d48dcb2f2f735c6c2d1f34130085bb12b112a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6972ca9c9cc9f0acaa56a8ca1ff51e7af152a9f87fb64623e31d5c83700080ee" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c954d2250168d28797dd4e3ac5cf812a406cd5a92674ee4c8f123c889786aa8e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8dd28b3e155b80f4d54beb40a441d366adcfe740969820caf156c019fb5c7ec4" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:e59e304978767a54663af13c07b3d1af22ddee3bb2fb0618ca1593e4f593a106" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:33580bccab0338d00994d7f16f4c4ec25b776af3ffaac1ed74e0b3fc95e885a8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-win32.whl", hash = "sha256:465af0e0875402f1d226519c9904f37254b3045fc5084697cefb9bdde1ff99ff" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-win_amd64.whl", hash = "sha256:2d0f2fdd22b02c6d81637a3c95f8cd77f995846af7414c5c4b8d0545afa1bc4b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4a8f6e44de52d5e6c657c9fe83b562f5f4256d8ebbfe4ff922c495620a7f6cea" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8d57ca8095a641b8237d5b079147646153d22552f1c637fd3ba7f4b0b29167a8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e340144ad7ae1533cb897d406382b4b6fede8890a03738ff1683af800d54192" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:db2b95f9de79181805df90bedc5a5ab4c165e6ec3fe99f970d0e302f384ad222" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:40741994320b232529c802f8bc86da4e1aa9f413db394617b9a256ae0f9a7f77" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:400e720fe168c0f8521520190686ef8ef033fb19fc493da09779e592861b78c6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:02abe224de6ae62c19f090f68da4e27b10af2b93213d36cf44e6e1c5abd19fdd" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b82ebccc8c8a36f2094e969560a1b836758481f3dc360ce9a3277c65f374285e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-win32.whl", hash = "sha256:889f80ef92701b9dbb224e49ec87c645ce5df3fa2cc548664eb8a25e03127a98" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-win_amd64.whl", hash = "sha256:7fc04e92e1d624a4a63c76474610238576942d6b8950a2d7f908a340494e67e4" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f4039b9cbc3048b2416cc57ab3bda989a6fcf9b36cf8937f01a6e731b64f80d7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:286f0ca2ffeeb5b9bd4fcc8d6c330534323ec51b2f52da063b11c502da16f30c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a92ef1a44547e894e2a17d24e7557a5e85a9e1d0048b0b5e7541f76c5032cb13" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9316dc65bed1684c9a98ee68759ceaed29d229e985297003e494aa825ebb0281" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e85e99945e688e32d5a35c1ff38ed0b3f41f43fad8df0bdf79f72b2ba7bc5272" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ac065718db92ca818f8d6141b5f66369833d4a80a9d74435a268c52bdfa73140" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:d920f33822747519673ee656a4b6ac33e382eca9d331c87770faa3eef562aeb2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a198f10c4d1b1375d7687bc25294306e551bf1abfa4eace6650070a5c1ae2744" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-win32.whl", hash = "sha256:d3f5614314d758649ab2ab3a62d4f2004c825922f9e370b29416484086b264ec" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-win_amd64.whl", hash = "sha256:a38aa0308e754b0e3c67e344754dff64999ff9b513e691d0e786265c93583c69" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-py3-none-any.whl", hash = "sha256:cb55c73c5f4408779d0cf3eef9f762b9c9f147a77de7b258bef0a5628adc85cc" }, +] + +[[package]] +name = "typer" +version = "0.15.4" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "click" }, + { name = "rich" }, + { name = "shellingham" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/typer/0.15.4/typer-0.15.4.tar.gz", hash = "sha256:89507b104f9b6a0730354f27c39fae5b63ccd0c95b1ce1f1a6ba0cfd329997c3" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/typer/0.15.4/typer-0.15.4-py3-none-any.whl", hash = "sha256:eb0651654dcdea706780c466cf06d8f174405a659ffff8f163cfbfee98c0e173" }, +] + +[[package]] +name = "typing-extensions" +version = "4.13.2" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/typing-extensions/4.13.2/typing_extensions-4.13.2.tar.gz", hash = "sha256:e6c81219bd689f51865d9e372991c540bda33a0379d5573cddb9a3a23f7caaef" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/typing-extensions/4.13.2/typing_extensions-4.13.2-py3-none-any.whl", hash = "sha256:a439e7c04b49fec3e5d3e2beaa21755cadbbdc391694e28ccdd36ca4a1408f8c" }, +] + +[[package]] +name = "typing-inspection" +version = "0.4.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "typing-extensions" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/typing-inspection/0.4.0/typing_inspection-0.4.0.tar.gz", hash = "sha256:9765c87de36671694a67904bf2c96e395be9c6439bb6c87b5142569dcdd65122" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/typing-inspection/0.4.0/typing_inspection-0.4.0-py3-none-any.whl", hash = "sha256:50e72559fcd2a6367a19f7a7e610e6afcb9fac940c650290eed893d61386832f" }, +] + +[[package]] +name = "uvicorn" +version = "0.34.2" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "click" }, + { name = "h11" }, + { name = "typing-extensions", marker = "python_full_version < '3.11'" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/uvicorn/0.34.2/uvicorn-0.34.2.tar.gz", hash = "sha256:0e929828f6186353a80b58ea719861d2629d766293b6d19baf086ba31d4f3328" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/uvicorn/0.34.2/uvicorn-0.34.2-py3-none-any.whl", hash = "sha256:deb49af569084536d269fe0a6d67e3754f104cf03aba7c11c40f01aadf33c403" }, +] + +[[package]] +name = "yarl" +version = "1.20.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "idna" }, + { name = "multidict" }, + { name = "propcache" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0.tar.gz", hash = "sha256:686d51e51ee5dfe62dec86e4866ee0e9ed66df700d55c828a615640adc885307" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:f1f6670b9ae3daedb325fa55fbe31c22c8228f6e0b513772c2e1c623caa6ab22" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:85a231fa250dfa3308f3c7896cc007a47bc76e9e8e8595c20b7426cac4884c62" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:1a06701b647c9939d7019acdfa7ebbfbb78ba6aa05985bb195ad716ea759a569" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7595498d085becc8fb9203aa314b136ab0516c7abd97e7d74f7bb4eb95042abe" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:af5607159085dcdb055d5678fc2d34949bd75ae6ea6b4381e784bbab1c3aa195" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:95b50910e496567434cb77a577493c26bce0f31c8a305135f3bda6a2483b8e10" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b594113a301ad537766b4e16a5a6750fcbb1497dcc1bc8a4daae889e6402a634" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:083ce0393ea173cd37834eb84df15b6853b555d20c52703e21fbababa8c129d2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4f1a350a652bbbe12f666109fbddfdf049b3ff43696d18c9ab1531fbba1c977a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:fb0caeac4a164aadce342f1597297ec0ce261ec4532bbc5a9ca8da5622f53867" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:d88cc43e923f324203f6ec14434fa33b85c06d18d59c167a0637164863b8e995" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e52d6ed9ea8fd3abf4031325dc714aed5afcbfa19ee4a89898d663c9976eb487" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:ce360ae48a5e9961d0c730cf891d40698a82804e85f6e74658fb175207a77cb2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:06d06c9d5b5bc3eb56542ceeba6658d31f54cf401e8468512447834856fb0e61" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:c27d98f4e5c4060582f44e58309c1e55134880558f1add7a87c1bc36ecfade19" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-win32.whl", hash = "sha256:f4d3fa9b9f013f7050326e165c3279e22850d02ae544ace285674cb6174b5d6d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-win_amd64.whl", hash = "sha256:bc906b636239631d42eb8a07df8359905da02704a868983265603887ed68c076" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:fdb5204d17cb32b2de2d1e21c7461cabfacf17f3645e4b9039f210c5d3378bf3" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:eaddd7804d8e77d67c28d154ae5fab203163bd0998769569861258e525039d2a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:634b7ba6b4a85cf67e9df7c13a7fb2e44fa37b5d34501038d174a63eaac25ee2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6d409e321e4addf7d97ee84162538c7258e53792eb7c6defd0c33647d754172e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:ea52f7328a36960ba3231c6677380fa67811b414798a6e071c7085c57b6d20a9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c8703517b924463994c344dcdf99a2d5ce9eca2b6882bb640aa555fb5efc706a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:077989b09ffd2f48fb2d8f6a86c5fef02f63ffe6b1dd4824c76de7bb01e4f2e2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0acfaf1da020253f3533526e8b7dd212838fdc4109959a2c53cafc6db611bff2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b4230ac0b97ec5eeb91d96b324d66060a43fd0d2a9b603e3327ed65f084e41f8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:0a6a1e6ae21cdd84011c24c78d7a126425148b24d437b5702328e4ba640a8902" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:86de313371ec04dd2531f30bc41a5a1a96f25a02823558ee0f2af0beaa7ca791" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:dd59c9dd58ae16eaa0f48c3d0cbe6be8ab4dc7247c3ff7db678edecbaf59327f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:a0bc5e05f457b7c1994cc29e83b58f540b76234ba6b9648a4971ddc7f6aa52da" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:c9471ca18e6aeb0e03276b5e9b27b14a54c052d370a9c0c04a68cefbd1455eb4" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:40ed574b4df723583a26c04b298b283ff171bcc387bc34c2683235e2487a65a5" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-win32.whl", hash = "sha256:db243357c6c2bf3cd7e17080034ade668d54ce304d820c2a58514a4e51d0cfd6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-win_amd64.whl", hash = "sha256:8c12cd754d9dbd14204c328915e23b0c361b88f3cffd124129955e60a4fbfcfb" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:e06b9f6cdd772f9b665e5ba8161968e11e403774114420737f7884b5bd7bdf6f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:b9ae2fbe54d859b3ade40290f60fe40e7f969d83d482e84d2c31b9bff03e359e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6d12b8945250d80c67688602c891237994d203d42427cb14e36d1a732eda480e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:087e9731884621b162a3e06dc0d2d626e1542a617f65ba7cc7aeab279d55ad33" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:69df35468b66c1a6e6556248e6443ef0ec5f11a7a4428cf1f6281f1879220f58" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3b2992fe29002fd0d4cbaea9428b09af9b8686a9024c840b8a2b8f4ea4abc16f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4c903e0b42aab48abfbac668b5a9d7b6938e721a6341751331bcd7553de2dcae" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bf099e2432131093cc611623e0b0bcc399b8cddd9a91eded8bfb50402ec35018" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8a7f62f5dc70a6c763bec9ebf922be52aa22863d9496a9a30124d65b489ea672" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:54ac15a8b60382b2bcefd9a289ee26dc0920cf59b05368c9b2b72450751c6eb8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:25b3bc0763a7aca16a0f1b5e8ef0f23829df11fb539a1b70476dcab28bd83da7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:b2586e36dc070fc8fad6270f93242124df68b379c3a251af534030a4a33ef594" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:866349da9d8c5290cfefb7fcc47721e94de3f315433613e01b435473be63daa6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:33bb660b390a0554d41f8ebec5cd4475502d84104b27e9b42f5321c5192bfcd1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:737e9f171e5a07031cbee5e9180f6ce21a6c599b9d4b2c24d35df20a52fabf4b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-win32.whl", hash = "sha256:839de4c574169b6598d47ad61534e6981979ca2c820ccb77bf70f4311dd2cc64" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-win_amd64.whl", hash = "sha256:3d7dbbe44b443b0c4aa0971cb07dcb2c2060e4a9bf8d1301140a33a93c98e18c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:2137810a20b933b1b1b7e5cf06a64c3ed3b4747b0e5d79c9447c00db0e2f752f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:447c5eadd750db8389804030d15f43d30435ed47af1313303ed82a62388176d3" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:42fbe577272c203528d402eec8bf4b2d14fd49ecfec92272334270b850e9cd7d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:18e321617de4ab170226cd15006a565d0fa0d908f11f724a2c9142d6b2812ab0" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:4345f58719825bba29895011e8e3b545e6e00257abb984f9f27fe923afca2501" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5d9b980d7234614bc4674468ab173ed77d678349c860c3af83b1fffb6a837ddc" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:af4baa8a445977831cbaa91a9a84cc09debb10bc8391f128da2f7bd070fc351d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:123393db7420e71d6ce40d24885a9e65eb1edefc7a5228db2d62bcab3386a5c0" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ab47acc9332f3de1b39e9b702d9c916af7f02656b2a86a474d9db4e53ef8fd7a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:4a34c52ed158f89876cba9c600b2c964dfc1ca52ba7b3ab6deb722d1d8be6df2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:04d8cfb12714158abf2618f792c77bc5c3d8c5f37353e79509608be4f18705c9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:7dc63ad0d541c38b6ae2255aaa794434293964677d5c1ec5d0116b0e308031f5" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:f9d02b591a64e4e6ca18c5e3d925f11b559c763b950184a64cf47d74d7e41877" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:95fc9876f917cac7f757df80a5dda9de59d423568460fe75d128c813b9af558e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:bb769ae5760cd1c6a712135ee7915f9d43f11d9ef769cb3f75a23e398a92d384" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-win32.whl", hash = "sha256:70e0c580a0292c7414a1cead1e076c9786f685c1fc4757573d2967689b370e62" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-win_amd64.whl", hash = "sha256:4c43030e4b0af775a85be1fa0433119b1565673266a70bf87ef68a9d5ba3174c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:b6c4c3d0d6a0ae9b281e492b1465c72de433b782e6b5001c8e7249e085b69051" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:8681700f4e4df891eafa4f69a439a6e7d480d64e52bf460918f58e443bd3da7d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:84aeb556cb06c00652dbf87c17838eb6d92cfd317799a8092cee0e570ee11229" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f166eafa78810ddb383e930d62e623d288fb04ec566d1b4790099ae0f31485f1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:5d3d6d14754aefc7a458261027a562f024d4f6b8a798adb472277f675857b1eb" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:2a8f64df8ed5d04c51260dbae3cc82e5649834eebea9eadfd829837b8093eb00" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4d9949eaf05b4d30e93e4034a7790634bbb41b8be2d07edd26754f2e38e491de" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9c366b254082d21cc4f08f522ac201d0d83a8b8447ab562732931d31d80eb2a5" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:91bc450c80a2e9685b10e34e41aef3d44ddf99b3a498717938926d05ca493f6a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:9c2aa4387de4bc3a5fe158080757748d16567119bef215bec643716b4fbf53f9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:d2cbca6760a541189cf87ee54ff891e1d9ea6406079c66341008f7ef6ab61145" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:798a5074e656f06b9fad1a162be5a32da45237ce19d07884d0b67a0aa9d5fdda" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:f106e75c454288472dbe615accef8248c686958c2e7dd3b8d8ee2669770d020f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:3b60a86551669c23dc5445010534d2c5d8a4e012163218fc9114e857c0586fdd" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:3e429857e341d5e8e15806118e0294f8073ba9c4580637e59ab7b238afca836f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-win32.whl", hash = "sha256:65a4053580fe88a63e8e4056b427224cd01edfb5f951498bfefca4052f0ce0ac" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-win_amd64.whl", hash = "sha256:53b2da3a6ca0a541c1ae799c349788d480e5144cac47dba0266c7cb6c76151fe" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-py3-none-any.whl", hash = "sha256:5d0fe6af927a47a230f31e6004621fd0959eaa915fc62acfafa67ff7229a3124" }, +] diff --git a/projects/unit3/github-actions-integration/solution/webhook_server.py b/projects/unit3/github-actions-integration/solution/webhook_server.py new file mode 100644 index 0000000..64941d1 --- /dev/null +++ b/projects/unit3/github-actions-integration/solution/webhook_server.py @@ -0,0 +1,57 @@ +#!/usr/bin/env python3 +""" +Simple webhook server for GitHub Actions events. +Stores events in a JSON file that the MCP server can read. +""" + +import json +from datetime import datetime +from pathlib import Path +from aiohttp import web + +# File to store events +EVENTS_FILE = Path(__file__).parent / "github_events.json" + +async def handle_webhook(request): + """Handle incoming GitHub webhook""" + try: + data = await request.json() + + # Create event record + event = { + "timestamp": datetime.utcnow().isoformat(), + "event_type": request.headers.get("X-GitHub-Event", "unknown"), + "action": data.get("action"), + "workflow_run": data.get("workflow_run"), + "check_run": data.get("check_run"), + "repository": data.get("repository", {}).get("full_name"), + "sender": data.get("sender", {}).get("login") + } + + # Load existing events + events = [] + if EVENTS_FILE.exists(): + with open(EVENTS_FILE, 'r') as f: + events = json.load(f) + + # Add new event and keep last 100 + events.append(event) + events = events[-100:] + + # Save events + with open(EVENTS_FILE, 'w') as f: + json.dump(events, f, indent=2) + + return web.json_response({"status": "received"}) + except Exception as e: + return web.json_response({"error": str(e)}, status=400) + +# Create app and add route +app = web.Application() +app.router.add_post('/webhook/github', handle_webhook) + +if __name__ == '__main__': + print("🚀 Starting webhook server on http://localhost:8080") + print("📝 Events will be saved to:", EVENTS_FILE) + print("🔗 Webhook URL: http://localhost:8080/webhook/github") + web.run_app(app, host='localhost', port=8080) \ No newline at end of file diff --git a/projects/unit3/github-actions-integration/starter/README.md b/projects/unit3/github-actions-integration/starter/README.md new file mode 100644 index 0000000..fe12d95 --- /dev/null +++ b/projects/unit3/github-actions-integration/starter/README.md @@ -0,0 +1,55 @@ +# Module 2: GitHub Actions Integration - Starter Code + +This is your starting point for Module 2. You'll extend the PR Agent from Module 1 with webhook handling and MCP Prompts. + +## Your Task + +1. **Import Required Types**: Add the necessary imports for MCP Prompts +2. **Connect to Events File**: Define the path to read webhook events +3. **Implement GitHub Actions Tools**: Complete the tools to query events +4. **Create MCP Prompts**: Build prompts for standardized CI/CD workflows + +Note: The webhook server (`webhook_server.py`) is provided for you! + +## Getting Started + +1. Install dependencies: + ```bash + uv sync + ``` + +2. Start the webhook server (in a separate terminal): + ```bash + python webhook_server.py + ``` + +3. Follow the TODOs in `server.py` to implement Module 2 features + +## Testing Your Implementation + +1. Start your server: + ```bash + uv run server.py + ``` + +2. In another terminal, start Cloudflare Tunnel: + ```bash + cloudflared tunnel --url http://localhost:8080 + ``` + +3. Configure GitHub webhooks with your tunnel URL + +4. Test with Claude Code using the new prompts + +## Implementation Hints + +- The webhook server stores events in `github_events.json` +- Read the JSON file in your tools to get event data +- Prompts are simple functions that return strings with instructions +- Decorate prompt functions with `@mcp.prompt()` + +## Need Help? + +- Review the Module 2 documentation +- Check the solution in the `solution/` directory +- Look at the MCP documentation for Prompts: https://modelcontextprotocol.io/docs/concepts/prompts \ No newline at end of file diff --git a/projects/unit3/github-actions-integration/starter/pyproject.toml b/projects/unit3/github-actions-integration/starter/pyproject.toml new file mode 100644 index 0000000..2e060d7 --- /dev/null +++ b/projects/unit3/github-actions-integration/starter/pyproject.toml @@ -0,0 +1,29 @@ +[project] +name = "pr-agent" +version = "1.0.0" +description = "MCP server for PR template suggestions" +readme = "README.md" +requires-python = ">=3.10" +dependencies = [ + "mcp[cli]>=1.0.0", + "aiohttp>=3.10.0,<4.0.0", +] + +[project.optional-dependencies] +dev = [ + "pytest>=8.3.0", + "pytest-asyncio>=0.21.0", +] + +[build-system] +requires = ["hatchling"] +build-backend = "hatchling.build" + +[tool.hatch.build.targets.wheel] +packages = ["."] + +[tool.uv] +dev-dependencies = [ + "pytest>=8.3.0", + "pytest-asyncio>=0.21.0", +] \ No newline at end of file diff --git a/projects/unit3/github-actions-integration/starter/server.py b/projects/unit3/github-actions-integration/starter/server.py new file mode 100644 index 0000000..0485b67 --- /dev/null +++ b/projects/unit3/github-actions-integration/starter/server.py @@ -0,0 +1,259 @@ +#!/usr/bin/env python3 +""" +Module 2: GitHub Actions Integration - STARTER CODE +Extend your PR Agent with webhook handling and MCP Prompts for CI/CD workflows. +""" + +import json +import os +import subprocess +from typing import Dict, List, Any, Optional +from pathlib import Path +from datetime import datetime + +from mcp.server.fastmcp import FastMCP + +# Initialize the FastMCP server +mcp = FastMCP("pr-agent-actions") + +# PR template directory (shared between starter and solution) +TEMPLATES_DIR = Path(__file__).parent.parent.parent / "templates" + +# TODO: Add path to events file where webhook_server.py stores events +# Hint: EVENTS_FILE = Path(__file__).parent / "github_events.json" + + +# ===== Module 1 Tools (Already includes output limiting fix from Module 1) ===== + +@mcp.tool() +async def analyze_file_changes( + base_branch: str = "main", + include_diff: bool = True, + max_diff_lines: int = 500 +) -> str: + """Get the full diff and list of changed files in the current git repository. + + Args: + base_branch: Base branch to compare against (default: main) + include_diff: Include the full diff content (default: true) + max_diff_lines: Maximum number of diff lines to include (default: 500) + """ + try: + # Get list of changed files + files_result = subprocess.run( + ["git", "diff", "--name-status", f"{base_branch}...HEAD"], + capture_output=True, + text=True, + check=True + ) + + # Get diff statistics + stat_result = subprocess.run( + ["git", "diff", "--stat", f"{base_branch}...HEAD"], + capture_output=True, + text=True + ) + + # Get the actual diff if requested + diff_content = "" + truncated = False + if include_diff: + diff_result = subprocess.run( + ["git", "diff", f"{base_branch}...HEAD"], + capture_output=True, + text=True + ) + diff_lines = diff_result.stdout.split('\n') + + # Check if we need to truncate (learned from Module 1) + if len(diff_lines) > max_diff_lines: + diff_content = '\n'.join(diff_lines[:max_diff_lines]) + diff_content += f"\n\n... Output truncated. Showing {max_diff_lines} of {len(diff_lines)} lines ..." + diff_content += "\n... Use max_diff_lines parameter to see more ..." + truncated = True + else: + diff_content = diff_result.stdout + + # Get commit messages for context + commits_result = subprocess.run( + ["git", "log", "--oneline", f"{base_branch}..HEAD"], + capture_output=True, + text=True + ) + + analysis = { + "base_branch": base_branch, + "files_changed": files_result.stdout, + "statistics": stat_result.stdout, + "commits": commits_result.stdout, + "diff": diff_content if include_diff else "Diff not included (set include_diff=true to see full diff)", + "truncated": truncated, + "total_diff_lines": len(diff_lines) if include_diff else 0 + } + + return json.dumps(analysis, indent=2) + + except subprocess.CalledProcessError as e: + return f"Error analyzing changes: {e.stderr}" + except Exception as e: + return f"Error: {str(e)}" + + +@mcp.tool() +async def get_pr_templates() -> str: + """List available PR templates with their content.""" + templates = [] + + # Define default templates + default_templates = { + "bug.md": "Bug Fix", + "feature.md": "Feature", + "docs.md": "Documentation", + "refactor.md": "Refactor", + "test.md": "Test", + "performance.md": "Performance", + "security.md": "Security" + } + + for filename, template_type in default_templates.items(): + template_path = TEMPLATES_DIR / filename + + # Read template content + content = template_path.read_text() + + templates.append({ + "filename": filename, + "type": template_type, + "content": content + }) + + return json.dumps(templates, indent=2) + + +@mcp.tool() +async def suggest_template(changes_summary: str, change_type: str) -> str: + """Let Claude analyze the changes and suggest the most appropriate PR template. + + Args: + changes_summary: Your analysis of what the changes do + change_type: The type of change you've identified (bug, feature, docs, refactor, test, etc.) + """ + + # Get available templates + templates_response = await get_pr_templates() + templates = json.loads(templates_response) + + # Map change types to template files + type_mapping = { + "bug": "bug.md", + "fix": "bug.md", + "feature": "feature.md", + "enhancement": "feature.md", + "docs": "docs.md", + "documentation": "docs.md", + "refactor": "refactor.md", + "cleanup": "refactor.md", + "test": "test.md", + "testing": "test.md", + "performance": "performance.md", + "optimization": "performance.md", + "security": "security.md" + } + + # Find matching template + template_file = type_mapping.get(change_type.lower(), "feature.md") + selected_template = next( + (t for t in templates if t["filename"] == template_file), + templates[0] # Default to first template if no match + ) + + suggestion = { + "recommended_template": selected_template, + "reasoning": f"Based on your analysis: '{changes_summary}', this appears to be a {change_type} change.", + "template_content": selected_template["content"], + "usage_hint": "Claude can help you fill out this template based on the specific changes in your PR." + } + + return json.dumps(suggestion, indent=2) + + +# ===== Module 2: New GitHub Actions Tools ===== + +@mcp.tool() +async def get_recent_actions_events(limit: int = 10) -> str: + """Get recent GitHub Actions events received via webhook. + + Args: + limit: Maximum number of events to return (default: 10) + """ + # TODO: Implement this function + # 1. Check if EVENTS_FILE exists + # 2. Read the JSON file + # 3. Return the most recent events (up to limit) + # 4. Return empty list if file doesn't exist + + return json.dumps({"message": "TODO: Implement get_recent_actions_events"}) + + +@mcp.tool() +async def get_workflow_status(workflow_name: Optional[str] = None) -> str: + """Get the current status of GitHub Actions workflows. + + Args: + workflow_name: Optional specific workflow name to filter by + """ + # TODO: Implement this function + # 1. Read events from EVENTS_FILE + # 2. Filter events for workflow_run events + # 3. If workflow_name provided, filter by that name + # 4. Group by workflow and show latest status + # 5. Return formatted workflow status information + + return json.dumps({"message": "TODO: Implement get_workflow_status"}) + + +# ===== Module 2: MCP Prompts ===== + +@mcp.prompt() +async def analyze_ci_results(): + """Analyze recent CI/CD results and provide insights.""" + # TODO: Implement this prompt + # Return a string with instructions for Claude to: + # 1. Use get_recent_actions_events() + # 2. Use get_workflow_status() + # 3. Analyze results and provide insights + + return "TODO: Implement analyze_ci_results prompt" + + +@mcp.prompt() +async def create_deployment_summary(): + """Generate a deployment summary for team communication.""" + # TODO: Implement this prompt + # Return a string that guides Claude to create a deployment summary + + return "TODO: Implement create_deployment_summary prompt" + + +@mcp.prompt() +async def generate_pr_status_report(): + """Generate a comprehensive PR status report including CI/CD results.""" + # TODO: Implement this prompt + # Return a string that guides Claude to combine code changes with CI/CD status + + return "TODO: Implement generate_pr_status_report prompt" + + +@mcp.prompt() +async def troubleshoot_workflow_failure(): + """Help troubleshoot a failing GitHub Actions workflow.""" + # TODO: Implement this prompt + # Return a string that guides Claude through troubleshooting steps + + return "TODO: Implement troubleshoot_workflow_failure prompt" + + +if __name__ == "__main__": + print("Starting PR Agent MCP server...") + print("NOTE: Run webhook_server.py in a separate terminal to receive GitHub events") + mcp.run() \ No newline at end of file diff --git a/projects/unit3/github-actions-integration/starter/test_server.py b/projects/unit3/github-actions-integration/starter/test_server.py new file mode 100644 index 0000000..79da4c1 --- /dev/null +++ b/projects/unit3/github-actions-integration/starter/test_server.py @@ -0,0 +1,157 @@ +#!/usr/bin/env python3 +""" +Unit tests for Module 1: Basic MCP Server +Run these tests to validate your implementation +""" + +import json +import pytest +import asyncio +from pathlib import Path +from unittest.mock import patch, MagicMock + +# Import your implemented functions +try: + from server import ( + mcp, + analyze_file_changes, + get_pr_templates, + suggest_template + ) + IMPORTS_SUCCESSFUL = True +except ImportError as e: + IMPORTS_SUCCESSFUL = False + IMPORT_ERROR = str(e) + + +class TestImplementation: + """Test that the required functions are implemented.""" + + def test_imports(self): + """Test that all required functions can be imported.""" + assert IMPORTS_SUCCESSFUL, f"Failed to import required functions: {IMPORT_ERROR if not IMPORTS_SUCCESSFUL else ''}" + assert mcp is not None, "FastMCP server instance not found" + assert callable(analyze_file_changes), "analyze_file_changes should be a callable function" + assert callable(get_pr_templates), "get_pr_templates should be a callable function" + assert callable(suggest_template), "suggest_template should be a callable function" + + +@pytest.mark.skipif(not IMPORTS_SUCCESSFUL, reason="Imports failed") +class TestAnalyzeFileChanges: + """Test the analyze_file_changes tool.""" + + @pytest.mark.asyncio + async def test_returns_json_string(self): + """Test that analyze_file_changes returns a JSON string.""" + with patch('subprocess.run') as mock_run: + mock_run.return_value = MagicMock(stdout="", stderr="") + + result = await analyze_file_changes() + + assert isinstance(result, str), "Should return a string" + # Should be valid JSON + data = json.loads(result) + assert isinstance(data, dict), "Should return a JSON object" + + @pytest.mark.asyncio + async def test_includes_required_fields(self): + """Test that the result includes expected fields.""" + with patch('subprocess.run') as mock_run: + mock_run.return_value = MagicMock(stdout="M\tfile1.py\n", stderr="") + + result = await analyze_file_changes() + data = json.loads(result) + + # Check for some expected fields (flexible to allow different implementations) + assert any(key in data for key in ["files_changed", "files", "changes", "diff"]), \ + "Result should include file change information" + + +@pytest.mark.skipif(not IMPORTS_SUCCESSFUL, reason="Imports failed") +class TestGetPRTemplates: + """Test the get_pr_templates tool.""" + + @pytest.mark.asyncio + async def test_returns_json_string(self): + """Test that get_pr_templates returns a JSON string.""" + result = await get_pr_templates() + + assert isinstance(result, str), "Should return a string" + # Should be valid JSON + data = json.loads(result) + assert isinstance(data, list), "Should return a JSON array of templates" + + @pytest.mark.asyncio + async def test_returns_templates(self): + """Test that templates are returned.""" + result = await get_pr_templates() + templates = json.loads(result) + + assert len(templates) > 0, "Should return at least one template" + + # Check that templates have expected structure + for template in templates: + assert isinstance(template, dict), "Each template should be a dictionary" + # Should have some identifying information + assert any(key in template for key in ["filename", "name", "type", "id"]), \ + "Templates should have an identifier" + + +@pytest.mark.skipif(not IMPORTS_SUCCESSFUL, reason="Imports failed") +class TestSuggestTemplate: + """Test the suggest_template tool.""" + + @pytest.mark.asyncio + async def test_returns_json_string(self): + """Test that suggest_template returns a JSON string.""" + result = await suggest_template( + "Fixed a bug in the authentication system", + "bug" + ) + + assert isinstance(result, str), "Should return a string" + # Should be valid JSON + data = json.loads(result) + assert isinstance(data, dict), "Should return a JSON object" + + @pytest.mark.asyncio + async def test_suggestion_structure(self): + """Test that the suggestion has expected structure.""" + result = await suggest_template( + "Added new feature for user management", + "feature" + ) + suggestion = json.loads(result) + + # Check for some expected fields (flexible to allow different implementations) + assert any(key in suggestion for key in ["template", "recommended_template", "suggestion"]), \ + "Should include a template recommendation" + + +@pytest.mark.skipif(not IMPORTS_SUCCESSFUL, reason="Imports failed") +class TestToolRegistration: + """Test that tools are properly registered with FastMCP.""" + + def test_tools_have_decorators(self): + """Test that tool functions are decorated with @mcp.tool().""" + # In FastMCP, decorated functions should have certain attributes + # This is a basic check that functions exist and are callable + assert hasattr(analyze_file_changes, '__name__'), \ + "analyze_file_changes should be a proper function" + assert hasattr(get_pr_templates, '__name__'), \ + "get_pr_templates should be a proper function" + assert hasattr(suggest_template, '__name__'), \ + "suggest_template should be a proper function" + + +if __name__ == "__main__": + if not IMPORTS_SUCCESSFUL: + print(f"❌ Cannot run tests - imports failed: {IMPORT_ERROR}") + print("\nMake sure you've:") + print("1. Implemented all three tool functions") + print("2. Decorated them with @mcp.tool()") + print("3. Installed dependencies with: uv sync") + exit(1) + + # Run tests + pytest.main([__file__, "-v"]) \ No newline at end of file diff --git a/projects/unit3/github-actions-integration/starter/validate_starter.py b/projects/unit3/github-actions-integration/starter/validate_starter.py new file mode 100644 index 0000000..9885831 --- /dev/null +++ b/projects/unit3/github-actions-integration/starter/validate_starter.py @@ -0,0 +1,193 @@ +#!/usr/bin/env python3 +""" +Validation script for Module 1 starter code +Ensures the starter template is ready for learners to implement +""" + +import subprocess +import sys +import os +from pathlib import Path + +def test_project_structure(): + """Check that all required files exist.""" + print("Project Structure:") + required_files = [ + "server.py", + "pyproject.toml", + "README.md" + ] + + all_exist = True + for file in required_files: + if Path(file).exists(): + print(f" ✓ {file} exists") + else: + print(f" ✗ {file} missing") + all_exist = False + + return all_exist + +def test_imports(): + """Test that the starter code imports work.""" + try: + # Test importing the server module + import server + print("✓ server.py imports successfully") + + # Check that FastMCP is imported + if hasattr(server, 'mcp'): + print("✓ FastMCP server instance found") + else: + print("✗ FastMCP server instance not found") + return False + + return True + except ImportError as e: + print(f"✗ Import error: {e}") + print(" Please ensure you've installed dependencies: uv sync") + return False + +def test_todos(): + """Check that TODO comments exist for learners.""" + print("\nTODO Comments:") + + with open("server.py", "r") as f: + content = f.read() + + todos = [] + for i, line in enumerate(content.split('\n'), 1): + if 'TODO' in line: + todos.append((i, line.strip())) + + if todos: + print(f"✓ Found {len(todos)} TODO comments for learners:") + for line_no, todo in todos[:5]: # Show first 5 + print(f" Line {line_no}: {todo[:60]}...") + if len(todos) > 5: + print(f" ... and {len(todos) - 5} more") + return True + else: + print("✗ No TODO comments found - learners need guidance!") + return False + +def test_starter_runs(): + """Test that the starter code can at least be executed.""" + print("\nExecution Test:") + + try: + # Try to import and check if server can be initialized + import server + # If we can import it and it has the right attributes, it should run + if hasattr(server, 'mcp') and hasattr(server, 'get_recent_actions_events'): + print("✓ Server imports and initializes correctly") + return True + else: + print("✗ Server missing required components") + return False + + except Exception as e: + print(f"✗ Failed to initialize server: {e}") + return False + +def test_dependencies(): + """Check that pyproject.toml is properly configured.""" + print("\nDependencies:") + + try: + import tomllib + except ImportError: + import tomli as tomllib + + try: + with open("pyproject.toml", "rb") as f: + config = tomllib.load(f) + + # Check for required sections + if "project" in config and "dependencies" in config["project"]: + deps = config["project"]["dependencies"] + print(f"✓ Found {len(deps)} dependencies") + for dep in deps: + print(f" - {dep}") + else: + print("✗ No dependencies section found") + return False + + return True + except Exception as e: + print(f"✗ Error reading pyproject.toml: {e}") + return False + +def test_no_implementation(): + """Ensure starter code doesn't contain the solution.""" + print("\nImplementation Check:") + + with open("server.py", "r") as f: + content = f.read() + + # Check that tool functions are not implemented + solution_indicators = [ + "subprocess.run", # Git commands + "json.dumps", # Returning JSON + "git diff", # Git operations + "template", # Template logic + ] + + found_implementations = [] + for indicator in solution_indicators: + if indicator in content.lower(): + found_implementations.append(indicator) + + if found_implementations: + print(f"⚠️ Found possible solution code: {', '.join(found_implementations)}") + print(" Make sure these are only in comments/examples") + return True # Warning, not failure + else: + print("✓ No solution implementation found (good!)") + return True + +def main(): + """Run all validation checks.""" + print("Module 1 Starter Code Validation") + print("=" * 50) + + # Change to starter directory if needed + if Path("validate_starter.py").exists(): + os.chdir(Path("validate_starter.py").parent) + + tests = [ + ("Project Structure", test_project_structure), + ("Python Imports", test_imports), + ("TODO Comments", test_todos), + ("Starter Execution", test_starter_runs), + ("Dependencies", test_dependencies), + ("Clean Starter", test_no_implementation) + ] + + results = [] + for test_name, test_func in tests: + print(f"\n{test_name}:") + try: + results.append(test_func()) + except Exception as e: + print(f"✗ Test failed with error: {e}") + results.append(False) + + print("\n" + "=" * 50) + passed = sum(results) + total = len(results) + print(f"Checks passed: {passed}/{total}") + + if passed == total: + print("\n✓ Starter code is ready for learners!") + print("\nLearners should:") + print("1. Run: uv sync") + print("2. Follow the TODO comments in server.py") + print("3. Test with: uv run pytest test_server.py") + print("4. Configure Claude Desktop when ready") + else: + print("\n✗ Some checks failed. Please review the starter code.") + sys.exit(1) + +if __name__ == "__main__": + main() \ No newline at end of file diff --git a/projects/unit3/github-actions-integration/starter/webhook_server.py b/projects/unit3/github-actions-integration/starter/webhook_server.py new file mode 100644 index 0000000..64941d1 --- /dev/null +++ b/projects/unit3/github-actions-integration/starter/webhook_server.py @@ -0,0 +1,57 @@ +#!/usr/bin/env python3 +""" +Simple webhook server for GitHub Actions events. +Stores events in a JSON file that the MCP server can read. +""" + +import json +from datetime import datetime +from pathlib import Path +from aiohttp import web + +# File to store events +EVENTS_FILE = Path(__file__).parent / "github_events.json" + +async def handle_webhook(request): + """Handle incoming GitHub webhook""" + try: + data = await request.json() + + # Create event record + event = { + "timestamp": datetime.utcnow().isoformat(), + "event_type": request.headers.get("X-GitHub-Event", "unknown"), + "action": data.get("action"), + "workflow_run": data.get("workflow_run"), + "check_run": data.get("check_run"), + "repository": data.get("repository", {}).get("full_name"), + "sender": data.get("sender", {}).get("login") + } + + # Load existing events + events = [] + if EVENTS_FILE.exists(): + with open(EVENTS_FILE, 'r') as f: + events = json.load(f) + + # Add new event and keep last 100 + events.append(event) + events = events[-100:] + + # Save events + with open(EVENTS_FILE, 'w') as f: + json.dump(events, f, indent=2) + + return web.json_response({"status": "received"}) + except Exception as e: + return web.json_response({"error": str(e)}, status=400) + +# Create app and add route +app = web.Application() +app.router.add_post('/webhook/github', handle_webhook) + +if __name__ == '__main__': + print("🚀 Starting webhook server on http://localhost:8080") + print("📝 Events will be saved to:", EVENTS_FILE) + print("🔗 Webhook URL: http://localhost:8080/webhook/github") + web.run_app(app, host='localhost', port=8080) \ No newline at end of file diff --git a/projects/unit3/slack-notification/solution/README.md b/projects/unit3/slack-notification/solution/README.md new file mode 100644 index 0000000..21a138f --- /dev/null +++ b/projects/unit3/slack-notification/solution/README.md @@ -0,0 +1,43 @@ +# Module 3: Slack Notification - Complete Solution + +This is the complete implementation of Module 3, demonstrating how to integrate MCP Tools and Prompts for team communication via Slack. + +## What This Implements + +This solution extends Modules 1 and 2 with: + +1. **`send_slack_notification` tool** - Sends formatted messages to Slack via webhook with proper error handling +2. **`format_ci_failure_alert` prompt** - Creates rich failure alerts with Slack markdown +3. **`format_ci_success_summary` prompt** - Creates celebration messages for successful deployments + +## Setup and Usage + +1. Install dependencies: + ```bash + uv sync + ``` + +2. Set up Slack webhook: + ```bash + export SLACK_WEBHOOK_URL="https://hooks.slack.com/services/YOUR/WEBHOOK/URL" + ``` + +3. Start services: + ```bash + # Terminal 1: Webhook server + python webhook_server.py + + # Terminal 2: MCP server + uv run server.py + + # Terminal 3: Cloudflare tunnel (optional) + cloudflared tunnel --url http://localhost:8080 + ``` + +## Testing + +See `manual_test.md` for comprehensive testing instructions using curl commands to simulate GitHub webhook events. + +## Key Learning Outcomes + +This solution demonstrates all MCP primitives working together for real-world team automation. \ No newline at end of file diff --git a/projects/unit3/slack-notification/solution/github_events.json b/projects/unit3/slack-notification/solution/github_events.json new file mode 100644 index 0000000..8f1ac86 --- /dev/null +++ b/projects/unit3/slack-notification/solution/github_events.json @@ -0,0 +1,19 @@ +[ + { + "timestamp": "2025-05-30T22:42:00.904997", + "event_type": "workflow_run", + "action": "completed", + "workflow_run": { + "name": "CI", + "status": "completed", + "conclusion": "failure", + "run_number": 42, + "html_url": "https://github.com/test/repo/actions/runs/123", + "head_branch": "test", + "head_sha": "abc123" + }, + "check_run": null, + "repository": null, + "sender": null + } +] \ No newline at end of file diff --git a/projects/unit3/slack-notification/solution/manual_test.md b/projects/unit3/slack-notification/solution/manual_test.md new file mode 100644 index 0000000..e1726fa --- /dev/null +++ b/projects/unit3/slack-notification/solution/manual_test.md @@ -0,0 +1,254 @@ +# Manual Testing Guide - Slack Notification Module + +This guide provides curl-based tests so you can verify your implementation without setting up a full GitHub repository and Actions workflow. + +## Prerequisites + +1. MCP server running: `uv run server.py` +2. Webhook server running: `python webhook_server.py` +3. Slack webhook URL set: `export SLACK_WEBHOOK_URL="https://hooks.slack.com/services/..."` + +## Test 1: Direct Slack Tool Test + +Test your `send_slack_notification` tool directly via Claude Code: + +```bash +# Start your MCP server and connect with Claude Code, then ask: +# "Send a test message to Slack: 'Hello from MCP Course Module 3!'" +``` + +Expected result: Message appears in your Slack channel. + +## Test 2: Simulate GitHub Webhook Events + +### 2a. Simulate CI Failure Event + +Send a fake GitHub Actions failure event to your webhook server: + +```bash +curl -X POST http://localhost:8080/webhook/github \ + -H "Content-Type: application/json" \ + -H "X-GitHub-Event: workflow_run" \ + -d '{ + "action": "completed", + "workflow_run": { + "id": 123456789, + "name": "CI", + "status": "completed", + "conclusion": "failure", + "run_number": 42, + "created_at": "2024-01-15T10:30:00Z", + "updated_at": "2024-01-15T10:35:00Z", + "html_url": "https://github.com/user/repo/actions/runs/123456789", + "head_branch": "feature/slack-integration", + "head_sha": "abc123f456789", + "repository": { + "name": "mcp-course", + "full_name": "user/mcp-course", + "html_url": "https://github.com/user/mcp-course" + }, + "pull_requests": [{ + "number": 42, + "url": "https://api.github.com/repos/user/mcp-course/pulls/42", + "html_url": "https://github.com/user/mcp-course/pull/42" + }] + } + }' +``` + +### 2b. Simulate CI Success Event + +```bash +curl -X POST http://localhost:8080/webhook/github \ + -H "Content-Type: application/json" \ + -H "X-GitHub-Event: workflow_run" \ + -d '{ + "action": "completed", + "workflow_run": { + "id": 123456790, + "name": "Deploy", + "status": "completed", + "conclusion": "success", + "run_number": 43, + "created_at": "2024-01-15T11:30:00Z", + "updated_at": "2024-01-15T11:35:00Z", + "html_url": "https://github.com/user/repo/actions/runs/123456790", + "head_branch": "main", + "head_sha": "def456g789012", + "repository": { + "name": "mcp-course", + "full_name": "user/mcp-course", + "html_url": "https://github.com/user/mcp-course" + }, + "pull_requests": [{ + "number": 43, + "url": "https://api.github.com/repos/user/mcp-course/pulls/43", + "html_url": "https://github.com/user/mcp-course/pull/43" + }] + } + }' +``` + +## Test 3: End-to-End Workflow Tests + +After sending the webhook events above, test the complete workflow via Claude Code: + +### 3a. Test Failure Alert Workflow + +Ask Claude Code: +``` +"Check recent CI events, find any failures, format them as a Slack alert, and send to the team" +``` + +Expected workflow: +1. Claude calls `get_recent_actions_events()` → finds failure event +2. Claude calls `format_ci_failure_alert()` → generates formatted message +3. Claude calls `send_slack_notification()` → sends to Slack + +Expected Slack message: +``` +❌ *CI Failed* - mcp-course + +> CI workflow failed on feature/slack-integration + +*Details:* +• Workflow: `CI` +• Branch: `feature/slack-integration` +• Commit: `abc123f` + +*Next Steps:* +• +• +``` + +### 3b. Test Success Summary Workflow + +Ask Claude Code: +``` +"Check recent CI events, find any successful deployments, format them as a celebration message, and send to the team" +``` + +Expected workflow: +1. Claude calls `get_recent_actions_events()` → finds success event +2. Claude calls `format_ci_success_summary()` → generates formatted message +3. Claude calls `send_slack_notification()` → sends to Slack + +Expected Slack message: +``` +✅ *Deployment Successful* - mcp-course + +> Deploy workflow completed successfully on main + +*Changes:* +• Module 3 Slack integration added +• Team notification system implemented + +*Links:* +• +• +``` + +## Test 4: Error Handling Tests + +### 4a. Test Missing Webhook URL + +```bash +# Temporarily unset the environment variable +unset SLACK_WEBHOOK_URL + +# Ask Claude Code to send a message +# Expected: Error message about missing environment variable +``` + +### 4b. Test Invalid Webhook URL + +```bash +# Set invalid webhook URL +export SLACK_WEBHOOK_URL="https://invalid-webhook-url.com/test" + +# Ask Claude Code to send a message +# Expected: Error message about connection failure +``` + +### 4c. Restore Valid Webhook URL + +```bash +export SLACK_WEBHOOK_URL="https://hooks.slack.com/services/YOUR/ACTUAL/URL" +``` + +## Test 5: Prompt-Only Tests + +Test the formatting prompts without sending to Slack: + +### 5a. Test Failure Alert Prompt + +Ask Claude Code: +``` +"Use the format_ci_failure_alert prompt to create a failure message for the recent CI failure, but don't send it to Slack yet" +``` + +### 5b. Test Success Summary Prompt + +Ask Claude Code: +``` +"Use the format_ci_success_summary prompt to create a success message for the recent deployment, but don't send it to Slack yet" +``` + +## Test 6: Integration with Previous Modules + +Test that all previous module functionality still works: + +### 6a. Module 1 Integration + +Ask Claude Code: +``` +"Analyze current file changes, suggest a PR template, then create a Slack message about the PR status" +``` + +### 6b. Module 2 Integration + +Ask Claude Code: +``` +"Check workflow status, analyze the CI results, and create a comprehensive team update for Slack" +``` + +## Verification Checklist + +After running these tests, verify: + +- [ ] Direct Slack tool works (Test 1) +- [ ] Webhook server receives and stores events (Test 2) +- [ ] Failure alert workflow works end-to-end (Test 3a) +- [ ] Success summary workflow works end-to-end (Test 3b) +- [ ] Error handling works properly (Test 4) +- [ ] Prompts work independently (Test 5) +- [ ] Integration with previous modules works (Test 6) +- [ ] Slack messages display with proper formatting +- [ ] All tools and prompts are accessible to Claude Code + +## Troubleshooting + +### Webhook Server Issues +```bash +# Check if webhook server is running +curl http://localhost:8080/health + +# Check stored events +cat github_events.json +``` + +### MCP Server Issues +```bash +# Check if MCP server is responding +# Should see server startup messages when running uv run server.py +``` + +### Slack Issues +```bash +# Test webhook URL directly +curl -X POST -H 'Content-type: application/json' \ + --data '{"text":"Direct webhook test"}' \ + $SLACK_WEBHOOK_URL +``` + +This testing approach lets you validate your implementation without needing to set up a real GitHub repository with Actions workflows! \ No newline at end of file diff --git a/projects/unit3/slack-notification/solution/pyproject.toml b/projects/unit3/slack-notification/solution/pyproject.toml new file mode 100644 index 0000000..cd2b2d9 --- /dev/null +++ b/projects/unit3/slack-notification/solution/pyproject.toml @@ -0,0 +1,30 @@ +[project] +name = "pr-agent-slack" +version = "3.0.0" +description = "MCP server with Slack notifications integrating Tools and Prompts" +readme = "README.md" +requires-python = ">=3.10" +dependencies = [ + "mcp[cli]>=1.0.0", + "aiohttp>=3.10.0,<4.0.0", + "requests>=2.32.0,<3.0.0", +] + +[project.optional-dependencies] +dev = [ + "pytest>=8.3.0", + "pytest-asyncio>=0.21.0", +] + +[build-system] +requires = ["hatchling"] +build-backend = "hatchling.build" + +[tool.hatch.build.targets.wheel] +packages = ["."] + +[tool.uv] +dev-dependencies = [ + "pytest>=8.3.0", + "pytest-asyncio>=0.21.0", +] \ No newline at end of file diff --git a/projects/unit3/slack-notification/solution/server.py b/projects/unit3/slack-notification/solution/server.py new file mode 100644 index 0000000..d08e046 --- /dev/null +++ b/projects/unit3/slack-notification/solution/server.py @@ -0,0 +1,503 @@ +#!/usr/bin/env python3 +""" +Module 3: Slack Notification Integration - Complete Solution +Combines all MCP primitives (Tools and Prompts) for complete team communication workflows. +""" + +import json +import os +import subprocess +import requests +from typing import Dict, Any, Optional +from pathlib import Path + +from mcp.server.fastmcp import FastMCP + +# Initialize the FastMCP server +mcp = FastMCP("pr-agent-slack") + +# PR template directory (shared between starter and solution) +TEMPLATES_DIR = Path(__file__).parent.parent.parent / "templates" + +# File where webhook server stores events +EVENTS_FILE = Path(__file__).parent / "github_events.json" + + +# ===== Tools from Modules 1 & 2 (Complete with output limiting) ===== + +@mcp.tool() +async def analyze_file_changes( + base_branch: str = "main", + include_diff: bool = True, + max_diff_lines: int = 500, + working_directory: Optional[str] = None +) -> str: + """Get the full diff and list of changed files in the current git repository. + + Args: + base_branch: Base branch to compare against (default: main) + include_diff: Include the full diff content (default: true) + max_diff_lines: Maximum number of diff lines to include (default: 500) + working_directory: Directory to run git commands in (default: current directory) + """ + try: + # Try to get working directory from roots first + if working_directory is None: + try: + context = mcp.get_context() + roots_result = await context.session.list_roots() + # Get the first root - Claude Code sets this to the CWD + root = roots_result.roots[0] + # FileUrl object has a .path property that gives us the path directly + working_directory = root.uri.path + except Exception as e: + # If we can't get roots, fall back to current directory + pass + + # Use provided working directory or current directory + cwd = working_directory if working_directory else os.getcwd() + # Get list of changed files + files_result = subprocess.run( + ["git", "diff", "--name-status", f"{base_branch}...HEAD"], + capture_output=True, + text=True, + check=True, + cwd=cwd + ) + + # Get diff statistics + stat_result = subprocess.run( + ["git", "diff", "--stat", f"{base_branch}...HEAD"], + capture_output=True, + text=True, + cwd=cwd + ) + + # Get the actual diff if requested + diff_content = "" + truncated = False + if include_diff: + diff_result = subprocess.run( + ["git", "diff", f"{base_branch}...HEAD"], + capture_output=True, + text=True, + cwd=cwd + ) + diff_lines = diff_result.stdout.split('\n') + + # Check if we need to truncate + if len(diff_lines) > max_diff_lines: + diff_content = '\n'.join(diff_lines[:max_diff_lines]) + diff_content += f"\n\n... Output truncated. Showing {max_diff_lines} of {len(diff_lines)} lines ..." + diff_content += "\n... Use max_diff_lines parameter to see more ..." + truncated = True + else: + diff_content = diff_result.stdout + + # Get commit messages for context + commits_result = subprocess.run( + ["git", "log", "--oneline", f"{base_branch}..HEAD"], + capture_output=True, + text=True, + cwd=cwd + ) + + analysis = { + "base_branch": base_branch, + "files_changed": files_result.stdout, + "statistics": stat_result.stdout, + "commits": commits_result.stdout, + "diff": diff_content if include_diff else "Diff not included (set include_diff=true to see full diff)", + "truncated": truncated, + "total_diff_lines": len(diff_lines) if include_diff else 0 + } + + return json.dumps(analysis, indent=2) + + except subprocess.CalledProcessError as e: + return json.dumps({"error": f"Git error: {e.stderr}"}) + except Exception as e: + return json.dumps({"error": str(e)}) + + +@mcp.tool() +async def get_pr_templates() -> str: + """List available PR templates with their content.""" + templates = [] + + # Define default templates + default_templates = { + "bug.md": "Bug Fix", + "feature.md": "Feature", + "docs.md": "Documentation", + "refactor.md": "Refactor", + "test.md": "Test", + "performance.md": "Performance", + "security.md": "Security" + } + + for filename, template_type in default_templates.items(): + template_path = TEMPLATES_DIR / filename + + # Read template content + content = template_path.read_text() + + templates.append({ + "filename": filename, + "type": template_type, + "content": content + }) + + return json.dumps(templates, indent=2) + + +@mcp.tool() +async def suggest_template(changes_summary: str, change_type: str) -> str: + """Let Claude analyze the changes and suggest the most appropriate PR template. + + Args: + changes_summary: Your analysis of what the changes do + change_type: The type of change you've identified (bug, feature, docs, refactor, test, etc.) + """ + + # Get available templates + templates_response = await get_pr_templates() + templates = json.loads(templates_response) + + # Map change types to template files + type_mapping = { + "bug": "bug.md", + "fix": "bug.md", + "feature": "feature.md", + "enhancement": "feature.md", + "docs": "docs.md", + "documentation": "docs.md", + "refactor": "refactor.md", + "cleanup": "refactor.md", + "test": "test.md", + "testing": "test.md", + "performance": "performance.md", + "optimization": "performance.md", + "security": "security.md" + } + + # Find matching template + template_file = type_mapping.get(change_type.lower(), "feature.md") + selected_template = next( + (t for t in templates if t["filename"] == template_file), + templates[0] # Default to first template if no match + ) + + suggestion = { + "recommended_template": selected_template, + "reasoning": f"Based on your analysis: '{changes_summary}', this appears to be a {change_type} change.", + "template_content": selected_template["content"], + "usage_hint": "Claude can help you fill out this template based on the specific changes in your PR." + } + + return json.dumps(suggestion, indent=2) + + +@mcp.tool() +async def get_recent_actions_events(limit: int = 10) -> str: + """Get recent GitHub Actions events received via webhook. + + Args: + limit: Maximum number of events to return (default: 10) + """ + # Read events from file + if not EVENTS_FILE.exists(): + return json.dumps([]) + + with open(EVENTS_FILE, 'r') as f: + events = json.load(f) + + # Return most recent events + recent = events[-limit:] + return json.dumps(recent, indent=2) + + +@mcp.tool() +async def get_workflow_status(workflow_name: Optional[str] = None) -> str: + """Get the current status of GitHub Actions workflows. + + Args: + workflow_name: Optional specific workflow name to filter by + """ + # Read events from file + if not EVENTS_FILE.exists(): + return json.dumps({"message": "No GitHub Actions events received yet"}) + + with open(EVENTS_FILE, 'r') as f: + events = json.load(f) + + if not events: + return json.dumps({"message": "No GitHub Actions events received yet"}) + + # Filter for workflow events + workflow_events = [ + e for e in events + if e.get("workflow_run") is not None + ] + + if workflow_name: + workflow_events = [ + e for e in workflow_events + if e["workflow_run"].get("name") == workflow_name + ] + + # Group by workflow and get latest status + workflows = {} + for event in workflow_events: + run = event["workflow_run"] + name = run["name"] + if name not in workflows or run["updated_at"] > workflows[name]["updated_at"]: + workflows[name] = { + "name": name, + "status": run["status"], + "conclusion": run.get("conclusion"), + "run_number": run["run_number"], + "updated_at": run["updated_at"], + "html_url": run["html_url"] + } + + return json.dumps(list(workflows.values()), indent=2) + + +# ===== New Module 3: Slack Integration Tools ===== + +@mcp.tool() +async def send_slack_notification(message: str) -> str: + """Send a formatted notification to the team Slack channel. + + Args: + message: The message to send to Slack (supports Slack markdown) + """ + webhook_url = os.getenv("SLACK_WEBHOOK_URL") + if not webhook_url: + return "Error: SLACK_WEBHOOK_URL environment variable not set" + + try: + # Prepare the payload + payload = { + "text": message + } + + # Send POST request to Slack webhook + response = requests.post( + webhook_url, + json=payload, + timeout=10 + ) + + # Check if request was successful + if response.status_code == 200: + return "✅ Message sent successfully to Slack" + else: + return f"❌ Failed to send message. Status: {response.status_code}, Response: {response.text}" + + except requests.exceptions.Timeout: + return "❌ Request timed out. Check your internet connection and try again." + except requests.exceptions.ConnectionError: + return "❌ Connection error. Check your internet connection and webhook URL." + except Exception as e: + return f"❌ Error sending message: {str(e)}" + + +# ===== New Module 3: Slack Formatting Prompts ===== + +@mcp.prompt() +async def format_ci_failure_alert(): + """Create a Slack alert for CI/CD failures with rich formatting.""" + return """Format this GitHub Actions failure as a Slack message using ONLY Slack markdown syntax: + +❌ *CI Failed* - [Repository Name] + +> Brief summary of what failed + +*Details:* +• Workflow: `workflow_name` +• Branch: `branch_name` +• Commit: `commit_hash` + +*Next Steps:* +• + +CRITICAL: Use EXACT Slack link format: +Examples: +- CORRECT: +- WRONG: [Repository](https://github.com/user/repo) +- WRONG: https://github.com/user/repo + +Other Slack formats: +- *text* for bold (NOT **text**) +- `text` for code +- > text for quotes +- • for bullets""" + + +@mcp.prompt() +async def format_ci_success_summary(): + """Create a Slack message celebrating successful deployments.""" + return """Format this successful GitHub Actions run as a Slack message using ONLY Slack markdown syntax: + +✅ *Deployment Successful* - [Repository Name] + +> Brief summary of what was deployed + +*Changes:* +• Key feature or fix 1 +• Key feature or fix 2 + +*Links:* +• + +CRITICAL: Use EXACT Slack link format: +Examples: +- CORRECT: +- WRONG: [Repository](https://github.com/user/repo) +- WRONG: https://github.com/user/repo + +Other Slack formats: +- *text* for bold (NOT **text**) +- `text` for code +- > text for quotes +- • for bullets""" + + +# ===== Prompts from Module 2 (Complete) ===== + +@mcp.prompt() +async def analyze_ci_results(): + """Analyze recent CI/CD results and provide insights.""" + return """Please analyze the recent CI/CD results from GitHub Actions: + +1. First, call get_recent_actions_events() to fetch the latest CI/CD events +2. Then call get_workflow_status() to check current workflow states +3. Identify any failures or issues that need attention +4. Provide actionable next steps based on the results + +Format your response as: +## CI/CD Status Summary +- **Overall Health**: [Good/Warning/Critical] +- **Failed Workflows**: [List any failures with links] +- **Successful Workflows**: [List recent successes] +- **Recommendations**: [Specific actions to take] +- **Trends**: [Any patterns you notice]""" + + +@mcp.prompt() +async def create_deployment_summary(): + """Generate a deployment summary for team communication.""" + return """Create a deployment summary for team communication: + +1. Check workflow status with get_workflow_status() +2. Look specifically for deployment-related workflows +3. Note the deployment outcome, timing, and any issues + +Format as a concise message suitable for Slack: + +🚀 **Deployment Update** +- **Status**: [✅ Success / ❌ Failed / ⏳ In Progress] +- **Environment**: [Production/Staging/Dev] +- **Version/Commit**: [If available from workflow data] +- **Duration**: [If available] +- **Key Changes**: [Brief summary if available] +- **Issues**: [Any problems encountered] +- **Next Steps**: [Required actions if failed] + +Keep it brief but informative for team awareness.""" + + +@mcp.prompt() +async def generate_pr_status_report(): + """Generate a comprehensive PR status report including CI/CD results.""" + return """Generate a comprehensive PR status report: + +1. Use analyze_file_changes() to understand what changed +2. Use get_workflow_status() to check CI/CD status +3. Use suggest_template() to recommend the appropriate PR template +4. Combine all information into a cohesive report + +Create a detailed report with: + +## 📋 PR Status Report + +### 📝 Code Changes +- **Files Modified**: [Count by type - .py, .js, etc.] +- **Change Type**: [Feature/Bug/Refactor/etc.] +- **Impact Assessment**: [High/Medium/Low with reasoning] +- **Key Changes**: [Bullet points of main modifications] + +### 🔄 CI/CD Status +- **All Checks**: [✅ Passing / ❌ Failing / ⏳ Running] +- **Test Results**: [Pass rate, failed tests if any] +- **Build Status**: [Success/Failed with details] +- **Code Quality**: [Linting, coverage if available] + +### 📌 Recommendations +- **PR Template**: [Suggested template and why] +- **Next Steps**: [What needs to happen before merge] +- **Reviewers**: [Suggested reviewers based on files changed] + +### ⚠️ Risks & Considerations +- [Any deployment risks] +- [Breaking changes] +- [Dependencies affected]""" + + +@mcp.prompt() +async def troubleshoot_workflow_failure(): + """Help troubleshoot a failing GitHub Actions workflow.""" + return """Help troubleshoot failing GitHub Actions workflows: + +1. Use get_recent_actions_events() to find recent failures +2. Use get_workflow_status() to see which workflows are failing +3. Analyze the failure patterns and timing +4. Provide systematic troubleshooting steps + +Structure your response as: + +## 🔧 Workflow Troubleshooting Guide + +### ❌ Failed Workflow Details +- **Workflow Name**: [Name of failing workflow] +- **Failure Type**: [Test/Build/Deploy/Lint] +- **First Failed**: [When did it start failing] +- **Failure Rate**: [Intermittent or consistent] + +### 🔍 Diagnostic Information +- **Error Patterns**: [Common error messages or symptoms] +- **Recent Changes**: [What changed before failures started] +- **Dependencies**: [External services or resources involved] + +### 💡 Possible Causes (ordered by likelihood) +1. **[Most Likely]**: [Description and why] +2. **[Likely]**: [Description and why] +3. **[Possible]**: [Description and why] + +### ✅ Suggested Fixes +**Immediate Actions:** +- [ ] [Quick fix to try first] +- [ ] [Second quick fix] + +**Investigation Steps:** +- [ ] [How to gather more info] +- [ ] [Logs or data to check] + +**Long-term Solutions:** +- [ ] [Preventive measure] +- [ ] [Process improvement] + +### 📚 Resources +- [Relevant documentation links] +- [Similar issues or solutions]""" + + +if __name__ == "__main__": + # Run MCP server normally + print("Starting PR Agent Slack MCP server...") + print("Make sure to set SLACK_WEBHOOK_URL environment variable") + print("To receive GitHub webhooks, run the webhook server separately:") + print(" python webhook_server.py") + mcp.run() \ No newline at end of file diff --git a/projects/unit3/slack-notification/solution/test_server.py b/projects/unit3/slack-notification/solution/test_server.py new file mode 100644 index 0000000..0bb24fd --- /dev/null +++ b/projects/unit3/slack-notification/solution/test_server.py @@ -0,0 +1,193 @@ +#!/usr/bin/env python3 +""" +Unit tests for Module 1: Basic MCP Server +""" + +import json +import pytest +import asyncio +from pathlib import Path +from unittest.mock import patch, MagicMock, AsyncMock +from server import ( + mcp, + analyze_file_changes, + get_pr_templates, + suggest_template, + create_default_template, + TEMPLATES_DIR +) + + +class TestAnalyzeFileChanges: + """Test the analyze_file_changes tool.""" + + @pytest.mark.asyncio + async def test_analyze_with_diff(self): + """Test analyzing changes with full diff included.""" + mock_result = MagicMock() + mock_result.stdout = "M\tfile1.py\nA\tfile2.py\n" + mock_result.stderr = "" + + with patch('subprocess.run') as mock_run: + mock_run.return_value = mock_result + + result = await analyze_file_changes("main", include_diff=True) + + assert isinstance(result, str) + data = json.loads(result) + assert data["base_branch"] == "main" + assert "files_changed" in data + assert "statistics" in data + assert "commits" in data + assert "diff" in data + + @pytest.mark.asyncio + async def test_analyze_without_diff(self): + """Test analyzing changes without diff content.""" + mock_result = MagicMock() + mock_result.stdout = "M\tfile1.py\n" + + with patch('subprocess.run') as mock_run: + mock_run.return_value = mock_result + + result = await analyze_file_changes("main", include_diff=False) + + data = json.loads(result) + assert "Diff not included" in data["diff"] + + @pytest.mark.asyncio + async def test_analyze_git_error(self): + """Test handling git command errors.""" + with patch('subprocess.run') as mock_run: + mock_run.side_effect = Exception("Git not found") + + result = await analyze_file_changes("main", True) + + assert "Error:" in result + + +class TestPRTemplates: + """Test PR template management.""" + + @pytest.mark.asyncio + async def test_get_templates(self, tmp_path, monkeypatch): + """Test getting available templates.""" + # Use temporary directory for templates + monkeypatch.setattr('server.TEMPLATES_DIR', tmp_path) + + result = await get_pr_templates() + + templates = json.loads(result) + assert len(templates) > 0 + assert any(t["type"] == "Bug Fix" for t in templates) + assert any(t["type"] == "Feature" for t in templates) + assert all("content" in t for t in templates) + + def test_create_default_template(self, tmp_path): + """Test creating default template files.""" + template_path = tmp_path / "test.md" + + create_default_template(template_path, "Bug Fix") + + assert template_path.exists() + content = template_path.read_text() + assert "## Bug Fix" in content + assert "Description" in content + assert "Root Cause" in content + + +class TestSuggestTemplate: + """Test template suggestion based on analysis.""" + + @pytest.mark.asyncio + async def test_suggest_bug_fix(self, tmp_path, monkeypatch): + """Test suggesting bug fix template.""" + monkeypatch.setattr('server.TEMPLATES_DIR', tmp_path) + + # Create templates first + await get_pr_templates() + + result = await suggest_template( + "Fixed null pointer exception in user service", + "bug" + ) + + suggestion = json.loads(result) + assert suggestion["recommended_template"]["filename"] == "bug.md" + assert "Bug Fix" in suggestion["recommended_template"]["type"] + assert "reasoning" in suggestion + + @pytest.mark.asyncio + async def test_suggest_feature(self, tmp_path, monkeypatch): + """Test suggesting feature template.""" + monkeypatch.setattr('server.TEMPLATES_DIR', tmp_path) + + await get_pr_templates() + + result = await suggest_template( + "Added new authentication method for API", + "feature" + ) + + suggestion = json.loads(result) + assert suggestion["recommended_template"]["filename"] == "feature.md" + + @pytest.mark.asyncio + async def test_suggest_with_type_variations(self, tmp_path, monkeypatch): + """Test template suggestion with various type names.""" + monkeypatch.setattr('server.TEMPLATES_DIR', tmp_path) + + await get_pr_templates() + + # Test variations + for change_type, expected_file in [ + ("fix", "bug.md"), + ("enhancement", "feature.md"), + ("documentation", "docs.md"), + ("cleanup", "refactor.md"), + ("testing", "test.md"), + ("optimization", "performance.md") + ]: + result = await suggest_template(f"Some {change_type} work", change_type) + suggestion = json.loads(result) + assert suggestion["recommended_template"]["filename"] == expected_file + + +class TestIntegration: + """Integration tests for the complete workflow.""" + + @pytest.mark.asyncio + async def test_full_workflow(self, tmp_path, monkeypatch): + """Test the complete workflow from analysis to suggestion.""" + monkeypatch.setattr('server.TEMPLATES_DIR', tmp_path) + + # Mock git commands + with patch('subprocess.run') as mock_run: + mock_run.return_value = MagicMock( + stdout="M\tsrc/main.py\nM\ttests/test_main.py\n", + stderr="" + ) + + # 1. Analyze changes + analysis_result = await analyze_file_changes("main", True) + + # 2. Get templates + templates_result = await get_pr_templates() + + # 3. Suggest template based on analysis + suggestion_result = await suggest_template( + "Updated main functionality and added tests", + "feature" + ) + + # Verify results + assert all(isinstance(r, str) for r in [analysis_result, templates_result, suggestion_result]) + + suggestion = json.loads(suggestion_result) + assert "recommended_template" in suggestion + assert "template_content" in suggestion + assert suggestion["recommended_template"]["type"] == "Feature" + + +if __name__ == "__main__": + pytest.main([__file__, "-v"]) \ No newline at end of file diff --git a/projects/unit3/slack-notification/solution/uv.lock b/projects/unit3/slack-notification/solution/uv.lock new file mode 100644 index 0000000..4ba2a12 --- /dev/null +++ b/projects/unit3/slack-notification/solution/uv.lock @@ -0,0 +1,1070 @@ +version = 1 +requires-python = ">=3.10" + +[[package]] +name = "aiohappyeyeballs" +version = "2.6.1" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohappyeyeballs/2.6.1/aiohappyeyeballs-2.6.1.tar.gz", hash = "sha256:c3f9d0113123803ccadfdf3f0faa505bc78e6a72d1cc4806cbd719826e943558" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohappyeyeballs/2.6.1/aiohappyeyeballs-2.6.1-py3-none-any.whl", hash = "sha256:f349ba8f4b75cb25c99c5c2d84e997e485204d2902a9597802b0371f09331fb8" }, +] + +[[package]] +name = "aiohttp" +version = "3.11.18" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "aiohappyeyeballs" }, + { name = "aiosignal" }, + { name = "async-timeout", marker = "python_full_version < '3.11'" }, + { name = "attrs" }, + { name = "frozenlist" }, + { name = "multidict" }, + { name = "propcache" }, + { name = "yarl" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18.tar.gz", hash = "sha256:ae856e1138612b7e412db63b7708735cff4d38d0399f6a5435d3dac2669f558a" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:96264854fedbea933a9ca4b7e0c745728f01380691687b7365d18d9e977179c4" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:9602044ff047043430452bc3a2089743fa85da829e6fc9ee0025351d66c332b6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:5691dc38750fcb96a33ceef89642f139aa315c8a193bbd42a0c33476fd4a1609" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:554c918ec43f8480b47a5ca758e10e793bd7410b83701676a4782672d670da55" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8a4076a2b3ba5b004b8cffca6afe18a3b2c5c9ef679b4d1e9859cf76295f8d4f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:767a97e6900edd11c762be96d82d13a1d7c4fc4b329f054e88b57cdc21fded94" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f0ddc9337a0fb0e727785ad4f41163cc314376e82b31846d3835673786420ef1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f414f37b244f2a97e79b98d48c5ff0789a0b4b4609b17d64fa81771ad780e415" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:fdb239f47328581e2ec7744ab5911f97afb10752332a6dd3d98e14e429e1a9e7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:f2c50bad73ed629cc326cc0f75aed8ecfb013f88c5af116f33df556ed47143eb" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:0a8d8f20c39d3fa84d1c28cdb97f3111387e48209e224408e75f29c6f8e0861d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:106032eaf9e62fd6bc6578c8b9e6dc4f5ed9a5c1c7fb2231010a1b4304393421" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:b491e42183e8fcc9901d8dcd8ae644ff785590f1727f76ca86e731c61bfe6643" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:ad8c745ff9460a16b710e58e06a9dec11ebc0d8f4dd82091cefb579844d69868" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-win32.whl", hash = "sha256:8e57da93e24303a883146510a434f0faf2f1e7e659f3041abc4e3fb3f6702a9f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp310-cp310-win_amd64.whl", hash = "sha256:cc93a4121d87d9f12739fc8fab0a95f78444e571ed63e40bfc78cd5abe700ac9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:427fdc56ccb6901ff8088544bde47084845ea81591deb16f957897f0f0ba1be9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:2c828b6d23b984255b85b9b04a5b963a74278b7356a7de84fda5e3b76866597b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:5c2eaa145bb36b33af1ff2860820ba0589e165be4ab63a49aebfd0981c173b66" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3d518ce32179f7e2096bf4e3e8438cf445f05fedd597f252de9f54c728574756" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0700055a6e05c2f4711011a44364020d7a10fbbcd02fbf3e30e8f7e7fddc8717" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8bd1cde83e4684324e6ee19adfc25fd649d04078179890be7b29f76b501de8e4" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:73b8870fe1c9a201b8c0d12c94fe781b918664766728783241a79e0468427e4f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:25557982dd36b9e32c0a3357f30804e80790ec2c4d20ac6bcc598533e04c6361" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:7e889c9df381a2433802991288a61e5a19ceb4f61bd14f5c9fa165655dcb1fd1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:9ea345fda05bae217b6cce2acf3682ce3b13d0d16dd47d0de7080e5e21362421" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:9f26545b9940c4b46f0a9388fd04ee3ad7064c4017b5a334dd450f616396590e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:3a621d85e85dccabd700294494d7179ed1590b6d07a35709bb9bd608c7f5dd1d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:9c23fd8d08eb9c2af3faeedc8c56e134acdaf36e2117ee059d7defa655130e5f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:d9e6b0e519067caa4fd7fb72e3e8002d16a68e84e62e7291092a5433763dc0dd" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-win32.whl", hash = "sha256:122f3e739f6607e5e4c6a2f8562a6f476192a682a52bda8b4c6d4254e1138f4d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp311-cp311-win_amd64.whl", hash = "sha256:e6f3c0a3a1e73e88af384b2e8a0b9f4fb73245afd47589df2afcab6b638fa0e6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:63d71eceb9cad35d47d71f78edac41fcd01ff10cacaa64e473d1aec13fa02df2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:d1929da615840969929e8878d7951b31afe0bac883d84418f92e5755d7b49508" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:7d0aebeb2392f19b184e3fdd9e651b0e39cd0f195cdb93328bd124a1d455cd0e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3849ead845e8444f7331c284132ab314b4dac43bfae1e3cf350906d4fff4620f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5e8452ad6b2863709f8b3d615955aa0807bc093c34b8e25b3b52097fe421cb7f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3b8d2b42073611c860a37f718b3d61ae8b4c2b124b2e776e2c10619d920350ec" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:40fbf91f6a0ac317c0a07eb328a1384941872f6761f2e6f7208b63c4cc0a7ff6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:44ff5625413fec55216da5eaa011cf6b0a2ed67a565914a212a51aa3755b0009" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:7f33a92a2fde08e8c6b0c61815521324fc1612f397abf96eed86b8e31618fdb4" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:11d5391946605f445ddafda5eab11caf310f90cdda1fd99865564e3164f5cff9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:3cc314245deb311364884e44242e00c18b5896e4fe6d5f942e7ad7e4cb640adb" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:0f421843b0f70740772228b9e8093289924359d306530bcd3926f39acbe1adda" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:e220e7562467dc8d589e31c1acd13438d82c03d7f385c9cd41a3f6d1d15807c1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ab2ef72f8605046115bc9aa8e9d14fd49086d405855f40b79ed9e5c1f9f4faea" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-win32.whl", hash = "sha256:12a62691eb5aac58d65200c7ae94d73e8a65c331c3a86a2e9670927e94339ee8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp312-cp312-win_amd64.whl", hash = "sha256:364329f319c499128fd5cd2d1c31c44f234c58f9b96cc57f743d16ec4f3238c8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:474215ec618974054cf5dc465497ae9708543cbfc312c65212325d4212525811" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:6ced70adf03920d4e67c373fd692123e34d3ac81dfa1c27e45904a628567d804" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:2d9f6c0152f8d71361905aaf9ed979259537981f47ad099c8b3d81e0319814bd" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a35197013ed929c0aed5c9096de1fc5a9d336914d73ab3f9df14741668c0616c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:540b8a1f3a424f1af63e0af2d2853a759242a1769f9f1ab053996a392bd70118" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f9e6710ebebfce2ba21cee6d91e7452d1125100f41b906fb5af3da8c78b764c1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f8af2ef3b4b652ff109f98087242e2ab974b2b2b496304063585e3d78de0b000" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:28c3f975e5ae3dbcbe95b7e3dcd30e51da561a0a0f2cfbcdea30fc1308d72137" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:c28875e316c7b4c3e745172d882d8a5c835b11018e33432d281211af35794a93" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:13cd38515568ae230e1ef6919e2e33da5d0f46862943fcda74e7e915096815f3" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:0e2a92101efb9f4c2942252c69c63ddb26d20f46f540c239ccfa5af865197bb8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:e6d3e32b8753c8d45ac550b11a1090dd66d110d4ef805ffe60fa61495360b3b2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:ea4cf2488156e0f281f93cc2fd365025efcba3e2d217cbe3df2840f8c73db261" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9d4df95ad522c53f2b9ebc07f12ccd2cb15550941e11a5bbc5ddca2ca56316d7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-win32.whl", hash = "sha256:cdd1bbaf1e61f0d94aced116d6e95fe25942f7a5f42382195fd9501089db5d78" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiohttp/3.11.18/aiohttp-3.11.18-cp313-cp313-win_amd64.whl", hash = "sha256:bdd619c27e44382cf642223f11cfd4d795161362a5a1fc1fa3940397bc89db01" }, +] + +[[package]] +name = "aiosignal" +version = "1.3.2" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "frozenlist" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiosignal/1.3.2/aiosignal-1.3.2.tar.gz", hash = "sha256:a8c255c66fafb1e499c9351d0bf32ff2d8a0321595ebac3b93713656d2436f54" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/aiosignal/1.3.2/aiosignal-1.3.2-py2.py3-none-any.whl", hash = "sha256:45cde58e409a301715980c2b01d0c28bdde3770d8290b5eb2173759d9acb31a5" }, +] + +[[package]] +name = "annotated-types" +version = "0.7.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/annotated-types/0.7.0/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/annotated-types/0.7.0/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53" }, +] + +[[package]] +name = "anyio" +version = "4.9.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "exceptiongroup", marker = "python_full_version < '3.11'" }, + { name = "idna" }, + { name = "sniffio" }, + { name = "typing-extensions", marker = "python_full_version < '3.13'" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/anyio/4.9.0/anyio-4.9.0.tar.gz", hash = "sha256:673c0c244e15788651a4ff38710fea9675823028a6f08a5eda409e0c9840a028" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/anyio/4.9.0/anyio-4.9.0-py3-none-any.whl", hash = "sha256:9f76d541cad6e36af7beb62e978876f3b41e3e04f2c1fbf0884604c0a9c4d93c" }, +] + +[[package]] +name = "async-timeout" +version = "5.0.1" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/async-timeout/5.0.1/async_timeout-5.0.1.tar.gz", hash = "sha256:d9321a7a3d5a6a5e187e824d2fa0793ce379a202935782d555d6e9d2735677d3" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/async-timeout/5.0.1/async_timeout-5.0.1-py3-none-any.whl", hash = "sha256:39e3809566ff85354557ec2398b55e096c8364bacac9405a7a1fa429e77fe76c" }, +] + +[[package]] +name = "attrs" +version = "25.3.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/attrs/25.3.0/attrs-25.3.0.tar.gz", hash = "sha256:75d7cefc7fb576747b2c81b4442d4d4a1ce0900973527c011d1030fd3bf4af1b" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/attrs/25.3.0/attrs-25.3.0-py3-none-any.whl", hash = "sha256:427318ce031701fea540783410126f03899a97ffc6f61596ad581ac2e40e3bc3" }, +] + +[[package]] +name = "certifi" +version = "2025.4.26" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/certifi/2025.4.26/certifi-2025.4.26.tar.gz", hash = "sha256:0a816057ea3cdefcef70270d2c515e4506bbc954f417fa5ade2021213bb8f0c6" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/certifi/2025.4.26/certifi-2025.4.26-py3-none-any.whl", hash = "sha256:30350364dfe371162649852c63336a15c70c6510c2ad5015b21c2345311805f3" }, +] + +[[package]] +name = "charset-normalizer" +version = "3.4.2" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2.tar.gz", hash = "sha256:5baececa9ecba31eff645232d59845c07aa030f0c81ee70184a90d35099a0e63" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:7c48ed483eb946e6c04ccbe02c6b4d1d48e51944b6db70f697e089c193404941" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b2d318c11350e10662026ad0eb71bb51c7812fc8590825304ae0bdd4ac283acd" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9cbfacf36cb0ec2897ce0ebc5d08ca44213af24265bd56eca54bee7923c48fd6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:18dd2e350387c87dabe711b86f83c9c78af772c748904d372ade190b5c7c9d4d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8075c35cd58273fee266c58c0c9b670947c19df5fb98e7b66710e04ad4e9ff86" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5bf4545e3b962767e5c06fe1738f951f77d27967cb2caa64c28be7c4563e162c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:7a6ab32f7210554a96cd9e33abe3ddd86732beeafc7a28e9955cdf22ffadbab0" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:b33de11b92e9f75a2b545d6e9b6f37e398d86c3e9e9653c4864eb7e89c5773ef" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:8755483f3c00d6c9a77f490c17e6ab0c8729e39e6390328e42521ef175380ae6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:68a328e5f55ec37c57f19ebb1fdc56a248db2e3e9ad769919a58672958e8f366" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:21b2899062867b0e1fde9b724f8aecb1af14f2778d69aacd1a5a1853a597a5db" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp310-cp310-win32.whl", hash = "sha256:e8082b26888e2f8b36a042a58307d5b917ef2b1cacab921ad3323ef91901c71a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp310-cp310-win_amd64.whl", hash = "sha256:f69a27e45c43520f5487f27627059b64aaf160415589230992cec34c5e18a509" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:be1e352acbe3c78727a16a455126d9ff83ea2dfdcbc83148d2982305a04714c2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aa88ca0b1932e93f2d961bf3addbb2db902198dca337d88c89e1559e066e7645" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d524ba3f1581b35c03cb42beebab4a13e6cdad7b36246bd22541fa585a56cccd" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28a1005facc94196e1fb3e82a3d442a9d9110b8434fc1ded7a24a2983c9888d8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fdb20a30fe1175ecabed17cbf7812f7b804b8a315a25f24678bcdf120a90077f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0f5d9ed7f254402c9e7d35d2f5972c9bbea9040e99cd2861bd77dc68263277c7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:efd387a49825780ff861998cd959767800d54f8308936b21025326de4b5a42b9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:f0aa37f3c979cf2546b73e8222bbfa3dc07a641585340179d768068e3455e544" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:e70e990b2137b29dc5564715de1e12701815dacc1d056308e2b17e9095372a82" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:0c8c57f84ccfc871a48a47321cfa49ae1df56cd1d965a09abe84066f6853b9c0" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:6b66f92b17849b85cad91259efc341dce9c1af48e2173bf38a85c6329f1033e5" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp311-cp311-win32.whl", hash = "sha256:daac4765328a919a805fa5e2720f3e94767abd632ae410a9062dff5412bae65a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp311-cp311-win_amd64.whl", hash = "sha256:e53efc7c7cee4c1e70661e2e112ca46a575f90ed9ae3fef200f2a25e954f4b28" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:0c29de6a1a95f24b9a1aa7aefd27d2487263f00dfd55a77719b530788f75cff7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cddf7bd982eaa998934a91f69d182aec997c6c468898efe6679af88283b498d3" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:fcbe676a55d7445b22c10967bceaaf0ee69407fbe0ece4d032b6eb8d4565982a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d41c4d287cfc69060fa91cae9683eacffad989f1a10811995fa309df656ec214" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4e594135de17ab3866138f496755f302b72157d115086d100c3f19370839dd3a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cf713fe9a71ef6fd5adf7a79670135081cd4431c2943864757f0fa3a65b1fafd" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:a370b3e078e418187da8c3674eddb9d983ec09445c99a3a263c2011993522981" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:a955b438e62efdf7e0b7b52a64dc5c3396e2634baa62471768a64bc2adb73d5c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:7222ffd5e4de8e57e03ce2cef95a4c43c98fcb72ad86909abdfc2c17d227fc1b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:bee093bf902e1d8fc0ac143c88902c3dfc8941f7ea1d6a8dd2bcb786d33db03d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:dedb8adb91d11846ee08bec4c8236c8549ac721c245678282dcb06b221aab59f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp312-cp312-win32.whl", hash = "sha256:db4c7bf0e07fc3b7d89ac2a5880a6a8062056801b83ff56d8464b70f65482b6c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp312-cp312-win_amd64.whl", hash = "sha256:5a9979887252a82fefd3d3ed2a8e3b937a7a809f65dcb1e068b090e165bbe99e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:926ca93accd5d36ccdabd803392ddc3e03e6d4cd1cf17deff3b989ab8e9dbcf0" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:eba9904b0f38a143592d9fc0e19e2df0fa2e41c3c3745554761c5f6447eedabf" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3fddb7e2c84ac87ac3a947cb4e66d143ca5863ef48e4a5ecb83bd48619e4634e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:98f862da73774290f251b9df8d11161b6cf25b599a66baf087c1ffe340e9bfd1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c9379d65defcab82d07b2a9dfbfc2e95bc8fe0ebb1b176a3190230a3ef0e07c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e635b87f01ebc977342e2697d05b56632f5f879a4f15955dfe8cef2448b51691" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:1c95a1e2902a8b722868587c0e1184ad5c55631de5afc0eb96bc4b0d738092c0" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:ef8de666d6179b009dce7bcb2ad4c4a779f113f12caf8dc77f0162c29d20490b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:32fc0341d72e0f73f80acb0a2c94216bd704f4f0bce10aedea38f30502b271ff" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:289200a18fa698949d2b39c671c2cc7a24d44096784e76614899a7ccf2574b7b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4a476b06fbcf359ad25d34a057b7219281286ae2477cc5ff5e3f70a246971148" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp313-cp313-win32.whl", hash = "sha256:aaeeb6a479c7667fbe1099af9617c83aaca22182d6cf8c53966491a0f1b7ffb7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp313-cp313-win_amd64.whl", hash = "sha256:aa6af9e7d59f9c12b33ae4e9450619cf2488e2bbe9b44030905877f0b2324980" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/charset-normalizer/3.4.2/charset_normalizer-3.4.2-py3-none-any.whl", hash = "sha256:7f56930ab0abd1c45cd15be65cc741c28b1c9a34876ce8c17a2fa107810c0af0" }, +] + +[[package]] +name = "click" +version = "8.1.8" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "colorama", marker = "sys_platform == 'win32'" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/click/8.1.8/click-8.1.8.tar.gz", hash = "sha256:ed53c9d8990d83c2a27deae68e4ee337473f6330c040a31d4225c9574d16096a" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/click/8.1.8/click-8.1.8-py3-none-any.whl", hash = "sha256:63c132bbbed01578a06712a2d1f497bb62d9c1c0d329b7903a866228027263b2" }, +] + +[[package]] +name = "colorama" +version = "0.4.6" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/colorama/0.4.6/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/colorama/0.4.6/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6" }, +] + +[[package]] +name = "exceptiongroup" +version = "1.3.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "typing-extensions", marker = "python_full_version < '3.13'" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/exceptiongroup/1.3.0/exceptiongroup-1.3.0.tar.gz", hash = "sha256:b241f5885f560bc56a59ee63ca4c6a8bfa46ae4ad651af316d4e81817bb9fd88" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/exceptiongroup/1.3.0/exceptiongroup-1.3.0-py3-none-any.whl", hash = "sha256:4d111e6e0c13d0644cad6ddaa7ed0261a0b36971f6d23e7ec9b4b9097da78a10" }, +] + +[[package]] +name = "frozenlist" +version = "1.6.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0.tar.gz", hash = "sha256:b99655c32c1c8e06d111e7f41c06c29a5318cb1835df23a45518e02a47c63b68" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:e6e558ea1e47fd6fa8ac9ccdad403e5dd5ecc6ed8dda94343056fa4277d5c65e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:f4b3cd7334a4bbc0c472164f3744562cb72d05002cc6fcf58adb104630bbc352" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9799257237d0479736e2b4c01ff26b5c7f7694ac9692a426cb717f3dc02fff9b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f3a7bb0fe1f7a70fb5c6f497dc32619db7d2cdd53164af30ade2f34673f8b1fc" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:36d2fc099229f1e4237f563b2a3e0ff7ccebc3999f729067ce4e64a97a7f2869" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f27a9f9a86dcf00708be82359db8de86b80d029814e6693259befe82bb58a106" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:75ecee69073312951244f11b8627e3700ec2bfe07ed24e3a685a5979f0412d24" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f2c7d5aa19714b1b01a0f515d078a629e445e667b9da869a3cd0e6fe7dec78bd" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:69bbd454f0fb23b51cadc9bdba616c9678e4114b6f9fa372d462ff2ed9323ec8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:7daa508e75613809c7a57136dec4871a21bca3080b3a8fc347c50b187df4f00c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:89ffdb799154fd4d7b85c56d5fa9d9ad48946619e0eb95755723fffa11022d75" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:920b6bd77d209931e4c263223381d63f76828bec574440f29eb497cf3394c249" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:d3ceb265249fb401702fce3792e6b44c1166b9319737d21495d3611028d95769" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:52021b528f1571f98a7d4258c58aa8d4b1a96d4f01d00d51f1089f2e0323cb02" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:0f2ca7810b809ed0f1917293050163c7654cefc57a49f337d5cd9de717b8fad3" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-win32.whl", hash = "sha256:0e6f8653acb82e15e5443dba415fb62a8732b68fe09936bb6d388c725b57f812" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp310-cp310-win_amd64.whl", hash = "sha256:f1a39819a5a3e84304cd286e3dc62a549fe60985415851b3337b6f5cc91907f1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ae8337990e7a45683548ffb2fee1af2f1ed08169284cd829cdd9a7fa7470530d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:8c952f69dd524558694818a461855f35d36cc7f5c0adddce37e962c85d06eac0" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:8f5fef13136c4e2dee91bfb9a44e236fff78fc2cd9f838eddfc470c3d7d90afe" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:716bbba09611b4663ecbb7cd022f640759af8259e12a6ca939c0a6acd49eedba" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:7b8c4dc422c1a3ffc550b465090e53b0bf4839047f3e436a34172ac67c45d595" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b11534872256e1666116f6587a1592ef395a98b54476addb5e8d352925cb5d4a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1c6eceb88aaf7221f75be6ab498dc622a151f5f88d536661af3ffc486245a626" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:62c828a5b195570eb4b37369fcbbd58e96c905768d53a44d13044355647838ff" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e1c6bd2c6399920c9622362ce95a7d74e7f9af9bfec05fff91b8ce4b9647845a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:49ba23817781e22fcbd45fd9ff2b9b8cdb7b16a42a4851ab8025cae7b22e96d0" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:431ef6937ae0f853143e2ca67d6da76c083e8b1fe3df0e96f3802fd37626e606" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:9d124b38b3c299ca68433597ee26b7819209cb8a3a9ea761dfe9db3a04bba584" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:118e97556306402e2b010da1ef21ea70cb6d6122e580da64c056b96f524fbd6a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:fb3b309f1d4086b5533cf7bbcf3f956f0ae6469664522f1bde4feed26fba60f1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:54dece0d21dce4fdb188a1ffc555926adf1d1c516e493c2914d7c370e454bc9e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-win32.whl", hash = "sha256:654e4ba1d0b2154ca2f096bed27461cf6160bc7f504a7f9a9ef447c293caf860" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp311-cp311-win_amd64.whl", hash = "sha256:3e911391bffdb806001002c1f860787542f45916c3baf764264a52765d5a5603" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:c5b9e42ace7d95bf41e19b87cec8f262c41d3510d8ad7514ab3862ea2197bfb1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:ca9973735ce9f770d24d5484dcb42f68f135351c2fc81a7a9369e48cf2998a29" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6ac40ec76041c67b928ca8aaffba15c2b2ee3f5ae8d0cb0617b5e63ec119ca25" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:95b7a8a3180dfb280eb044fdec562f9b461614c0ef21669aea6f1d3dac6ee576" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:c444d824e22da6c9291886d80c7d00c444981a72686e2b59d38b285617cb52c8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bb52c8166499a8150bfd38478248572c924c003cbb45fe3bcd348e5ac7c000f9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b35298b2db9c2468106278537ee529719228950a5fdda686582f68f247d1dc6e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d108e2d070034f9d57210f22fefd22ea0d04609fc97c5f7f5a686b3471028590" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4e1be9111cb6756868ac242b3c2bd1f09d9aea09846e4f5c23715e7afb647103" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:94bb451c664415f02f07eef4ece976a2c65dcbab9c2f1705b7031a3a75349d8c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:d1a686d0b0949182b8faddea596f3fc11f44768d1f74d4cad70213b2e139d821" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:ea8e59105d802c5a38bdbe7362822c522230b3faba2aa35c0fa1765239b7dd70" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:abc4e880a9b920bc5020bf6a431a6bb40589d9bca3975c980495f63632e8382f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:9a79713adfe28830f27a3c62f6b5406c37376c892b05ae070906f07ae4487046" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:9a0318c2068e217a8f5e3b85e35899f5a19e97141a45bb925bb357cfe1daf770" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-win32.whl", hash = "sha256:853ac025092a24bb3bf09ae87f9127de9fe6e0c345614ac92536577cf956dfcc" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp312-cp312-win_amd64.whl", hash = "sha256:2bdfe2d7e6c9281c6e55523acd6c2bf77963cb422fdc7d142fb0cb6621b66878" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:1d7fb014fe0fbfee3efd6a94fc635aeaa68e5e1720fe9e57357f2e2c6e1a647e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:01bcaa305a0fdad12745502bfd16a1c75b14558dabae226852f9159364573117" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:8b314faa3051a6d45da196a2c495e922f987dc848e967d8cfeaee8a0328b1cd4" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:da62fecac21a3ee10463d153549d8db87549a5e77eefb8c91ac84bb42bb1e4e3" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:d1eb89bf3454e2132e046f9599fbcf0a4483ed43b40f545551a39316d0201cd1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d18689b40cb3936acd971f663ccb8e2589c45db5e2c5f07e0ec6207664029a9c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e67ddb0749ed066b1a03fba812e2dcae791dd50e5da03be50b6a14d0c1a9ee45" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:fc5e64626e6682638d6e44398c9baf1d6ce6bc236d40b4b57255c9d3f9761f1f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:437cfd39564744ae32ad5929e55b18ebd88817f9180e4cc05e7d53b75f79ce85" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:62dd7df78e74d924952e2feb7357d826af8d2f307557a779d14ddf94d7311be8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:a66781d7e4cddcbbcfd64de3d41a61d6bdde370fc2e38623f30b2bd539e84a9f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:482fe06e9a3fffbcd41950f9d890034b4a54395c60b5e61fae875d37a699813f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:e4f9373c500dfc02feea39f7a56e4f543e670212102cc2eeb51d3a99c7ffbde6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:e69bb81de06827147b7bfbaeb284d85219fa92d9f097e32cc73675f279d70188" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:7613d9977d2ab4a9141dde4a149f4357e4065949674c5649f920fec86ecb393e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-win32.whl", hash = "sha256:4def87ef6d90429f777c9d9de3961679abf938cb6b7b63d4a7eb8a268babfce4" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313-win_amd64.whl", hash = "sha256:37a8a52c3dfff01515e9bbbee0e6063181362f9de3db2ccf9bc96189b557cbfd" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:46138f5a0773d064ff663d273b309b696293d7a7c00a0994c5c13a5078134b64" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:f88bc0a2b9c2a835cb888b32246c27cdab5740059fb3688852bf91e915399b91" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:777704c1d7655b802c7850255639672e90e81ad6fa42b99ce5ed3fbf45e338dd" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:85ef8d41764c7de0dcdaf64f733a27352248493a85a80661f3c678acd27e31f2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:da5cb36623f2b846fb25009d9d9215322318ff1c63403075f812b3b2876c8506" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:cbb56587a16cf0fb8acd19e90ff9924979ac1431baea8681712716a8337577b0" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c6154c3ba59cda3f954c6333025369e42c3acd0c6e8b6ce31eb5c5b8116c07e0" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2e8246877afa3f1ae5c979fe85f567d220f86a50dc6c493b9b7d8191181ae01e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7b0f6cce16306d2e117cf9db71ab3a9e8878a28176aeaf0dbe35248d97b28d0c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:1b8e8cd8032ba266f91136d7105706ad57770f3522eac4a111d77ac126a25a9b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:e2ada1d8515d3ea5378c018a5f6d14b4994d4036591a52ceaf1a1549dec8e1ad" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:cdb2c7f071e4026c19a3e32b93a09e59b12000751fc9b0b7758da899e657d215" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:03572933a1969a6d6ab509d509e5af82ef80d4a5d4e1e9f2e1cdd22c77a3f4d2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:77effc978947548b676c54bbd6a08992759ea6f410d4987d69feea9cd0919911" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:a2bda8be77660ad4089caf2223fdbd6db1858462c4b85b67fbfa22102021e497" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-win32.whl", hash = "sha256:a4d96dc5bcdbd834ec6b0f91027817214216b5b30316494d2b1aebffb87c534f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-cp313-cp313t-win_amd64.whl", hash = "sha256:e18036cb4caa17ea151fd5f3d70be9d354c99eb8cf817a3ccde8a7873b074348" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/frozenlist/1.6.0/frozenlist-1.6.0-py3-none-any.whl", hash = "sha256:535eec9987adb04701266b92745d6cdcef2e77669299359c3009c3404dd5d191" }, +] + +[[package]] +name = "h11" +version = "0.16.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/h11/0.16.0/h11-0.16.0.tar.gz", hash = "sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/h11/0.16.0/h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86" }, +] + +[[package]] +name = "httpcore" +version = "1.0.9" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "certifi" }, + { name = "h11" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/httpcore/1.0.9/httpcore-1.0.9.tar.gz", hash = "sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/httpcore/1.0.9/httpcore-1.0.9-py3-none-any.whl", hash = "sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55" }, +] + +[[package]] +name = "httpx" +version = "0.28.1" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "anyio" }, + { name = "certifi" }, + { name = "httpcore" }, + { name = "idna" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/httpx/0.28.1/httpx-0.28.1.tar.gz", hash = "sha256:75e98c5f16b0f35b567856f597f06ff2270a374470a5c2392242528e3e3e42fc" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/httpx/0.28.1/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad" }, +] + +[[package]] +name = "httpx-sse" +version = "0.4.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/httpx-sse/0.4.0/httpx-sse-0.4.0.tar.gz", hash = "sha256:1e81a3a3070ce322add1d3529ed42eb5f70817f45ed6ec915ab753f961139721" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/httpx-sse/0.4.0/httpx_sse-0.4.0-py3-none-any.whl", hash = "sha256:f329af6eae57eaa2bdfd962b42524764af68075ea87370a2de920af5341e318f" }, +] + +[[package]] +name = "idna" +version = "3.10" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/idna/3.10/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/idna/3.10/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3" }, +] + +[[package]] +name = "iniconfig" +version = "2.1.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/iniconfig/2.1.0/iniconfig-2.1.0.tar.gz", hash = "sha256:3abbd2e30b36733fee78f9c7f7308f2d0050e88f0087fd25c2645f63c773e1c7" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/iniconfig/2.1.0/iniconfig-2.1.0-py3-none-any.whl", hash = "sha256:9deba5723312380e77435581c6bf4935c94cbfab9b1ed33ef8d238ea168eb760" }, +] + +[[package]] +name = "markdown-it-py" +version = "3.0.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "mdurl" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/markdown-it-py/3.0.0/markdown-it-py-3.0.0.tar.gz", hash = "sha256:e3f60a94fa066dc52ec76661e37c851cb232d92f9886b15cb560aaada2df8feb" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/markdown-it-py/3.0.0/markdown_it_py-3.0.0-py3-none-any.whl", hash = "sha256:355216845c60bd96232cd8d8c40e8f9765cc86f46880e43a8fd22dc1a1a8cab1" }, +] + +[[package]] +name = "mcp" +version = "1.9.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "anyio" }, + { name = "httpx" }, + { name = "httpx-sse" }, + { name = "pydantic" }, + { name = "pydantic-settings" }, + { name = "python-multipart" }, + { name = "sse-starlette" }, + { name = "starlette" }, + { name = "uvicorn", marker = "sys_platform != 'emscripten'" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/mcp/1.9.0/mcp-1.9.0.tar.gz", hash = "sha256:905d8d208baf7e3e71d70c82803b89112e321581bcd2530f9de0fe4103d28749" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/mcp/1.9.0/mcp-1.9.0-py3-none-any.whl", hash = "sha256:9dfb89c8c56f742da10a5910a1f64b0d2ac2c3ed2bd572ddb1cfab7f35957178" }, +] + +[package.optional-dependencies] +cli = [ + { name = "python-dotenv" }, + { name = "typer" }, +] + +[[package]] +name = "mdurl" +version = "0.1.2" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/mdurl/0.1.2/mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/mdurl/0.1.2/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8" }, +] + +[[package]] +name = "multidict" +version = "6.4.4" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "typing-extensions", marker = "python_full_version < '3.11'" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4.tar.gz", hash = "sha256:69ee9e6ba214b5245031b76233dd95408a0fd57fdb019ddcc1ead4790932a8e8" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:4f5f29794ac0e73d2a06ac03fd18870adc0135a9d384f4a306a951188ed02f95" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:c04157266344158ebd57b7120d9b0b35812285d26d0e78193e17ef57bfe2979a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:bb61ffd3ab8310d93427e460f565322c44ef12769f51f77277b4abad7b6f7223" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5e0ba18a9afd495f17c351d08ebbc4284e9c9f7971d715f196b79636a4d0de44" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:9faf1b1dcaadf9f900d23a0e6d6c8eadd6a95795a0e57fcca73acce0eb912065" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a4d1cb1327c6082c4fce4e2a438483390964c02213bc6b8d782cf782c9b1471f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:941f1bec2f5dbd51feeb40aea654c2747f811ab01bdd3422a48a4e4576b7d76a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e5f8a146184da7ea12910a4cec51ef85e44f6268467fb489c3caf0cd512f29c2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:232b7237e57ec3c09be97206bfb83a0aa1c5d7d377faa019c68a210fa35831f1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:dc388f75a1c00000824bf28b7633e40854f4127ede80512b44c3cfeeea1839a2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:98af87593a666f739d9dba5d0ae86e01b0e1a9cfcd2e30d2d361fbbbd1a9162d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:aff4cafea2d120327d55eadd6b7f1136a8e5a0ecf6fb3b6863e8aca32cd8e50a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:169c4ba7858176b797fe551d6e99040c531c775d2d57b31bcf4de6d7a669847f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:b9eb4c59c54421a32b3273d4239865cb14ead53a606db066d7130ac80cc8ec93" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7cf3bd54c56aa16fdb40028d545eaa8d051402b61533c21e84046e05513d5780" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f682c42003c7264134bfe886376299db4cc0c6cd06a3295b41b347044bcb5482" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a920f9cf2abdf6e493c519492d892c362007f113c94da4c239ae88429835bad1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:530d86827a2df6504526106b4c104ba19044594f8722d3e87714e847c74a0275" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/multidict/6.4.4/multidict-6.4.4-py3-none-any.whl", hash = "sha256:bd4557071b561a8b3b6075c3ce93cf9bfb6182cb241805c3d66ced3b75eff4ac" }, +] + +[[package]] +name = "packaging" +version = "25.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/packaging/25.0/packaging-25.0.tar.gz", hash = "sha256:d443872c98d677bf60f6a1f2f8c1cb748e8fe762d2bf9d3148b5599295b0fc4f" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/packaging/25.0/packaging-25.0-py3-none-any.whl", hash = "sha256:29572ef2b1f17581046b3a2227d5c611fb25ec70ca1ba8554b24b0e69331a484" }, +] + +[[package]] +name = "pluggy" +version = "1.6.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pluggy/1.6.0/pluggy-1.6.0.tar.gz", hash = "sha256:7dcc130b76258d33b90f61b658791dede3486c3e6bfb003ee5c9bfb396dd22f3" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pluggy/1.6.0/pluggy-1.6.0-py3-none-any.whl", hash = "sha256:e920276dd6813095e9377c0bc5566d94c932c33b27a3e3945d8389c374dd4746" }, +] + +[[package]] +name = "pr-agent-slack" +version = "3.0.0" +source = { editable = "." } +dependencies = [ + { name = "aiohttp" }, + { name = "mcp", extra = ["cli"] }, + { name = "requests" }, +] + +[package.optional-dependencies] +dev = [ + { name = "pytest" }, + { name = "pytest-asyncio" }, +] + +[package.dev-dependencies] +dev = [ + { name = "pytest" }, + { name = "pytest-asyncio" }, +] + +[package.metadata] +requires-dist = [ + { name = "aiohttp", specifier = ">=3.10.0,<4.0.0" }, + { name = "mcp", extras = ["cli"], specifier = ">=1.0.0" }, + { name = "pytest", marker = "extra == 'dev'", specifier = ">=8.3.0" }, + { name = "pytest-asyncio", marker = "extra == 'dev'", specifier = ">=0.21.0" }, + { name = "requests", specifier = ">=2.32.0,<3.0.0" }, +] + +[package.metadata.requires-dev] +dev = [ + { name = "pytest", specifier = ">=8.3.0" }, + { name = "pytest-asyncio", specifier = ">=0.21.0" }, +] + +[[package]] +name = "propcache" +version = "0.3.1" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1.tar.gz", hash = "sha256:40d980c33765359098837527e18eddefc9a24cea5b45e078a7f3bb5b032c6ecf" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:f27785888d2fdd918bc36de8b8739f2d6c791399552333721b58193f68ea3e98" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d4e89cde74154c7b5957f87a355bb9c8ec929c167b59c83d90654ea36aeb6180" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:730178f476ef03d3d4d255f0c9fa186cb1d13fd33ffe89d39f2cda4da90ceb71" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:967a8eec513dbe08330f10137eacb427b2ca52118769e82ebcfcab0fba92a649" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5b9145c35cc87313b5fd480144f8078716007656093d23059e8993d3a8fa730f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9e64e948ab41411958670f1093c0a57acfdc3bee5cf5b935671bbd5313bcf229" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:319fa8765bfd6a265e5fa661547556da381e53274bc05094fc9ea50da51bfd46" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c66d8ccbc902ad548312b96ed8d5d266d0d2c6d006fd0f66323e9d8f2dd49be7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:2d219b0dbabe75e15e581fc1ae796109b07c8ba7d25b9ae8d650da582bed01b0" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:cd6a55f65241c551eb53f8cf4d2f4af33512c39da5d9777694e9d9c60872f519" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:9979643ffc69b799d50d3a7b72b5164a2e97e117009d7af6dfdd2ab906cb72cd" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:4cf9e93a81979f1424f1a3d155213dc928f1069d697e4353edb8a5eba67c6259" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:2fce1df66915909ff6c824bbb5eb403d2d15f98f1518e583074671a30fe0c21e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:4d0dfdd9a2ebc77b869a0b04423591ea8823f791293b527dc1bb896c1d6f1136" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-win32.whl", hash = "sha256:1f6cc0ad7b4560e5637eb2c994e97b4fa41ba8226069c9277eb5ea7101845b42" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp310-cp310-win_amd64.whl", hash = "sha256:47ef24aa6511e388e9894ec16f0fbf3313a53ee68402bc428744a367ec55b833" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:7f30241577d2fef2602113b70ef7231bf4c69a97e04693bde08ddab913ba0ce5" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:43593c6772aa12abc3af7784bff4a41ffa921608dd38b77cf1dfd7f5c4e71371" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a75801768bbe65499495660b777e018cbe90c7980f07f8aa57d6be79ea6f71da" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f6f1324db48f001c2ca26a25fa25af60711e09b9aaf4b28488602776f4f9a744" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5cdb0f3e1eb6dfc9965d19734d8f9c481b294b5274337a8cb5cb01b462dcb7e0" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1eb34d90aac9bfbced9a58b266f8946cb5935869ff01b164573a7634d39fbcb5" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f35c7070eeec2cdaac6fd3fe245226ed2a6292d3ee8c938e5bb645b434c5f256" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b23c11c2c9e6d4e7300c92e022046ad09b91fd00e36e83c44483df4afa990073" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:3e19ea4ea0bf46179f8a3652ac1426e6dcbaf577ce4b4f65be581e237340420d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:bd39c92e4c8f6cbf5f08257d6360123af72af9f4da75a690bef50da77362d25f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:b0313e8b923b3814d1c4a524c93dfecea5f39fa95601f6a9b1ac96cd66f89ea0" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:e861ad82892408487be144906a368ddbe2dc6297074ade2d892341b35c59844a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:61014615c1274df8da5991a1e5da85a3ccb00c2d4701ac6f3383afd3ca47ab0a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:71ebe3fe42656a2328ab08933d420df5f3ab121772eef78f2dc63624157f0ed9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-win32.whl", hash = "sha256:58aa11f4ca8b60113d4b8e32d37e7e78bd8af4d1a5b5cb4979ed856a45e62005" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp311-cp311-win_amd64.whl", hash = "sha256:9532ea0b26a401264b1365146c440a6d78269ed41f83f23818d4b79497aeabe7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:f78eb8422acc93d7b69964012ad7048764bb45a54ba7a39bb9e146c72ea29723" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:89498dd49c2f9a026ee057965cdf8192e5ae070ce7d7a7bd4b66a8e257d0c976" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:09400e98545c998d57d10035ff623266927cb784d13dd2b31fd33b8a5316b85b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aa8efd8c5adc5a2c9d3b952815ff8f7710cefdcaf5f2c36d26aff51aeca2f12f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c2fe5c910f6007e716a06d269608d307b4f36e7babee5f36533722660e8c4a70" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a0ab8cf8cdd2194f8ff979a43ab43049b1df0b37aa64ab7eca04ac14429baeb7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:563f9d8c03ad645597b8d010ef4e9eab359faeb11a0a2ac9f7b4bc8c28ebef25" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:fb6e0faf8cb6b4beea5d6ed7b5a578254c6d7df54c36ccd3d8b3eb00d6770277" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:1c5c7ab7f2bb3f573d1cb921993006ba2d39e8621019dffb1c5bc94cdbae81e8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:050b571b2e96ec942898f8eb46ea4bfbb19bd5502424747e83badc2d4a99a44e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:e1c4d24b804b3a87e9350f79e2371a705a188d292fd310e663483af6ee6718ee" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:e4fe2a6d5ce975c117a6bb1e8ccda772d1e7029c1cca1acd209f91d30fa72815" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:feccd282de1f6322f56f6845bf1207a537227812f0a9bf5571df52bb418d79d5" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ec314cde7314d2dd0510c6787326bbffcbdc317ecee6b7401ce218b3099075a7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-win32.whl", hash = "sha256:7d2d5a0028d920738372630870e7d9644ce437142197f8c827194fca404bf03b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp312-cp312-win_amd64.whl", hash = "sha256:88c423efef9d7a59dae0614eaed718449c09a5ac79a5f224a8b9664d603f04a3" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:f1528ec4374617a7a753f90f20e2f551121bb558fcb35926f99e3c42367164b8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:dc1915ec523b3b494933b5424980831b636fe483d7d543f7afb7b3bf00f0c10f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a110205022d077da24e60b3df8bcee73971be9575dec5573dd17ae5d81751111" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d249609e547c04d190e820d0d4c8ca03ed4582bcf8e4e160a6969ddfb57b62e5" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5ced33d827625d0a589e831126ccb4f5c29dfdf6766cac441d23995a65825dcb" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4114c4ada8f3181af20808bedb250da6bae56660e4b8dfd9cd95d4549c0962f7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:975af16f406ce48f1333ec5e912fe11064605d5c5b3f6746969077cc3adeb120" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a34aa3a1abc50740be6ac0ab9d594e274f59960d3ad253cd318af76b996dd654" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:9cec3239c85ed15bfaded997773fdad9fb5662b0a7cbc854a43f291eb183179e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:05543250deac8e61084234d5fc54f8ebd254e8f2b39a16b1dce48904f45b744b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:5cb5918253912e088edbf023788de539219718d3b10aef334476b62d2b53de53" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:f3bbecd2f34d0e6d3c543fdb3b15d6b60dd69970c2b4c822379e5ec8f6f621d5" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:aca63103895c7d960a5b9b044a83f544b233c95e0dcff114389d64d762017af7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:5a0a9898fdb99bf11786265468571e628ba60af80dc3f6eb89a3545540c6b0ef" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-win32.whl", hash = "sha256:3a02a28095b5e63128bcae98eb59025924f121f048a62393db682f049bf4ac24" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313-win_amd64.whl", hash = "sha256:813fbb8b6aea2fc9659815e585e548fe706d6f663fa73dff59a1677d4595a037" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:a444192f20f5ce8a5e52761a031b90f5ea6288b1eef42ad4c7e64fef33540b8f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:0fbe94666e62ebe36cd652f5fc012abfbc2342de99b523f8267a678e4dfdee3c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:f011f104db880f4e2166bcdcf7f58250f7a465bc6b068dc84c824a3d4a5c94dc" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3e584b6d388aeb0001d6d5c2bd86b26304adde6d9bb9bfa9c4889805021b96de" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8a17583515a04358b034e241f952f1715243482fc2c2945fd99a1b03a0bd77d6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5aed8d8308215089c0734a2af4f2e95eeb360660184ad3912686c181e500b2e7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6d8e309ff9a0503ef70dc9a0ebd3e69cf7b3894c9ae2ae81fc10943c37762458" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b655032b202028a582d27aeedc2e813299f82cb232f969f87a4fde491a233f11" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:9f64d91b751df77931336b5ff7bafbe8845c5770b06630e27acd5dbb71e1931c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:19a06db789a4bd896ee91ebc50d059e23b3639c25d58eb35be3ca1cbe967c3bf" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:bef100c88d8692864651b5f98e871fb090bd65c8a41a1cb0ff2322db39c96c27" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:87380fb1f3089d2a0b8b00f006ed12bd41bd858fabfa7330c954c70f50ed8757" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:e474fc718e73ba5ec5180358aa07f6aded0ff5f2abe700e3115c37d75c947e18" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:17d1c688a443355234f3c031349da69444be052613483f3e4158eef751abcd8a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-win32.whl", hash = "sha256:359e81a949a7619802eb601d66d37072b79b79c2505e6d3fd8b945538411400d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-cp313-cp313t-win_amd64.whl", hash = "sha256:e7fb9a84c9abbf2b2683fa3e7b0d7da4d8ecf139a1c635732a8bda29c5214b0e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/propcache/0.3.1/propcache-0.3.1-py3-none-any.whl", hash = "sha256:9a8ecf38de50a7f518c21568c80f985e776397b902f1ce0b01f799aba1608b40" }, +] + +[[package]] +name = "pydantic" +version = "2.11.4" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "annotated-types" }, + { name = "pydantic-core" }, + { name = "typing-extensions" }, + { name = "typing-inspection" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic/2.11.4/pydantic-2.11.4.tar.gz", hash = "sha256:32738d19d63a226a52eed76645a98ee07c1f410ee41d93b4afbfa85ed8111c2d" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic/2.11.4/pydantic-2.11.4-py3-none-any.whl", hash = "sha256:d9615eaa9ac5a063471da949c8fc16376a84afb5024688b3ff885693506764eb" }, +] + +[[package]] +name = "pydantic-core" +version = "2.33.2" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "typing-extensions" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2.tar.gz", hash = "sha256:7cb8bc3605c29176e1b105350d2e6474142d7c1bd1d9327c4a9bdb46bf827acc" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:2b3d326aaef0c0399d9afffeb6367d5e26ddc24d351dbc9c636840ac355dc5d8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:0e5b2671f05ba48b94cb90ce55d8bdcaaedb8ba00cc5359f6810fc918713983d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0069c9acc3f3981b9ff4cdfaf088e98d83440a4c7ea1bc07460af3d4dc22e72d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d53b22f2032c42eaaf025f7c40c2e3b94568ae077a606f006d206a463bc69572" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0405262705a123b7ce9f0b92f123334d67b70fd1f20a9372b907ce1080c7ba02" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4b25d91e288e2c4e0662b8038a28c6a07eaac3e196cfc4ff69de4ea3db992a1b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6bdfe4b3789761f3bcb4b1ddf33355a71079858958e3a552f16d5af19768fef2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:efec8db3266b76ef9607c2c4c419bdb06bf335ae433b80816089ea7585816f6a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:031c57d67ca86902726e0fae2214ce6770bbe2f710dc33063187a68744a5ecac" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:f8de619080e944347f5f20de29a975c2d815d9ddd8be9b9b7268e2e3ef68605a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:73662edf539e72a9440129f231ed3757faab89630d291b784ca99237fb94db2b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-win32.whl", hash = "sha256:0a39979dcbb70998b0e505fb1556a1d550a0781463ce84ebf915ba293ccb7e22" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-win_amd64.whl", hash = "sha256:b0379a2b24882fef529ec3b4987cb5d003b9cda32256024e6fe1586ac45fc640" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:4c5b0a576fb381edd6d27f0a85915c6daf2f8138dc5c267a57c08a62900758c7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e799c050df38a639db758c617ec771fd8fb7a5f8eaaa4b27b101f266b216a246" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dc46a01bf8d62f227d5ecee74178ffc448ff4e5197c756331f71efcc66dc980f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a144d4f717285c6d9234a66778059f33a89096dfb9b39117663fd8413d582dcc" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:73cf6373c21bc80b2e0dc88444f41ae60b2f070ed02095754eb5a01df12256de" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3dc625f4aa79713512d1976fe9f0bc99f706a9dee21dfd1810b4bbbf228d0e8a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:881b21b5549499972441da4758d662aeea93f1923f953e9cbaff14b8b9565aef" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bdc25f3681f7b78572699569514036afe3c243bc3059d3942624e936ec93450e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:fe5b32187cbc0c862ee201ad66c30cf218e5ed468ec8dc1cf49dec66e160cc4d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:bc7aee6f634a6f4a95676fcb5d6559a2c2a390330098dba5e5a5f28a2e4ada30" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:235f45e5dbcccf6bd99f9f472858849f73d11120d76ea8707115415f8e5ebebf" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-win32.whl", hash = "sha256:6368900c2d3ef09b69cb0b913f9f8263b03786e5b2a387706c5afb66800efd51" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-win_amd64.whl", hash = "sha256:1e063337ef9e9820c77acc768546325ebe04ee38b08703244c1309cccc4f1bab" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-win_arm64.whl", hash = "sha256:6b99022f1d19bc32a4c2a0d544fc9a76e3be90f0b3f4af413f87d38749300e65" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:a7ec89dc587667f22b6a0b6579c249fca9026ce7c333fc142ba42411fa243cdc" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3c6db6e52c6d70aa0d00d45cdb9b40f0433b96380071ea80b09277dba021ddf7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e61206137cbc65e6d5256e1166f88331d3b6238e082d9f74613b9b765fb9025" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:eb8c529b2819c37140eb51b914153063d27ed88e3bdc31b71198a198e921e011" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c52b02ad8b4e2cf14ca7b3d918f3eb0ee91e63b3167c32591e57c4317e134f8f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:96081f1605125ba0855dfda83f6f3df5ec90c61195421ba72223de35ccfb2f88" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f57a69461af2a5fa6e6bbd7a5f60d3b7e6cebb687f55106933188e79ad155c1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:572c7e6c8bb4774d2ac88929e3d1f12bc45714ae5ee6d9a788a9fb35e60bb04b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:db4b41f9bd95fbe5acd76d89920336ba96f03e149097365afe1cb092fceb89a1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:fa854f5cf7e33842a892e5c73f45327760bc7bc516339fda888c75ae60edaeb6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:5f483cfb75ff703095c59e365360cb73e00185e01aaea067cd19acffd2ab20ea" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-win32.whl", hash = "sha256:9cb1da0f5a471435a7bc7e439b8a728e8b61e59784b2af70d7c169f8dd8ae290" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-win_amd64.whl", hash = "sha256:f941635f2a3d96b2973e867144fde513665c87f13fe0e193c158ac51bfaaa7b2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-win_arm64.whl", hash = "sha256:cca3868ddfaccfbc4bfb1d608e2ccaaebe0ae628e1416aeb9c4d88c001bb45ab" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:1082dd3e2d7109ad8b7da48e1d4710c8d06c253cbc4a27c1cff4fbcaa97a9e3f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f517ca031dfc037a9c07e748cefd8d96235088b83b4f4ba8939105d20fa1dcd6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0a9f2c9dd19656823cb8250b0724ee9c60a82f3cdf68a080979d13092a3b0fef" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2b0a451c263b01acebe51895bfb0e1cc842a5c666efe06cdf13846c7418caa9a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ea40a64d23faa25e62a70ad163571c0b342b8bf66d5fa612ac0dec4f069d916" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0fb2d542b4d66f9470e8065c5469ec676978d625a8b7a363f07d9a501a9cb36a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fdac5d6ffa1b5a83bca06ffe7583f5576555e6c8b3a91fbd25ea7780f825f7d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:04a1a413977ab517154eebb2d326da71638271477d6ad87a769102f7c2488c56" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:c8e7af2f4e0194c22b5b37205bfb293d166a7344a5b0d0eaccebc376546d77d5" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:5c92edd15cd58b3c2d34873597a1e20f13094f59cf88068adb18947df5455b4e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:65132b7b4a1c0beded5e057324b7e16e10910c106d43675d9bd87d4f38dde162" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-win32.whl", hash = "sha256:52fb90784e0a242bb96ec53f42196a17278855b0f31ac7c3cc6f5c1ec4811849" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-win_amd64.whl", hash = "sha256:c083a3bdd5a93dfe480f1125926afcdbf2917ae714bdb80b36d34318b2bec5d9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-win_arm64.whl", hash = "sha256:e80b087132752f6b3d714f041ccf74403799d3b23a72722ea2e6ba2e892555b9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:61c18fba8e5e9db3ab908620af374db0ac1baa69f0f32df4f61ae23f15e586ac" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:95237e53bb015f67b63c91af7518a62a8660376a6a0db19b89acc77a4d6199f5" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313t-win_amd64.whl", hash = "sha256:c2fc0a768ef76c15ab9238afa6da7f69895bb5d1ee83aeea2e3509af4472d0b9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:5c4aa4e82353f65e548c476b37e64189783aa5384903bfea4f41580f255fddfa" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:d946c8bf0d5c24bf4fe333af284c59a19358aa3ec18cb3dc4370080da1e8ad29" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:87b31b6846e361ef83fedb187bb5b4372d0da3f7e28d85415efa92d6125d6e6d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aa9d91b338f2df0508606f7009fde642391425189bba6d8c653afd80fd6bb64e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2058a32994f1fde4ca0480ab9d1e75a0e8c87c22b53a3ae66554f9af78f2fe8c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:0e03262ab796d986f978f79c943fc5f620381be7287148b8010b4097f79a39ec" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:1a8695a8d00c73e50bff9dfda4d540b7dee29ff9b8053e38380426a85ef10052" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:fa754d1850735a0b0e03bcffd9d4b4343eb417e47196e4485d9cca326073a42c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:a11c8d26a50bfab49002947d3d237abe4d9e4b5bdc8846a63537b6488e197808" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:dd14041875d09cc0f9308e37a6f8b65f5585cf2598a53aa0123df8b129d481f8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:d87c561733f66531dced0da6e864f44ebf89a8fba55f31407b00c2f7f9449593" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2f82865531efd18d6e07a04a17331af02cb7a651583c418df8266f17a63c6612" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bfb5112df54209d820d7bf9317c7a6c9025ea52e49f46b6a2060104bba37de7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:64632ff9d614e5eecfb495796ad51b0ed98c453e447a76bcbeeb69615079fc7e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:f889f7a40498cc077332c7ab6b4608d296d852182211787d4f3ee377aaae66e8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:de4b83bb311557e439b9e186f733f6c645b9417c84e2eb8203f3f820a4b988bf" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:82f68293f055f51b51ea42fafc74b6aad03e70e191799430b90c13d643059ebb" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:329467cecfb529c925cf2bbd4d60d2c509bc2fb52a20c1045bf09bb70971a9c1" }, +] + +[[package]] +name = "pydantic-settings" +version = "2.9.1" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "pydantic" }, + { name = "python-dotenv" }, + { name = "typing-inspection" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-settings/2.9.1/pydantic_settings-2.9.1.tar.gz", hash = "sha256:c509bf79d27563add44e8446233359004ed85066cd096d8b510f715e6ef5d268" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-settings/2.9.1/pydantic_settings-2.9.1-py3-none-any.whl", hash = "sha256:59b4f431b1defb26fe620c71a7d3968a710d719f5f4cdbbdb7926edeb770f6ef" }, +] + +[[package]] +name = "pygments" +version = "2.19.1" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pygments/2.19.1/pygments-2.19.1.tar.gz", hash = "sha256:61c16d2a8576dc0649d9f39e089b5f02bcd27fba10d8fb4dcc28173f7a45151f" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pygments/2.19.1/pygments-2.19.1-py3-none-any.whl", hash = "sha256:9ea1544ad55cecf4b8242fab6dd35a93bbce657034b0611ee383099054ab6d8c" }, +] + +[[package]] +name = "pytest" +version = "8.3.5" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "colorama", marker = "sys_platform == 'win32'" }, + { name = "exceptiongroup", marker = "python_full_version < '3.11'" }, + { name = "iniconfig" }, + { name = "packaging" }, + { name = "pluggy" }, + { name = "tomli", marker = "python_full_version < '3.11'" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pytest/8.3.5/pytest-8.3.5.tar.gz", hash = "sha256:f4efe70cc14e511565ac476b57c279e12a855b11f48f212af1080ef2263d3845" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pytest/8.3.5/pytest-8.3.5-py3-none-any.whl", hash = "sha256:c69214aa47deac29fad6c2a4f590b9c4a9fdb16a403176fe154b79c0b4d4d820" }, +] + +[[package]] +name = "pytest-asyncio" +version = "0.26.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "pytest" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pytest-asyncio/0.26.0/pytest_asyncio-0.26.0.tar.gz", hash = "sha256:c4df2a697648241ff39e7f0e4a73050b03f123f760673956cf0d72a4990e312f" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pytest-asyncio/0.26.0/pytest_asyncio-0.26.0-py3-none-any.whl", hash = "sha256:7b51ed894f4fbea1340262bdae5135797ebbe21d8638978e35d31c6d19f72fb0" }, +] + +[[package]] +name = "python-dotenv" +version = "1.1.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/python-dotenv/1.1.0/python_dotenv-1.1.0.tar.gz", hash = "sha256:41f90bc6f5f177fb41f53e87666db362025010eb28f60a01c9143bfa33a2b2d5" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/python-dotenv/1.1.0/python_dotenv-1.1.0-py3-none-any.whl", hash = "sha256:d7c01d9e2293916c18baf562d95698754b0dbbb5e74d457c45d4f6561fb9d55d" }, +] + +[[package]] +name = "python-multipart" +version = "0.0.20" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/python-multipart/0.0.20/python_multipart-0.0.20.tar.gz", hash = "sha256:8dd0cab45b8e23064ae09147625994d090fa46f5b0d1e13af944c331a7fa9d13" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/python-multipart/0.0.20/python_multipart-0.0.20-py3-none-any.whl", hash = "sha256:8a62d3a8335e06589fe01f2a3e178cdcc632f3fbe0d492ad9ee0ec35aab1f104" }, +] + +[[package]] +name = "requests" +version = "2.32.3" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "certifi" }, + { name = "charset-normalizer" }, + { name = "idna" }, + { name = "urllib3" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/requests/2.32.3/requests-2.32.3.tar.gz", hash = "sha256:55365417734eb18255590a9ff9eb97e9e1da868d4ccd6402399eaf68af20a760" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/requests/2.32.3/requests-2.32.3-py3-none-any.whl", hash = "sha256:70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6" }, +] + +[[package]] +name = "rich" +version = "14.0.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "markdown-it-py" }, + { name = "pygments" }, + { name = "typing-extensions", marker = "python_full_version < '3.11'" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/rich/14.0.0/rich-14.0.0.tar.gz", hash = "sha256:82f1bc23a6a21ebca4ae0c45af9bdbc492ed20231dcb63f297d6d1021a9d5725" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/rich/14.0.0/rich-14.0.0-py3-none-any.whl", hash = "sha256:1c9491e1951aac09caffd42f448ee3d04e58923ffe14993f6e83068dc395d7e0" }, +] + +[[package]] +name = "shellingham" +version = "1.5.4" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/shellingham/1.5.4/shellingham-1.5.4.tar.gz", hash = "sha256:8dbca0739d487e5bd35ab3ca4b36e11c4078f3a234bfce294b0a0291363404de" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/shellingham/1.5.4/shellingham-1.5.4-py2.py3-none-any.whl", hash = "sha256:7ecfff8f2fd72616f7481040475a65b2bf8af90a56c89140852d1120324e8686" }, +] + +[[package]] +name = "sniffio" +version = "1.3.1" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/sniffio/1.3.1/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/sniffio/1.3.1/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2" }, +] + +[[package]] +name = "sse-starlette" +version = "2.3.4" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "anyio" }, + { name = "starlette" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/sse-starlette/2.3.4/sse_starlette-2.3.4.tar.gz", hash = "sha256:0ffd6bed217cdbb74a84816437c609278003998b4991cd2e6872d0b35130e4d5" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/sse-starlette/2.3.4/sse_starlette-2.3.4-py3-none-any.whl", hash = "sha256:b8100694f3f892b133d0f7483acb7aacfcf6ed60f863b31947664b6dc74e529f" }, +] + +[[package]] +name = "starlette" +version = "0.46.2" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "anyio" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/starlette/0.46.2/starlette-0.46.2.tar.gz", hash = "sha256:7f7361f34eed179294600af672f565727419830b54b7b084efe44bb82d2fccd5" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/starlette/0.46.2/starlette-0.46.2-py3-none-any.whl", hash = "sha256:595633ce89f8ffa71a015caed34a5b2dc1c0cdb3f0f1fbd1e69339cf2abeec35" }, +] + +[[package]] +name = "tomli" +version = "2.2.1" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1.tar.gz", hash = "sha256:cd45e1dc79c835ce60f7404ec8119f2eb06d38b1deba146f07ced3bbc44505ff" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:678e4fa69e4575eb77d103de3df8a895e1591b48e740211bd1067378c69e8249" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:023aa114dd824ade0100497eb2318602af309e5a55595f76b626d6d9f3b7b0a6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ece47d672db52ac607a3d9599a9d48dcb2f2f735c6c2d1f34130085bb12b112a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6972ca9c9cc9f0acaa56a8ca1ff51e7af152a9f87fb64623e31d5c83700080ee" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c954d2250168d28797dd4e3ac5cf812a406cd5a92674ee4c8f123c889786aa8e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8dd28b3e155b80f4d54beb40a441d366adcfe740969820caf156c019fb5c7ec4" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:e59e304978767a54663af13c07b3d1af22ddee3bb2fb0618ca1593e4f593a106" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:33580bccab0338d00994d7f16f4c4ec25b776af3ffaac1ed74e0b3fc95e885a8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-win32.whl", hash = "sha256:465af0e0875402f1d226519c9904f37254b3045fc5084697cefb9bdde1ff99ff" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-win_amd64.whl", hash = "sha256:2d0f2fdd22b02c6d81637a3c95f8cd77f995846af7414c5c4b8d0545afa1bc4b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4a8f6e44de52d5e6c657c9fe83b562f5f4256d8ebbfe4ff922c495620a7f6cea" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8d57ca8095a641b8237d5b079147646153d22552f1c637fd3ba7f4b0b29167a8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e340144ad7ae1533cb897d406382b4b6fede8890a03738ff1683af800d54192" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:db2b95f9de79181805df90bedc5a5ab4c165e6ec3fe99f970d0e302f384ad222" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:40741994320b232529c802f8bc86da4e1aa9f413db394617b9a256ae0f9a7f77" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:400e720fe168c0f8521520190686ef8ef033fb19fc493da09779e592861b78c6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:02abe224de6ae62c19f090f68da4e27b10af2b93213d36cf44e6e1c5abd19fdd" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b82ebccc8c8a36f2094e969560a1b836758481f3dc360ce9a3277c65f374285e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-win32.whl", hash = "sha256:889f80ef92701b9dbb224e49ec87c645ce5df3fa2cc548664eb8a25e03127a98" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-win_amd64.whl", hash = "sha256:7fc04e92e1d624a4a63c76474610238576942d6b8950a2d7f908a340494e67e4" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f4039b9cbc3048b2416cc57ab3bda989a6fcf9b36cf8937f01a6e731b64f80d7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:286f0ca2ffeeb5b9bd4fcc8d6c330534323ec51b2f52da063b11c502da16f30c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a92ef1a44547e894e2a17d24e7557a5e85a9e1d0048b0b5e7541f76c5032cb13" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9316dc65bed1684c9a98ee68759ceaed29d229e985297003e494aa825ebb0281" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e85e99945e688e32d5a35c1ff38ed0b3f41f43fad8df0bdf79f72b2ba7bc5272" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ac065718db92ca818f8d6141b5f66369833d4a80a9d74435a268c52bdfa73140" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:d920f33822747519673ee656a4b6ac33e382eca9d331c87770faa3eef562aeb2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a198f10c4d1b1375d7687bc25294306e551bf1abfa4eace6650070a5c1ae2744" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-win32.whl", hash = "sha256:d3f5614314d758649ab2ab3a62d4f2004c825922f9e370b29416484086b264ec" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-win_amd64.whl", hash = "sha256:a38aa0308e754b0e3c67e344754dff64999ff9b513e691d0e786265c93583c69" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-py3-none-any.whl", hash = "sha256:cb55c73c5f4408779d0cf3eef9f762b9c9f147a77de7b258bef0a5628adc85cc" }, +] + +[[package]] +name = "typer" +version = "0.15.4" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "click" }, + { name = "rich" }, + { name = "shellingham" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/typer/0.15.4/typer-0.15.4.tar.gz", hash = "sha256:89507b104f9b6a0730354f27c39fae5b63ccd0c95b1ce1f1a6ba0cfd329997c3" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/typer/0.15.4/typer-0.15.4-py3-none-any.whl", hash = "sha256:eb0651654dcdea706780c466cf06d8f174405a659ffff8f163cfbfee98c0e173" }, +] + +[[package]] +name = "typing-extensions" +version = "4.13.2" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/typing-extensions/4.13.2/typing_extensions-4.13.2.tar.gz", hash = "sha256:e6c81219bd689f51865d9e372991c540bda33a0379d5573cddb9a3a23f7caaef" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/typing-extensions/4.13.2/typing_extensions-4.13.2-py3-none-any.whl", hash = "sha256:a439e7c04b49fec3e5d3e2beaa21755cadbbdc391694e28ccdd36ca4a1408f8c" }, +] + +[[package]] +name = "typing-inspection" +version = "0.4.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "typing-extensions" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/typing-inspection/0.4.0/typing_inspection-0.4.0.tar.gz", hash = "sha256:9765c87de36671694a67904bf2c96e395be9c6439bb6c87b5142569dcdd65122" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/typing-inspection/0.4.0/typing_inspection-0.4.0-py3-none-any.whl", hash = "sha256:50e72559fcd2a6367a19f7a7e610e6afcb9fac940c650290eed893d61386832f" }, +] + +[[package]] +name = "urllib3" +version = "2.4.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/urllib3/2.4.0/urllib3-2.4.0.tar.gz", hash = "sha256:414bc6535b787febd7567804cc015fee39daab8ad86268f1310a9250697de466" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/urllib3/2.4.0/urllib3-2.4.0-py3-none-any.whl", hash = "sha256:4e16665048960a0900c702d4a66415956a584919c03361cac9f1df5c5dd7e813" }, +] + +[[package]] +name = "uvicorn" +version = "0.34.2" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "click" }, + { name = "h11" }, + { name = "typing-extensions", marker = "python_full_version < '3.11'" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/uvicorn/0.34.2/uvicorn-0.34.2.tar.gz", hash = "sha256:0e929828f6186353a80b58ea719861d2629d766293b6d19baf086ba31d4f3328" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/uvicorn/0.34.2/uvicorn-0.34.2-py3-none-any.whl", hash = "sha256:deb49af569084536d269fe0a6d67e3754f104cf03aba7c11c40f01aadf33c403" }, +] + +[[package]] +name = "yarl" +version = "1.20.0" +source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } +dependencies = [ + { name = "idna" }, + { name = "multidict" }, + { name = "propcache" }, +] +sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0.tar.gz", hash = "sha256:686d51e51ee5dfe62dec86e4866ee0e9ed66df700d55c828a615640adc885307" } +wheels = [ + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:f1f6670b9ae3daedb325fa55fbe31c22c8228f6e0b513772c2e1c623caa6ab22" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:85a231fa250dfa3308f3c7896cc007a47bc76e9e8e8595c20b7426cac4884c62" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:1a06701b647c9939d7019acdfa7ebbfbb78ba6aa05985bb195ad716ea759a569" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7595498d085becc8fb9203aa314b136ab0516c7abd97e7d74f7bb4eb95042abe" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:af5607159085dcdb055d5678fc2d34949bd75ae6ea6b4381e784bbab1c3aa195" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:95b50910e496567434cb77a577493c26bce0f31c8a305135f3bda6a2483b8e10" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b594113a301ad537766b4e16a5a6750fcbb1497dcc1bc8a4daae889e6402a634" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:083ce0393ea173cd37834eb84df15b6853b555d20c52703e21fbababa8c129d2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4f1a350a652bbbe12f666109fbddfdf049b3ff43696d18c9ab1531fbba1c977a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:fb0caeac4a164aadce342f1597297ec0ce261ec4532bbc5a9ca8da5622f53867" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:d88cc43e923f324203f6ec14434fa33b85c06d18d59c167a0637164863b8e995" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e52d6ed9ea8fd3abf4031325dc714aed5afcbfa19ee4a89898d663c9976eb487" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:ce360ae48a5e9961d0c730cf891d40698a82804e85f6e74658fb175207a77cb2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:06d06c9d5b5bc3eb56542ceeba6658d31f54cf401e8468512447834856fb0e61" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:c27d98f4e5c4060582f44e58309c1e55134880558f1add7a87c1bc36ecfade19" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-win32.whl", hash = "sha256:f4d3fa9b9f013f7050326e165c3279e22850d02ae544ace285674cb6174b5d6d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp310-cp310-win_amd64.whl", hash = "sha256:bc906b636239631d42eb8a07df8359905da02704a868983265603887ed68c076" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:fdb5204d17cb32b2de2d1e21c7461cabfacf17f3645e4b9039f210c5d3378bf3" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:eaddd7804d8e77d67c28d154ae5fab203163bd0998769569861258e525039d2a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:634b7ba6b4a85cf67e9df7c13a7fb2e44fa37b5d34501038d174a63eaac25ee2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6d409e321e4addf7d97ee84162538c7258e53792eb7c6defd0c33647d754172e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:ea52f7328a36960ba3231c6677380fa67811b414798a6e071c7085c57b6d20a9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c8703517b924463994c344dcdf99a2d5ce9eca2b6882bb640aa555fb5efc706a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:077989b09ffd2f48fb2d8f6a86c5fef02f63ffe6b1dd4824c76de7bb01e4f2e2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0acfaf1da020253f3533526e8b7dd212838fdc4109959a2c53cafc6db611bff2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b4230ac0b97ec5eeb91d96b324d66060a43fd0d2a9b603e3327ed65f084e41f8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:0a6a1e6ae21cdd84011c24c78d7a126425148b24d437b5702328e4ba640a8902" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:86de313371ec04dd2531f30bc41a5a1a96f25a02823558ee0f2af0beaa7ca791" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:dd59c9dd58ae16eaa0f48c3d0cbe6be8ab4dc7247c3ff7db678edecbaf59327f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:a0bc5e05f457b7c1994cc29e83b58f540b76234ba6b9648a4971ddc7f6aa52da" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:c9471ca18e6aeb0e03276b5e9b27b14a54c052d370a9c0c04a68cefbd1455eb4" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:40ed574b4df723583a26c04b298b283ff171bcc387bc34c2683235e2487a65a5" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-win32.whl", hash = "sha256:db243357c6c2bf3cd7e17080034ade668d54ce304d820c2a58514a4e51d0cfd6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp311-cp311-win_amd64.whl", hash = "sha256:8c12cd754d9dbd14204c328915e23b0c361b88f3cffd124129955e60a4fbfcfb" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:e06b9f6cdd772f9b665e5ba8161968e11e403774114420737f7884b5bd7bdf6f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:b9ae2fbe54d859b3ade40290f60fe40e7f969d83d482e84d2c31b9bff03e359e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6d12b8945250d80c67688602c891237994d203d42427cb14e36d1a732eda480e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:087e9731884621b162a3e06dc0d2d626e1542a617f65ba7cc7aeab279d55ad33" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:69df35468b66c1a6e6556248e6443ef0ec5f11a7a4428cf1f6281f1879220f58" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3b2992fe29002fd0d4cbaea9428b09af9b8686a9024c840b8a2b8f4ea4abc16f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4c903e0b42aab48abfbac668b5a9d7b6938e721a6341751331bcd7553de2dcae" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bf099e2432131093cc611623e0b0bcc399b8cddd9a91eded8bfb50402ec35018" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8a7f62f5dc70a6c763bec9ebf922be52aa22863d9496a9a30124d65b489ea672" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:54ac15a8b60382b2bcefd9a289ee26dc0920cf59b05368c9b2b72450751c6eb8" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:25b3bc0763a7aca16a0f1b5e8ef0f23829df11fb539a1b70476dcab28bd83da7" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:b2586e36dc070fc8fad6270f93242124df68b379c3a251af534030a4a33ef594" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:866349da9d8c5290cfefb7fcc47721e94de3f315433613e01b435473be63daa6" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:33bb660b390a0554d41f8ebec5cd4475502d84104b27e9b42f5321c5192bfcd1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:737e9f171e5a07031cbee5e9180f6ce21a6c599b9d4b2c24d35df20a52fabf4b" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-win32.whl", hash = "sha256:839de4c574169b6598d47ad61534e6981979ca2c820ccb77bf70f4311dd2cc64" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp312-cp312-win_amd64.whl", hash = "sha256:3d7dbbe44b443b0c4aa0971cb07dcb2c2060e4a9bf8d1301140a33a93c98e18c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:2137810a20b933b1b1b7e5cf06a64c3ed3b4747b0e5d79c9447c00db0e2f752f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:447c5eadd750db8389804030d15f43d30435ed47af1313303ed82a62388176d3" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:42fbe577272c203528d402eec8bf4b2d14fd49ecfec92272334270b850e9cd7d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:18e321617de4ab170226cd15006a565d0fa0d908f11f724a2c9142d6b2812ab0" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:4345f58719825bba29895011e8e3b545e6e00257abb984f9f27fe923afca2501" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5d9b980d7234614bc4674468ab173ed77d678349c860c3af83b1fffb6a837ddc" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:af4baa8a445977831cbaa91a9a84cc09debb10bc8391f128da2f7bd070fc351d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:123393db7420e71d6ce40d24885a9e65eb1edefc7a5228db2d62bcab3386a5c0" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ab47acc9332f3de1b39e9b702d9c916af7f02656b2a86a474d9db4e53ef8fd7a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:4a34c52ed158f89876cba9c600b2c964dfc1ca52ba7b3ab6deb722d1d8be6df2" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:04d8cfb12714158abf2618f792c77bc5c3d8c5f37353e79509608be4f18705c9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:7dc63ad0d541c38b6ae2255aaa794434293964677d5c1ec5d0116b0e308031f5" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:f9d02b591a64e4e6ca18c5e3d925f11b559c763b950184a64cf47d74d7e41877" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:95fc9876f917cac7f757df80a5dda9de59d423568460fe75d128c813b9af558e" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:bb769ae5760cd1c6a712135ee7915f9d43f11d9ef769cb3f75a23e398a92d384" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-win32.whl", hash = "sha256:70e0c580a0292c7414a1cead1e076c9786f685c1fc4757573d2967689b370e62" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313-win_amd64.whl", hash = "sha256:4c43030e4b0af775a85be1fa0433119b1565673266a70bf87ef68a9d5ba3174c" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:b6c4c3d0d6a0ae9b281e492b1465c72de433b782e6b5001c8e7249e085b69051" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:8681700f4e4df891eafa4f69a439a6e7d480d64e52bf460918f58e443bd3da7d" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:84aeb556cb06c00652dbf87c17838eb6d92cfd317799a8092cee0e570ee11229" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f166eafa78810ddb383e930d62e623d288fb04ec566d1b4790099ae0f31485f1" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:5d3d6d14754aefc7a458261027a562f024d4f6b8a798adb472277f675857b1eb" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:2a8f64df8ed5d04c51260dbae3cc82e5649834eebea9eadfd829837b8093eb00" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4d9949eaf05b4d30e93e4034a7790634bbb41b8be2d07edd26754f2e38e491de" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9c366b254082d21cc4f08f522ac201d0d83a8b8447ab562732931d31d80eb2a5" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:91bc450c80a2e9685b10e34e41aef3d44ddf99b3a498717938926d05ca493f6a" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:9c2aa4387de4bc3a5fe158080757748d16567119bef215bec643716b4fbf53f9" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:d2cbca6760a541189cf87ee54ff891e1d9ea6406079c66341008f7ef6ab61145" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:798a5074e656f06b9fad1a162be5a32da45237ce19d07884d0b67a0aa9d5fdda" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:f106e75c454288472dbe615accef8248c686958c2e7dd3b8d8ee2669770d020f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:3b60a86551669c23dc5445010534d2c5d8a4e012163218fc9114e857c0586fdd" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:3e429857e341d5e8e15806118e0294f8073ba9c4580637e59ab7b238afca836f" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-win32.whl", hash = "sha256:65a4053580fe88a63e8e4056b427224cd01edfb5f951498bfefca4052f0ce0ac" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-cp313-cp313t-win_amd64.whl", hash = "sha256:53b2da3a6ca0a541c1ae799c349788d480e5144cac47dba0266c7cb6c76151fe" }, + { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/yarl/1.20.0/yarl-1.20.0-py3-none-any.whl", hash = "sha256:5d0fe6af927a47a230f31e6004621fd0959eaa915fc62acfafa67ff7229a3124" }, +] diff --git a/projects/unit3/slack-notification/solution/webhook_server.py b/projects/unit3/slack-notification/solution/webhook_server.py new file mode 100644 index 0000000..64941d1 --- /dev/null +++ b/projects/unit3/slack-notification/solution/webhook_server.py @@ -0,0 +1,57 @@ +#!/usr/bin/env python3 +""" +Simple webhook server for GitHub Actions events. +Stores events in a JSON file that the MCP server can read. +""" + +import json +from datetime import datetime +from pathlib import Path +from aiohttp import web + +# File to store events +EVENTS_FILE = Path(__file__).parent / "github_events.json" + +async def handle_webhook(request): + """Handle incoming GitHub webhook""" + try: + data = await request.json() + + # Create event record + event = { + "timestamp": datetime.utcnow().isoformat(), + "event_type": request.headers.get("X-GitHub-Event", "unknown"), + "action": data.get("action"), + "workflow_run": data.get("workflow_run"), + "check_run": data.get("check_run"), + "repository": data.get("repository", {}).get("full_name"), + "sender": data.get("sender", {}).get("login") + } + + # Load existing events + events = [] + if EVENTS_FILE.exists(): + with open(EVENTS_FILE, 'r') as f: + events = json.load(f) + + # Add new event and keep last 100 + events.append(event) + events = events[-100:] + + # Save events + with open(EVENTS_FILE, 'w') as f: + json.dump(events, f, indent=2) + + return web.json_response({"status": "received"}) + except Exception as e: + return web.json_response({"error": str(e)}, status=400) + +# Create app and add route +app = web.Application() +app.router.add_post('/webhook/github', handle_webhook) + +if __name__ == '__main__': + print("🚀 Starting webhook server on http://localhost:8080") + print("📝 Events will be saved to:", EVENTS_FILE) + print("🔗 Webhook URL: http://localhost:8080/webhook/github") + web.run_app(app, host='localhost', port=8080) \ No newline at end of file diff --git a/projects/unit3/slack-notification/starter/README.md b/projects/unit3/slack-notification/starter/README.md new file mode 100644 index 0000000..b65be7e --- /dev/null +++ b/projects/unit3/slack-notification/starter/README.md @@ -0,0 +1,123 @@ +# Module 3: Slack Notification - Starter + +This starter code extends Modules 1 and 2 with Slack integration, demonstrating how to combine MCP Tools and Prompts for complete team communication workflows. + +## What You'll Implement + +In this module, you'll complete: + +1. **`send_slack_notification` tool** - Send messages to Slack via webhook +2. **`format_ci_failure_alert` prompt** - Format CI failures for Slack +3. **`format_ci_success_summary` prompt** - Format successful deployments for Slack + +## Prerequisites + +- Completed Modules 1 and 2 +- A Slack workspace with webhook permissions +- Environment variable: `SLACK_WEBHOOK_URL` + +## Setup + +1. Install dependencies: + ```bash + uv sync + ``` + +2. Set up Slack webhook: + - Create Slack app at https://api.slack.com/apps + - App name: "MCP Course Notifications" + - Enable incoming webhooks + - Add webhook to workspace + - Export URL: `export SLACK_WEBHOOK_URL="https://hooks.slack.com/services/..."` + +3. Test webhook: + ```bash + curl -X POST -H 'Content-type: application/json' \ + --data '{"text":"Hello from MCP Course!"}' \ + $SLACK_WEBHOOK_URL + ``` + +## Implementation Tasks + +### 1. Complete `send_slack_notification` Tool + +In `server.py`, implement the TODO sections: + +```python +@mcp.tool() +async def send_slack_notification(message: str) -> str: + # TODO: Import requests library at top of file + # TODO: Send POST request to webhook_url + # TODO: Include message in JSON payload: {"text": message} + # TODO: Handle response and return status +``` + +### 2. Test the Prompts + +The formatting prompts are complete - test them with Claude Code: +- Use `format_ci_failure_alert` to create failure messages +- Use `format_ci_success_summary` to create success messages +- Send formatted messages using your `send_slack_notification` tool + +### 3. End-to-End Workflow + +Test the complete integration: + +1. Start webhook server: `python webhook_server.py` +2. Start MCP server: `uv run server.py` +3. Start Cloudflare tunnel: `cloudflared tunnel --url http://localhost:8080` +4. Trigger GitHub Actions +5. Use prompts to format messages +6. Send to Slack and verify formatting + +## Expected Slack Output + +**Failure Alert:** +``` +❌ *CI Failed* - mcp-course + +> Tests failed in Module 3 implementation + +*Details:* +• Workflow: `CI` +• Branch: `feature/slack-integration` +• Commit: `abc123f` + +*Next Steps:* +• +• +``` + +## Security Note + +⚠️ **Important**: The `SLACK_WEBHOOK_URL` is a sensitive secret that grants permission to post messages to your Slack channel. Always: +- Store it as an environment variable, never in code +- Never commit webhook URLs to version control +- Treat it like a password + +## Available Tools & Prompts + +This module includes all tools and prompts from Modules 1 & 2, plus: + +**New Tools:** +- `send_slack_notification` - Send messages to Slack + +**New Prompts:** +- `format_ci_failure_alert` - Create failure alerts +- `format_ci_success_summary` - Create success summaries + +**From Previous Modules:** +- All file analysis tools (Module 1) +- All GitHub Actions tools (Module 2) +- All CI/CD analysis prompts (Module 2) + +## Testing Your Implementation + +1. **Tool Test**: Call `send_slack_notification` directly with a test message +2. **Prompt Test**: Use formatting prompts to generate messages, then send them +3. **Integration Test**: Full workflow from GitHub webhook to Slack notification +4. **Format Test**: Verify Slack markdown renders correctly in your channel + +## Next Steps + +Once complete, you'll have built a production-ready MCP server that demonstrates all core MCP primitives working together for real-world team automation! \ No newline at end of file diff --git a/projects/unit3/slack-notification/starter/manual_test.md b/projects/unit3/slack-notification/starter/manual_test.md new file mode 100644 index 0000000..e1726fa --- /dev/null +++ b/projects/unit3/slack-notification/starter/manual_test.md @@ -0,0 +1,254 @@ +# Manual Testing Guide - Slack Notification Module + +This guide provides curl-based tests so you can verify your implementation without setting up a full GitHub repository and Actions workflow. + +## Prerequisites + +1. MCP server running: `uv run server.py` +2. Webhook server running: `python webhook_server.py` +3. Slack webhook URL set: `export SLACK_WEBHOOK_URL="https://hooks.slack.com/services/..."` + +## Test 1: Direct Slack Tool Test + +Test your `send_slack_notification` tool directly via Claude Code: + +```bash +# Start your MCP server and connect with Claude Code, then ask: +# "Send a test message to Slack: 'Hello from MCP Course Module 3!'" +``` + +Expected result: Message appears in your Slack channel. + +## Test 2: Simulate GitHub Webhook Events + +### 2a. Simulate CI Failure Event + +Send a fake GitHub Actions failure event to your webhook server: + +```bash +curl -X POST http://localhost:8080/webhook/github \ + -H "Content-Type: application/json" \ + -H "X-GitHub-Event: workflow_run" \ + -d '{ + "action": "completed", + "workflow_run": { + "id": 123456789, + "name": "CI", + "status": "completed", + "conclusion": "failure", + "run_number": 42, + "created_at": "2024-01-15T10:30:00Z", + "updated_at": "2024-01-15T10:35:00Z", + "html_url": "https://github.com/user/repo/actions/runs/123456789", + "head_branch": "feature/slack-integration", + "head_sha": "abc123f456789", + "repository": { + "name": "mcp-course", + "full_name": "user/mcp-course", + "html_url": "https://github.com/user/mcp-course" + }, + "pull_requests": [{ + "number": 42, + "url": "https://api.github.com/repos/user/mcp-course/pulls/42", + "html_url": "https://github.com/user/mcp-course/pull/42" + }] + } + }' +``` + +### 2b. Simulate CI Success Event + +```bash +curl -X POST http://localhost:8080/webhook/github \ + -H "Content-Type: application/json" \ + -H "X-GitHub-Event: workflow_run" \ + -d '{ + "action": "completed", + "workflow_run": { + "id": 123456790, + "name": "Deploy", + "status": "completed", + "conclusion": "success", + "run_number": 43, + "created_at": "2024-01-15T11:30:00Z", + "updated_at": "2024-01-15T11:35:00Z", + "html_url": "https://github.com/user/repo/actions/runs/123456790", + "head_branch": "main", + "head_sha": "def456g789012", + "repository": { + "name": "mcp-course", + "full_name": "user/mcp-course", + "html_url": "https://github.com/user/mcp-course" + }, + "pull_requests": [{ + "number": 43, + "url": "https://api.github.com/repos/user/mcp-course/pulls/43", + "html_url": "https://github.com/user/mcp-course/pull/43" + }] + } + }' +``` + +## Test 3: End-to-End Workflow Tests + +After sending the webhook events above, test the complete workflow via Claude Code: + +### 3a. Test Failure Alert Workflow + +Ask Claude Code: +``` +"Check recent CI events, find any failures, format them as a Slack alert, and send to the team" +``` + +Expected workflow: +1. Claude calls `get_recent_actions_events()` → finds failure event +2. Claude calls `format_ci_failure_alert()` → generates formatted message +3. Claude calls `send_slack_notification()` → sends to Slack + +Expected Slack message: +``` +❌ *CI Failed* - mcp-course + +> CI workflow failed on feature/slack-integration + +*Details:* +• Workflow: `CI` +• Branch: `feature/slack-integration` +• Commit: `abc123f` + +*Next Steps:* +• +• +``` + +### 3b. Test Success Summary Workflow + +Ask Claude Code: +``` +"Check recent CI events, find any successful deployments, format them as a celebration message, and send to the team" +``` + +Expected workflow: +1. Claude calls `get_recent_actions_events()` → finds success event +2. Claude calls `format_ci_success_summary()` → generates formatted message +3. Claude calls `send_slack_notification()` → sends to Slack + +Expected Slack message: +``` +✅ *Deployment Successful* - mcp-course + +> Deploy workflow completed successfully on main + +*Changes:* +• Module 3 Slack integration added +• Team notification system implemented + +*Links:* +• +• +``` + +## Test 4: Error Handling Tests + +### 4a. Test Missing Webhook URL + +```bash +# Temporarily unset the environment variable +unset SLACK_WEBHOOK_URL + +# Ask Claude Code to send a message +# Expected: Error message about missing environment variable +``` + +### 4b. Test Invalid Webhook URL + +```bash +# Set invalid webhook URL +export SLACK_WEBHOOK_URL="https://invalid-webhook-url.com/test" + +# Ask Claude Code to send a message +# Expected: Error message about connection failure +``` + +### 4c. Restore Valid Webhook URL + +```bash +export SLACK_WEBHOOK_URL="https://hooks.slack.com/services/YOUR/ACTUAL/URL" +``` + +## Test 5: Prompt-Only Tests + +Test the formatting prompts without sending to Slack: + +### 5a. Test Failure Alert Prompt + +Ask Claude Code: +``` +"Use the format_ci_failure_alert prompt to create a failure message for the recent CI failure, but don't send it to Slack yet" +``` + +### 5b. Test Success Summary Prompt + +Ask Claude Code: +``` +"Use the format_ci_success_summary prompt to create a success message for the recent deployment, but don't send it to Slack yet" +``` + +## Test 6: Integration with Previous Modules + +Test that all previous module functionality still works: + +### 6a. Module 1 Integration + +Ask Claude Code: +``` +"Analyze current file changes, suggest a PR template, then create a Slack message about the PR status" +``` + +### 6b. Module 2 Integration + +Ask Claude Code: +``` +"Check workflow status, analyze the CI results, and create a comprehensive team update for Slack" +``` + +## Verification Checklist + +After running these tests, verify: + +- [ ] Direct Slack tool works (Test 1) +- [ ] Webhook server receives and stores events (Test 2) +- [ ] Failure alert workflow works end-to-end (Test 3a) +- [ ] Success summary workflow works end-to-end (Test 3b) +- [ ] Error handling works properly (Test 4) +- [ ] Prompts work independently (Test 5) +- [ ] Integration with previous modules works (Test 6) +- [ ] Slack messages display with proper formatting +- [ ] All tools and prompts are accessible to Claude Code + +## Troubleshooting + +### Webhook Server Issues +```bash +# Check if webhook server is running +curl http://localhost:8080/health + +# Check stored events +cat github_events.json +``` + +### MCP Server Issues +```bash +# Check if MCP server is responding +# Should see server startup messages when running uv run server.py +``` + +### Slack Issues +```bash +# Test webhook URL directly +curl -X POST -H 'Content-type: application/json' \ + --data '{"text":"Direct webhook test"}' \ + $SLACK_WEBHOOK_URL +``` + +This testing approach lets you validate your implementation without needing to set up a real GitHub repository with Actions workflows! \ No newline at end of file diff --git a/projects/unit3/slack-notification/starter/pyproject.toml b/projects/unit3/slack-notification/starter/pyproject.toml new file mode 100644 index 0000000..cd2b2d9 --- /dev/null +++ b/projects/unit3/slack-notification/starter/pyproject.toml @@ -0,0 +1,30 @@ +[project] +name = "pr-agent-slack" +version = "3.0.0" +description = "MCP server with Slack notifications integrating Tools and Prompts" +readme = "README.md" +requires-python = ">=3.10" +dependencies = [ + "mcp[cli]>=1.0.0", + "aiohttp>=3.10.0,<4.0.0", + "requests>=2.32.0,<3.0.0", +] + +[project.optional-dependencies] +dev = [ + "pytest>=8.3.0", + "pytest-asyncio>=0.21.0", +] + +[build-system] +requires = ["hatchling"] +build-backend = "hatchling.build" + +[tool.hatch.build.targets.wheel] +packages = ["."] + +[tool.uv] +dev-dependencies = [ + "pytest>=8.3.0", + "pytest-asyncio>=0.21.0", +] \ No newline at end of file diff --git a/projects/unit3/slack-notification/starter/server.py b/projects/unit3/slack-notification/starter/server.py new file mode 100644 index 0000000..fc5ca01 --- /dev/null +++ b/projects/unit3/slack-notification/starter/server.py @@ -0,0 +1,467 @@ +#!/usr/bin/env python3 +""" +Module 3: Slack Notification Integration +Combines all MCP primitives (Tools and Prompts) for complete team communication workflows. +""" + +import json +import os +import subprocess +from typing import Dict, Any, Optional +from pathlib import Path + +from mcp.server.fastmcp import FastMCP + +# Initialize the FastMCP server +mcp = FastMCP("pr-agent-slack") + +# PR template directory (shared between starter and solution) +TEMPLATES_DIR = Path(__file__).parent.parent.parent / "templates" + +# File where webhook server stores events +EVENTS_FILE = Path(__file__).parent / "github_events.json" + + +# ===== Tools from Modules 1 & 2 (Complete with output limiting) ===== + +@mcp.tool() +async def analyze_file_changes( + base_branch: str = "main", + include_diff: bool = True, + max_diff_lines: int = 500 +) -> str: + """Get the full diff and list of changed files in the current git repository. + + Args: + base_branch: Base branch to compare against (default: main) + include_diff: Include the full diff content (default: true) + max_diff_lines: Maximum number of diff lines to include (default: 500) + """ + try: + # Get list of changed files + files_result = subprocess.run( + ["git", "diff", "--name-status", f"{base_branch}...HEAD"], + capture_output=True, + text=True, + check=True + ) + + # Get diff statistics + stat_result = subprocess.run( + ["git", "diff", "--stat", f"{base_branch}...HEAD"], + capture_output=True, + text=True + ) + + # Get the actual diff if requested + diff_content = "" + truncated = False + if include_diff: + diff_result = subprocess.run( + ["git", "diff", f"{base_branch}...HEAD"], + capture_output=True, + text=True + ) + diff_lines = diff_result.stdout.split('\n') + + # Check if we need to truncate + if len(diff_lines) > max_diff_lines: + diff_content = '\n'.join(diff_lines[:max_diff_lines]) + diff_content += f"\n\n... Output truncated. Showing {max_diff_lines} of {len(diff_lines)} lines ..." + diff_content += "\n... Use max_diff_lines parameter to see more ..." + truncated = True + else: + diff_content = diff_result.stdout + + # Get commit messages for context + commits_result = subprocess.run( + ["git", "log", "--oneline", f"{base_branch}..HEAD"], + capture_output=True, + text=True + ) + + analysis = { + "base_branch": base_branch, + "files_changed": files_result.stdout, + "statistics": stat_result.stdout, + "commits": commits_result.stdout, + "diff": diff_content if include_diff else "Diff not included (set include_diff=true to see full diff)", + "truncated": truncated, + "total_diff_lines": len(diff_lines) if include_diff else 0 + } + + return json.dumps(analysis, indent=2) + + except subprocess.CalledProcessError as e: + return json.dumps({"error": f"Git error: {e.stderr}"}) + except Exception as e: + return json.dumps({"error": str(e)}) + + +@mcp.tool() +async def get_pr_templates() -> str: + """List available PR templates with their content.""" + templates = [] + + # Define default templates + default_templates = { + "bug.md": "Bug Fix", + "feature.md": "Feature", + "docs.md": "Documentation", + "refactor.md": "Refactor", + "test.md": "Test", + "performance.md": "Performance", + "security.md": "Security" + } + + for filename, template_type in default_templates.items(): + template_path = TEMPLATES_DIR / filename + + # Read template content + content = template_path.read_text() + + templates.append({ + "filename": filename, + "type": template_type, + "content": content + }) + + return json.dumps(templates, indent=2) + + +@mcp.tool() +async def suggest_template(changes_summary: str, change_type: str) -> str: + """Let Claude analyze the changes and suggest the most appropriate PR template. + + Args: + changes_summary: Your analysis of what the changes do + change_type: The type of change you've identified (bug, feature, docs, refactor, test, etc.) + """ + + # Get available templates + templates_response = await get_pr_templates() + templates = json.loads(templates_response) + + # Map change types to template files + type_mapping = { + "bug": "bug.md", + "fix": "bug.md", + "feature": "feature.md", + "enhancement": "feature.md", + "docs": "docs.md", + "documentation": "docs.md", + "refactor": "refactor.md", + "cleanup": "refactor.md", + "test": "test.md", + "testing": "test.md", + "performance": "performance.md", + "optimization": "performance.md", + "security": "security.md" + } + + # Find matching template + template_file = type_mapping.get(change_type.lower(), "feature.md") + selected_template = next( + (t for t in templates if t["filename"] == template_file), + templates[0] # Default to first template if no match + ) + + suggestion = { + "recommended_template": selected_template, + "reasoning": f"Based on your analysis: '{changes_summary}', this appears to be a {change_type} change.", + "template_content": selected_template["content"], + "usage_hint": "Claude can help you fill out this template based on the specific changes in your PR." + } + + return json.dumps(suggestion, indent=2) + + +@mcp.tool() +async def get_recent_actions_events(limit: int = 10) -> str: + """Get recent GitHub Actions events received via webhook. + + Args: + limit: Maximum number of events to return (default: 10) + """ + # Read events from file + if not EVENTS_FILE.exists(): + return json.dumps([]) + + with open(EVENTS_FILE, 'r') as f: + events = json.load(f) + + # Return most recent events + recent = events[-limit:] + return json.dumps(recent, indent=2) + + +@mcp.tool() +async def get_workflow_status(workflow_name: Optional[str] = None) -> str: + """Get the current status of GitHub Actions workflows. + + Args: + workflow_name: Optional specific workflow name to filter by + """ + # Read events from file + if not EVENTS_FILE.exists(): + return json.dumps({"message": "No GitHub Actions events received yet"}) + + with open(EVENTS_FILE, 'r') as f: + events = json.load(f) + + if not events: + return json.dumps({"message": "No GitHub Actions events received yet"}) + + # Filter for workflow events + workflow_events = [ + e for e in events + if e.get("workflow_run") is not None + ] + + if workflow_name: + workflow_events = [ + e for e in workflow_events + if e["workflow_run"].get("name") == workflow_name + ] + + # Group by workflow and get latest status + workflows = {} + for event in workflow_events: + run = event["workflow_run"] + name = run["name"] + if name not in workflows or run["updated_at"] > workflows[name]["updated_at"]: + workflows[name] = { + "name": name, + "status": run["status"], + "conclusion": run.get("conclusion"), + "run_number": run["run_number"], + "updated_at": run["updated_at"], + "html_url": run["html_url"] + } + + return json.dumps(list(workflows.values()), indent=2) + + +# ===== New Module 3: Slack Integration Tools ===== + +@mcp.tool() +async def send_slack_notification(message: str) -> str: + """Send a formatted notification to the team Slack channel. + + Args: + message: The message to send to Slack (supports Slack markdown) + """ + webhook_url = os.getenv("SLACK_WEBHOOK_URL") + if not webhook_url: + return "Error: SLACK_WEBHOOK_URL environment variable not set" + + try: + # TODO: Import requests library + # TODO: Send POST request to webhook_url with JSON payload + # TODO: Include the message in the JSON data + # TODO: Handle the response and return appropriate status + + # For now, return a placeholder + return f"TODO: Implement Slack webhook POST request for message: {message[:50]}..." + + except Exception as e: + return f"Error sending message: {str(e)}" + + +# ===== New Module 3: Slack Formatting Prompts ===== + +@mcp.prompt() +async def format_ci_failure_alert(): + """Create a Slack alert for CI/CD failures with rich formatting.""" + return """Format this GitHub Actions failure as a Slack message using ONLY Slack markdown syntax: + +❌ *CI Failed* - [Repository Name] + +> Brief summary of what failed + +*Details:* +• Workflow: `workflow_name` +• Branch: `branch_name` +• Commit: `commit_hash` + +*Next Steps:* +• + +CRITICAL: Use EXACT Slack link format: +Examples: +- CORRECT: +- WRONG: [Repository](https://github.com/user/repo) +- WRONG: https://github.com/user/repo + +Other Slack formats: +- *text* for bold (NOT **text**) +- `text` for code +- > text for quotes +- • for bullets""" + + +@mcp.prompt() +async def format_ci_success_summary(): + """Create a Slack message celebrating successful deployments.""" + return """Format this successful GitHub Actions run as a Slack message using ONLY Slack markdown syntax: + +✅ *Deployment Successful* - [Repository Name] + +> Brief summary of what was deployed + +*Changes:* +• Key feature or fix 1 +• Key feature or fix 2 + +*Links:* +• + +CRITICAL: Use EXACT Slack link format: +Examples: +- CORRECT: +- WRONG: [Repository](https://github.com/user/repo) +- WRONG: https://github.com/user/repo + +Other Slack formats: +- *text* for bold (NOT **text**) +- `text` for code +- > text for quotes +- • for bullets""" + + +# ===== Prompts from Module 2 (Complete) ===== + +@mcp.prompt() +async def analyze_ci_results(): + """Analyze recent CI/CD results and provide insights.""" + return """Please analyze the recent CI/CD results from GitHub Actions: + +1. First, call get_recent_actions_events() to fetch the latest CI/CD events +2. Then call get_workflow_status() to check current workflow states +3. Identify any failures or issues that need attention +4. Provide actionable next steps based on the results + +Format your response as: +## CI/CD Status Summary +- **Overall Health**: [Good/Warning/Critical] +- **Failed Workflows**: [List any failures with links] +- **Successful Workflows**: [List recent successes] +- **Recommendations**: [Specific actions to take] +- **Trends**: [Any patterns you notice]""" + + +@mcp.prompt() +async def create_deployment_summary(): + """Generate a deployment summary for team communication.""" + return """Create a deployment summary for team communication: + +1. Check workflow status with get_workflow_status() +2. Look specifically for deployment-related workflows +3. Note the deployment outcome, timing, and any issues + +Format as a concise message suitable for Slack: + +🚀 **Deployment Update** +- **Status**: [✅ Success / ❌ Failed / ⏳ In Progress] +- **Environment**: [Production/Staging/Dev] +- **Version/Commit**: [If available from workflow data] +- **Duration**: [If available] +- **Key Changes**: [Brief summary if available] +- **Issues**: [Any problems encountered] +- **Next Steps**: [Required actions if failed] + +Keep it brief but informative for team awareness.""" + + +@mcp.prompt() +async def generate_pr_status_report(): + """Generate a comprehensive PR status report including CI/CD results.""" + return """Generate a comprehensive PR status report: + +1. Use analyze_file_changes() to understand what changed +2. Use get_workflow_status() to check CI/CD status +3. Use suggest_template() to recommend the appropriate PR template +4. Combine all information into a cohesive report + +Create a detailed report with: + +## 📋 PR Status Report + +### 📝 Code Changes +- **Files Modified**: [Count by type - .py, .js, etc.] +- **Change Type**: [Feature/Bug/Refactor/etc.] +- **Impact Assessment**: [High/Medium/Low with reasoning] +- **Key Changes**: [Bullet points of main modifications] + +### 🔄 CI/CD Status +- **All Checks**: [✅ Passing / ❌ Failing / ⏳ Running] +- **Test Results**: [Pass rate, failed tests if any] +- **Build Status**: [Success/Failed with details] +- **Code Quality**: [Linting, coverage if available] + +### 📌 Recommendations +- **PR Template**: [Suggested template and why] +- **Next Steps**: [What needs to happen before merge] +- **Reviewers**: [Suggested reviewers based on files changed] + +### ⚠️ Risks & Considerations +- [Any deployment risks] +- [Breaking changes] +- [Dependencies affected]""" + + +@mcp.prompt() +async def troubleshoot_workflow_failure(): + """Help troubleshoot a failing GitHub Actions workflow.""" + return """Help troubleshoot failing GitHub Actions workflows: + +1. Use get_recent_actions_events() to find recent failures +2. Use get_workflow_status() to see which workflows are failing +3. Analyze the failure patterns and timing +4. Provide systematic troubleshooting steps + +Structure your response as: + +## 🔧 Workflow Troubleshooting Guide + +### ❌ Failed Workflow Details +- **Workflow Name**: [Name of failing workflow] +- **Failure Type**: [Test/Build/Deploy/Lint] +- **First Failed**: [When did it start failing] +- **Failure Rate**: [Intermittent or consistent] + +### 🔍 Diagnostic Information +- **Error Patterns**: [Common error messages or symptoms] +- **Recent Changes**: [What changed before failures started] +- **Dependencies**: [External services or resources involved] + +### 💡 Possible Causes (ordered by likelihood) +1. **[Most Likely]**: [Description and why] +2. **[Likely]**: [Description and why] +3. **[Possible]**: [Description and why] + +### ✅ Suggested Fixes +**Immediate Actions:** +- [ ] [Quick fix to try first] +- [ ] [Second quick fix] + +**Investigation Steps:** +- [ ] [How to gather more info] +- [ ] [Logs or data to check] + +**Long-term Solutions:** +- [ ] [Preventive measure] +- [ ] [Process improvement] + +### 📚 Resources +- [Relevant documentation links] +- [Similar issues or solutions]""" + + +if __name__ == "__main__": + # Run MCP server normally + print("Starting PR Agent Slack MCP server...") + print("Make sure to set SLACK_WEBHOOK_URL environment variable") + print("To receive GitHub webhooks, run the webhook server separately:") + print(" python webhook_server.py") + mcp.run() \ No newline at end of file diff --git a/projects/unit3/slack-notification/starter/test_server.py b/projects/unit3/slack-notification/starter/test_server.py new file mode 100644 index 0000000..79da4c1 --- /dev/null +++ b/projects/unit3/slack-notification/starter/test_server.py @@ -0,0 +1,157 @@ +#!/usr/bin/env python3 +""" +Unit tests for Module 1: Basic MCP Server +Run these tests to validate your implementation +""" + +import json +import pytest +import asyncio +from pathlib import Path +from unittest.mock import patch, MagicMock + +# Import your implemented functions +try: + from server import ( + mcp, + analyze_file_changes, + get_pr_templates, + suggest_template + ) + IMPORTS_SUCCESSFUL = True +except ImportError as e: + IMPORTS_SUCCESSFUL = False + IMPORT_ERROR = str(e) + + +class TestImplementation: + """Test that the required functions are implemented.""" + + def test_imports(self): + """Test that all required functions can be imported.""" + assert IMPORTS_SUCCESSFUL, f"Failed to import required functions: {IMPORT_ERROR if not IMPORTS_SUCCESSFUL else ''}" + assert mcp is not None, "FastMCP server instance not found" + assert callable(analyze_file_changes), "analyze_file_changes should be a callable function" + assert callable(get_pr_templates), "get_pr_templates should be a callable function" + assert callable(suggest_template), "suggest_template should be a callable function" + + +@pytest.mark.skipif(not IMPORTS_SUCCESSFUL, reason="Imports failed") +class TestAnalyzeFileChanges: + """Test the analyze_file_changes tool.""" + + @pytest.mark.asyncio + async def test_returns_json_string(self): + """Test that analyze_file_changes returns a JSON string.""" + with patch('subprocess.run') as mock_run: + mock_run.return_value = MagicMock(stdout="", stderr="") + + result = await analyze_file_changes() + + assert isinstance(result, str), "Should return a string" + # Should be valid JSON + data = json.loads(result) + assert isinstance(data, dict), "Should return a JSON object" + + @pytest.mark.asyncio + async def test_includes_required_fields(self): + """Test that the result includes expected fields.""" + with patch('subprocess.run') as mock_run: + mock_run.return_value = MagicMock(stdout="M\tfile1.py\n", stderr="") + + result = await analyze_file_changes() + data = json.loads(result) + + # Check for some expected fields (flexible to allow different implementations) + assert any(key in data for key in ["files_changed", "files", "changes", "diff"]), \ + "Result should include file change information" + + +@pytest.mark.skipif(not IMPORTS_SUCCESSFUL, reason="Imports failed") +class TestGetPRTemplates: + """Test the get_pr_templates tool.""" + + @pytest.mark.asyncio + async def test_returns_json_string(self): + """Test that get_pr_templates returns a JSON string.""" + result = await get_pr_templates() + + assert isinstance(result, str), "Should return a string" + # Should be valid JSON + data = json.loads(result) + assert isinstance(data, list), "Should return a JSON array of templates" + + @pytest.mark.asyncio + async def test_returns_templates(self): + """Test that templates are returned.""" + result = await get_pr_templates() + templates = json.loads(result) + + assert len(templates) > 0, "Should return at least one template" + + # Check that templates have expected structure + for template in templates: + assert isinstance(template, dict), "Each template should be a dictionary" + # Should have some identifying information + assert any(key in template for key in ["filename", "name", "type", "id"]), \ + "Templates should have an identifier" + + +@pytest.mark.skipif(not IMPORTS_SUCCESSFUL, reason="Imports failed") +class TestSuggestTemplate: + """Test the suggest_template tool.""" + + @pytest.mark.asyncio + async def test_returns_json_string(self): + """Test that suggest_template returns a JSON string.""" + result = await suggest_template( + "Fixed a bug in the authentication system", + "bug" + ) + + assert isinstance(result, str), "Should return a string" + # Should be valid JSON + data = json.loads(result) + assert isinstance(data, dict), "Should return a JSON object" + + @pytest.mark.asyncio + async def test_suggestion_structure(self): + """Test that the suggestion has expected structure.""" + result = await suggest_template( + "Added new feature for user management", + "feature" + ) + suggestion = json.loads(result) + + # Check for some expected fields (flexible to allow different implementations) + assert any(key in suggestion for key in ["template", "recommended_template", "suggestion"]), \ + "Should include a template recommendation" + + +@pytest.mark.skipif(not IMPORTS_SUCCESSFUL, reason="Imports failed") +class TestToolRegistration: + """Test that tools are properly registered with FastMCP.""" + + def test_tools_have_decorators(self): + """Test that tool functions are decorated with @mcp.tool().""" + # In FastMCP, decorated functions should have certain attributes + # This is a basic check that functions exist and are callable + assert hasattr(analyze_file_changes, '__name__'), \ + "analyze_file_changes should be a proper function" + assert hasattr(get_pr_templates, '__name__'), \ + "get_pr_templates should be a proper function" + assert hasattr(suggest_template, '__name__'), \ + "suggest_template should be a proper function" + + +if __name__ == "__main__": + if not IMPORTS_SUCCESSFUL: + print(f"❌ Cannot run tests - imports failed: {IMPORT_ERROR}") + print("\nMake sure you've:") + print("1. Implemented all three tool functions") + print("2. Decorated them with @mcp.tool()") + print("3. Installed dependencies with: uv sync") + exit(1) + + # Run tests + pytest.main([__file__, "-v"]) \ No newline at end of file diff --git a/projects/unit3/slack-notification/starter/validate_starter.py b/projects/unit3/slack-notification/starter/validate_starter.py new file mode 100644 index 0000000..e288cc9 --- /dev/null +++ b/projects/unit3/slack-notification/starter/validate_starter.py @@ -0,0 +1,193 @@ +#!/usr/bin/env python3 +""" +Validation script for Module 1 starter code +Ensures the starter template is ready for learners to implement +""" + +import subprocess +import sys +import os +from pathlib import Path + +def test_project_structure(): + """Check that all required files exist.""" + print("Project Structure:") + required_files = [ + "server.py", + "pyproject.toml", + "README.md" + ] + + all_exist = True + for file in required_files: + if Path(file).exists(): + print(f" ✓ {file} exists") + else: + print(f" ✗ {file} missing") + all_exist = False + + return all_exist + +def test_imports(): + """Test that the starter code imports work.""" + try: + # Test importing the server module + import server + print("✓ server.py imports successfully") + + # Check that FastMCP is imported + if hasattr(server, 'mcp'): + print("✓ FastMCP server instance found") + else: + print("✗ FastMCP server instance not found") + return False + + return True + except ImportError as e: + print(f"✗ Import error: {e}") + print(" Please ensure you've installed dependencies: uv sync") + return False + +def test_todos(): + """Check that TODO comments exist for learners.""" + print("\nTODO Comments:") + + with open("server.py", "r") as f: + content = f.read() + + todos = [] + for i, line in enumerate(content.split('\n'), 1): + if 'TODO' in line: + todos.append((i, line.strip())) + + if todos: + print(f"✓ Found {len(todos)} TODO comments for learners:") + for line_no, todo in todos[:5]: # Show first 5 + print(f" Line {line_no}: {todo[:60]}...") + if len(todos) > 5: + print(f" ... and {len(todos) - 5} more") + return True + else: + print("✗ No TODO comments found - learners need guidance!") + return False + +def test_starter_runs(): + """Test that the starter code can at least be executed.""" + print("\nExecution Test:") + + try: + # Try to import and check if server can be initialized + import server + # If we can import it and it has the right attributes, it should run + if hasattr(server, 'mcp') and hasattr(server, 'send_slack_notification'): + print("✓ Server imports and initializes correctly") + return True + else: + print("✗ Server missing required components") + return False + + except Exception as e: + print(f"✗ Failed to initialize server: {e}") + return False + +def test_dependencies(): + """Check that pyproject.toml is properly configured.""" + print("\nDependencies:") + + try: + import tomllib + except ImportError: + import tomli as tomllib + + try: + with open("pyproject.toml", "rb") as f: + config = tomllib.load(f) + + # Check for required sections + if "project" in config and "dependencies" in config["project"]: + deps = config["project"]["dependencies"] + print(f"✓ Found {len(deps)} dependencies") + for dep in deps: + print(f" - {dep}") + else: + print("✗ No dependencies section found") + return False + + return True + except Exception as e: + print(f"✗ Error reading pyproject.toml: {e}") + return False + +def test_no_implementation(): + """Ensure starter code doesn't contain the solution.""" + print("\nImplementation Check:") + + with open("server.py", "r") as f: + content = f.read() + + # Check that tool functions are not implemented + solution_indicators = [ + "subprocess.run", # Git commands + "json.dumps", # Returning JSON + "git diff", # Git operations + "template", # Template logic + ] + + found_implementations = [] + for indicator in solution_indicators: + if indicator in content.lower(): + found_implementations.append(indicator) + + if found_implementations: + print(f"⚠️ Found possible solution code: {', '.join(found_implementations)}") + print(" Make sure these are only in comments/examples") + return True # Warning, not failure + else: + print("✓ No solution implementation found (good!)") + return True + +def main(): + """Run all validation checks.""" + print("Module 1 Starter Code Validation") + print("=" * 50) + + # Change to starter directory if needed + if Path("validate_starter.py").exists(): + os.chdir(Path("validate_starter.py").parent) + + tests = [ + ("Project Structure", test_project_structure), + ("Python Imports", test_imports), + ("TODO Comments", test_todos), + ("Starter Execution", test_starter_runs), + ("Dependencies", test_dependencies), + ("Clean Starter", test_no_implementation) + ] + + results = [] + for test_name, test_func in tests: + print(f"\n{test_name}:") + try: + results.append(test_func()) + except Exception as e: + print(f"✗ Test failed with error: {e}") + results.append(False) + + print("\n" + "=" * 50) + passed = sum(results) + total = len(results) + print(f"Checks passed: {passed}/{total}") + + if passed == total: + print("\n✓ Starter code is ready for learners!") + print("\nLearners should:") + print("1. Run: uv sync") + print("2. Follow the TODO comments in server.py") + print("3. Test with: uv run pytest test_server.py") + print("4. Configure Claude Desktop when ready") + else: + print("\n✗ Some checks failed. Please review the starter code.") + sys.exit(1) + +if __name__ == "__main__": + main() \ No newline at end of file diff --git a/projects/unit3/slack-notification/starter/webhook_server.py b/projects/unit3/slack-notification/starter/webhook_server.py new file mode 100644 index 0000000..64941d1 --- /dev/null +++ b/projects/unit3/slack-notification/starter/webhook_server.py @@ -0,0 +1,57 @@ +#!/usr/bin/env python3 +""" +Simple webhook server for GitHub Actions events. +Stores events in a JSON file that the MCP server can read. +""" + +import json +from datetime import datetime +from pathlib import Path +from aiohttp import web + +# File to store events +EVENTS_FILE = Path(__file__).parent / "github_events.json" + +async def handle_webhook(request): + """Handle incoming GitHub webhook""" + try: + data = await request.json() + + # Create event record + event = { + "timestamp": datetime.utcnow().isoformat(), + "event_type": request.headers.get("X-GitHub-Event", "unknown"), + "action": data.get("action"), + "workflow_run": data.get("workflow_run"), + "check_run": data.get("check_run"), + "repository": data.get("repository", {}).get("full_name"), + "sender": data.get("sender", {}).get("login") + } + + # Load existing events + events = [] + if EVENTS_FILE.exists(): + with open(EVENTS_FILE, 'r') as f: + events = json.load(f) + + # Add new event and keep last 100 + events.append(event) + events = events[-100:] + + # Save events + with open(EVENTS_FILE, 'w') as f: + json.dump(events, f, indent=2) + + return web.json_response({"status": "received"}) + except Exception as e: + return web.json_response({"error": str(e)}, status=400) + +# Create app and add route +app = web.Application() +app.router.add_post('/webhook/github', handle_webhook) + +if __name__ == '__main__': + print("🚀 Starting webhook server on http://localhost:8080") + print("📝 Events will be saved to:", EVENTS_FILE) + print("🔗 Webhook URL: http://localhost:8080/webhook/github") + web.run_app(app, host='localhost', port=8080) \ No newline at end of file diff --git a/projects/unit3/team-guidelines/coding-standards.md b/projects/unit3/team-guidelines/coding-standards.md new file mode 100644 index 0000000..7462a6f --- /dev/null +++ b/projects/unit3/team-guidelines/coding-standards.md @@ -0,0 +1,33 @@ +# Coding Standards + +## Python +- Use type hints for all function arguments and return values +- Follow PEP 8 style guide +- Maximum line length: 100 characters +- Use descriptive variable names +- Prefer f-strings over .format() or % formatting + +## Git Commits +- Use conventional commit format: type(scope): description +- Types: feat, fix, docs, style, refactor, test, chore +- Keep commit messages under 72 characters +- Reference issue numbers when applicable (e.g., "Fixes #123") + +## Code Organization +- One class per file for major components +- Group related functionality into modules +- Use __init__.py to control public API +- Keep functions under 50 lines when possible + +## Testing +- All new features must include tests +- Maintain >80% test coverage +- Use pytest for Python tests +- Test edge cases and error conditions +- Mock external dependencies + +## Documentation +- All public functions need docstrings +- Use Google-style docstrings +- Include usage examples for complex functions +- Keep README files up to date \ No newline at end of file diff --git a/projects/unit3/team-guidelines/pr-guidelines.md b/projects/unit3/team-guidelines/pr-guidelines.md new file mode 100644 index 0000000..67c667a --- /dev/null +++ b/projects/unit3/team-guidelines/pr-guidelines.md @@ -0,0 +1,36 @@ +# PR Guidelines + +## PR Size +- Keep PRs under 500 lines of changes +- Split large features into multiple PRs +- One logical change per PR +- Separate refactoring from feature changes + +## PR Description +- Clearly explain what and why +- Include screenshots for UI changes +- List any breaking changes +- Add testing instructions +- Reference related issues/tickets + +## Review Process +- At least one approval required +- Address all review comments +- Update PR description with changes made +- Resolve conflicts before requesting review +- Tag relevant team members + +## Before Merging +- All CI checks must pass +- Update documentation if needed +- Verify no sensitive data is exposed +- Squash commits if necessary +- Delete feature branch after merge + +## PR Title Format +- Use conventional commit format +- Be specific and descriptive +- Examples: + - "feat(auth): Add OAuth2 support" + - "fix(api): Handle null response in user endpoint" + - "docs: Update installation guide" \ No newline at end of file diff --git a/projects/unit3/templates/bug.md b/projects/unit3/templates/bug.md new file mode 100644 index 0000000..0e8f267 --- /dev/null +++ b/projects/unit3/templates/bug.md @@ -0,0 +1,18 @@ +## Bug Fix + +### Description + + +### Root Cause + + +### Solution + + +### Testing +- [ ] Added/updated tests +- [ ] Manually tested the fix +- [ ] Verified no regressions + +### Related Issues +Fixes # diff --git a/projects/unit3/templates/docs.md b/projects/unit3/templates/docs.md new file mode 100644 index 0000000..969104f --- /dev/null +++ b/projects/unit3/templates/docs.md @@ -0,0 +1,13 @@ +## Documentation Update + +### Description + + +### Changes + + +### Review Checklist +- [ ] Grammar and spelling checked +- [ ] Technical accuracy verified +- [ ] Examples tested +- [ ] Links verified diff --git a/projects/unit3/templates/feature.md b/projects/unit3/templates/feature.md new file mode 100644 index 0000000..0c63a68 --- /dev/null +++ b/projects/unit3/templates/feature.md @@ -0,0 +1,22 @@ +## New Feature + +### Description + + +### Motivation + + +### Implementation + + +### Testing +- [ ] Added unit tests +- [ ] Added integration tests +- [ ] Tested edge cases + +### Documentation +- [ ] Updated relevant documentation +- [ ] Added usage examples + +### Breaking Changes + diff --git a/projects/unit3/templates/performance.md b/projects/unit3/templates/performance.md new file mode 100644 index 0000000..3b7095d --- /dev/null +++ b/projects/unit3/templates/performance.md @@ -0,0 +1,15 @@ +## Performance Improvement + +### Description + + +### Metrics + + +### Changes + + +### Testing +- [ ] Benchmarks added/updated +- [ ] No functionality regression +- [ ] Tested under various loads diff --git a/projects/unit3/templates/refactor.md b/projects/unit3/templates/refactor.md new file mode 100644 index 0000000..0e309f5 --- /dev/null +++ b/projects/unit3/templates/refactor.md @@ -0,0 +1,18 @@ +## Code Refactoring + +### Description + + +### Motivation + + +### Changes + + +### Testing +- [ ] All existing tests pass +- [ ] No functional changes +- [ ] Performance impact assessed + +### Risk Assessment + diff --git a/projects/unit3/templates/security.md b/projects/unit3/templates/security.md new file mode 100644 index 0000000..c943a72 --- /dev/null +++ b/projects/unit3/templates/security.md @@ -0,0 +1,18 @@ +## Security Update + +### Description + + +### Impact + + +### Solution + + +### Testing +- [ ] Security tests added +- [ ] Penetration testing performed +- [ ] No new vulnerabilities introduced + +### References + diff --git a/projects/unit3/templates/test.md b/projects/unit3/templates/test.md new file mode 100644 index 0000000..139478a --- /dev/null +++ b/projects/unit3/templates/test.md @@ -0,0 +1,16 @@ +## Test Update + +### Description + + +### Coverage +- Previous coverage: +- New coverage: + +### Test Types +- [ ] Unit tests +- [ ] Integration tests +- [ ] End-to-end tests + +### Related Features + diff --git a/units/en/_toctree.yml b/units/en/_toctree.yml index 4232bbf..600cc53 100644 --- a/units/en/_toctree.yml +++ b/units/en/_toctree.yml @@ -43,12 +43,16 @@ - local: unit2/tiny-agents title: Building Tiny Agents with MCP and the Hugging Face Hub -- title: "3. Use Case: Advanced MCP Development" +- title: "3. Advanced MCP Development: Custom Workflow Servers" sections: - local: unit3/introduction - title: Coming Soon + title: Building Custom Workflow Servers for Claude Code + - local: unit3/build-mcp-server + title: "Module 1: Build MCP Server" + - local: unit3/github-actions-integration + title: "Module 2: GitHub Actions Integration" + - local: unit3/slack-notification + title: "Module 3: Slack Notification" + - local: unit3/conclusion + title: "Unit 3 Conclusion" -- title: "Bonus Units" - sections: - - local: unit4/introduction - title: Coming Soon diff --git a/units/en/unit3/NEXT_STEPS.md b/units/en/unit3/NEXT_STEPS.md new file mode 100644 index 0000000..dcd1400 --- /dev/null +++ b/units/en/unit3/NEXT_STEPS.md @@ -0,0 +1,128 @@ +# Unit 3 Next Steps - Status Summary + +## Current Status (Module 1: Complete ✅) + +### What's Done +1. **Module 1: Build MCP Server** - FULLY COMPLETE + - ✅ FastMCP implementation with all three tools + - ✅ uv package management with pyproject.toml + - ✅ Complete starter code with TODOs + - ✅ Comprehensive test suite (unit tests + validation scripts) + - ✅ Documentation (README, manual testing guide) + - ✅ MDX file in module directory + - ✅ _toctree.yml updated with module reference + - ✅ All 7 PR templates created + +2. **Unit 3 Foundation** + - ✅ Updated introduction.mdx (removed production language, added MCP primitives) + - ✅ Solution walkthrough document created + - ✅ Implementation plans documented + - ✅ Directory structure for all 5 modules + +### Key Technology Decisions Made +- **Package Manager**: uv (with pyproject.toml) +- **MCP SDK**: FastMCP (not legacy Server API) +- **Python Version**: >=3.10 +- **Testing**: pytest with pytest-asyncio +- **Webhooks**: Cloudflare Tunnel (not ngrok) + +## Immediate Next Steps (Before Committing) + +1. **Git housekeeping**: + ```bash + git add units/en/unit3/implementation_plan_enhancements.md + git add units/en/unit3/solution_walkthrough.md + ``` + +2. **Verify Module 1 works**: + ```bash + cd units/en/unit3/module1/solution + uv sync --all-extras + uv run python validate_solution.py + ``` + +3. **Consider adding to _toctree.yml**: + - The solution_walkthrough.md (as a reference doc) + - Future module entries as they're completed + +## Module Implementation Roadmap + +### Module 2: GitHub Actions Integration (Prompts) +**Goal**: Add webhook handling and Prompts +- **Starter**: Copy Module 1 solution +- **Add**: Cloudflare Tunnel setup, webhook endpoint, prompt templates +- **MDX**: Create module2/introduction.mdx +- **_toctree**: Add reference when complete + +### Module 3: Hugging Face Hub Integration +**Goal**: LLM-specific workflows +- **Starter**: Copy Module 2 solution +- **Add**: Hub API integration, model card generation, dataset validation +- **MDX**: Create module3/introduction.mdx +- **_toctree**: Add reference when complete + +### Module 4: Slack Notification (All Primitives) +**Goal**: Complete integration +- **Starter**: Copy Module 3 solution +- **Add**: Slack webhooks, message formatting, full workflow +- **MDX**: Create module4/introduction.mdx +- **_toctree**: Add reference when complete + +## Testing Checklist for Each Module + +When implementing each module: +1. [ ] Create solution with full implementation +2. [ ] Create starter by removing implementation +3. [ ] Write unit tests +4. [ ] Create validation scripts +5. [ ] Test with Claude Desktop +6. [ ] Write module MDX file +7. [ ] Update _toctree.yml +8. [ ] Ensure progressive enhancement from previous module + +## Documentation Updates Needed + +1. **After all modules complete**: + - Update solution_walkthrough.md if needed + - Consider adding troubleshooting guide + - Create unit3/conclusion.mdx + +2. **For PR**: + - Ensure commit message follows format + - Update PR description with what's complete + - Note that only Module 1 is implemented + +## Commit Message Suggestion + +``` +feat(unit3): implement Module 1 with FastMCP and uv + +- Complete Module 1: Basic MCP server with PR template tools +- Migrate from legacy Server API to FastMCP +- Replace pip/requirements.txt with uv/pyproject.toml +- Add comprehensive test suite and validation scripts +- Update unit introduction to focus on learning (not production) +- Create module structure for remaining 4 modules + +Module 1 provides hands-on experience with MCP Tools, letting learners +build a PR agent that analyzes git changes and suggests templates. +``` + +## Quality Checklist ✅ + +- [x] All imports use current APIs (FastMCP, not legacy Server) +- [x] No hardcoded paths (all use relative imports) +- [x] Consistent error handling patterns +- [x] Tests are meaningful and pass +- [x] Documentation is clear and accurate +- [x] No "production-ready" claims in Unit 3 +- [x] Free tooling only (no paid services required) +- [x] Works with Claude Desktop + +## Notes for Reviewers + +1. **Only Module 1 is implemented** - Other modules are scaffolded but empty +2. **FastMCP is used throughout** - This is the modern API +3. **uv replaces pip** - Following MCP documentation recommendations +4. **Focus is on learning** - Production concerns deferred to Unit 4 +5. **Core MCP primitives covered** - Tools (M1), Prompts (M2), Integration (M3-4) \ No newline at end of file diff --git a/units/en/unit3/build-mcp-server-solution-walkthrough.mdx b/units/en/unit3/build-mcp-server-solution-walkthrough.mdx new file mode 100644 index 0000000..b0829cf --- /dev/null +++ b/units/en/unit3/build-mcp-server-solution-walkthrough.mdx @@ -0,0 +1,444 @@ +# Unit 3 Solution Walkthrough: Building a Pull Request Agent with MCP + +## Overview + +This walkthrough guides you through the complete solution for Unit 3's Pull Request Agent - an MCP server that helps developers create better pull requests by analyzing code changes, monitoring CI/CD pipelines, and automating team communications. The solution demonstrates all three MCP primitives (Tools, Resources, and Prompts) working together in a real-world workflow. + +## Architecture Overview + +The PR Agent consists of interconnected modules that progressively build a complete automation system: + +1. **Build MCP Server** - Basic server with Tools for PR template suggestions +2. **Smart File Analysis** - Enhanced analysis using Resources for project context +3. **GitHub Actions Integration** - CI/CD monitoring with standardized Prompts +4. **Hugging Face Hub Integration** - Model deployment and dataset PR workflows +5. **Slack Notification** - Team communication integrating all MCP primitives + +## Module 1: Build MCP Server + +### What We're Building +A minimal MCP server that analyzes file changes and suggests appropriate PR templates using MCP Tools. + +### Key Components + +#### 1. Server Initialization (`server.py`) +```python +# The server registers three essential tools: +# - analyze_file_changes: Returns structured data about changed files +# - get_pr_templates: Lists available templates with metadata +# - suggest_template: Provides intelligent template recommendations +``` + +The server uses the MCP SDK to expose these tools to Claude Code, allowing it to gather information and make intelligent decisions about which PR template to use. + +#### 2. File Analysis Tool +The `analyze_file_changes` tool examines the git diff to identify: +- File types and extensions +- Number of files changed +- Lines added/removed +- Common patterns (tests, configs, docs) + +This structured data enables Claude to understand the nature of the changes without hard-coding decision logic. + +#### 3. Template Management +Templates are stored as markdown files in the `templates/` directory: +- `bug.md` - For bug fixes +- `feature.md` - For new features +- `docs.md` - For documentation updates +- `refactor.md` - For code refactoring + +Each template includes placeholders that Claude can fill based on the analysis. + +### How Claude Uses These Tools + +1. Claude calls `analyze_file_changes` to understand what changed +2. Uses `get_pr_templates` to see available options +3. Calls `suggest_template` with the analysis data +4. Receives a recommendation with reasoning +5. Can customize the template based on specific changes + +### Learning Outcomes +- Understanding tool registration and schemas +- Letting Claude make decisions with structured data +- Separation of data gathering from decision logic + +## Module 2: Smart File Analysis + +### What We're Building +Enhanced file analysis using MCP Resources to provide project context and team guidelines. + +### Key Components + +#### 1. Resource Registration +The server exposes four types of resources: +```python +# Resources provide read-only access to: +# - file://templates/ - PR template files +# - file://project-context/ - Coding standards, conventions +# - git://recent-changes/ - Commit history and patterns +# - team://guidelines/ - Review processes and standards +``` + +#### 2. Project Context Resources +The `project-context/` directory contains: +- `coding-standards.md` - Language-specific conventions +- `review-guidelines.md` - What reviewers look for +- `architecture.md` - System design patterns +- `dependencies.md` - Third-party library policies + +Claude can read these to understand project-specific requirements. + +#### 3. Git History Analysis +The `git://recent-changes/` resource provides: +- Recent commit messages and patterns +- Common PR titles and descriptions +- Team member contribution patterns +- Historical template usage + +This helps Claude suggest templates consistent with team practices. + +### How Claude Uses Resources + +1. Reads `team://guidelines/review-process.md` to understand PR requirements +2. Accesses `file://project-context/coding-standards.md` for style guides +3. Analyzes `git://recent-changes/` to match team patterns +4. Combines this context with file analysis for better suggestions + +### Enhanced Decision Making +With resources, Claude can now: +- Suggest templates matching team conventions +- Include project-specific requirements in PRs +- Reference coding standards in descriptions +- Align with historical team practices + +### Learning Outcomes +- Resource URI design and schemas +- Making project knowledge accessible to AI +- Context-aware decision making +- Balancing automation with team standards + +## Module 3: GitHub Actions Integration + +### What We're Building +Real-time CI/CD monitoring using webhooks and standardized prompts for consistent team communication. + +### Key Components + +#### 1. Webhook Server +Uses Cloudflare Tunnel to receive GitHub Actions events: +```python +# Webhook endpoint handles: +# - workflow_run events +# - check_run events +# - pull_request status updates +# - deployment notifications +``` + +#### 2. Prompt Templates +Four standardized prompts ensure consistency: +- **"Analyze CI Results"** - Process test failures and build errors +- **"Generate Status Summary"** - Create human-readable status updates +- **"Create Follow-up Tasks"** - Suggest next steps based on results +- **"Draft Team Notification"** - Format updates for different audiences + +#### 3. Event Processing Pipeline +1. Receive webhook from GitHub +2. Parse event data and extract relevant information +3. Use appropriate prompt based on event type +4. Generate standardized response +5. Store for team notification + +### How Claude Uses Prompts + +Example prompt usage: +```python +# When tests fail, Claude uses the "Analyze CI Results" prompt: +prompt_data = { + "event_type": "workflow_run", + "status": "failure", + "failed_jobs": ["unit-tests", "lint"], + "error_logs": "...", + "pr_context": {...} +} + +# Claude generates: +# - Root cause analysis +# - Suggested fixes +# - Impact assessment +# - Next steps +``` + +### Standardized Workflows +Prompts ensure that regardless of who's working: +- CI failures are analyzed consistently +- Status updates follow team formats +- Follow-up actions align with processes +- Notifications contain required information + +### Learning Outcomes +- Webhook integration patterns +- Prompt engineering for consistency +- Event-driven architectures +- Standardizing team workflows + +## Module 4: Hugging Face Hub Integration + +### What We're Building +Integration with Hugging Face Hub for LLM and dataset PRs, adding specialized workflows for teams working with language models. + +### Key Components + +#### 1. Hub-Specific Tools +```python +# Tools for Hugging Face workflows: +# - analyze_model_changes: Detect LLM file modifications +# - validate_dataset_format: Check training data compliance +# - generate_model_card: Create/update model documentation +# - suggest_hub_template: PR templates for LLMs/datasets +``` + +#### 2. Hub Resources +```python +# Resources for Hub context: +# - hub://model-cards/ - LLM card templates and examples +# - hub://dataset-formats/ - Training data specifications +# - hub://community-standards/ - Hub community guidelines +# - hub://license-info/ - License compatibility checks +``` + +#### 3. LLM-Specific Prompts +```python +# Prompts for LLM workflows: +# - "Analyze Model Changes" - Understand LLM updates +# - "Generate Benchmark Summary" - Create evaluation metrics +# - "Check Dataset Quality" - Validate training data +# - "Draft Model Card Update" - Update documentation +``` + +### Hub-Specific Workflows + +When a PR modifies LLM files: +1. **Tool**: `analyze_model_changes` detects model architecture changes +2. **Resource**: Reads `hub://model-cards/llm-template.md` +3. **Prompt**: "Generate Benchmark Summary" creates evaluation section +4. **Tool**: `generate_model_card` updates documentation +5. **Resource**: Checks `hub://license-info/` for compatibility + +### Dataset PR Handling +For training data updates: +- Validates format consistency +- Checks data quality metrics +- Updates dataset cards +- Suggests appropriate reviewers + +### Learning Outcomes +- Hugging Face Hub API integration +- LLM-specific PR workflows +- Model and dataset documentation +- Community standards compliance + +## Module 5: Slack Notification + +### What We're Building +Automated team notifications combining Tools, Resources, and Prompts for complete workflow automation. + +### Key Components + +#### 1. Communication Tools +```python +# Three tools for team updates: +# - send_slack_message: Post to team channels +# - get_team_members: Identify who to notify +# - track_notification_status: Monitor delivery +``` + +#### 2. Team Resources +```python +# Resources for team data: +# - team://members/ - Developer profiles and preferences +# - slack://channels/ - Channel configurations +# - notification://templates/ - Message formats +``` + +#### 3. Notification Prompts +```python +# Prompts for communication: +# - "Format Team Update" - Style messages appropriately +# - "Choose Communication Channel" - Select right audience +# - "Escalate if Critical" - Handle urgent issues +``` + +### Integration Example + +When CI fails on a critical PR: +1. **Tool**: `get_team_members` identifies the PR author and reviewers +2. **Resource**: `team://members/{user}/preferences` checks notification settings +3. **Prompt**: "Format Team Update" creates appropriate message +4. **Tool**: `send_slack_message` delivers to right channel +5. **Resource**: `notification://templates/ci-failure` ensures consistent format +6. **Prompt**: "Escalate if Critical" determines if additional alerts needed + +### Intelligent Routing +The system considers: +- Team member availability (from calendar resources) +- Notification preferences (email vs Slack) +- Message urgency (based on PR labels) +- Time zones and working hours + +### Learning Outcomes +- Primitive integration patterns +- Complex workflow orchestration +- Balancing automation with human needs +- Production-ready error handling + +## Complete Workflow Example + +Here's how all components work together for a typical PR: + +1. **Developer creates PR** + - GitHub webhook triggers the server + - Tool: `analyze_file_changes` examines the diff + - Resource: Reads team guidelines and project context + - Prompt: Suggests optimal PR template + +2. **CI/CD Pipeline Runs** + - Webhook receives workflow events + - Prompt: "Analyze CI Results" processes outcomes + - Resource: Checks team escalation policies + - Tool: Updates PR status in GitHub + +3. **Hugging Face Hub Integration** + - Tool: Detects LLM/dataset changes + - Resource: Reads Hub guidelines + - Prompt: Generates model card updates + - Tool: Validates against Hub standards + +4. **Team Notification** + - Tool: Identifies relevant team members + - Resource: Reads notification preferences + - Prompt: Formats appropriate message + - Tool: Sends via Slack channels + +5. **Follow-up Actions** + - Prompt: "Create Follow-up Tasks" generates next steps + - Tool: Creates GitHub issues if needed + - Resource: Links to documentation + - All primitives work together seamlessly + +## Testing Strategy + +### Unit Tests +Each module includes comprehensive unit tests: +- Tool schema validation +- Resource URI parsing +- Prompt template rendering +- Integration scenarios + +### Integration Tests +End-to-end tests cover: +- Complete PR workflow +- Error recovery scenarios +- Performance under load +- Security validation + +### Test Structure +``` +tests/ +├── unit/ +│ ├── test_tools.py +│ ├── test_resources.py +│ ├── test_prompts.py +│ └── test_integration.py +├── integration/ +│ ├── test_workflow.py +│ ├── test_webhooks.py +│ └── test_notifications.py +└── fixtures/ + ├── sample_events.json + └── mock_responses.json +``` + +## Running the Solution + +### Local Development Setup +1. **Start the MCP server**: `python server.py` +2. **Configure Claude Code**: Add server to MCP settings +3. **Set up Cloudflare Tunnel**: `cloudflared tunnel --url http://localhost:3000` +4. **Configure webhooks**: Add tunnel URL to GitHub repository +5. **Test the workflow**: Create a PR and watch the automation + +### Configuration +Simple file-based configuration for easy setup: +- GitHub tokens in `.env` file +- Slack webhooks in config +- Template customization in `templates/` +- All settings in one place + +## Common Patterns and Best Practices + +### Tool Design +- Keep tools focused and single-purpose +- Return structured data for AI interpretation +- Include comprehensive error messages +- Version your tool schemas + +### Resource Organization +- Use clear URI hierarchies +- Implement resource discovery +- Cache frequently accessed resources +- Version control all resources + +### Prompt Engineering +- Make prompts specific but flexible +- Include context and examples +- Test with various inputs +- Maintain prompt libraries + +### Integration Patterns +- Use events for loose coupling +- Implement circuit breakers +- Add retries with backoff +- Monitor all external calls + +## Troubleshooting Guide + +### Common Issues + +1. **Webhook not receiving events** + - Check Cloudflare Tunnel is running + - Verify GitHub webhook configuration + - Confirm secret matches + +2. **Tools not appearing in Claude** + - Validate tool schemas + - Check server registration + - Review MCP connection + +3. **Resources not accessible** + - Verify file permissions + - Check URI formatting + - Confirm resource registration + +4. **Prompts producing inconsistent results** + - Review prompt templates + - Check context provided + - Validate input formatting + +## Next Steps and Extensions + +### Potential Enhancements +1. Add more code analysis tools (complexity, security) +2. Integrate with more communication platforms +3. Add custom workflow definitions +4. Implement PR auto-merge capabilities + +### Learning Path +- **Next**: Unit 4 - Deploy this server remotely +- **Advanced**: Custom MCP protocol extensions +- **Expert**: Multi-server orchestration + +## Conclusion + +This PR Agent demonstrates the power of MCP's three primitives working together. Tools provide capabilities, Resources offer context, and Prompts ensure consistency. Combined, they create an intelligent automation system that enhances developer productivity while maintaining team standards. + +The modular architecture ensures each component can be understood, tested, and extended independently, while the integration showcases real-world patterns you'll use in production MCP servers. \ No newline at end of file diff --git a/units/en/unit3/build-mcp-server.mdx b/units/en/unit3/build-mcp-server.mdx new file mode 100644 index 0000000..f5ad14b --- /dev/null +++ b/units/en/unit3/build-mcp-server.mdx @@ -0,0 +1,340 @@ +# Module 1: Build MCP Server + +## The PR Chaos at CodeCraft Studios + +It's your first week at CodeCraft Studios, and you're witnessing something that makes every developer cringe. The team's pull requests look like this: + +- "stuff" +- "more changes" +- "fix" +- "update things" + +Meanwhile, the code review backlog is growing because reviewers can't understand what changed or why. Sarah from the backend team spent 30 minutes trying to figure out what "various improvements" actually meant, while Mike from frontend had to dig through 47 files to understand a "small fix." + +The team knows they need better PR descriptions, but everyone's too busy shipping features to write detailed explanations. They need a solution that helps without slowing them down. + +**Your mission**: Build an intelligent PR Agent that analyzes code changes and suggests helpful descriptions automatically. + +## What You'll Build + +In this first module, you'll create the foundation of CodeCraft Studios' automation system: an MCP server that transforms how the team writes pull requests. This module focuses on core MCP concepts that you'll build upon in Modules 2 and 3. + +## What You Will Learn + +In this foundational module, you'll master: +- **How to create a basic MCP server using FastMCP** - The building blocks for Modules 2 and 3 +- **Implementing MCP Tools for data retrieval and analysis** - The core primitive you'll use throughout Unit 3 +- **Letting Claude make intelligent decisions based on raw data** - A key principle for all MCP development +- **Testing and validating your MCP server** - Essential skills for building reliable tools + +## Overview + +Your PR Agent will solve CodeCraft Studios' problem using a key principle of MCP development: instead of hard-coding rigid rules about what makes a good PR, you'll provide Claude with raw git data and let it intelligently suggest appropriate descriptions. + +This approach works because: +- **Flexible analysis**: Claude can understand context that simple rules miss +- **Natural language**: Suggestions feel human, not robotic +- **Adaptable**: Works for any codebase or coding style + +You'll implement three essential tools that establish patterns for the entire automation system: + +1. **analyze_file_changes** - Retrieves git diff information and changed files (data collection) +2. **get_pr_templates** - Lists available PR templates (resource management) +3. **suggest_template** - Allows Claude to recommend the most appropriate template (intelligent decision-making) + +## Getting Started + +### Prerequisites + +- Python 3.10 or higher +- Git installed and a git repository to test with +- uv package manager ([installation guide](https://docs.astral.sh/uv/getting-started/installation/)) + +### Starter Code + +Clone the starter code repository: + +```bash +git clone https://github.com/huggingface/mcp-course.git +``` + +Navigate to the starter code directory: + +```bash +cd mcp-course/projects/unit3/build-mcp-server/starter +``` + +Install dependencies: + + + +You might want to create a virtual environment for this project: + +```bash +uv venv .venv +source .venv/bin/activate # On Windows use: .venv\Scripts\activate +``` + + +```bash +uv sync --all-extras +``` + +### Your Task + +This is your first hands-on MCP development experience! Open `server.py` and implement the three tools following the TODO comments. The starter code provides the basic structure - you need to: + +1. **Implement `analyze_file_changes`** to run git commands and return diff data + - ⚠️ **Important**: You'll likely hit a token limit error (25,000 tokens max per response) + - This is a real-world constraint that teaches proper output management + - See the "Handling Large Outputs" section below for the solution + - ⚠️ **Note**: Git commands will run in the MCP server's directory by default. See "Working Directory Considerations" below for details +2. **Implement `get_pr_templates`** to manage and return PR templates +3. **Implement `suggest_template`** to map change types to templates + +Don't worry about making everything perfect - you'll refine these skills as you progress through the unit. + +### Design Philosophy + +Unlike traditional systems that categorize changes based on file extensions or rigid patterns, your implementation should: + +- Provide Claude with raw git data (diffs, file lists, statistics) +- Let Claude analyze the actual code changes +- Allow Claude to make intelligent template suggestions +- Keep the logic simple - Claude handles the complexity + + + +**MCP Philosophy**: Instead of building complex logic into your tools, provide Claude with rich data and let its intelligence make the decisions. This makes your code simpler and more flexible than traditional rule-based systems. + + + +## Testing Your Implementation + +### 1. Validate Your Code + +Run the validation script to check your implementation: + +```bash +uv run python validate_starter.py +``` + +### 2. Run Unit Tests + +Test your implementation with the provided test suite: + +```bash +uv run pytest test_server.py -v +``` + +### 3. Test with Claude Code + +Configure your server directly in Claude Code: + +```bash +# Add the MCP server to Claude Code +claude mcp add pr-agent -- uv --directory /absolute/path/to/starter run server.py + +# Verify the server is configured +claude mcp list +``` + +Then: +1. Make some changes in a git repository +2. Ask Claude: "Can you analyze my changes and suggest a PR template?" +3. Watch Claude use your tools to provide intelligent suggestions + + +**Common first error**: If you get "MCP tool response exceeds maximum allowed tokens (25000)", this is expected! Large repositories can generate massive diffs. This is a valuable learning moment - see the "Handling Large Outputs" section for the solution. + + +## Common Patterns + +### Tool Implementation Pattern + +```python +@mcp.tool() +async def tool_name(param1: str, param2: bool = True) -> str: + """Tool description for Claude. + + Args: + param1: Description of parameter + param2: Optional parameter with default + """ + # Your implementation + result = {"key": "value"} + return json.dumps(result) +``` + +### Error Handling + +Always handle potential errors gracefully: + +```python +try: + result = subprocess.run(["git", "diff"], capture_output=True, text=True) + return json.dumps({"output": result.stdout}) +except Exception as e: + return json.dumps({"error": str(e)}) +``` + + +**Error Handling**: Always return valid JSON from your tools, even for errors. Claude needs structured data to understand what went wrong and provide helpful responses to users. + + +### Handling Large Outputs (Critical Learning Moment!) + + +**Real-world constraint**: MCP tools have a token limit of 25,000 tokens per response. Large git diffs can easily exceed this limit 10x or more! This is a critical lesson for production MCP development. + + +When implementing `analyze_file_changes`, you'll likely encounter this error: +``` +Error: MCP tool response (262521 tokens) exceeds maximum allowed tokens (25000) +``` + +**Why this happens:** +- A single file change can be thousands of lines +- Enterprise repositories often have massive refactorings +- Git diffs include full context by default +- JSON encoding adds overhead + +This teaches us an important principle: **Always design tools with output limits in mind**. Here's the solution: + +```python +@mcp.tool() +async def analyze_file_changes(base_branch: str = "main", + include_diff: bool = True, + max_diff_lines: int = 500) -> str: + """Analyze file changes with smart output limiting. + + Args: + base_branch: Branch to compare against + include_diff: Whether to include the actual diff + max_diff_lines: Maximum diff lines to include (default 500) + """ + try: + # Get the diff + result = subprocess.run( + ["git", "diff", f"{base_branch}...HEAD"], + capture_output=True, + text=True + ) + + diff_output = result.stdout + diff_lines = diff_output.split('\n') + + # Smart truncation if needed + if len(diff_lines) > max_diff_lines: + truncated_diff = '\n'.join(diff_lines[:max_diff_lines]) + truncated_diff += f"\n\n... Output truncated. Showing {max_diff_lines} of {len(diff_lines)} lines ..." + diff_output = truncated_diff + + # Get summary statistics + stats_result = subprocess.run( + ["git", "diff", "--stat", f"{base_branch}...HEAD"], + capture_output=True, + text=True + ) + + return json.dumps({ + "stats": stats_result.stdout, + "total_lines": len(diff_lines), + "diff": diff_output if include_diff else "Use include_diff=true to see diff", + "files_changed": self._get_changed_files(base_branch) + }) + + except Exception as e: + return json.dumps({"error": str(e)}) +``` + +**Best practices for large outputs:** +1. **Implement pagination**: Break large results into pages +2. **Add filtering options**: Let users request specific files or directories +3. **Provide summaries first**: Return statistics before full content +4. **Use progressive disclosure**: Start with high-level info, allow drilling down +5. **Set sensible defaults**: Default to reasonable limits that work for most cases + +## Working Directory Considerations + +By default, MCP servers run commands in their installation directory, not in Claude's current working directory. This means your git commands might analyze the wrong repository! + +To solve this, MCP provides [roots](https://modelcontextprotocol.io/docs/concepts/roots) - a way for clients to inform servers about relevant directories. Claude Code automatically provides its working directory as a root. + +Here's how to access it in your tool: + +```python +@mcp.tool() +async def analyze_file_changes(...): + # Get Claude's working directory from roots + context = mcp.get_context() + roots_result = await context.session.list_roots() + + # Extract the path from the FileUrl object + working_dir = roots_result.roots[0].uri.path + + # Use it for all git commands + result = subprocess.run( + ["git", "diff", "--name-status"], + capture_output=True, + text=True, + cwd=working_dir # Run in Claude's directory! + ) +``` + +This ensures your tools operate on the repository Claude is actually working with, not the MCP server's installation location. + +## Troubleshooting + +- **Import errors**: Ensure you've run `uv sync` +- **Git errors**: Make sure you're in a git repository +- **No output**: MCP servers communicate via stdio - test with Claude Desktop +- **JSON errors**: All tools must return valid JSON strings +- **Token limit exceeded**: This is expected with large diffs! Implement output limiting as shown above +- **"Response too large" errors**: Add `max_diff_lines` parameter or set `include_diff=false` +- **Git commands run in wrong directory**: MCP servers run in their installation directory by default, not Claude's working directory. To fix this, use [MCP roots](https://modelcontextprotocol.io/docs/concepts/roots) to access Claude's current directory: + ```python + # Get Claude's working directory from roots + context = mcp.get_context() + roots_result = await context.session.list_roots() + working_dir = roots_result.roots[0].uri.path # FileUrl object has .path property + + # Use it in subprocess calls + subprocess.run(["git", "diff"], cwd=working_dir) + ``` + Claude Code automatically provides its working directory as a root, allowing your MCP server to operate in the correct location. + +## Next Steps + +Congratulations! You've built your first MCP server with Tools - the foundation for everything that follows in Unit 3. + +### What you've accomplished in Module 1: +- **Created MCP Tools** that provide Claude with structured data +- **Implemented the core MCP philosophy** - let Claude make intelligent decisions from raw data +- **Built a practical PR Agent** that can analyze code changes and suggest templates +- **Learned about real-world constraints** - the 25,000 token limit and how to handle it +- **Established testing patterns** with validation scripts and unit tests + +### Key patterns you can reuse: +- **Data collection tools** that gather information from external sources +- **Intelligent analysis** where Claude processes raw data to make decisions +- **Output management** - truncating large responses while preserving usefulness +- **Error handling** that returns structured JSON responses +- **Testing strategies** for MCP server development + +### What to do next: +1. **Review the solution** in `/projects/unit3/build-mcp-server/solution/` to see different implementation approaches +2. **Compare your implementation** with the provided solution - there's no single "right" way to solve the problem +3. **Test your tools thoroughly** - try them with different types of code changes to see how Claude adapts +4. **Move on to Module 2** where you'll add real-time webhook capabilities and learn about MCP Prompts for workflow standardization + +Module 2 will build directly on the server you created here, adding dynamic event handling to complement your static file analysis tools! + +### The story continues... +With your PR Agent working, CodeCraft Studios developers are already writing better pull requests. But next week, you'll face a new challenge: critical CI/CD failures are slipping through unnoticed. Module 2 will add real-time monitoring to catch these issues before they reach production. + +## Additional Resources + +- [MCP Documentation](https://modelcontextprotocol.io/) +- [FastMCP Guide](https://modelcontextprotocol.io/quickstart/server) +- Solution walkthrough: `unit3/build-mcp-server-solution-walkthrough.md` \ No newline at end of file diff --git a/units/en/unit3/conclusion.mdx b/units/en/unit3/conclusion.mdx new file mode 100644 index 0000000..16654a4 --- /dev/null +++ b/units/en/unit3/conclusion.mdx @@ -0,0 +1,147 @@ +# Unit 3 Conclusion: The CodeCraft Studios Transformation + +## Mission Accomplished! + +Congratulations! You've successfully transformed CodeCraft Studios from a chaotic startup into a well-oiled development machine. Let's see how far you've come: + +### Before Your Automation System: +- ❌ PRs with descriptions like "stuff" and "fix" +- ❌ Critical bugs reaching production undetected +- ❌ Teams working in silos, duplicating effort +- ❌ Weekend debugging sessions for already-fixed issues + +### After Your Automation System: +- ✅ Clear, helpful PR descriptions that save reviewers time +- ✅ Real-time CI/CD monitoring that catches failures immediately +- ✅ Smart team notifications that keep everyone informed +- ✅ Developers focused on building features, not fighting process problems + +The CodeCraft Studios team now has a complete automation system that demonstrates what's possible when you combine MCP's flexibility with Claude's intelligence. + +## How You Solved Each Challenge + +Your three-module journey tackled real problems that every development team faces: + +### Module 1: Solved the PR Chaos +*"Help developers write better pull requests without slowing them down"* +- **PR Agent** with intelligent file analysis +- **Core MCP concepts**: Tools, data collection, and Claude integration +- **Design philosophy**: Provide raw data, let Claude make intelligent decisions +- **Result**: Clear PR descriptions that help reviewers understand changes + +### Module 2: Caught the Silent Failures +*"Never let another critical bug slip through unnoticed"* +- **Webhook server** for capturing GitHub Actions events +- **MCP Prompts** for standardized workflow guidance +- **Event storage system** using simple JSON files +- **Result**: Real-time CI/CD monitoring that prevents production issues + +### Module 3: Bridged the Communication Gap +*"Keep the whole team informed about what's happening"* +- **Slack integration** for team notifications +- **Message formatting** using Claude's intelligence +- **Tools + Prompts combination** for powerful automation +- **Result**: Smart notifications that eliminate information silos + +## Key MCP Concepts You've Learned + +### MCP Primitives +- **Tools**: For data access and external API calls +- **Prompts**: For consistent workflow guidance and formatting +- **Integration patterns**: How Tools and Prompts work together + +### Architecture Patterns +- **Separation of concerns**: MCP server vs webhook server +- **File-based event storage**: Simple, reliable, testable +- **Claude as the intelligence layer**: Making decisions from raw data + +### Development Best Practices +- **Error handling**: Returning structured JSON even for failures +- **Security**: Environment variables for sensitive credentials +- **Testing**: Validation scripts and manual testing workflows + +## Real-World Applications + +The patterns you've learned can be applied to many automation scenarios: + + + +**Beyond CI/CD**: The Tools + Prompts pattern works for customer support automation, content moderation, data analysis workflows, and any scenario where you need intelligent processing of external data. + + + +### Common Patterns from Unit 3 +1. **Data Collection** → Tools that gather information +2. **Intelligent Analysis** → Claude processes the data +3. **Formatted Output** → Prompts guide consistent presentation +4. **External Integration** → Tools interact with APIs and services + +## Next Steps + +### Immediate Actions +1. **Experiment** with your workflow automation - try different GitHub events +2. **Extend** the system with additional integrations (Discord, email, etc.) +3. **Share** your MCP server with teammates for real project use + +### Advanced Exploration +- **Scale up**: Handle multiple repositories or teams +- **Add persistence**: Use databases for larger event volumes +- **Create dashboards**: Build web interfaces for your automation +- **Explore other MCP clients**: Beyond Claude Code and Claude Desktop + +### Community Involvement +- **Contribute** to the MCP ecosystem with your own servers +- **Share patterns** you discover with the community +- **Build on** existing MCP servers and extend their capabilities + +## Key Takeaways + + + +**MCP Philosophy**: The most effective MCP servers don't try to be smart - they provide Claude with rich, structured data and let Claude's intelligence do the heavy lifting. This makes your code simpler and more flexible. + + + +### Technical Insights +- **Simple is powerful**: JSON file storage can handle many use cases +- **Claude as orchestrator**: Let Claude coordinate between your tools +- **Prompts for consistency**: Use prompts to ensure reliable output formats + +### Development Insights +- **Start small**: Build one tool at a time, test thoroughly +- **Think in workflows**: Design tools that work well together +- **Plan for humans**: Your automation should help teams, not replace them + +## Resources for Continued Learning + +### MCP Documentation +- [Official MCP Protocol](https://modelcontextprotocol.io/) +- [Python SDK Reference](https://github.com/modelcontextprotocol/python-sdk) +- [FastMCP Framework](https://gofastmcp.com/) + +### Community Resources +- [MCP Server Directory](https://modelcontextprotocol.io/servers) +- [Example Implementations](https://github.com/modelcontextprotocol) +- [Community Discord](https://discord.gg/modelcontextprotocol) + +--- + +## The CodeCraft Studios Success Story + +Three weeks ago, CodeCraft Studios was struggling with: +- Unclear pull requests causing review delays +- Critical bugs slipping into production +- Teams working in isolation and duplicating effort + +Today, they have an intelligent automation system that: +- **Helps developers** write clear, helpful PR descriptions automatically +- **Monitors CI/CD pipelines** and alerts the team to issues immediately +- **Keeps everyone informed** with smart, contextual team notifications + +You've built more than just an MCP server - you've created a solution that transforms how development teams work together. + +## Your MCP Journey Continues + +The patterns you learned at CodeCraft Studios can solve countless other automation challenges. Whether you're building customer service tools, data analysis pipelines, or any system that needs intelligent processing, you now have the foundation to create powerful, adaptive solutions with MCP. + +The future of intelligent automation is in your hands. What will you build next? 🚀 \ No newline at end of file diff --git a/units/en/unit3/github-actions-integration.mdx b/units/en/unit3/github-actions-integration.mdx new file mode 100644 index 0000000..673064a --- /dev/null +++ b/units/en/unit3/github-actions-integration.mdx @@ -0,0 +1,250 @@ +# Module 2: GitHub Actions Integration + +## The Silent Failures Strike + +Week 2 at CodeCraft Studios. Your PR Agent from Module 1 is already helping developers write better pull requests - Sarah's latest PR had a clear description that saved Mike 20 minutes of investigation time. The team is thrilled! + +But then disaster strikes. + +A critical bug reaches production on Friday afternoon. The payment system is down, customers are complaining, and the team scrambles to investigate. After two stressful hours, they discover the root cause: a test failure in Tuesday's CI run that nobody noticed. + +"How did we miss this?" asks the team lead, scrolling through GitHub Actions. "The tests clearly failed, but with 47 repositories and dozens of daily commits, who has time to check every build?" + +The team realizes they need real-time visibility into their CI/CD pipeline, but manually checking GitHub Actions across all their projects isn't scalable. They need automation that watches for problems and alerts them immediately. + +**Your mission**: Extend your MCP server with webhook capabilities to monitor GitHub Actions and never let another failure slip through unnoticed. + +## What You'll Build + +This module bridges the gap between static file analysis (Module 1) and dynamic team notifications (Module 3). You'll add real-time capabilities that transform your PR Agent into a comprehensive development monitoring system. + +## What You'll Build + +Building on the foundation you created in Module 1, you'll add: +- **Webhook server** to receive GitHub Actions events +- **New tools** for monitoring CI/CD status +- **MCP Prompts** that provide consistent workflow patterns +- **Real-time integration** with your GitHub repository + +## Learning Objectives + +By the end of this module, you'll understand: +1. How to run a webhook server alongside an MCP server +2. How to receive and process GitHub webhooks +3. How to create MCP Prompts for standardized workflows +4. How to use Cloudflare Tunnel for local webhook testing + +## Prerequisites + +You'll build directly on your work from Module 1, so make sure you have: +- **Completed Module 1: Build MCP Server** - You'll be extending that same codebase +- **Basic understanding of GitHub Actions** - You should know what CI/CD workflows are +- **A GitHub repository with Actions enabled** - Even a simple workflow file works fine +- **Cloudflare Tunnel (cloudflared) installed** - This will expose your local webhook server to GitHub + +## Key Concepts + +### MCP Prompts + +Prompts are reusable templates that guide Claude through complex workflows. Unlike Tools (which Claude calls automatically), Prompts are user-initiated and provide structured guidance. + +Example use cases: +- Analyzing CI/CD results consistently +- Creating standardized deployment summaries +- Troubleshooting failures systematically + +### Webhook Integration + +Your MCP server will run two services: +1. The MCP server (communicates with Claude) +2. A webhook server on port 8080 (receives GitHub events) + +This allows Claude to react to real-time CI/CD events! + + + +**Architecture Insight**: Running separate services for MCP communication and webhook handling is a clean separation of concerns. The webhook server handles HTTP complexity while your MCP server focuses on data analysis and Claude integration. + + + +## Project Structure + +``` +github-actions-integration/ +├── starter/ # Your starting point +│ ├── server.py # Module 1 code + TODOs +│ ├── pyproject.toml +│ └── README.md +└── solution/ # Complete implementation + ├── server.py # Full webhook + prompts + ├── pyproject.toml + └── README.md +``` + +## Implementation Steps + +### Step 1: Set Up and Run Webhook Server + +Unlike Module 1 where you worked with existing files, this module introduces real-time event handling. The starter code includes: +- **Your Module 1 implementation** - All your existing PR analysis tools +- **A complete webhook server** (`webhook_server.py`) - Ready to receive GitHub events + +1. Install dependencies (same as Module 1): + ```bash + uv sync + ``` + +2. Start the webhook server (in a separate terminal): + ```bash + python webhook_server.py + ``` + +This server will receive GitHub webhooks and store them in `github_events.json`. + +**How webhook event storage works:** +- Each incoming GitHub webhook (push, pull request, workflow completion, etc.) is appended to the JSON file +- Events are stored with timestamps, making it easy to find recent activity +- The file acts as a simple event log that your MCP tools can read and analyze +- No database required - everything is stored in a simple, readable JSON format + +### Step 2: Connect to Event Storage + +Now you'll connect your MCP server (from Module 1) to the webhook data. This is much simpler than handling HTTP requests directly - the webhook server does all the heavy lifting and stores events in a JSON file. + +Add the path to read webhook events: + +```python +# File where webhook server stores events +EVENTS_FILE = Path(__file__).parent / "github_events.json" +``` + +The webhook server handles all the HTTP details - you just need to read the JSON file! This separation of concerns keeps your MCP server focused on what it does best. + + + +**Development Tip**: Working with files instead of HTTP requests makes testing much easier. You can manually add events to `github_events.json` to test your tools without setting up webhooks. + + + +### Step 3: Add GitHub Actions Tools + +Just like in Module 1 where you created tools for file analysis, you'll now create tools for CI/CD analysis. These tools will work alongside your existing PR analysis tools, giving Claude a complete view of both code changes and build status. + + +**Note**: The starter code already includes the output limiting fix from Module 1, so you won't encounter token limit errors. Focus on the new concepts in this module! + + +Implement two new tools: + +1. **`get_recent_actions_events`**: + - Read from `EVENTS_FILE` + - Return the most recent events (up to limit) + - Return empty list if file doesn't exist + +2. **`get_workflow_status`**: + - Read all events from file + - Filter for workflow_run events + - Group by workflow name and show latest status + +These tools let Claude analyze your CI/CD pipeline. + +### Step 4: Create MCP Prompts + +Now you'll add your first MCP Prompts! Unlike Tools (which Claude calls automatically), Prompts are templates that help users interact with Claude consistently. Think of them as "conversation starters" that guide Claude through complex workflows. + +While Module 1 focused on Tools for data access, this module introduces Prompts for workflow guidance. + +Implement four prompts that demonstrate different workflow patterns: + +1. **`analyze_ci_results`**: Comprehensive CI/CD analysis +2. **`create_deployment_summary`**: Team-friendly updates +3. **`generate_pr_status_report`**: Combined code + CI report +4. **`troubleshoot_workflow_failure`**: Systematic debugging + +Each prompt should return a string with clear instructions for Claude to follow. + +### Step 5: Test with Cloudflare Tunnel + +Now for the exciting part - testing your expanded MCP server with real GitHub events! You'll run multiple services together, just like in a real development environment. + +1. Start your MCP server (same command as Module 1): + ```bash + uv run server.py + ``` + +2. In another terminal, start Cloudflare Tunnel: + ```bash + cloudflared tunnel --url http://localhost:8080 + ``` + +3. Configure GitHub webhook with the tunnel URL + +4. Test with Claude Code using the prompts + +## Exercises + +### Exercise 1: Custom Workflow Prompt +Create a new prompt that helps with PR reviews by combining: +- Code changes from Module 1 tools +- CI/CD status from Module 2 tools +- A checklist format for reviewers + +### Exercise 2: Event Filtering +Enhance `get_workflow_status` to: +- Filter by workflow conclusion (success/failure) +- Group by repository +- Show time since last run + +### Exercise 3: Notification System +Add a tool that: +- Tracks which events have been "seen" +- Highlights new failures +- Suggests which team member to notify + +## Common Issues + +### Webhook Not Receiving Events +- Ensure Cloudflare Tunnel is running +- Check GitHub webhook settings (should show recent deliveries) +- Verify the payload URL includes `/webhook/github` + +### Prompt Not Working +- FastMCP prompts simply return strings +- Make sure your function is decorated with `@mcp.prompt()` + +### Webhook Server Issues +- Ensure webhook_server.py is running in a separate terminal +- Check that port 8080 is free: `lsof -i :8080` +- The events file will be created automatically when first event is received + +## Next Steps + +Excellent work! You've successfully added real-time capabilities to your MCP server. You now have a system that can: + +- **Analyze code changes** (from Module 1) +- **Monitor CI/CD events in real-time** (from this module) +- **Use MCP Prompts** to provide consistent workflow guidance +- **Handle webhook events** through a clean file-based architecture + +### Key achievements in Module 2: +- Built your first webhook integration +- Learned MCP Prompts for workflow standardization +- Created tools that work with real-time data +- Established patterns for event-driven automation + +### What to do next: +1. **Review the solution** in `/projects/unit3/github-actions-integration/solution/` to see different implementation approaches +2. **Experiment with your prompts** - try using them for different types of GitHub events +3. **Test the integration** - combine your Module 1 file analysis tools with Module 2 event monitoring in a single conversation with Claude +4. **Move on to Module 3** - where you'll complete the automation pipeline by adding team notifications through Slack integration + +Module 3 will bring everything together into a complete workflow that your team can actually use! + +### The story continues... +Your monitoring system is working! CodeCraft Studios now catches CI/CD failures in real-time, and the team feels much more confident about their deployments. But next week brings a new challenge: information silos are causing duplicate work and missed opportunities. Module 3 will complete the automation system with intelligent team notifications that keep everyone in the loop. + +## Additional Resources + +- [MCP Prompts Documentation](https://modelcontextprotocol.io/docs/concepts/prompts) +- [GitHub Webhooks Guide](https://docs.github.com/en/developers/webhooks-and-events) +- [Cloudflare Tunnel Documentation](https://developers.cloudflare.com/cloudflare-one/connections/connect-apps) \ No newline at end of file diff --git a/units/en/unit3/implementation_plan.md b/units/en/unit3/implementation_plan.md new file mode 100644 index 0000000..80045d1 --- /dev/null +++ b/units/en/unit3/implementation_plan.md @@ -0,0 +1,159 @@ +# Unit 3 Implementation Plan Enhancements: Complete MCP Primitives Coverage + +## Overview + +This document enhances the existing [implementation_plan.md](./implementation_plan.md) to ensure comprehensive coverage of the core MCP primitives (Tools and Prompts) while maintaining the solid foundation and timeline already established. + +## MCP Primitives Integration Strategy + +Rather than redesigning the entire unit, we'll enhance each existing module to naturally incorporate MCP primitives without changing the core learning goals or structure. + +## Enhanced Module Structure + +### Module 1: Basic Workflow Server (30 min) +**Existing Goal**: "I want Claude Code to help me create better PRs" +**+ MCP Primitive**: **Tools** + +**Current Plan**: +- Minimal MCP server with PR template suggestion +- Simple rule-based template selection (file extension → template type) + +**Enhancements**: +- **Tool**: `analyze_file_changes` - Returns structured data about changed files for Claude to analyze +- **Tool**: `get_pr_templates` - Lists available PR templates with metadata +- **Tool**: `suggest_template` - Provides template recommendation based on file analysis + +**What Claude Does**: Uses tools to gather file data, then intelligently decides which template to recommend and explains why. + +**Learning Focus**: Tool registration, schema definition, and letting Claude make smart decisions with structured data. + +--- + +### Module 2: GitHub Actions Integration (45 min) +**Existing Goal**: "Tell me when my tests pass/fail" +**+ MCP Primitive**: **Prompts** + +**Current Plan**: +- Local webhook receiver using Cloudflare Tunnel +- GitHub Actions event parsing and real-time CI/CD status + +**Enhancements**: +- **Prompt**: "Analyze CI Results" - Standardized prompt for processing GitHub Actions outcomes +- **Prompt**: "Generate Status Summary" - Consistent format for CI/CD status updates +- **Prompt**: "Create Follow-up Tasks" - Generate next steps based on CI results +- **Prompt**: "Draft Team Notification" - Standardized team communication about CI events + +**What Claude Does**: Uses prompts to consistently analyze CI results and generate standardized team communications. + +**Learning Focus**: Prompt templates, workflow consistency, and reusable team processes. + +--- + +### Module 3: Team Communication (45 min) +**Existing Goal**: "Update my team automatically" +**+ Integration**: **All Three Primitives Working Together** + +**Current Plan**: +- Slack webhook integration for notifications +- Smart message formatting based on CI results + +**Enhancements**: +- **Tools**: `send_slack_message`, `get_team_members`, `track_notification_status` +- **Resources**: `team://members/`, `slack://channels/`, `notification://templates/` +- **Prompts**: "Format Team Update", "Choose Communication Channel", "Escalate if Critical" + +**What Claude Does**: Combines tools (Slack API), resources (team data), and prompts (message formatting) for complete workflow automation. + +**Learning Focus**: Primitive integration, workflow orchestration, and production patterns. + +--- + +### Module 4: Polish & Integration (30 min) +**Existing Goal**: "Make it production-ready (locally)" +**+ Orchestration**: **Complete Workflow Demonstration** + +**Current Plan**: +- Error handling and logging +- Configuration management +- Complete Claude Code workflow demonstration + +**Enhancements**: +- **Showcase**: End-to-end workflow using all primitives +- **Demo**: "Create PR → Analyze Changes → Monitor CI → Notify Team" +- **Testing**: Validate all primitives work together seamlessly +- **Documentation**: How each primitive contributes to the workflow + +**What Claude Does**: Demonstrates the complete team workflow automation with intelligent decision-making at each step. + +**Learning Focus**: System integration, error handling, and preparing for Unit 4 deployment. + +## Primitive Distribution + +| Module | Primary Primitive | Secondary | Learning Outcome | +|--------|------------------|-----------|------------------| +| 1 | **Tools** | - | Claude can call functions to get structured data | +| 2 | **Prompts** | Tools | Claude can follow standardized workflows consistently | +| 3 | **Integration** | Tools & Prompts | All primitives work together for complex automation | +| 4 | **Orchestration** | Tools & Prompts | Production-ready workflow with proper error handling | + +## Implementation Benefits + +### Maintains Existing Strengths +- ✅ **Same timeline** - 3 hours total, module breakdown unchanged +- ✅ **Same learning goals** - Each module still has clear, practical objectives +- ✅ **Same progression** - Local toy → production pipeline approach +- ✅ **Same technology choices** - Cloudflare Tunnel, GitHub Actions, etc. + +### Adds MCP Depth +- ✅ **Complete coverage** - Core MCP primitives (Tools and Prompts) with real examples +- ✅ **Natural integration** - Primitives enhance existing modules rather than replace them +- ✅ **Progressive complexity** - Tools → Prompts → Integration +- ✅ **Real-world patterns** - How to combine primitives effectively + +### Educational Enhancements +- **Advanced MCP concepts** - Beyond basic server building from Unit 2 +- **Primitive synergy** - How tools, resources, and prompts work together +- **Workflow standardization** - Using prompts for team consistency +- **Context awareness** - Resources make Claude team and project aware + +## Quiz Enhancement Areas for @burtenshaw + +### Additional Quiz Topics +- **Tools vs Resources vs Prompts** - When to use each primitive +- **Resource URI patterns** - Designing discoverable resource schemas +- **Prompt engineering** - Creating effective workflow templates +- **Primitive integration** - Combining all three for complex workflows + +### Sample Questions +- "How would you expose team coding standards to Claude?" (Resources) +- "What's the difference between a tool and a prompt?" (Concepts) +- "How do you make workflow processes consistent across team members?" (Prompts) + +## Implementation Notes + +### Code Structure +- Each module's starter code includes framework for the new primitive +- Solutions demonstrate both the existing functionality AND the primitive integration +- No breaking changes to existing module goals or timelines + +### Testing Strategy +- Test each primitive individually within modules +- Test primitive integration in Module 4 +- Validate end-to-end workflow in Module 5 + +### Documentation +- Each module explains the primitive it introduces +- Show how the primitive enhances the existing functionality +- Provide examples of other use cases for each primitive + +## Next Steps + +1. **Enhance Module 1** - Add proper Tools implementation to existing file analysis +2. **Design Resources** - Create resource schemas for Module 2 project context +3. **Create Prompts** - Develop workflow prompt templates for Module 3 +4. **Integration Testing** - Ensure all primitives work together in Module 4 +5. **Documentation** - Update module READMEs with primitive explanations + +--- + +*This enhancement maintains the existing solid plan while ensuring learners get comprehensive MCP primitives education through practical workflow automation.* \ No newline at end of file diff --git a/units/en/unit3/introduction.mdx b/units/en/unit3/introduction.mdx index d3c9b29..7417970 100644 --- a/units/en/unit3/introduction.mdx +++ b/units/en/unit3/introduction.mdx @@ -1,3 +1,77 @@ -# Coming Soon +# Advanced MCP Development: Building Custom Workflow Servers for Claude Code -This will be another use case that dives deeper into the MCP protocol and how to use it in more complex ways. \ No newline at end of file +Welcome to Unit 3! In this unit, we'll build a practical MCP server that enhances Claude Code with custom development workflows while learning all three MCP primitives. + +## What You'll Build + +**PR Agent Workflow Server** - An MCP server that demonstrates how to make Claude Code team-aware and workflow-intelligent: + +- **Smart PR Management**: Automatic PR template selection based on code changes using MCP Tools +- **CI/CD Monitoring**: Track GitHub Actions with Cloudflare Tunnel and standardized Prompts +- **Team Communication**: Slack notifications demonstrating all MCP primitives working together + +## Real-World Case Study + +We'll implement a practical scenario every development team faces: + +**Before**: Developer manually creates PRs, waits for Actions to complete, manually checks results, remembers to notify team members + +**After**: Claude Code connected to your workflow server can intelligently: +- Suggest the right PR template based on changed files +- Monitor GitHub Actions runs and provide formatted summaries +- Automatically notify team via Slack when deployments succeed/fail +- Guide developers through team-specific review processes based on Actions results + +## Key Learning Outcomes + +1. **Core MCP Primitives**: Master Tools and Prompts through practical examples +2. **MCP Server Development**: Build a functional server with proper structure and error handling +3. **GitHub Actions Integration**: Use Cloudflare Tunnel to receive webhooks and process CI/CD events +4. **Hugging Face Hub Workflows**: Create specialized workflows for LLM development teams +5. **Multi-System Integration**: Connect GitHub, Slack, and Hugging Face Hub through MCP +6. **Claude Code Enhancement**: Make Claude understand your team's specific workflows + +## MCP Primitives in Action + +This unit provides hands-on experience with the core MCP primitives: + +- **Tools** (Module 1): Functions Claude can call to analyze files and suggest templates +- **Prompts** (Module 2): Standardized workflows for consistent team processes +- **Integration** (Module 3): All primitives working together for complex automation + +## Module Structure + +1. **Module 1: Build MCP Server** - Create a basic server with Tools for PR template suggestions +2. **Module 2: GitHub Actions Integration** - Monitor CI/CD with Cloudflare Tunnel and Prompts +3. **Module 3: Slack Notification** - Team communication integrating all MCP primitives + +## Prerequisites + +Before starting this unit, ensure you have: + +- Completion of Units 1 and 2 +- Basic familiarity with GitHub Actions and webhook concepts +- Access to a GitHub repository for testing (can be a personal test repo) +- A Slack workspace where you can create webhook integrations + +### Claude Code Installation and Setup + +This unit requires Claude Code to test your MCP server integration. + + + +**Installation Required:** This unit requires Claude Code for testing MCP server integration with AI workflows. + + + +**Quick Setup:** + +Follow the [official installation guide](https://docs.anthropic.com/en/docs/claude-code/getting-started) to install Claude Code and complete authentication. The key steps are installing via npm, navigating to your project directory, and running `claude` to authenticate through console.anthropic.com. + +Once installed, you'll use Claude Code throughout this unit to test your MCP server and interact with the workflow automation you build. + + +**New to Claude Code?** If you encounter any setup issues, the [troubleshooting guide](https://docs.anthropic.com/en/docs/claude-code/troubleshooting) covers common installation and authentication problems. + + +By the end of this unit, you'll have built a complete MCP server that demonstrates how to transform Claude Code into a powerful team development assistant, with hands-on experience using all three MCP primitives. \ No newline at end of file diff --git a/units/en/unit3/mcp-server-hf-course-in-action.png b/units/en/unit3/mcp-server-hf-course-in-action.png new file mode 100644 index 0000000..1b15335 Binary files /dev/null and b/units/en/unit3/mcp-server-hf-course-in-action.png differ diff --git a/units/en/unit3/slack-notification.mdx b/units/en/unit3/slack-notification.mdx new file mode 100644 index 0000000..01ad12b --- /dev/null +++ b/units/en/unit3/slack-notification.mdx @@ -0,0 +1,359 @@ +# Module 3: Slack Notification + +## The Communication Gap Crisis + +Week 3 at CodeCraft Studios. Your automation system is already transforming how the team works: +- **PR Agent** (Module 1): Developers are writing clear, helpful pull request descriptions +- **CI/CD Monitor** (Module 2): The team catches test failures immediately, preventing bugs from reaching production + +The team is feeling much more confident... until Monday morning brings a new crisis. + +The frontend team (Emma and Jake) spent the entire weekend debugging a nasty API integration issue. They tried everything: checked their network calls, validated request formats, even rewrote the error handling. Finally, at 2 AM Sunday, they discovered the backend team had fixed this exact issue on Friday and deployed the fix to staging - but forgot to announce it. + +"We wasted 12 hours solving a problem that was already fixed!" Emma says, frustrated. + +Meanwhile, the design team finished the new user onboarding flow illustrations last week, but the frontend team didn't know they were ready. Those beautiful assets are still sitting unused while the team ships a temporary design. + +The team realizes they have an information silo problem. Everyone's working hard, but they're not communicating effectively about what's happening when. + +**Your mission**: Complete the automation system with intelligent Slack notifications that keep the whole team informed about important developments automatically. + +## What You'll Build + +This final module completes the CodeCraft Studios transformation. You'll integrate Tools and Prompts to create a smart notification system that sends formatted Slack messages about CI/CD events, demonstrating how all MCP primitives work together in a real-world scenario. + +## What You'll Build + +Building on the foundation from Modules 1 and 2, you'll add the final piece of the puzzle: +- **Slack webhook tool** for sending messages to your team channel +- **Two notification prompts** that intelligently format CI events +- **Complete integration** showing all MCP primitives working together + +## Learning Objectives + +By the end of this module, you'll understand: +1. How to integrate external APIs with MCP Tools +2. How to combine Tools and Prompts for complete workflows +3. How to format rich messages using Slack markdown +4. How all MCP primitives work together in practice + +## Prerequisites + +You'll need everything from the previous modules plus: +- **Completed Modules 1 and 2** - This module directly extends your existing MCP server +- **A Slack workspace** where you can create incoming webhooks (personal workspaces work fine) +- **Basic understanding of REST APIs** - You'll be making HTTP requests to Slack's webhook endpoints + +## Key Concepts + +### MCP Integration Pattern + +This module demonstrates the complete workflow: +1. **Events** → GitHub Actions webhook (from Module 2) +2. **Prompts** → Format events into readable messages +3. **Tools** → Send formatted messages to Slack +4. **Result** → Professional team notifications + +### Slack Markdown Formatting + +You'll use [Slack's markdown](https://api.slack.com/reference/surfaces/formatting) for rich messages: +- [`*bold text*`](https://api.slack.com/reference/surfaces/formatting#visual-styles) for emphasis +- [`_italic text_`](https://api.slack.com/reference/surfaces/formatting#visual-styles) for details +- [`` `code blocks` ``](https://api.slack.com/reference/surfaces/formatting#inline-code) for technical info +- [`> quoted text`](https://api.slack.com/reference/surfaces/formatting#quotes) for summaries +- [Emoji](https://api.slack.com/reference/surfaces/formatting#emoji): ✅ ❌ 🚀 ⚠️ +- [Links](https://api.slack.com/reference/surfaces/formatting#linking-urls): `` + +## Project Structure + +``` +slack-notification/ +├── starter/ # Your starting point +│ ├── server.py # Modules 1+2 code + TODOs +│ ├── webhook_server.py # From Module 2 +│ ├── pyproject.toml +│ └── README.md +└── solution/ # Complete implementation + ├── server.py # Full Slack integration + ├── webhook_server.py + └── README.md +``` + +## Implementation Steps + +### Step 1: Set Up Slack Integration (10 min) + +1. Create a Slack webhook: + - Go to [Slack API Apps](https://api.slack.com/apps) + - Create new app → "From scratch" ([Creating an app guide](https://api.slack.com/authentication/basics#creating)) + - App Name: "MCP Course Notifications" + - Choose your workspace + - Go to "Features" → "[Incoming Webhooks](https://api.slack.com/messaging/webhooks)" + - [Activate incoming webhooks](https://api.slack.com/messaging/webhooks#enable_webhooks) + - Click "Add New Webhook to Workspace" + - Choose channel and authorize ([Webhook setup guide](https://api.slack.com/messaging/webhooks#getting_started)) + - Copy the webhook URL + +2. Test webhook works (following [webhook posting examples](https://api.slack.com/messaging/webhooks#posting_with_webhooks)): + ```bash + curl -X POST -H 'Content-type: application/json' \ + --data '{"text":"Hello from MCP Course!"}' \ + YOUR_WEBHOOK_URL + ``` + +3. Set environment variable: + ```bash + export SLACK_WEBHOOK_URL="https://hooks.slack.com/services/YOUR/WEBHOOK/URL" + ``` + + **⚠️ Security Note**: The webhook URL is a sensitive secret that grants permission to post messages to your Slack channel. Always: + - Store it as an environment variable, never hardcode it in your code + - Never commit webhook URLs to version control (add to .gitignore) + - Treat it like a password - anyone with this URL can send messages to your channel + + +**Security Alert**: Webhook URLs are sensitive credentials! Anyone with your webhook URL can send messages to your Slack channel. Always store them as environment variables and never commit them to version control. + + +### Step 2: Add Slack Tool (15 min) + +Now that you have a working webhook, you'll add a new MCP tool to your existing server.py from Module 2. This tool will handle sending notifications to Slack by making HTTP requests to the webhook URL. + + +**Note**: The starter code includes all improvements from Modules 1 & 2 (output limiting, webhook handling). Focus on the new Slack integration! + + +Add this tool to your server.py: + +**`send_slack_notification`**: +- Takes a message string parameter +- Reads webhook URL from environment variable +- Sends POST request to Slack webhook +- Returns success/failure message +- Handles basic error cases + +```python +import os +import requests +from mcp.types import TextContent + +@mcp.tool() +def send_slack_notification(message: str) -> str: + """Send a formatted notification to the team Slack channel.""" + webhook_url = os.getenv("SLACK_WEBHOOK_URL") + if not webhook_url: + return "Error: SLACK_WEBHOOK_URL environment variable not set" + + try: + # TODO: Send POST request to webhook_url + # TODO: Include message in JSON payload + # TODO: Handle response and return status + pass + except Exception as e: + return f"Error sending message: {str(e)}" +``` + +### Step 3: Create Formatting Prompts (15 min) + +Next, you'll add MCP Prompts to your server - this is where the magic happens! These prompts will work with Claude to automatically format your GitHub webhook data into well-structured Slack messages. Remember from Module 1 that Prompts provide reusable instructions that Claude can use consistently. + +Implement two prompts that generate Slack-formatted messages: + +1. **`format_ci_failure_alert`**: + ```python + @mcp.prompt() + def format_ci_failure_alert() -> str: + """Create a Slack alert for CI/CD failures.""" + return """Format this GitHub Actions failure as a Slack message: + + Use this template: + ❌ *CI Failed* - [Repository Name] + + > Brief summary of what failed + + *Details:* + • Workflow: `workflow_name` + • Branch: `branch_name` + • Commit: `commit_hash` + + *Next Steps:* + • + • + + Use Slack markdown formatting and keep it concise for quick team scanning.""" + ``` + +2. **`format_ci_success_summary`**: + ```python + @mcp.prompt() + def format_ci_success_summary() -> str: + """Create a Slack message celebrating successful deployments.""" + return """Format this successful GitHub Actions run as a Slack message: + + Use this template: + ✅ *Deployment Successful* - [Repository Name] + + > Brief summary of what was deployed + + *Changes:* + • Key feature or fix 1 + • Key feature or fix 2 + + *Links:* + • + • + + Keep it celebratory but informative. Use Slack markdown formatting.""" + ``` + +### Step 4: Test Complete Workflow (10 min) + +Now comes the exciting part - testing your complete MCP workflow! You'll have all three components working together: webhook capture from Module 2, prompt formatting from this module, and Slack notifications. + +1. Start all services (just like in Module 2, but now with Slack integration): + ```bash + # Terminal 1: Start webhook server + python webhook_server.py + + # Terminal 2: Start MCP server + uv run server.py + + # Terminal 3: Start Cloudflare Tunnel + cloudflared tunnel --url http://localhost:8080 + ``` + +2. Test the complete integration with Claude Code: + - **Configure GitHub webhook** with tunnel URL (same as Module 2) + - **Push changes** to trigger GitHub Actions + - **Ask Claude** to check recent events and format them using your prompts + - **Let Claude send** the formatted message using your Slack tool + - **Verify** notifications appear in your Slack channel + +### Step 5: Verify Integration (5 min) + +You can test your implementation without setting up a real GitHub repository! See `manual_test.md` for curl commands that simulate GitHub webhook events. + +**Understanding the webhook event flow:** +- Your webhook server (from Module 2) captures GitHub events and stores them in `github_events.json` +- Your MCP tools read from this file to get recent CI/CD activity +- Claude uses your formatting prompts to create readable messages +- Your Slack tool sends the formatted messages to your team channel +- This creates a complete pipeline: GitHub → Local Storage → Claude Analysis → Slack Notification + +**Quick Test Workflow:** +1. Use curl to send fake GitHub events to your webhook server +2. Ask Claude to check recent events and format them +3. Send formatted messages to Slack +4. Verify everything works end-to-end + +**Manual Testing Alternative:** For a complete testing experience without GitHub setup, follow the step-by-step curl commands in `manual_test.md`. + +## Example Workflow in Claude Code + +``` +User: "Check recent CI events and notify the team about any failures" + +Claude: +1. Uses get_recent_actions_events (from Module 2) +2. Finds a workflow failure +3. Uses format_ci_failure_alert prompt to create message +4. Uses send_slack_notification tool to deliver it +5. Reports back: "Sent failure alert to #dev-team channel" +``` + +## Expected Slack Message Output + +**Failure Alert:** +``` +❌ *CI Failed* - mcp-course + +> Tests failed in Module 3 implementation + +*Details:* +• Workflow: `CI` +• Branch: `feature/slack-integration` +• Commit: `abc123f` + +*Next Steps:* +• +• +``` + +**Success Summary:** +``` +✅ *Deployment Successful* - mcp-course + +> Module 3 Slack integration deployed to staging + +*Changes:* +• Added team notification system +• Integrated MCP Tools and Prompts + +*Links:* +• +• +``` + +## Common Issues + +### Webhook URL Issues +- Verify the environment variable is set correctly +- Test webhook directly with curl before integrating +- Ensure Slack app has proper permissions + +### Message Formatting +- [Slack markdown](https://api.slack.com/reference/surfaces/formatting) differs from GitHub markdown +- Test message formatting manually before automating +- Handle special characters in commit messages properly ([formatting reference](https://api.slack.com/reference/surfaces/formatting#escaping)) + +### Network Errors +- Add basic timeout handling to webhook requests ([webhook error handling](https://api.slack.com/messaging/webhooks#handling_errors)) +- Return meaningful error messages from the tool +- Check internet connectivity if requests fail + +## Key Takeaways + +You've now built a complete MCP workflow that demonstrates: +- **Tools** for external API integration (Slack webhooks) +- **Prompts** for intelligent message formatting +- **Integration** of all MCP primitives working together +- **Real-world application** that teams can actually use + +This shows the power of MCP for building practical development automation tools! + + + +**Key Learning**: You've now built a complete MCP workflow that combines Tools (for external API calls) with Prompts (for consistent formatting). This pattern of Tools + Prompts is fundamental to advanced MCP development and can be applied to many other automation scenarios. + + + +## Next Steps + +Congratulations! You've completed the final module of Unit 3 and built a complete end-to-end automation system. Your journey through all three modules has given you hands-on experience with: + +- **Module 1**: MCP Tools and intelligent data analysis +- **Module 2**: Real-time webhooks and MCP Prompts +- **Module 3**: External API integration and workflow completion + +### What to do next: +1. **Test your complete system** - Try triggering real GitHub events and watch the full pipeline work +2. **Experiment with customization** - Modify the Slack message formats or add new notification types +3. **Review the Unit 3 Conclusion** - Reflect on everything you've learned and explore next steps +4. **Share your success** - Show teammates how MCP can automate your development workflows + +You now have a solid foundation for building intelligent automation systems with MCP! + +### The transformation is complete! +CodeCraft Studios has gone from chaotic development to a well-oiled machine. The automation system you built handles: +- **Smart PR descriptions** that help reviewers understand changes +- **Real-time CI/CD monitoring** that catches failures before they reach production +- **Intelligent team notifications** that keep everyone informed automatically + +The team can now focus on building great products instead of fighting process problems. And you've learned advanced MCP patterns that you can apply to any automation challenge! + +## Additional Resources + +- [Slack Incoming Webhooks Documentation](https://api.slack.com/messaging/webhooks) +- [Slack Message Formatting Guide](https://api.slack.com/reference/surfaces/formatting) +- [MCP Tools Documentation](https://modelcontextprotocol.io/docs/concepts/tools) +- [MCP Prompts Guide](https://modelcontextprotocol.io/docs/concepts/prompts) \ No newline at end of file diff --git a/units/en/unit4/introduction.mdx b/units/en/unit4/introduction.mdx deleted file mode 100644 index 5eed14d..0000000 --- a/units/en/unit4/introduction.mdx +++ /dev/null @@ -1,5 +0,0 @@ -# Coming Soon - -This unit will be a collaboration with partners from the AI community. - -If you're building tools for MCP, please reach out to us and we'll add you to the unit. Open a [discussion](https://huggingface.co/spaces/mcp-course/README/discussions) on the hub organization. \ No newline at end of file