Thank you for your interest in contributing to TerraVision! This document provides guidelines and best practices for contributing to the project.
- Code of Conduct
- Getting Started
- Development Workflow
- Code Standards
- Testing Requirements
- Pull Request Process
- AI-Assisted Development
- Architecture Guidelines
By participating in this project, you agree to maintain a respectful and collaborative environment for all contributors.
Ensure you have the following installed:
- Python 3.11+
- Terraform v1.x (v0.x not supported)
- Graphviz (
dotcommand) - Git
- Poetry (recommended) or pip
- Ollama (for local AI testing)
- AWS account (for bedrock AI testing)
# Clone the repository
git clone https://github.com/patrickchugh/terravision.git
cd terravision
# Install dependencies with Poetry (recommended)
poetry install
poetry shell
# Install pre-commit hooks
pre-commit install
# Verify installation
python terravision.py --help# Run all tests
poetry run pytest tests -v
# Run non-slow tests (for quick validation)
poetry run pytest -m "not slow"
# Run specific test file
poetry run pytest tests/test_provider_detection.py -v
# Run with coverage
poetry run pytest tests --cov=modules- Fork the repository and create a feature branch from
main - Make your changes following the code standards below
- Write tests for new functionality
- Run the test suite to ensure all tests pass
- Format your code with Black
- Submit a pull request to the
mainbranch
Use descriptive branch names:
feature/add-azure-supportfix/security-group-parsingdocs/update-readmerefactor/provider-detection
Write clear, descriptive commit messages:
Add support for GCP Cloud Run resources
- Implement Cloud Run handler in resource_handlers_gcp.py
- Add Cloud Run icons to resource_classes/gcp/
- Update cloud_config_gcp.py with special resources
- Add integration tests for Cloud Run
TerraVision uses Black for code formatting with the following configuration:
- Line length: 88 characters
- Follow PEP 8 conventions
- Use type hints where appropriate
- Write docstrings for public functions
# Check formatting (what CI runs)
poetry run black --check -v modules
# Auto-format code
poetry run black modules
# Run all pre-commit hooks
pre-commit run --all-files- Provider-specific code goes in
modules/config/cloud_config_<provider>.pyandmodules/resource_handlers_<provider>.py - Shared utilities belong in
modules/helpers.py - Icons are stored in
resource_classes/<provider>/ - Tests mirror the module structure in
tests/
Use isort conventions (automatically applied by pre-commit):
- Standard library imports
- Third-party imports
- Local application imports
- All new features must include tests
- Bug fixes should include regression tests
- Aim for >80% code coverage on new code
Unit Tests (tests/*_unit_test.py):
def test_provider_detection():
tfdata = {"all_resource": [{"type": "aws_instance"}]}
result = detect_providers(tfdata)
assert "aws" in result["providers"]Integration Tests (tests/integration_test.py):
- Test the full pipeline with real Terraform code
- Mark as slow:
@pytest.mark.slow
Provider Tests (tests/test_<provider>_*.py):
- Validate provider-specific functionality
- Test config loading and resource handlers
Before submitting a PR, ensure:
# All tests pass
poetry run pytest tests -v
# Code is formatted
poetry run black --check -v modules
# Pre-commit hooks pass
pre-commit run --all-files- Code follows Black formatting standards
- All tests pass locally
- New tests added for new functionality
- Documentation updated (README.md, CLAUDE.md if architecture changed)
- Pre-commit hooks pass
- No unnecessary dependencies added
- AI assistance disclosed (see below)
## Description
Brief description of changes
## Type of Change
- [ ] Bug fix
- [ ] New feature
- [ ] Breaking change
- [ ] Documentation update
## Testing
Describe testing performed
## AI Assistance
- Tools used: [e.g., Claude Code, GitHub Copilot]
- Model: [e.g., Claude Sonnet 4.5, GPT-4]
- Scope: [e.g., "Generated test cases", "Refactored function X"]
## Checklist
- [ ] Tests pass
- [ ] Code formatted with Black
- [ ] Documentation updated- Any major revamps to code or new ideas not discussed with a maintainer will be rejected
- Github passes Automated CI runs on all PRs (Black check, pytest suite)
- Maintainer review required for merge, maintainer to review files changed
- You address review feedback promptly
- Keep PRs focused and reasonably sized - try to implement only one specific change per PR so it rollback will be easier.
TerraVision welcomes the use of AI tools for development! AI assistance can accelerate development, improve code quality, and help with documentation.
All AI use for coding must be disclosed in pull requests. Please include:
- Tools used: Name of AI tool(s) (e.g., Claude Code, GitHub Copilot, ChatGPT, Cursor)
- Model and version: Specific model used (e.g., Claude Sonnet 4.5, GPT-4 Turbo, Llama 3.1 70B)
- Scope of assistance: What the AI helped with (e.g., "Generated test fixtures", "Refactored VPC handler", "Wrote docstrings")
## AI Assistance
- **Tools**: Claude Code CLI
- **Model**: Claude Sonnet 4.5 (claude-sonnet-4-5-20250929)
- **Scope**:
- Generated initial implementation of Azure resource handlers
- Created unit tests for provider detection
- Assisted with debugging Terraform graph parsing logic- Transparency: Helps maintainers understand the development process
- Quality: Allows reviewers to pay extra attention to AI-generated code
- Learning: Helps the community learn what AI tools work well for this project
- Best practices: Establishes a culture of responsible AI use
- Always review and test AI-generated code thoroughly
- Ensure AI-generated code follows TerraVision's architecture patterns
- Verify that AI suggestions align with provider-specific conventions
- Use AI to augment your skills, not replace understanding
When adding support for a new cloud provider:
- Create
modules/config/cloud_config_<provider>.pywith required constants:PROVIDER_PREFIX,ICON_LIBRARY,SPECIAL_RESOURCES, etc.
- Create
modules/resource_handlers_<provider>.pywith handler functions - Add provider prefix to
PROVIDER_PREFIXESinprovider_detector.py - Add icons to
resource_classes/<provider>/ - Add tests in
tests/test_provider_detection.pyandtests/test_<provider>_resources.py
TerraVision uses dynamic provider detection and configuration loading:
# Detect provider from resource prefixes
provider = get_primary_provider_or_default(tfdata)
# Load provider-specific configuration
config = load_config(provider)
# Use provider-specific constants
icons = config.ICON_LIBRARY
special_resources = config.SPECIAL_RESOURCESSpecial resources (VPCs, security groups, networks) follow a handler pattern:
# In cloud_config_<provider>.py
SPECIAL_RESOURCES = {
"provider_resource_type": "handler_function_name",
}
# In resource_handlers_<provider>.py
def handler_function_name(tfdata, resource):
# Implementation
return tfdataWhen debugging issues:
# Generate debug output
python terravision.py draw --source <path> --debug
# Inspect tfdata.json
cat tfdata.json | jq '.graphdict'
# Replay from debug file (skips terraform init/plan)
python terravision.py draw --source tfdata.jsonterravision/
├── modules/
│ ├── config/ # Provider-specific configurations
│ │ ├── cloud_config_aws.py
│ │ ├── cloud_config_azure.py
│ │ └── cloud_config_gcp.py
│ ├── resource_handlers_*.py # Provider resource handlers
│ ├── resource_transformers.py # Reusable Core graph transformers for all providers
│ ├── provider_detector.py # Provider detection logic
│ ├── config_loader.py # Dynamic config loading
│ ├── graphmaker.py # Core graph construction
│ ├── drawing.py # Graphviz rendering
│ └── helpers.py # Utility functions
├── resource_classes/ # Icon libraries by provider
│ ├── aws/
│ ├── azure/
│ ├── gcp/
│ └── generic/
├── tests/ # Test suite
├── terravision.py # Main CLI entry point
└── pyproject.toml # Poetry dependencies
- Documentation: See [docs/ folder for more detailed info
- Issues: Search existing issues or create a new one
- Discussions: Use GitHub Discussions for questions
- Examples: Check
tests/fixtures/for example Terraform code
By contributing to TerraVision, you agree that your contributions will be licensed under the same license as the project.
Thank you for contributing to TerraVision!