Thank you for your interest in contributing to AutoFaker! This document provides guidelines and information for contributors to help maintain code quality and consistency.
- Code of Conduct
- Getting Started
- Development Setup
- Code Style and Patterns
- Testing Guidelines
- Pull Request Guidelines
- Documentation Guidelines
- Issue Reporting
Please be respectful and considerate in all interactions. We aim to maintain a welcoming and inclusive environment for all contributors.
- Fork the repository on GitHub
- Clone your fork locally:
git clone https://github.com/your-username/autofaker.git cd autofaker - Add the upstream repository:
git remote add upstream https://github.com/christianhelle/autofaker.git
- Python 3.10 or higher
- pip package manager
-
Install development dependencies:
make prepare
-
Install the package in development mode:
python setup.py develop --user
Note: If you encounter permission errors with the Makefile targets, you can run commands individually:
pip install -r requirements.txt python setup.py develop --user
make prepare: Install development dependenciesmake debug: Install package in development modemake test: Run all tests with coveragemake package: Build distribution packagesmake all: Run complete build pipeline
When contributing, please follow the established patterns in the codebase:
Follow the existing patterns for creating data generators:
# Example from existing codebase
class ExampleGenerator:
def create(self, t: type, use_fake_data: bool = False):
# Implementation here
passWhen adding new decorators, follow the existing pattern:
def example_decorator(*types: object, use_fake_data: bool = False):
def decorator(function):
def wrapper(*args):
# Implementation following existing pattern
pass
return wrapper
return decoratorUse both unittest and pytest patterns as established:
import unittest
from autofaker import Autodata, autodata
class ExampleTestCase(unittest.TestCase):
def test_example_functionality(self):
# Test implementation
self.assertIsNotNone(result)
@autodata(str, int)
def test_with_decorator(self, text, number):
# Test using decorator pattern
self.assertIsInstance(text, str)
self.assertIsInstance(number, int)Use type hints consistently, following existing patterns:
from typing import List, Optional, Union
def create_example(data_type: type, count: int = 3) -> List[object]:
# Implementation with proper type hints
passInclude docstrings for public methods and classes:
def create_anonymous_data(data_type: type) -> object:
"""
Creates anonymous data for the specified type.
Args:
data_type: The type to create anonymous data for
Returns:
An instance of the specified type with anonymous data
"""
pass- Use snake_case for functions and variables
- Use PascalCase for classes
- Use descriptive names that reflect functionality
- Follow existing naming patterns in the codebase
Organize imports following the existing pattern:
- Standard library imports
- Third-party imports
- Local imports
import unittest
from typing import List
from dataclasses import dataclass
import pandas
from autofaker import Autodata- All new functionality must include comprehensive tests
- Maintain or improve existing test coverage
- Tests should cover both success and failure scenarios
- Unit Tests: Use unittest.TestCase for traditional unit testing
- Decorator Tests: Use the @autodata and @fakedata patterns
- Integration Tests: Test complete workflows
- Exception Tests: Test error conditions and edge cases
# Run all tests
python -m pytest tests/ -v --cov --cov-report=term-missing
# Alternative: use make if you have proper permissions
make test
# Run specific test file
python -m pytest tests/test_specific_feature.py -v
# Run with coverage
python -m pytest tests/ -v --cov --cov-report=term-missing- Place tests in the
tests/directory - Name test files with
test_prefix - Group related tests in the same file
- Use descriptive test method names
Pull Request descriptions must be as verbose as possible and include:
- Summary: Clear, concise description of what the PR accomplishes
- Motivation: Why this change is needed
- Changes Made: Detailed list of all changes, including:
- New features added
- Bug fixes implemented
- Code refactoring performed
- Documentation updates
- Testing: Description of how the changes were tested
- Breaking Changes: Any backwards incompatible changes
- Additional Notes: Any other relevant information
## Summary
[Provide a clear summary of the changes]
## Motivation
[Explain why this change is needed]
## Changes Made
- [ ] Feature A: Description of feature A
- [ ] Bug fix B: Description of bug fix B
- [ ] Documentation updates for X
- [ ] Added tests for Y
## Testing
- [ ] Added unit tests for new functionality
- [ ] Verified existing tests pass
- [ ] Tested on Python versions: [list versions]
- [ ] Manual testing performed: [describe scenarios]
## Breaking Changes
[List any breaking changes or write "None"]
## Additional Notes
[Any other relevant information]- Update Documentation: If your changes affect user-facing functionality
- Run Tests: Ensure all tests pass locally
- Check Coverage: Verify test coverage is maintained
- Update README: If adding new features or changing existing behavior
- Add Examples: Include usage examples for new features
-
Create a feature branch from main:
git checkout -b feature/your-feature-name
-
Make your changes following the established patterns
-
Test your changes thoroughly:
python -m pytest tests/ -v --cov --cov-report=term-missing
-
Commit with descriptive messages:
git commit -m "Add feature X with comprehensive tests" -
Push to your fork:
git push origin feature/your-feature-name
-
Create a Pull Request with verbose description
All changes that affect user-facing functionality must include README updates:
- New Features: Add examples and documentation
- API Changes: Update existing examples
- New Supported Types: Update the "Supported data types" section
- Installation Changes: Update installation instructions
- Use clear, concise language
- Include code examples for new features
- Ensure examples are tested and work correctly
- Update both README.md and docs/pypi.md if applicable
When adding new features, include examples like:
# Create anonymous data using new feature
from autofaker import Autodata
result = Autodata.create_new_feature(YourClass)
print(f'Result: {result}')When reporting issues:
- Use Issue Templates: Follow any provided templates
- Provide Context: Include Python version, OS, and package version
- Include Examples: Provide minimal reproducible examples
- Check Existing Issues: Search for similar issues first
- Small, Focused Changes: Keep PRs focused on single features or fixes
- Maintain Backwards Compatibility: Avoid breaking existing functionality
- Follow Existing Patterns: Consistency is key
- Test Thoroughly: Include comprehensive tests
- Document Changes: Update documentation for user-facing changes
If you need help or have questions:
- Check existing documentation and issues
- Create an issue with the "question" label
- Be specific about what you're trying to accomplish
Thank you for contributing to AutoFaker! Your contributions help make this library better for everyone.