diff --git a/.github/copilot-instructions.md b/.github/copilot-instructions.md new file mode 100644 index 00000000..85e66869 --- /dev/null +++ b/.github/copilot-instructions.md @@ -0,0 +1,128 @@ +Guidelines for Using GitHub Copilot in Code Generation + +Clear and Readable Code: + +Code must be generated according to strict conventions, such as camelCase or snake_case, depending on the programming language. This is crucial for ensuring consistency, facilitating comprehension, and improving the maintainability of the code, particularly in large-scale projects. + +Detailed and informative annotations are mandatory, with a focus on explaining complex logic and algorithms. These annotations should strike an ideal balance between completeness and conciseness, enabling team members to collaborate efficiently and onboard new team members quickly. + +Functions and methods must be designed to maximize modularity. Each module should serve a specific responsibility, enhancing reusability and significantly simplifying bug fixes or extensions. Avoiding overly nested functions or methods helps to limit cyclomatic complexity. + +Security Measures: + +Generated code must not contain known vulnerabilities, such as SQL injection, buffer overflows, or hardcoded credentials. Proactively applying security measures, such as using prepared statements and avoiding vulnerable APIs, is mandatory. + +All inputs must be thoroughly validated before being processed within the application. This includes both server-side and client-side validation. Additionally, error handling must be robust, providing clear messages and logging mechanisms that enable developers to quickly locate and resolve issues. + +Use frameworks and libraries that enforce security automatically, such as ORMs for database interactions and modern cryptographic libraries. This minimizes the risk of human errors and promotes best practices. + +Performance Optimization: + +Code should be written with algorithmic efficiency in mind. This includes avoiding redundant iterations and using efficient data structures like hashmaps or balanced trees, depending on the situation. + +Balancing readability and optimization is crucial, especially in critical applications such as real-time systems. Code must remain understandable for human reviewers without compromising performance. + +Future scalability should be considered when making design decisions. This includes anticipating peak loads, efficiently managing system resources, and integrating load-balancing solutions when necessary. + +Adherence to Best Practices: + +Consistency in style and implementation within a project is essential. This includes following language-specific conventions, using linting tools to prevent stylistic errors, and avoiding unconventional coding practices that could cause confusion. + +Applying proven principles such as SOLID (Single Responsibility, Open/Closed, Liskov Substitution, Interface Segregation, Dependency Inversion) and DRY (Don't Repeat Yourself) is mandatory. These principles ensure robust and maintainable designs. + +Avoid implementing inefficient or outdated methodologies, as these limit the flexibility and expandability of future development cycles. + +Copyright and Licensing: + +Copilot must not generate code that infringes on copyrights. All generated code must fall under a permissive license unless stated otherwise. This prevents legal conflicts and ensures the integrity of the project. + +All dependencies and libraries used must be thoroughly documented. This includes specifying licensing requirements and ensuring compliance with these licenses to avoid legal risks. + +Usability: + +User interfaces, both CLI and GUI, must be intuitive and easy to use. Unnecessary complexity should be avoided, focusing on clear navigation and accessible features. + +Error handling in user interfaces should aim for user-friendly messages that inform the user about the nature of the error and provide practical solutions. This significantly enhances the overall user experience. + +Systematic implementation of internationalization (i18n) is essential to make the application accessible to a global audience. This includes supporting multiple languages and respecting regional differences in date formats, currencies, and other cultural norms. + +Compatibility and Sustainability: + +Generated code must remain up-to-date with the latest versions of programming languages and frameworks while maintaining backward compatibility. This promotes the sustainability of the codebase. + +Modularity should be central to the design, allowing future changes or extensions to be implemented easily without requiring significant refactoring. + +Version control using tools like Git, combined with automated CI/CD pipelines, must be applied to ensure a consistent and reliable codebase. + +Documentation and Educational Value: + +Each function must be accompanied by clear and concise documentation describing its functionality and limitations. This includes adding example implementations for practical application. + +Project documentation, such as README files, must be detailed and provide clear guidelines for installation, usage, and troubleshooting. This facilitates adoption by new users and developers. + +Regular updates and maintenance of documentation are essential to keep it synchronized with the evolution of the project. + +Minimization of Dependencies: + +External libraries should only be used when absolutely necessary. Overuse of dependencies increases the risk of security vulnerabilities and compatibility issues. + +Core functionality must remain independent of external resources, ensuring the application’s robustness in various environments. + +Ethical Responsibility: + +Code must not be generated for applications that are unethical or harmful, such as malware or invasive surveillance. + +Risky patterns and potential security issues must be explicitly flagged with warning annotations to ensure developers are aware of the implications. + +Promoting ethics and social responsibility must be an integral part of the development culture, with attention to minimizing harmful impacts and maximizing positive societal contributions. + +### Guidelines for Using GitHub Copilot in Code Generation + +1. **Clear and Readable Code:** + - Code must be generated according to strict conventions, such as camelCase or snake_case, depending on the programming language. This is crucial for ensuring consistency, facilitating comprehension, and improving the maintainability of the code, particularly in large-scale projects. + - Detailed and informative annotations are mandatory, with a focus on explaining complex logic and algorithms. These annotations should strike an ideal balance between completeness and conciseness, enabling team members to collaborate efficiently and onboard new team members quickly. + - Functions and methods must be designed to maximize modularity. Each module should serve a specific responsibility, enhancing reusability and significantly simplifying bug fixes or extensions. Avoiding overly nested functions or methods helps to limit cyclomatic complexity. + +2. **Security Measures:** + - Generated code must not contain known vulnerabilities, such as SQL injection, buffer overflows, or hardcoded credentials. Proactively applying security measures, such as using prepared statements and avoiding vulnerable APIs, is mandatory. + - All inputs must be thoroughly validated before being processed within the application. This includes both server-side and client-side validation. Additionally, error handling must be robust, providing clear messages and logging mechanisms that enable developers to quickly locate and resolve issues. + - Use frameworks and libraries that enforce security automatically, such as ORMs for database interactions and modern cryptographic libraries. This minimizes the risk of human errors and promotes best practices. + +3. **Performance Optimization:** + - Code should be written with algorithmic efficiency in mind. This includes avoiding redundant iterations and using efficient data structures like hashmaps or balanced trees, depending on the situation. + - Balancing readability and optimization is crucial, especially in critical applications such as real-time systems. Code must remain understandable for human reviewers without compromising performance. + - Future scalability should be considered when making design decisions. This includes anticipating peak loads, efficiently managing system resources, and integrating load-balancing solutions when necessary. + +4. **Adherence to Best Practices:** + - Consistency in style and implementation within a project is essential. This includes following language-specific conventions, using linting tools to prevent stylistic errors, and avoiding unconventional coding practices that could cause confusion. + - Applying proven principles such as SOLID (Single Responsibility, Open/Closed, Liskov Substitution, Interface Segregation, Dependency Inversion) and DRY (Don't Repeat Yourself) is mandatory. These principles ensure robust and maintainable designs. + - Avoid implementing inefficient or outdated methodologies, as these limit the flexibility and expandability of future development cycles. + +5. **Copyright and Licensing:** + - Copilot must not generate code that infringes on copyrights. All generated code must fall under a permissive license unless stated otherwise. This prevents legal conflicts and ensures the integrity of the project. + - All dependencies and libraries used must be thoroughly documented. This includes specifying licensing requirements and ensuring compliance with these licenses to avoid legal risks. + +6. **Usability:** + - User interfaces, both CLI and GUI, must be intuitive and easy to use. Unnecessary complexity should be avoided, focusing on clear navigation and accessible features. + - Error handling in user interfaces should aim for user-friendly messages that inform the user about the nature of the error and provide practical solutions. This significantly enhances the overall user experience. + - Systematic implementation of internationalization (i18n) is essential to make the application accessible to a global audience. This includes supporting multiple languages and respecting regional differences in date formats, currencies, and other cultural norms. + +7. **Compatibility and Sustainability:** + - Generated code must remain up-to-date with the latest versions of programming languages and frameworks while maintaining backward compatibility. This promotes the sustainability of the codebase. + - Modularity should be central to the design, allowing future changes or extensions to be implemented easily without requiring significant refactoring. + - Version control using tools like Git, combined with automated CI/CD pipelines, must be applied to ensure a consistent and reliable codebase. + +8. **Documentation and Educational Value:** + - Each function must be accompanied by clear and concise documentation describing its functionality and limitations. This includes adding example implementations for practical application. + - Project documentation, such as README files, must be detailed and provide clear guidelines for installation, usage, and troubleshooting. This facilitates adoption by new users and developers. + - Regular updates and maintenance of documentation are essential to keep it synchronized with the evolution of the project. + +9. **Minimization of Dependencies:** + - External libraries should only be used when absolutely necessary. Overuse of dependencies increases the risk of security vulnerabilities and compatibility issues. + - Core functionality must remain independent of external resources, ensuring the application’s robustness in various environments. + +10. **Ethical Responsibility:** + - Code must not be generated for applications that are unethical or harmful, such as malware or invasive surveillance. + - Risky patterns and potential security issues must be explicitly flagged with warning annotations to ensure developers are aware of the implications. + - Promoting ethics and social responsibility must be an integral part of the development culture, with attention to minimizing harmful impacts and maximizing positive societal contributions. + diff --git a/.github/workflows/build-check.yaml b/.github/workflows/build-check.yaml index abef53c5..e6573691 100644 --- a/.github/workflows/build-check.yaml +++ b/.github/workflows/build-check.yaml @@ -1,94 +1,171 @@ -name: Check if ISO can be built +name: Validate and Test Build on: pull_request: branches: - main + - dev workflow_dispatch: schedule: - cron: '0 0 * * *' +env: + DOCKER_BUILDKIT: 1 + PACMAN_CACHE: /tmp/pacman-cache + WORKSPACE: /workdir + BUILD_DIR: /workdir/workdir + OUTPUT_DIR: /workdir/out + jobs: - build: + validate: runs-on: ubuntu-latest + steps: + - name: Checkout Repository + uses: actions/checkout@v4 + + - name: Validate package list + run: | + # Check if package list exists + if [ ! -f packages.x86_64 ]; then + echo "::error::packages.x86_64 file not found" + exit 1 + fi + + # Check for duplicate packages + sort packages.x86_64 | uniq -d > duplicates.txt + if [ -s duplicates.txt ]; then + echo "::error::Duplicate packages found:" + cat duplicates.txt + exit 1 + fi + + # Validate package names exist in Arch repos + docker run --rm -v "${{ github.workspace }}/packages.x86_64:/packages.x86_64:ro" archlinux:latest bash -c " + set -euo pipefail + pacman -Syu --noconfirm + while read -r pkg; do + [[ \$pkg =~ ^# ]] && continue + [[ -z \$pkg ]] && continue + if ! pacman -Si \$pkg >/dev/null 2>&1; then + echo \"::error::Package not found: \$pkg\" + exit 1 + fi + done < /packages.x86_64 + " + security-scan: + runs-on: ubuntu-latest steps: - name: Checkout Repository uses: actions/checkout@v4 + - name: Run Security Scan + uses: aquasecurity/trivy-action@master + with: + scan-type: 'fs' + ignore-unfixed: true + format: 'sarif' + output: 'trivy-results.sarif' + severity: 'CRITICAL,HIGH' + + - name: Upload Scan Results + uses: github/codeql-action/upload-sarif@v2 + if: always() + with: + sarif_file: 'trivy-results.sarif' + + test-build: + needs: [validate, security-scan] + runs-on: ubuntu-latest + timeout-minutes: 120 + + steps: + - name: Checkout Repository + uses: actions/checkout@v4 + + - name: Cache Pacman packages + uses: actions/cache@v3 + with: + path: ${{ env.PACMAN_CACHE }} + key: pacman-${{ runner.os }}-${{ github.sha }} + restore-keys: | + pacman-${{ runner.os }}- + - name: Set up Arch Linux Container run: | - docker run --privileged --name arch-container -d -v ${{ github.workspace }}:/workdir archlinux:latest sleep infinity + mkdir -p ${{ env.PACMAN_CACHE }} + docker run --privileged --name arch-container -d \ + -v ${{ github.workspace }}:${{ env.WORKSPACE }} \ + -v ${{ env.PACMAN_CACHE }}:/var/cache/pacman/pkg \ + archlinux:latest sleep infinity - - name: Build ISO in Arch Container + - name: Install Dependencies run: | docker exec arch-container bash -c " - pacman -Syu --noconfirm && - pacman -S --noconfirm git archiso grub && - cd /workdir && - mkarchiso -v -w workdir/ -o out/ . + set -euo pipefail + pacman -Syu --noconfirm + pacman -S --noconfirm --needed git archiso grub " - - name: Rename ISO to Arch.iso + - name: Test Build + id: build run: | docker exec arch-container bash -c " - iso_file=\$(ls /workdir/out/*.iso 2>/dev/null | head -n 1) && - [ -n \"\$iso_file\" ] && mv \$iso_file /workdir/out/Arch.iso || echo 'No ISO file found.' + set -euo pipefail + cd ${{ env.WORKSPACE }} + rm -rf ${{ env.BUILD_DIR }} ${{ env.OUTPUT_DIR }} + mkdir -p ${{ env.BUILD_DIR }} ${{ env.OUTPUT_DIR }} + mkarchiso -v -w ${{ env.BUILD_DIR }} -o ${{ env.OUTPUT_DIR }} . " - - name: Copy ISO to Host + - name: Verify ISO run: | - docker cp arch-container:/workdir/out/Arch.iso ${{ github.workspace }}/ || echo 'Failed to copy ISO to host.' - - - name: Get current date - id: date - run: echo "date=$(date +'%Y-%m-%d')" >> $GITHUB_ENV + docker exec arch-container bash -c " + set -euo pipefail + cd ${{ env.OUTPUT_DIR }} + + # Check if ISO exists + iso_count=\$(ls -1 *.iso 2>/dev/null | wc -l) + if [ \$iso_count -eq 0 ]; then + echo '::error::No ISO file found' + exit 1 + elif [ \$iso_count -gt 1 ]; then + echo '::error::Multiple ISO files found' + exit 1 + fi + + iso_file=\$(ls *.iso) + + # Check ISO size (minimum 500MB) + size=\$(stat -c%s \"\$iso_file\") + if [ \$size -lt 524288000 ]; then + echo \"::error::ISO file too small: \$((\$size / 1024 / 1024))MB\" + exit 1 + fi + + # Verify ISO checksum + sha256sum \"\$iso_file\" > checksum.sha256 + sha256sum -c checksum.sha256 || { + echo '::error::ISO checksum verification failed' + exit 1 + } + " - - name: Create GitHub Release - id: create_release - uses: actions/create-release@v1.1.4 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - tag_name: v${{ github.run_id }}-release - release_name: "Arch Linux Release" - body: "Arch Linux ISO built on ${{ steps.date.outputs.date }}" - draft: false - prerelease: false - - - name: Upload ISO to GitHub Release - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_release.outputs.upload_url }} - asset_path: ${{ github.workspace }}/Arch.iso - asset_name: Arch.iso - asset_content_type: application/octet-stream - - - name: Delete GitHub Release - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - run: | - release_id=$(curl -s \ - -H "Authorization: token $GITHUB_TOKEN" \ - -H "Accept: application/vnd.github.v3+json" \ - https://api.github.com/repos/${{ github.repository }}/releases/tags/v${{ github.run_id }}-release | jq -r .id) && - curl -X DELETE \ - -H "Authorization: token $GITHUB_TOKEN" \ - -H "Accept: application/vnd.github.v3+json" \ - https://api.github.com/repos/${{ github.repository }}/releases/$release_id - - - name: Delete Git Tag - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + - name: Clean Up + if: always() run: | - curl -X DELETE \ - -H "Authorization: token $GITHUB_TOKEN" \ - -H "Accept: application/vnd.github.v3+json" \ - https://api.github.com/repos/${{ github.repository }}/git/refs/tags/v${{ github.run_id }}-release + if docker ps -a | grep -q arch-container; then + docker stop arch-container || true + docker rm -f arch-container || true + fi + sudo rm -rf ${{ env.BUILD_DIR }} ${{ env.OUTPUT_DIR }} - - name: Clean Up + - name: Report Status + if: always() run: | - docker stop arch-container || echo 'Failed to stop the container.' - docker rm arch-container || echo 'Failed to remove the container.' + if [ "${{ job.status }}" = "success" ]; then + echo "βœ… Build check passed successfully" + else + echo "❌ Build check failed" + exit 1 + fi diff --git a/.github/workflows/build.yaml b/.github/workflows/build.yaml index 1caedaf5..0b4bc509 100644 --- a/.github/workflows/build.yaml +++ b/.github/workflows/build.yaml @@ -4,81 +4,177 @@ on: workflow_dispatch: schedule: - cron: '0 0 * * *' # Run the workflow every day at midnight + push: + branches: + - main + - dev + paths-ignore: + - '**.md' + - '.gitignore' + +env: + DOCKER_BUILDKIT: 1 + ISO_FILENAME: Arch.iso jobs: build: - runs-on: ubuntu-latest # Use a standard runner + runs-on: ubuntu-latest + timeout-minutes: 120 # Set a timeout to prevent hung builds steps: - name: Checkout Repository uses: actions/checkout@v4 + - name: Set up environment variables + id: env + run: | + echo "DATE=$(date +'%Y-%m-%d')" >> $GITHUB_ENV + echo "VERSION=$(date +'%Y.%m.%d')" >> $GITHUB_ENV + echo "CACHE_KEY=$(date +'%Y-%m')" >> $GITHUB_ENV + echo "WORKSPACE=${GITHUB_WORKSPACE}" >> $GITHUB_ENV + + - name: Create Cache Directories + run: | + sudo mkdir -p /tmp/pacman-cache + sudo chmod 777 /tmp/pacman-cache + # Ensure the directory is empty to prevent tar errors + sudo rm -rf /tmp/pacman-cache/* + + - name: Cache Pacman packages + uses: actions/cache@v3 + with: + path: /tmp/pacman-cache + key: pacman-${{ runner.os }}-${{ env.CACHE_KEY }} + restore-keys: | + pacman-${{ runner.os }}- + - name: Set up Arch Linux Container run: | - docker run --privileged --name arch-container -d -v ${{ github.workspace }}:/workdir archlinux:latest sleep infinity + docker run --privileged --name arch-container -d \ + -v ${{ env.WORKSPACE }}:/workdir \ + -v /tmp/pacman-cache:/var/cache/pacman/pkg \ + archlinux:latest sleep infinity - - name: Build ISO in Arch Container + - name: Initialize Container run: | - set -e docker exec arch-container bash -c " - pacman -Syu --noconfirm && - pacman -S --noconfirm git archiso grub && - cd /workdir && - mkarchiso -v -w workdir/ -o out/ . + set -euo pipefail + + # Update package database and system + pacman -Syu --noconfirm + + # Install required packages + pacman -S --noconfirm --needed \ + git \ + archiso \ + grub \ + curl \ + jq \ + gnupg \ + make \ + sudo + + # Verify installation + command -v mkarchiso >/dev/null 2>&1 || { + echo '::error::mkarchiso not found' + exit 1 + } " - - name: Rename ISO to Arch.iso + - name: Build ISO + id: build run: | - set -e docker exec arch-container bash -c " - iso_file=\$(ls /workdir/out/*.iso 2>/dev/null | head -n 1) && - [ -n \"\$iso_file\" ] && mv \$iso_file /workdir/out/Arch.iso || echo 'No ISO file found.' + set -euo pipefail + cd /workdir + + # Cleanup any previous builds + rm -rf workdir/ out/ + mkdir -p out/ + + # Build the ISO with verbose output + mkarchiso -v -w workdir/ -o out/ . 2>&1 | tee build.log || { + echo '::error::ISO build failed!' + tail -n 50 build.log + exit 1 + } + + # Verify ISO was created + [ -f out/*.iso ] || { + echo '::error::ISO file not found after build' + exit 1 + } " - - name: List ISO files + - name: Generate Checksums run: | - docker exec arch-container bash -c "ls -l /workdir/out/" || echo 'Failed to list files.' + docker exec arch-container bash -c " + set -euo pipefail + cd /workdir/out + + # Generate checksums + for iso in *.iso; do + sha256sum \"\$iso\" > \"\${iso}.sha256sum\" + sha512sum \"\$iso\" > \"\${iso}.sha512sum\" + done + " - - name: Copy ISO to Host + - name: Rename and Move ISO run: | - docker cp arch-container:/workdir/out/Arch.iso ${{ github.workspace }}/ || echo 'Failed to copy ISO to host.' + docker exec arch-container bash -c " + set -euo pipefail + cd /workdir/out + + for f in *.iso; do + newname=\"arch-linux-no-beeps-${{ env.VERSION }}.iso\" + mv \"\$f\" \"\$newname\" + mv \"\$f.sha256sum\" \"\$newname.sha256sum\" + mv \"\$f.sha512sum\" \"\$newname.sha512sum\" + done + " - - name: Upload ISO Artifact - uses: actions/upload-artifact@v3 - with: - name: Arch.iso - path: ${{ github.workspace }}/Arch.iso - - - name: Get current date - id: date - run: echo "DATE=$(date +'%Y-%m-%d')" >> $GITHUB_ENV - - # Create a release on GitHub using GITHUB_TOKEN - - name: Create GitHub Release - id: create_release # Adding an ID to reference the release step - uses: actions/create-release@v1.1.4 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + + - name: Create Release + id: create_release + uses: softprops/action-gh-release@v1 + if: github.ref == 'refs/heads/main' with: - tag_name: "v${{ github.run_id }}-release" - release_name: "Arch Linux Release" + tag_name: v${{ env.VERSION }} + name: "Arch Linux No Beeps v${{ env.VERSION }}" body: | - This release contains the Arch Linux ISO built on ${{ env.DATE }}. + πŸš€ Arch Linux ISO without system beeps (build ${{ env.DATE }}) + + ### Changes + - Automatic daily build + - System beeps disabled + - ISO SHA256 and SHA512 checksums added + + ### Download + - Download the ISO and verify checksums before use + + ### Checksums + SHA256 and SHA512 checksums are available in the uploaded files. draft: false prerelease: false - - # Upload the ISO to the GitHub release with a specific, predictable name - - name: Upload ISO to GitHub Release - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_release.outputs.upload_url }} - asset_path: ${{ github.workspace }}/Arch.iso - asset_name: Arch.iso - asset_content_type: application/octet-stream + files: | + ${{ env.WORKSPACE }}/out/*.iso + ${{ env.WORKSPACE }}/out/*.sha*sum - name: Clean Up + if: always() run: | - docker stop arch-container || echo 'Failed to stop the container.' - docker rm arch-container || echo 'Failed to remove the container.' \ No newline at end of file + if docker ps -a | grep -q arch-container; then + docker stop arch-container || true + docker rm -f arch-container || true + fi + sudo rm -rf workdir/ out/ /tmp/pacman-cache/* + + - name: Upload Build Logs on Failure + if: failure() + uses: actions/upload-artifact@v4 # Upgrade to v4 + with: + name: build-logs + path: | + ${{ env.WORKSPACE }}/build.log + retention-days: 5 + compression-level: 9 # Maximum compression for logs \ No newline at end of file diff --git a/.github/workflows/update-docs.yaml b/.github/workflows/update-docs.yaml new file mode 100644 index 00000000..1d572a76 --- /dev/null +++ b/.github/workflows/update-docs.yaml @@ -0,0 +1,132 @@ +name: Update Documentation + +on: + push: + branches: + - main + - dev + paths: + - 'packages.x86_64' + - 'airootfs/**' + - '.github/workflows/**' + workflow_dispatch: + +jobs: + update-docs: + runs-on: ubuntu-latest + permissions: + contents: write + pull-requests: write + + steps: + - name: Checkout Repository + uses: actions/checkout@v4 + + - name: Set up Python + uses: actions/setup-python@v4 + with: + python-version: '3.x' + + - name: Install Dependencies + run: | + python -m pip install --upgrade pip + pip install PyYAML markdown + + - name: Generate Package List Documentation + run: | + echo "## πŸ“¦ Installed Packages" > packages_doc.md + echo "" >> packages_doc.md + echo "This ISO contains the following packages:" >> packages_doc.md + echo "" >> packages_doc.md + echo "| Package | Description |" >> packages_doc.md + echo "|---------|-------------|" >> packages_doc.md + + # Set up Docker container for package info + docker run --name arch-container -d archlinux:latest sleep infinity + docker exec arch-container pacman -Sy + + while read -r pkg; do + # Skip comments and empty lines + [[ "$pkg" =~ ^#.*$ ]] && continue + [[ -z "$pkg" ]] && continue + + # Get package description + desc=$(docker exec arch-container bash -c "pacman -Si $pkg 2>/dev/null | grep Description | cut -d: -f2- || echo 'No description available'") + echo "| \`$pkg\` | ${desc:-No description available} |" >> packages_doc.md + done < packages.x86_64 + + docker stop arch-container + docker rm arch-container + + - name: Update README + run: | + # Backup current README + cp README.md README.md.bak + + # Update package section in README + awk ' + /## πŸ“¦ Installed Packages/,/##/ { next } + /## πŸ“¦ Installed Packages/ { + print + system("cat packages_doc.md") + next + } + { print } + ' README.md.bak > README.md + + # Clean up + rm README.md.bak packages_doc.md + + - name: Generate Workflow Documentation + run: | + echo "## πŸ”„ Automated Workflows" > workflows_doc.md + echo "" >> workflows_doc.md + echo "This project uses the following GitHub Actions workflows:" >> workflows_doc.md + echo "" >> workflows_doc.md + + for workflow in .github/workflows/*.yaml; do + name=$(grep "^name:" "$workflow" | head -n1 | cut -d: -f2- | xargs) + echo "### $name" >> workflows_doc.md + echo "" >> workflows_doc.md + echo "File: \`$(basename "$workflow")\`" >> workflows_doc.md + echo "" >> workflows_doc.md + + # Extract description based on triggers + echo "Triggered by:" >> workflows_doc.md + if grep -q "workflow_dispatch:" "$workflow"; then + echo "- πŸ”˜ Manual trigger" >> workflows_doc.md + fi + if grep -q "schedule:" "$workflow"; then + cron=$(grep -A1 "schedule:" "$workflow" | grep "cron:" | cut -d"'" -f2) + echo "- ⏰ Scheduled: \`$cron\`" >> workflows_doc.md + fi + if grep -q "push:" "$workflow"; then + echo "- πŸ“€ Push to repository" >> workflows_doc.md + fi + if grep -q "pull_request:" "$workflow"; then + echo "- πŸ”„ Pull request" >> workflows_doc.md + fi + echo "" >> workflows_doc.md + done + + - name: Create Pull Request + uses: peter-evans/create-pull-request@v5 + with: + token: ${{ secrets.GITHUB_TOKEN }} + commit-message: "docs: update automated documentation" + title: "πŸ“š Documentation Update" + body: | + πŸ”„ Automated documentation update + + This PR includes: + - Updated package list + - Updated workflow documentation + - General documentation improvements + + This PR was automatically generated by the update-docs workflow. + branch: update-documentation + base: ${{ github.ref_name }} + labels: | + documentation + automated + draft: false \ No newline at end of file diff --git a/.github/workflows/update-packages.yaml b/.github/workflows/update-packages.yaml new file mode 100644 index 00000000..68a04ab6 --- /dev/null +++ b/.github/workflows/update-packages.yaml @@ -0,0 +1,125 @@ +name: Update Packages + +on: + schedule: + - cron: '0 2 * * *' # Run at 2 AM UTC daily + workflow_dispatch: # Allow manual trigger + +jobs: + update-packages: + runs-on: ubuntu-latest + permissions: + contents: write + pull-requests: write + + steps: + - name: Checkout Repository + uses: actions/checkout@v4 + with: + ref: dev + + - name: Set up environment variables + id: env + run: | + echo "WORKSPACE=${GITHUB_WORKSPACE}" >> $GITHUB_ENV + echo "CACHE_KEY=$(date +'%Y-%m')" >> $GITHUB_ENV + + - name: Create Cache Directory + run: | + sudo mkdir -p /tmp/pacman-cache + sudo chmod 777 /tmp/pacman-cache + + - name: Cache Pacman packages + uses: actions/cache@v3 + with: + path: /tmp/pacman-cache + key: pacman-${{ env.CACHE_KEY }} + restore-keys: | + pacman- + + - name: Set up Docker + run: | + docker run --privileged --name arch-container -d \ + -v ${{ env.WORKSPACE }}:/workdir \ + -v /tmp/pacman-cache:/var/cache/pacman/pkg \ + archlinux:latest sleep infinity + + - name: Initialize Container + run: | + docker exec arch-container bash -c " + set -euo pipefail + cd /workdir + + # Update package database + pacman -Sy --noconfirm + + # Install required packages + pacman -S --noconfirm --needed curl jq + " + + - name: Check for Package Updates + id: check-updates + run: | + docker exec arch-container bash -c " + set -euo pipefail + cd /workdir + + # Create temporary files in workspace + touch current-packages.txt updates.txt + + # Get current packages + grep -v '^#' packages.x86_64 | grep -v '^$' > current-packages.txt + + # Initialize pacman + pacman -Sy + + # Process each package + while read -r pkg; do + if pacman -Si \"\$pkg\" >/dev/null 2>&1; then + current_ver=\$(pacman -Si \"\$pkg\" | grep Version | head -n1 | awk '{print \$3}') + echo \"\$pkg \$current_ver\" >> updates.txt + else + echo \"Warning: Package \$pkg not found in repositories\" + fi + done < current-packages.txt + + # Check if we have updates + if [ -s updates.txt ]; then + echo 'updates_available=true' >> \$GITHUB_OUTPUT + echo 'Found updates:' + cat updates.txt + else + echo 'updates_available=false' >> \$GITHUB_OUTPUT + echo 'No updates found' + fi + " + + - name: Create Pull Request + if: steps.check-updates.outputs.updates_available == 'true' + uses: peter-evans/create-pull-request@v5 + with: + token: ${{ secrets.GITHUB_TOKEN }} + commit-message: "chore: update package versions" + title: "πŸ“¦ Automatic Package Updates" + body: | + πŸ”„ Automatic package update + + The following packages have been updated to their latest version: + ``` + $(cat updates.txt) + ``` + + This PR was automatically generated by the update-packages workflow. + branch: package-updates + base: dev + labels: | + automated + dependencies + draft: false + + - name: Clean Up + if: always() + run: | + docker stop arch-container || true + docker rm arch-container || true + sudo rm -rf /tmp/pacman-cache/* \ No newline at end of file diff --git a/workflows_doc.md b/workflows_doc.md new file mode 100644 index 00000000..d6b7bf7f --- /dev/null +++ b/workflows_doc.md @@ -0,0 +1,49 @@ +## πŸ”„ Automated Workflows + +This project uses the following GitHub Actions workflows: + +### Validate and Test Build + +File: `build-check.yaml` + +Triggered by: +- πŸ”˜ Manual trigger +- ⏰ Scheduled: `0 0 * * *` +- πŸ”„ Pull request + +### Build ISO + +File: `build.yaml` + +Triggered by: +- πŸ”˜ Manual trigger +- ⏰ Scheduled: `0 0 * * *` +- πŸ“€ Push to repository + +### Check to make sure Dockerfile works + +File: `dockerfile-check.yaml` + +Triggered by: +- πŸ”˜ Manual trigger +- ⏰ Scheduled: `` +- πŸ”„ Pull request + +### Update Documentation + +File: `update-docs.yaml` + +Triggered by: +- πŸ”˜ Manual trigger +- ⏰ Scheduled: `" -f2)` +- πŸ“€ Push to repository +- πŸ”„ Pull request + +### Update Packages + +File: `update-packages.yaml` + +Triggered by: +- πŸ”˜ Manual trigger +- ⏰ Scheduled: `0 2 * * *` +