Skip to content

Commit 5c4b8cd

Browse files
committed
Merge branch 'master' into dependabot/pip/Solutions/Open-Systems/DataConnectors/aiohttp-3.13.3
2 parents a4e605e + c5902e0 commit 5c4b8cd

File tree

1,121 files changed

+31539
-85434
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

1,121 files changed

+31539
-85434
lines changed
Lines changed: 106 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,106 @@
1+
name: AWS-S3 DataConnector Bundle Auto-Update
2+
run-name: Auto-updating AWS-S3 bundles for ${{ github.event.pull_request.head.ref }}
3+
4+
on:
5+
pull_request:
6+
branches:
7+
- master
8+
paths:
9+
# Trigger when any of these files in AWS-S3 directory change
10+
- 'DataConnectors/AWS-S3/*.ps1'
11+
- 'DataConnectors/AWS-S3/*.py'
12+
- 'DataConnectors/AWS-S3/*.md'
13+
- 'DataConnectors/AWS-S3/CloudFormation/**'
14+
- 'DataConnectors/AWS-S3/Enviornment/**'
15+
- 'DataConnectors/AWS-S3/Utils/**'
16+
# Don't trigger on zip file changes (to avoid recursion)
17+
- '!DataConnectors/AWS-S3/*.zip'
18+
# Don't trigger on bundle automation documentation changes (not bundled)
19+
- '!DataConnectors/AWS-S3/BUNDLE_AUTOMATION.md'
20+
21+
# Allow manual workflow dispatch for testing
22+
workflow_dispatch:
23+
24+
jobs:
25+
auto-update-bundles:
26+
# Security: Block workflow execution on forked repositories
27+
if: ${{ !github.event.pull_request.head.repo.fork }}
28+
runs-on: ubuntu-latest
29+
permissions:
30+
contents: write
31+
pull-requests: write
32+
33+
steps:
34+
- name: Generate a token
35+
id: generate_token
36+
uses: actions/create-github-app-token@v1
37+
with:
38+
app-id: ${{ secrets.APPLICATION_ID }}
39+
private-key: ${{ secrets.APPLICATION_PRIVATE_KEY }}
40+
41+
- name: Checkout PR branch with sparse checkout
42+
uses: actions/checkout@v4
43+
with:
44+
token: ${{ steps.generate_token.outputs.token }}
45+
ref: ${{ github.event.pull_request.head.ref }}
46+
fetch-depth: 2 # Just need HEAD and parent for git diff
47+
persist-credentials: false # Security: Don't persist credentials after checkout
48+
sparse-checkout: |
49+
DataConnectors/AWS-S3
50+
.script
51+
sparse-checkout-cone-mode: false
52+
53+
- name: Restore bundling script from base branch
54+
run: |
55+
# Security: Use trusted script from base branch to prevent malicious PR modifications
56+
# Fetch the base branch to ensure we have the reference
57+
git fetch origin ${{ github.base_ref || 'master' }}:refs/remotes/origin/${{ github.base_ref || 'master' }}
58+
git checkout origin/${{ github.base_ref || 'master' }} -- .script/bundleAwsS3Scripts.sh
59+
chmod +x .script/bundleAwsS3Scripts.sh
60+
61+
- name: Check if auto-update needed
62+
id: check_update
63+
run: |
64+
# Skip if this commit already updated bundles (prevent loops)
65+
if git log -1 --name-only | grep -q "ConfigAwsS3DataConnectorScripts.*\.zip"; then
66+
echo "skip=true" >> $GITHUB_OUTPUT
67+
echo "Bundles already updated in latest commit"
68+
else
69+
echo "skip=false" >> $GITHUB_OUTPUT
70+
fi
71+
72+
- name: Update bundles
73+
if: steps.check_update.outputs.skip != 'true'
74+
run: |
75+
.script/bundleAwsS3Scripts.sh
76+
77+
- name: Commit updated bundles
78+
if: steps.check_update.outputs.skip != 'true'
79+
env:
80+
GITHUB_TOKEN: ${{ steps.generate_token.outputs.token }}
81+
run: |
82+
git config --local user.email "action@github.com"
83+
git config --local user.name "GitHub Action"
84+
85+
# Configure remote with token for push (needed due to persist-credentials: false)
86+
git remote set-url origin https://x-access-token:${GITHUB_TOKEN}@github.com/${{ github.repository }}.git
87+
88+
# Stage zip files
89+
git add DataConnectors/AWS-S3/ConfigAwsS3DataConnectorScripts*.zip
90+
91+
# Check if there are changes to commit
92+
if ! git diff --cached --quiet; then
93+
git commit -m "Auto-update AWS-S3 DataConnector bundles
94+
95+
- Updated ConfigAwsS3DataConnectorScripts.zip
96+
- Updated ConfigAwsS3DataConnectorScriptsGov.zip
97+
- Changes triggered by: ${{ github.event.pull_request.head.sha }}
98+
99+
[skip ci]"
100+
101+
git push origin ${{ github.event.pull_request.head.ref }}
102+
103+
echo "✅ Successfully updated and committed bundle files"
104+
else
105+
echo "ℹ️ No bundle changes detected"
106+
fi

.github/workflows/update-solutions-analyzer.yml

Lines changed: 14 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ on:
88
- 'Solutions/**/*.json'
99
- 'Solutions/**/Parsers/**/*.yaml'
1010
- 'Solutions/**/Parsers/**/*.yml'
11-
- 'Tools/Solutions Analyzer/solution_connector_tables.py'
11+
- 'Tools/Solutions Analyzer/map_solutions_connectors_tables.py'
1212
workflow_dispatch: # Allow manual trigger
1313
schedule:
1414
# Run weekly on Monday at 2 AM UTC to catch any missed changes
@@ -39,17 +39,15 @@ jobs:
3939
- name: Run Solutions Analyzer
4040
run: |
4141
cd "Tools/Solutions Analyzer"
42-
python solution_connector_tables.py
42+
python map_solutions_connectors_tables.py
4343
44-
- name: Generate Connector Documentation
45-
run: |
46-
cd "Tools/Solutions Analyzer"
47-
python generate_connector_docs.py
44+
# Note: Documentation generation removed - docs are now hosted in a separate repo
45+
# See: https://github.com/oshezaf/sentinelninja/tree/main/Solutions%20Docs
4846

4947
- name: Check for changes
5048
id: check_changes
5149
run: |
52-
if git diff --quiet "Tools/Solutions Analyzer/solutions_connectors_tables_mapping.csv" "Tools/Solutions Analyzer/solutions_connectors_tables_issues_and_exceptions_report.csv" "Tools/Solutions Analyzer/connector-docs/"; then
50+
if git diff --quiet "Tools/Solutions Analyzer/solutions_connectors_tables_mapping.csv" "Tools/Solutions Analyzer/solutions_connectors_tables_issues_and_exceptions_report.csv"; then
5351
echo "changed=false" >> $GITHUB_OUTPUT
5452
else
5553
echo "changed=true" >> $GITHUB_OUTPUT
@@ -61,27 +59,27 @@ jobs:
6159
uses: peter-evans/create-pull-request@v6
6260
with:
6361
token: ${{ secrets.GITHUB_TOKEN }}
64-
commit-message: 'chore: Update Solutions Analyzer CSV files and documentation'
62+
commit-message: 'chore: Update Solutions Analyzer CSV files'
6563
branch: solutions-analyzer-update
6664
delete-branch: true
67-
title: 'chore: Update Solutions Analyzer CSV files and documentation'
65+
title: 'chore: Update Solutions Analyzer CSV files'
6866
body: |
6967
## Automated Solutions Analyzer Update
7068
7169
This PR contains automated updates to:
7270
- Solutions connector-to-tables mapping CSV
7371
- Solutions issues and exceptions report CSV
74-
- Connector documentation files
7572
7673
Generated by the Solutions Analyzer workflow.
7774
75+
**Note:** Documentation is now hosted separately at https://github.com/oshezaf/sentinelninja
76+
7877
**Triggered by:** ${{ github.event_name }}
7978
**Workflow run:** ${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}
80-
labels: automated, documentation
79+
labels: automated
8180
add-paths: |
8281
Tools/Solutions Analyzer/solutions_connectors_tables_mapping.csv
8382
Tools/Solutions Analyzer/solutions_connectors_tables_issues_and_exceptions_report.csv
84-
Tools/Solutions Analyzer/connector-docs/
8583
8684
- name: Enable auto-merge
8785
if: steps.check_changes.outputs.changed == 'true' && steps.create_pr.outputs.pull-request-number != ''
@@ -95,16 +93,17 @@ jobs:
9593
run: |
9694
echo "### Solutions Analyzer Pull Request Created :white_check_mark:" >> $GITHUB_STEP_SUMMARY
9795
echo "" >> $GITHUB_STEP_SUMMARY
98-
echo "A pull request has been created with updated CSV files and documentation." >> $GITHUB_STEP_SUMMARY
96+
echo "A pull request has been created with updated CSV files." >> $GITHUB_STEP_SUMMARY
9997
echo "" >> $GITHUB_STEP_SUMMARY
10098
echo "**Modified files:**" >> $GITHUB_STEP_SUMMARY
10199
echo "- Tools/Solutions Analyzer/solutions_connectors_tables_mapping.csv" >> $GITHUB_STEP_SUMMARY
102100
echo "- Tools/Solutions Analyzer/solutions_connectors_tables_issues_and_exceptions_report.csv" >> $GITHUB_STEP_SUMMARY
103-
echo "- Tools/Solutions Analyzer/connector-docs/" >> $GITHUB_STEP_SUMMARY
101+
echo "" >> $GITHUB_STEP_SUMMARY
102+
echo "**Note:** Documentation is hosted at https://github.com/oshezaf/sentinelninja" >> $GITHUB_STEP_SUMMARY
104103
105104
- name: No changes summary
106105
if: steps.check_changes.outputs.changed == 'false'
107106
run: |
108107
echo "### Solutions Analyzer :information_source:" >> $GITHUB_STEP_SUMMARY
109108
echo "" >> $GITHUB_STEP_SUMMARY
110-
echo "No changes detected. CSV files and documentation are already up-to-date." >> $GITHUB_STEP_SUMMARY
109+
echo "No changes detected. CSV files are already up-to-date." >> $GITHUB_STEP_SUMMARY
Lines changed: 192 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,192 @@
1+
# AWS-S3 DataConnector Scripts Bundle Automation
2+
3+
## Overview
4+
5+
The AWS-S3 DataConnector scripts are automatically bundled into zip files whenever changes are made to the source files. This automation ensures that the distributed zip files are always up-to-date with the latest script changes.
6+
7+
## Automated Bundles
8+
9+
Two main zip files are automatically maintained:
10+
11+
1. **ConfigAwsS3DataConnectorScripts.zip** - For Commercial Azure
12+
- Contains: `ConfigAwsComToAzureCom.zip` and `ConfigAwsGovToAzureCom.zip`
13+
- Includes both `CloudWatchLambdaFunction.py` and `CloudWatchLambdaFunction_V2.py`
14+
15+
2. **ConfigAwsS3DataConnectorScriptsGov.zip** - For Government Azure
16+
- Contains: `ConfigAwsComToAzureGov.zip` and `ConfigAwsGovToAzureGov.zip`
17+
- Includes only `CloudWatchLambdaFunction.py` (V1)
18+
19+
## How It Works
20+
21+
### GitHub Actions Workflow
22+
23+
The automation is implemented via a GitHub Actions workflow (`.github/workflows/aws-s3-bundle-update.yaml`) that:
24+
25+
1. **Triggers automatically** on:
26+
- **Pull Requests** targeting the `master` branch
27+
- When changes affect:
28+
- `*.ps1` files in the AWS-S3 directory
29+
- `*.py` files in the AWS-S3 directory
30+
- `*.md` files in the AWS-S3 directory
31+
- Files in `CloudFormation/`, `Enviornment/`, or `Utils/` subdirectories
32+
33+
2. **Auto-Update Mode**:
34+
- Runs the bundling script to regenerate zip files
35+
- Automatically commits updated bundles to the PR branch
36+
- Includes `[skip ci]` flag to prevent workflow recursion
37+
- Developers don't need to manually update bundles - it's handled automatically
38+
- Commits are made by GitHub Action bot with clear description
39+
40+
3. **Prevents recursion** by:
41+
- Excluding zip file changes from triggering the workflow
42+
- Checking if the commit already contains zip updates
43+
- Using `[skip ci]` flag in auto-commit messages
44+
45+
### Bundling Script
46+
47+
The `.script/bundleAwsS3Scripts.sh` script uses intelligent, dynamic bundling:
48+
49+
- **Dynamic File Detection**: Automatically detects changed files using `git diff`
50+
- Respects `GITHUB_BASE_REF` in CI/CD environments
51+
- Falls back to `HEAD~1` for local execution
52+
- Filters out `.zip` files and documentation automatically
53+
- **Intelligent Updates**: Extracts existing zip files and only replaces modified files
54+
- Uses `cmp -s` to compare file contents
55+
- Preserves unchanged files to minimize bundle changes
56+
- **Variant Handling**: Automatically manages differences between Commercial and Government bundles
57+
- Commercial: Includes both Lambda V1 and V2
58+
- Government: Includes only Lambda V1
59+
- **Nested Structure**: Creates proper nested zip file structure
60+
- **Fallback Safety**: If no changes detected, bundles all relevant files to ensure completeness
61+
62+
## Files Included in Bundles
63+
64+
The bundling script uses **dynamic file detection** to automatically determine which files to include:
65+
66+
### Dynamic Detection Process
67+
68+
1. **Changed Files Detection**: The script uses `git diff` to detect files that have been modified in the AWS-S3 directory
69+
2. **Automatic Filtering**: Excludes `.zip` files and `BUNDLE_AUTOMATION.md` from the bundle
70+
3. **Fallback Mechanism**: If no changes are detected, all relevant files in the AWS-S3 directory are bundled
71+
72+
### File Types Included
73+
74+
The script automatically bundles:
75+
- **PowerShell scripts** (`*.ps1`) - Configuration and connector scripts
76+
- **Python files** (`*.py`) - Lambda functions
77+
- **Markdown documentation** (`*.md`) - Policy and usage documentation
78+
- **CloudFormation templates** - Infrastructure-as-code definitions
79+
- **Utility scripts** - Helper functions and shared code in `Utils/` directory
80+
- **Environment configuration** - Settings in `Enviornment/` directory
81+
82+
### Bundle Variants
83+
84+
**Commercial Azure Bundles** (`ConfigAwsS3DataConnectorScripts.zip`):
85+
- Include both `CloudWatchLambdaFunction.py` and `CloudWatchLambdaFunction_V2.py`
86+
- Contain two nested zips: `ConfigAwsComToAzureCom.zip` and `ConfigAwsGovToAzureCom.zip`
87+
88+
**Government Azure Bundles** (`ConfigAwsS3DataConnectorScriptsGov.zip`):
89+
- Include only `CloudWatchLambdaFunction.py` (V1)
90+
- Contain two nested zips: `ConfigAwsComToAzureGov.zip` and `ConfigAwsGovToAzureGov.zip`
91+
92+
### Adding New Files
93+
94+
Simply add or modify files in the `DataConnectors/AWS-S3/` directory. The bundling script will automatically detect and include them - no manual configuration needed!
95+
96+
## Manual Bundle Generation
97+
98+
If needed, you can manually regenerate the bundles:
99+
100+
```bash
101+
# From the repository root
102+
.script/bundleAwsS3Scripts.sh
103+
```
104+
105+
Or trigger the workflow manually:
106+
107+
1. Go to the Actions tab in the GitHub repository
108+
2. Select "AWS-S3 DataConnector Bundle Auto-Update" workflow
109+
3. Click "Run workflow"
110+
4. Select the branch and click "Run workflow"
111+
112+
## Troubleshooting
113+
114+
### Bundles not auto-updated in PR
115+
116+
If bundles aren't automatically updated:
117+
118+
1. Check the GitHub Actions tab to see if the workflow ran
119+
2. Verify your changes are in monitored paths (*.ps1, *.py, *.md, CloudFormation/, Enviornment/, Utils/)
120+
3. If workflow succeeded but no commit appeared, bundles may already be up-to-date
121+
4. Manually trigger the workflow from the Actions tab if needed
122+
123+
### Manual bundle update needed
124+
125+
If you prefer to update bundles manually or workflow fails:
126+
127+
1. Run the bundling script locally:
128+
```bash
129+
.script/bundleAwsS3Scripts.sh
130+
```
131+
2. Commit the updated zip files:
132+
```bash
133+
git add DataConnectors/AWS-S3/*.zip
134+
git commit -m "Update AWS-S3 bundles"
135+
git push
136+
```
137+
138+
### Workflow doesn't trigger
139+
140+
- Ensure changes are in the monitored paths (see above)
141+
- Check that the PR targets the `master` branch
142+
- Verify the workflow file exists and is valid YAML
143+
- Check that zip files weren't the only changes (they're excluded from triggers)
144+
145+
### Recursion issues
146+
147+
If the workflow triggers itself repeatedly:
148+
149+
- Check that the commit message includes `[skip ci]`
150+
- Verify the workflow doesn't trigger on zip file changes
151+
- Review the `check_changes` step logic in the workflow
152+
153+
## Development Notes
154+
155+
### Adding New Files to Bundles
156+
157+
**No configuration needed!** The bundling script uses dynamic file detection:
158+
159+
1. Simply add or modify files in the `DataConnectors/AWS-S3/` directory
160+
2. The script automatically detects changes via `git diff`
161+
3. New files are automatically included in the next bundle generation
162+
163+
The script intelligently handles:
164+
- New PowerShell scripts (`*.ps1`)
165+
- New Python files (`*.py`)
166+
- New documentation (`*.md`)
167+
- New files in `CloudFormation/`, `Enviornment/`, or `Utils/` subdirectories
168+
169+
### Modifying Bundle Structure
170+
171+
To change which files go in which bundle variant (Commercial vs. Government):
172+
173+
1. Edit the `create_nested_zip` function in `.script/bundleAwsS3Scripts.sh`
174+
2. Adjust the logic for the `lambda_version` parameter
175+
3. Test locally: `.script/bundleAwsS3Scripts.sh`
176+
4. Commit your changes
177+
178+
### Understanding Dynamic Detection
179+
180+
The script's `get_changed_files()` function:
181+
- Compares current branch against base branch (in PRs) or last commit (locally)
182+
- Automatically filters out `.zip` files and `BUNDLE_AUTOMATION.md`
183+
- Falls back to including all files if no changes are detected
184+
- Works seamlessly in both CI/CD and local development environments
185+
186+
## Benefits
187+
188+
**Consistency**: Bundles are always in sync with source files
189+
**Automation**: No manual zip file creation needed
190+
**Transparency**: All changes are tracked in Git
191+
**Reliability**: Automated testing ensures bundles are created correctly
192+
**Documentation**: Clear process for maintenance and updates
7 KB
Binary file not shown.
7.07 KB
Binary file not shown.

DataConnectors/AWS-S3/README.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -64,3 +64,6 @@ The `ConfigAwsConnector.ps1` script has two parameters:
6464
- `-LogPath` specifies a custom path to create the script activity log file.
6565
- `-AwsLogType` specifies the AWS log type to configure. Valid options are: "VPC", "CloudTrail", "GuardDuty". If this parameter is specified, the user will not be prompted for this information.
6666

67+
## Script Bundle Automation
68+
69+
The configuration scripts are automatically bundled into zip files (`ConfigAwsS3DataConnectorScripts.zip` and `ConfigAwsS3DataConnectorScriptsGov.zip`) whenever changes are made to the source files. This automation ensures that the distributed bundles are always up-to-date with the latest script versions. For more information about the bundling process, see [BUNDLE_AUTOMATION.md](BUNDLE_AUTOMATION.md).

Solutions/CTM360/Data Connectors/CBS/requirements.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,4 +3,4 @@
33
# Manually managing azure-functions-worker may cause unexpected issues
44

55
azure-functions==1.6.0
6-
requests==2.31.0
6+
requests==2.32.4

0 commit comments

Comments
 (0)