Skip to content

Commit 2b09cf8

Browse files
committed
Merge remote-tracking branch 'upstream/master'
# Conflicts: # Tools/Solutions Analyzer/solutions_connectors_tables_mapping.csv
2 parents f6f247c + 5e131f5 commit 2b09cf8

File tree

519 files changed

+286692
-33836
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

519 files changed

+286692
-33836
lines changed

.github/copilot-instructions.md

Lines changed: 82 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,82 @@
1+
# GitHub Copilot Instructions for Azure-Sentinel Repository
2+
3+
## Solutions Analyzer Tools
4+
5+
When working with the Solutions Analyzer tools in `Tools/Solutions Analyzer/`:
6+
7+
### Output Locations
8+
9+
There are THREE different output scenarios - **never confuse them**:
10+
11+
1. **Default (development):** CSVs are written to `Tools/Solutions Analyzer/` in the current branch
12+
- This is the normal case when developing/testing
13+
- **Never generate documentation here**
14+
15+
2. **Output worktree (publishing CSVs):** `C:\Users\ofshezaf\GitHub\Azure-Sentinel-solution-analyzer-output\Tools\Solutions Analyzer`
16+
- Only use this when **specifically requested** to "publish CSVs to the output branch"
17+
- This is a separate git worktree for the CSV output branch
18+
- **Only CSVs go here, never documentation**
19+
20+
3. **Documentation output:** `C:\Users\ofshezaf\GitHub\sentinelninja\Solutions Docs`
21+
- This is where generated markdown documentation goes
22+
- This is in a **separate repository** (sentinelninja)
23+
- Empty the target folder before generating new docs
24+
25+
### Key Rules
26+
27+
- **Never generate docs locally** in the Azure-Sentinel repository
28+
- **Generate docs only in the sentinelninja repo** when asked or needed
29+
- **For official CSV releases**, generate CSVs **only** in the solution analyzer output worktree
30+
- Always use `--output-dir` flag when running `generate_connector_docs.py`
31+
32+
### Running Scripts
33+
34+
#### Mapper Script
35+
```powershell
36+
cd "Tools/Solutions Analyzer"
37+
python map_solutions_connectors_tables.py
38+
```
39+
40+
**Note:** Do NOT truncate or filter the output (e.g., do not pipe through `Select-Object`). The script prints timestamped progress messages to the console that the user needs to see. Run with `isBackground: false` and `timeout: 0` so the full output is visible.
41+
42+
#### Documentation Generator
43+
```powershell
44+
python generate_connector_docs.py --output-dir "C:\Users\ofshezaf\GitHub\sentinelninja\Solutions Docs" --skip-input-generation
45+
```
46+
47+
**IMPORTANT:** Never run without `--output-dir` flag.
48+
49+
**IMPORTANT:** Do NOT truncate or filter the output (e.g., do not pipe through `Select-Object`). Run with `isBackground: false` and `timeout: 0` so the full output is visible to the user.
50+
51+
**IMPORTANT:** Always use `--skip-input-generation` unless you specifically need to regenerate the input CSVs (mapper + collect_table_info). Without this flag, the doc generator will re-run those scripts automatically, which is slow and unnecessary if the CSVs are already up-to-date.
52+
53+
**IMPORTANT:** Run the mapper script before generating docs if:
54+
- The mapper script itself was modified, OR
55+
- Any override YAML file in the `overrides/` folder was modified (including adding, editing, or removing `additional_connectors` entries), OR
56+
- You specifically need to refresh the CSV data, OR
57+
- You are explicitly asked to run the mapper
58+
59+
### Caching and Logging
60+
61+
- **Cache:** `.cache/` folder for analysis caching
62+
- **Logs:** `.logs/` folder for log files
63+
64+
**Log file:** `Tools/Solutions Analyzer/.logs/map_solutions_connectors_tables.log`
65+
66+
Use `--force-refresh` with these types when modifying analysis logic:
67+
- `asim` - ASIM parser analysis
68+
- `parsers` - Non-ASIM parser analysis
69+
- `solutions` - Solution content analysis
70+
- `standalone` - Standalone content item analysis
71+
- `marketplace` - Marketplace availability check (requires network)
72+
- `tables` - Table reference info (requires network)
73+
74+
### Script Documentation
75+
76+
**Before updating a script:** Always review the relevant script documentation in `Tools/Solutions Analyzer/docs/` first.
77+
78+
**When updating a script**, update the corresponding script doc to reflect:
79+
- Any script parameters added or changed
80+
- Any output file changes, including changes to CSV files (new columns, renamed columns, removed columns)
81+
- Any changes to analysis methods or logic
82+
- Update the primary readme.md if needed and add the change to the change log. Do not add a version if the previous version as manifested by the changelog, was not committed yet.

.github/workflows/runAsimSchemaAndDataTesters.yaml

Lines changed: 41 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -43,6 +43,14 @@ jobs:
4343
outputs:
4444
approved: ${{ steps.check-approval.outputs.approved }}
4545
steps:
46+
- name: Checkout pull request branch
47+
if: github.event.pull_request != null
48+
uses: actions/checkout@v3
49+
with:
50+
ref: ${{ github.event.pull_request.head.sha }}
51+
repository: ${{ github.event.pull_request.head.repo.full_name }}
52+
fetch-depth: 0
53+
4654
- name: Check if PR needs approval
4755
id: check-approval
4856
run: |
@@ -96,7 +104,39 @@ jobs:
96104
echo "needs_approval=false" >> $GITHUB_OUTPUT
97105
echo "comment_needed=false" >> $GITHUB_OUTPUT
98106
fi
99-
107+
108+
- name: Prevent fork modifications to test infrastructure
109+
if: github.event.pull_request.head.repo.fork == true
110+
shell: bash
111+
run: |
112+
set -euo pipefail
113+
114+
log_info() { echo "ℹ️ $1"; }
115+
log_error() { echo "❌ $1"; }
116+
log_success() { echo "✅ $1"; }
117+
118+
log_info "Checking for modifications to asimParsersTest folder in fork PR..."
119+
120+
# We are currently checked out at the fork's HEAD SHA (actions/checkout did that).
121+
# Add the base repo as a remote and fetch the base branch, so we can diff reliably.
122+
git remote remove upstream 2>/dev/null || true
123+
git remote add upstream "https://github.com/${{ github.repository }}.git"
124+
125+
log_info "Fetching base branch ${{ github.event.pull_request.base.ref }} from upstream..."
126+
git fetch --no-tags --prune upstream "${{ github.event.pull_request.base.ref }}"
127+
128+
# Diff base branch (FETCH_HEAD) -> current HEAD (fork head)
129+
modified_files="$(git diff --name-only "FETCH_HEAD...HEAD")"
130+
131+
if echo "$modified_files" | grep -E "\.script/tests/asimParsersTest/" > /dev/null; then
132+
log_error "Fork PRs cannot modify the asimParsersTest test infrastructure folder"
133+
log_error "Modified test files detected:"
134+
echo "$modified_files" | grep "\.script/tests/asimParsersTest/" | sed 's/^/ - /'
135+
exit 1
136+
fi
137+
138+
log_success "No modifications to asimParsersTest folder detected"
139+
100140
- name: Comment on fork PR for approval guidance
101141
if: |
102142
always() &&

.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -346,3 +346,4 @@ Hunting Queries/DeployedQueries.json
346346
.script/**/*.js.map
347347
.script/**/*.d.ts
348348
.script/**/*.d.ts.map
349+
/.vscode

.script/package-automation/catalogAPI.ps1

Lines changed: 22 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -25,6 +25,28 @@ function GetCatalogDetails($offerId)
2525
return $null;
2626
}
2727
else {
28+
# Handle case where multiple offers are returned with same OfferId
29+
if ($offerDetails -is [System.Object[]] -and $offerDetails.Count -gt 1)
30+
{
31+
Write-Host "Multiple offers found for offerId $offerId. Matching by publisherId from baseMetadata."
32+
$matched = $offerDetails | Where-Object { $_.publisherId -eq $baseMetadata.publisherId }
33+
if ($null -ne $matched)
34+
{
35+
if ($matched -is [System.Object[]])
36+
{
37+
$offerDetails = $matched[0]
38+
}
39+
else
40+
{
41+
$offerDetails = $matched
42+
}
43+
}
44+
else
45+
{
46+
Write-Host "No offer matched publisherId '$($baseMetadata.publisherId)'. Defaulting to first offer."
47+
$offerDetails = $offerDetails[0]
48+
}
49+
}
2850
Write-Host "CatalogAPI Details found for offerId $offerId"
2951
return $offerDetails;
3052
}

.script/tests/KqlvalidationsTests/CustomFunctions/AWSCloudTrail.json

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -146,6 +146,10 @@
146146
"name": "EC2RoleDelivery",
147147
"type": "String"
148148
},
149+
{
150+
"name": "UserIdentityAccessKeyId",
151+
"type": "String"
152+
},
149153
{
150154
"name": "Session*",
151155
"type": "String"
Lines changed: 57 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,57 @@
1+
{
2+
"Name": "VersasecCmsErrorLogs",
3+
"Properties": [
4+
{
5+
"name": "TimeGenerated",
6+
"type": "datetime"
7+
},
8+
{
9+
"name": "EventVendor",
10+
"type": "string"
11+
},
12+
{
13+
"name": "EventProduct",
14+
"type": "string"
15+
},
16+
{
17+
"name": "CmsErrorID",
18+
"type": "real"
19+
},
20+
{
21+
"name": "ErrorCode",
22+
"type": "string"
23+
},
24+
{
25+
"name": "CmsErrorIDStrg",
26+
"type": "string"
27+
},
28+
{
29+
"name": "ErrorId",
30+
"type": "real"
31+
},
32+
{
33+
"name": "ComputerName",
34+
"type": "string"
35+
},
36+
{
37+
"name": "ClientId",
38+
"type": "string"
39+
},
40+
{
41+
"name": "ErrorMessage",
42+
"type": "string"
43+
},
44+
{
45+
"name": "TargetUsername",
46+
"type": "real"
47+
},
48+
{
49+
"name": "SupportTicket",
50+
"type": "string"
51+
},
52+
{
53+
"name": "TicketReference",
54+
"type": "string"
55+
}
56+
]
57+
}
Lines changed: 39 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,39 @@
1+
{
2+
"Name": "VersasecCmsErrorLogs_CL",
3+
"Properties": [
4+
{
5+
"name": "TimeGenerated",
6+
"type": "datetime"
7+
}, {
8+
"name": "CmsErrorID",
9+
"type": "real"
10+
}, {
11+
"name": "CmsErrorIDCode",
12+
"type": "string"
13+
}, {
14+
"name": "CmsErrorIDStrg",
15+
"type": "string"
16+
}, {
17+
"name": "ID",
18+
"type": "real"
19+
}, {
20+
"name": "ComputerName",
21+
"type": "string"
22+
}, {
23+
"name": "CLID",
24+
"type": "string"
25+
}, {
26+
"name": "ErrorStrg",
27+
"type": "string"
28+
}, {
29+
"name": "UserID",
30+
"type": "real"
31+
}, {
32+
"name": "SupportTicket",
33+
"type": "string"
34+
}, {
35+
"name": "TicketRef",
36+
"type": "string"
37+
}
38+
]
39+
}
Lines changed: 53 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,53 @@
1+
{
2+
"Name": "VersasecCmsSysLogs",
3+
"Properties": [
4+
{
5+
"name": "TimeGenerated",
6+
"type": "datetime"
7+
},
8+
{
9+
"name": "EventVendor",
10+
"type": "string"
11+
},
12+
{
13+
"name": "EventProduct",
14+
"type": "string"
15+
},
16+
{
17+
"name": "EventId",
18+
"type": "real"
19+
},
20+
{
21+
"name": "EventResult",
22+
"type": "string"
23+
},
24+
{
25+
"name": "ActivitySummary",
26+
"type": "string"
27+
},
28+
{
29+
"name": "SyslogID",
30+
"type": "real"
31+
},
32+
{
33+
"name": "ComputerName",
34+
"type": "string"
35+
},
36+
{
37+
"name": "TargetUsername",
38+
"type": "string"
39+
},
40+
{
41+
"name": "Parameter",
42+
"type": "string"
43+
},
44+
{
45+
"name": "UserID",
46+
"type": "real"
47+
},
48+
{
49+
"name": "TicketReference",
50+
"type": "string"
51+
}
52+
]
53+
}
Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,45 @@
1+
{
2+
"Name": "VersasecCmsSysLogs_CL",
3+
"Properties": [
4+
{
5+
"name": "TimeGenerated",
6+
"type": "datetime"
7+
},
8+
{
9+
"name": "SyslogID",
10+
"type": "real"
11+
},
12+
{
13+
"name": "SyslogIDCode",
14+
"type": "string"
15+
},
16+
{
17+
"name": "SyslogIDStrg",
18+
"type": "string"
19+
},
20+
{
21+
"name": "ID",
22+
"type": "real"
23+
},
24+
{
25+
"name": "ComputerName",
26+
"type": "string"
27+
},
28+
{
29+
"name": "CLID",
30+
"type": "string"
31+
},
32+
{
33+
"name": "Param1",
34+
"type": "string"
35+
},
36+
{
37+
"name": "UserID",
38+
"type": "real"
39+
},
40+
{
41+
"name": "TicketRef",
42+
"type": "string"
43+
}
44+
]
45+
}

0 commit comments

Comments
 (0)