Skip to content

Commit ef23a7c

Browse files
committed
Learn Editor: Update work-with-stix-objects-indicators.md
1 parent dd012ee commit ef23a7c

File tree

1 file changed

+8
-8
lines changed

1 file changed

+8
-8
lines changed

articles/sentinel/work-with-stix-objects-indicators.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -23,12 +23,12 @@ For more information about threat intelligence in Microsoft Sentinel, see [Threa
2323
> [!IMPORTANT]
2424
> Microsoft Sentinel will ingest all threat intelligence into the new `ThreatIntelIndicators` and `ThreatIntelObjects` tables, while continuing to ingest the same data into the legacy `ThreatIntelligenceIndicator` table until July 31, 2025.
2525
> **Be sure to update your custom queries, analytics and detection rules, workbooks, and automation to use the new tables by July 31, 2025.** After this date, Microsoft Sentinel will stop ingesting data to the legacy `ThreatIntelligenceIndicator` table. We're updating all out-of-the-box threat intelligence solutions in Content hub to leverage the new tables.
26-
> We've made some important updates to the our data re-publishing.
27-
> 1. Previously, data was divided and re-published to Log Analytics over a **12-day period**. Now, **all data** is re-published every **7 days** at the beginning of each week. You can identify this data in the `ThreatIntelIndicators` and `ThreatIntelObjects` tables by checking if `SourceSystem` equals `LogARepublisher`.
28-
> 2. The new tables now support additional columns, including the `Data` column, which contains the full data object used in advanced hunting scenarios. If these columns don't align with your scenario, learn more about filtering out [columns](#transform-away-columns-sent-to-log-analytics) and [rows](#transform-away-rows-sent-to-log-analytics) before ingestion to Log Analytics.
29-
> 3. To optimize ingestion to Log Analytics, key-value pairs with no data will be excluded. Additionally, some fields within the `Data` column—such as `description` and `pattern`will be truncated if they exceed 1,000 characters.
30-
> For more details on the updated schema and how it may affect your usage, see [ThreatIntelIndicators](/azure/azure-monitor/reference/tables/threatintelindicators) and [ThreatIntelObjects](/azure/azure-monitor/reference/tables/threatintelobjects).
31-
>
26+
> We introduced important updates to the data republishing processes.
27+
> 1. Previously, data was divided and republished to Log Analytics over a **12-day period**. Now, **all data** is republished every **7 days** at the beginning of each week. You can identify this data in the `ThreatIntelIndicators` and `ThreatIntelObjects` tables by checking if `SourceSystem` equals `LogARepublisher`.
28+
> 2. The new tables now support more columns, including the `Data` column, which contains the full data object used in advanced hunting scenarios. If these columns don't align with your scenario, learn more about filtering out [columns](#transform-away-columns-sent-to-log-analytics) and [rows](#transform-away-rows-sent-to-log-analytics) before ingestion to Log Analytics.
29+
> 3. To optimize ingestion to Log Analytics, key-value pairs with no data are excluded. Additionally, some fields within the `Data` column—such as `description` and `pattern`are truncated if they exceed 1,000 characters.
30+
> For more information on the updated schema and how it might affect your usage, see [ThreatIntelIndicators](/azure/azure-monitor/reference/tables/threatintelindicators) and [ThreatIntelObjects](/azure/azure-monitor/reference/tables/threatintelobjects).
31+
3232
## Identify threat actors associated with specific threat indicators
3333

3434
This query is an example of how to correlate threat indicators, such as IP addresses, with threat actors:
@@ -150,15 +150,15 @@ ThreatIntelIndicators
150150
[Transformations in Azure Monitor](/azure/azure-monitor/data-collection/data-collection-transformations) allow you to filter or modify incoming data before it's stored in a Log Analytics workspace. They're implemented as a Kusto Query Language (KQL) statement in a [data collection rule (DCR)](/azure/azure-monitor/data-collection/data-collection-rule-overview). Learn more about how to [create workspace transformations](/azure/azure-monitor/data-collection/data-collection-transformations-create?tabs=portal#create-workspace-transformation-dcr) and the [cost for transformations](/azure/azure-monitor/data-collection/data-collection-transformations#cost-for-transformations).
151151

152152
### Transform away columns sent to Log Analytics
153-
The `ThreatIntelIndicator` and `ThreatIntelObjects` tables include a Data column that contains the full original STIX object. If this column is not relevant to your use case, you can filter it out before ingestion using the following KQL statement:
153+
The `ThreatIntelIndicator` and `ThreatIntelObjects` tables include a `Data` column that contains the full original STIX object. If this column isn't relevant to your use case, you can filter it out before ingestion using the following KQL statement:
154154

155155
```Kusto
156156
source
157157
| project-away Data
158158
```
159159

160160
### Transform away rows sent to Log Analytics
161-
The `ThreatIntelIndicators` table always receives at least one row for each unexpired indicator. In some cases, the STIX pattern cannot be parsed into key/value pairs. When this happens, the indicator is still sent to Log Analytics, but only the raw, unparsed pattern is included—allowing users to build custom analytics if needed. If these rows are not useful for your scenario, you can filter them out before ingestion using the following KQL statement:
161+
The `ThreatIntelIndicators` table always receives at least one row for each unexpired indicator. In some cases, the STIX pattern can't be parsed into key/value pairs. When this happens, the indicator is still sent to Log Analytics, but only the raw, unparsed pattern is included—allowing users to build custom analytics if needed. If these rows aren't useful for your scenario, you can filter them out before ingestion using the following KQL statement:
162162

163163
```Kusto
164164
source

0 commit comments

Comments
 (0)