Skip to content

Commit 8a35d64

Browse files
authored
Merge pull request #241957 from guywi-ms/migrate-to-LI-API
Migrate to Log Ingestion api
2 parents afd054e + 9cf19bc commit 8a35d64

File tree

1 file changed

+74
-29
lines changed

1 file changed

+74
-29
lines changed

articles/azure-monitor/logs/custom-logs-migrate.md

Lines changed: 74 additions & 29 deletions
Original file line numberDiff line numberDiff line change
@@ -1,48 +1,93 @@
11
---
2-
title: Migrate from Data Collector API and custom fields-enabled tables to DCR-based custom log collection
3-
description: Steps that you must perform when migrating from Data Collector API and custom fields-enabled tables to DCR-based custom log collection.
4-
ms.topic: conceptual
5-
ms.date: 01/06/2022
2+
title: Migrate from the HTTP Data Collector API to the Log Ingestion API
3+
description: Migrate from the legacy Azure Monitor Data Collector API to the Log Ingestion API, which provides more processing power and greater flexibility.
4+
author: guywi-ms
5+
ms.author: guywild
6+
ms.reviewer: ivankh
7+
ms.topic: how-to
8+
ms.date: 05/23/2023
69

710
---
811

9-
# Migrate from Data Collector API and custom fields-enabled tables to DCR-based custom log collection
10-
This article describes how to migrate from [Data Collector API](data-collector-api.md) or [custom fields](custom-fields.md) in Azure Monitor to [DCR-based custom log collection](../essentials/data-collection-rule-overview.md). It includes configuration required for custom tables created in your Log Analytics workspace so that they can be used by [Logs ingestion API](logs-ingestion-api-overview.md) and [workspace transformations](../essentials/data-collection-transformations.md#workspace-transformation-dcr).
12+
# Migrate from the HTTP Data Collector API to the Log Ingestion API to send data to Azure Monitor Logs
1113

12-
> [!IMPORTANT]
13-
> You do not need to follow this article if you are configuring your DCR-based custom logs [using the Azure Portal](tutorial-workspace-transformations-portal.md) since the configuration will be performed for you. This article only applies if you're configuring using Resource Manager templates APIs.
14+
The Azure Monitor [Log Ingestion API](../logs/logs-ingestion-api-overview.md) provides more processing power and greater flexibility in ingesting logs and [managing tables](../logs/manage-logs-tables.md) than the legacy [HTTP Data Collector API](../logs/data-collector-api.md). This article describes the differences between the Data Collector API and the Log Ingestion API and provides guidance and best practices for migrating to the new Log Ingestion API.
15+
16+
> [!NOTE]
17+
> As a Microsoft MVP, [Morten Waltorp Knudsen](https://mortenknudsen.net/) contributed to and provided material feedback for this article. For an example of how you can automate the setup and ongoing use of the Log Ingestion API, see Morten's publicly available [AzLogDcrIngestPS PowerShell module](https://github.com/KnudsenMorten/AzLogDcrIngestPS).
18+
19+
## Advantages of the Log Ingestion API
20+
21+
The Log Ingestion API provides the following advantages over the Data Collector API:
22+
23+
- Supports [transformations](../essentials/data-collection-transformations.md), which enable you to modify the data before it's ingested into the destination table, including filtering and data manipulation.
24+
- Lets you send data to multiple destinations.
25+
- Enables you to manage the destination table schema, including column names, and whether to add new columns to the destination table when the source data schema changes.
26+
## Prerequisites
27+
28+
The migration procedure described in this article assumes you have:
29+
30+
- A Log Analytics workspace where you have at least [contributor rights](manage-access.md#azure-rbac).
31+
- [Permissions to create data collection rules](../essentials/data-collection-rule-overview.md#permissions) in the Log Analytics workspace.
32+
- [An Azure AD application to authenticate API calls](../logs/tutorial-logs-ingestion-portal.md#create-azure-ad-application) or any other Resource Manager authentication scheme.
33+
34+
## Create new resources required for the Log ingestion API
35+
36+
The Log Ingestion API requires you to create two new types of resources, which the HTTP Data Collector API doesn't require:
1437

15-
## Background
16-
To use a table with the [Logs ingestion API](logs-ingestion-api-overview.md) or with a [workspace transformation](../essentials/data-collection-transformations.md#workspace-transformation-dcr), it must be configured to support new features. When you complete the process described in this article, the following actions are taken:
38+
- [Data collection endpoints](../essentials/data-collection-endpoint-overview.md), from which the the data you collect is ingested into the pipeline for processing.
39+
- [Data collection rules](../essentials/data-collection-rule-overview.md), which define [data transformations](../essentials/data-collection-transformations.md) and the destination table to which the data is ingested.
1740

18-
- The table is reconfigured to enable all DCR-based custom logs features. This includes DCR and DCE support and management with the new **Tables** control plane.
19-
- Any previously defined custom fields will stop populating.
20-
- The Data Collector API will continue to work but won't create any new columns. Data will only populate into any columns that was created prior to migration.
21-
- The schema and historic data is preserved and can be accessed the same way it was previously.
41+
## Migrate existing custom tables or create new tables
2242

23-
## Applicable scenarios
24-
This article is only applicable if all of the following criteria apply:
43+
If you have an existing custom table to which you currently send data using the Data Collector API, you can:
44+
45+
- Migrate the table to continue ingesting data into the same table using the Log Ingestion API.
46+
- Maintain the existing table and data and set up a new table into which you ingest data using the Log Ingestion API. You can then delete the old table when you're ready.
47+
48+
This is the preferred option, especially if you to need to make changes to the existing table. Changes to existing data types and multiple schema changes to existing Data Collector API custom tables can lead to errors.
49+
50+
This table summarizes considerations to keep in mind for each option:
51+
52+
||Table migration|Side-by-side implementation|
53+
|-|-|-|
54+
|**Table and column naming**|Reuse existing table name.<br>Column naming options: <br>- Use new column names and define a transformation to direct incoming data to the newly named column.<br>- Continue using old names.|Set the new table name freely.<br>Need to adjust integrations, dashboards, and alerts before switching to the new table.|
55+
|**Migration procedure**|One-off table migration. Not possible to roll back a migrated table. |Migration can be done gradually, per table.|
56+
|**Post-migration**|You can continue to ingest data using the HTTP Data Collector API with existing columns, except custom columns.<br>Ingest data into new columns using the Log Ingestion API only.| Data in the old table is available until the end of retention period.<br>When you first set up a new table or make schema changes, it can take 10-15 minutes for the data changes to start appearing in the destination table.|
57+
58+
To convert a table that uses the Data Collector API to data collection rules and the Log Ingestion API, issue this API call against the table:
59+
60+
```rest
61+
POST https://management.azure.com/subscriptions/{subscriptionId}/resourcegroups/{resourceGroupName}/providers/Microsoft.OperationalInsights/workspaces/{workspaceName}/tables/{tableName}/migrate?api-version=2021-12-01-preview
62+
```
63+
This call is idempotent, so it has no effect if the table has already been converted.
64+
65+
The API call enables all DCR-based custom logs features on the table. The Data Collector API will continue to ingest data into existing columns, but won't create any new columns. Any previously defined [custom fields](../logs/custom-fields.md) won't continue to be populated. Another way to migrate an existing table to using data collection rules, but not necessarily the Log Ingestion API is applying a [workspace transformation](../logs/tutorial-workspace-transformations-portal.md) to the table.
66+
67+
> [!IMPORTANT]
68+
> - Column names must start with a letter and can consist of up to 45 alphanumeric characters and the characters `_` and `-`.
69+
> - The following are reserved column names: `Type`, `TenantId`, `resource`, `resourceid`, `resourcename`, `resourcetype`, `subscriptionid`, `tenanted`.
70+
> - Custom columns you add to an Azure table must have the suffix `_CF`.
71+
> - If you update the table schema in your Log Analytics workspace, you must also update the input stream definition in the data collection rule to ingest data into new or modified columns.
2572
26-
- You're going to send data to the table using the [Logs ingestion API](logs-ingestion-api-overview.md) or configure a transformation for the table in the [workspace transformation DCR](../essentials/data-collection-transformations.md#workspace-transformation-dcr), preserving both schema and historical data in that table.
27-
- The table was either created using the Data Collector API, or has custom fields defined in it.
28-
- You want to migrate using the APIs instead of the Azure portal as described in [Send custom logs to Azure Monitor Logs using the Azure portal](tutorial-logs-ingestion-portal.md) or [Add transformation in workspace data collection rule using the Azure portal](tutorial-workspace-transformations-portal.md).
73+
## Call the Log Ingestion API
2974

30-
If all of these conditions aren't true, then you can use DCR-based log collection without following the procedure described here.
75+
The Log Ingestion API lets you send up to 1 MB of compressed or uncompressed data per call. If you need to send more than 1 MB of data, you can send multiple calls in parallel. This is a change from the Data Collector API, which lets you send up to 32 MB of data per call.
3176

32-
## Migration procedure
33-
If the table that you're targeting with DCR-based log collection fits the criteria above, then you must perform the following steps:
77+
For information about how to call the Log Ingestion API, see [Log Ingestion REST API call](../logs/logs-ingestion-api-overview.md#rest-api-call).
3478

35-
1. Configure your data collection rule (DCR) following procedures at [Send data to Azure Monitor using Logs ingestion API (Resource Manager templates)](tutorial-logs-ingestion-api.md) or [Add transformation in workspace data collection rule to Azure Monitor using Resource Manager templates](tutorial-workspace-transformations-api.md).
79+
## Modify table schemas and data collection rules based on changes to source data object
3680

37-
1. If using the Logs ingestion API, also [configure the data collection endpoint (DCE)](tutorial-logs-ingestion-api.md#create-data-collection-endpoint) and the agent or component that will be sending data to the API.
81+
While the Data Collector API automatically adjusts the destination table schema when the source data object schema changes, the Log Ingestion API doesn't. This ensures that you don't collect new data into columns that you didn't intend to create.
3882

39-
1. Issue the following API call against your table. This call is idempotent, so there will be no effect if the table has already been migrated.
83+
When the source data schema changes, you can:
4084

41-
```rest
42-
POST https://management.azure.com/subscriptions/{subscriptionId}/resourcegroups/{resourceGroupName}/providers/Microsoft.OperationalInsights/workspaces/{workspaceName}/tables/{tableName}/migrate?api-version=2021-12-01-preview
43-
```
85+
- [Modify destination table schemas](../logs/create-custom-table.md) and [data collection rules](../essentials/data-collection-rule-edit.md) to align with source data schema changes.
86+
- [Define a transformation](../essentials/data-collection-transformations.md) in the data collection rule to send the new data into existing columns in the destination table.
87+
- Leave the destination table and data collection rule unchanged. In this case, you won't ingest the new data.
4488

45-
1. Discontinue use of the Data Collector API and start using the new Logs ingestion API.
89+
> [!NOTE]
90+
> You can't reuse a column name with a data type that's different to the original data type defined for the column.
4691
4792
## Next steps
4893

0 commit comments

Comments
 (0)