You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/backup/backup-azure-monitor-alert-faq.md
+19-20Lines changed: 19 additions & 20 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,41 +8,40 @@ ms.date: 07/08/2019
8
8
9
9
# Azure Backup Monitoring Alert - FAQ
10
10
11
-
This article answers common questions about the Azure monitoring alert.
11
+
This article answers common questions about Azure Backup monitoring and reporting.
12
12
13
13
## Configure Azure Backup reports
14
14
15
-
### How do I check if reporting data has started flowing into a storage account?
15
+
### How do I check if reporting data has started flowing into a Log Analytics (LA) Workspace?
16
16
17
-
Go to the storage account you configured, and select containers. If the container has an entry for insights-logs-azurebackupreport, it indicates that reporting data has started flowing in.
17
+
Navigate to the LA Workspace you have configured, navigate to the **Logs** menu item, and run the query CoreAzureBackup | take 1. If you see a record being returned, it means data has started flowing into the workspace. The initial data push may take up to 24 hours.
18
18
19
-
### What is the frequency of data push to a storage account and the Azure Backup content pack in Power BI?
19
+
### What is the frequency of data push to an LA Workspace?
20
20
21
-
For Day 0 users, it takes around 24 hours to push data to a storage account. After this initial push is finished, data is refreshed with the frequency shown in the following figure.
21
+
The diagnostic data from the vault is pumped to the Log Analytics workspace with some lag. Every event arrives at the Log Analytics workspace 20 to 30 minutes after it's pushed from the Recovery Services vault. Here are further details about the lag:
22
22
23
-
* Data related to **Jobs**, **Alerts**, **Backup Items**, **Vaults**, **Protected Servers**, and **Policies** is pushed to a customer storage account as and when it's logged.
23
+
* Across all solutions, the backup service's built-in alerts are pushed as soon as they're created. So they usually appear in the Log Analytics workspace after 20 to 30 minutes.
24
+
* Across all solutions, on-demand backup jobs and restore jobs are pushed as soon as they finish.
25
+
* For all solutions except SQL backup, scheduled backup jobs are pushed as soon as they finish.
26
+
* For SQL backup, because log backups can occur every 15 minutes, information for all the completed scheduled backup jobs, including logs, is batched and pushed every 6 hours.
27
+
* Across all solutions, other information such as the backup item, policy, recovery points, storage, and so on, is pushed at least once per day.
28
+
* A change in the backup configuration (such as changing policy or editing policy) triggers a push of all related backup information.
24
29
25
-
* Data related to **Storage** is pushed to a customer storage account every 24 hours.
30
+
### How long can I retain reporting data?
26
31
27
-

32
+
After you create an LA Workspace, you can choose to retain data for a maximum of 2 years. By default, an LA Workspace retains data for 31 days.
28
33
29
-
* Power BI has a [scheduled refresh once a day](https://powerbi.microsoft.com/documentation/powerbi-refresh-data/#what-can-be-refreshed). You can perform a manual refresh of the data in Power BI for the content pack.
34
+
### Will I see all my data in reports after I configure the LA Workspace?
30
35
31
-
### How long can I retain reports?
36
+
All the data generated after you configure diagnostics settings is pushed to the LA Workspace and is available in reports. In-progress jobs aren't pushed for reporting. After the job finishes or fails, it is sent to reports.
32
37
33
-
When you configure a storage account, you can select a retention period for report data in the storage account. Follow step 6 in the [Configure storage account for reports](backup-azure-configure-reports.md#configure-storage-account-for-reports) section. You also can [analyze reports in Excel](https://powerbi.microsoft.com/documentation/powerbi-service-analyze-in-excel/) and save them for a longer retention period, based on your needs.
34
-
35
-
### Will I see all my data in reports after I configure the storage account?
36
-
37
-
All the data generated after you configure a storage account is pushed to the storage account and is available in reports. In-progress jobs aren't pushed for reporting. After the job finishes or fails, it's sent to reports.
38
-
39
-
### If I already configured the storage account to view reports, can I change the configuration to use another storage account?
38
+
### Can I view reports across vaults and subscriptions?
40
39
41
-
Yes, you can change the configuration to point to a different storage account. Use the newly configured storage account while you connect to the Azure Backup content pack. Also, after a different storage account is configured, new data flows in this storage account. Older data (before you change the configuration) still remains in the older storage account.
40
+
Yes, you can view reports across vaults and subscriptions as well as regions. Your data may reside in a single LA Workspace or a group of LA Workspaces.
42
41
43
-
### Can I view reports across vaults and subscriptions?
42
+
### Can I view reports across tenants?
44
43
45
-
Yes, you can configure the same storage account across various vaults to view cross-vault reports. Also, you can configure the same storage account for vaults across subscriptions. Then you can use this storage account while you connect to the Azure Backup content pack in Power BI to view the reports. The storage account selected must be in the same region as the Recovery Services vault.
44
+
If you are an [Azure Lighthouse](https://azure.microsoft.com/services/azure-lighthouse/) user with delegated access to your customers' subscriptions or LA Workspaces, you can use Backup Reports to view data across all your tenants.
46
45
47
46
### How long does it take for the Azure backup agent job status to reflect in the portal?
> Data from Azure VM backups, the Azure Backup agent, System Center Data Protection Manager, SQL backups in Azure VMs, and Azure Files share backups is pumped to the Log Analytics workspace through diagnostic settings. Support for Microsoft Azure Backup Server (MABS) will be added soon
22
-
23
-
To monitor/report at scale, you need the capabilities of two Azure services. *Diagnostic settings* send data from multiple Azure Resource Manager resources to another resource. *Log Analytics* generates custom alerts where you can use action groups to define other notification channels.
24
-
25
-
The following sections detail how to use Log Analytics to monitor Azure Backup at scale.
26
-
27
-
### Configure diagnostic settings
28
-
29
-
Azure Resource Manager resources, such as the Recovery Services vault, record information about scheduled operations and user-triggered operations as diagnostic data.
30
-
31
-
In the monitoring section, select **Diagnostic settings** and specify the target for the Recovery Services vault's diagnostic data.
You can target a Log Analytics workspace from another subscription. To monitor vaults across subscriptions in a single place, select the same Log Analytics workspace for multiple Recovery Services vaults. To channel all the information that's related to Azure Backup to the Log Analytics workspace, choose **AzureDiagnostics** in the toggle that appears, and select the **AzureBackupReport** event.
36
-
37
-
> [!IMPORTANT]
38
-
> After you finish the configuration, you should wait 24 hours for the initial data push to finish. After that initial data push, all the events are pushed as described later in this article, in the [frequency section](#diagnostic-data-update-frequency).
39
-
40
-
### Deploy a solution to the Log Analytics workspace
41
-
42
-
> [!IMPORTANT]
43
-
> We have released an updated, multi-view [template](https://azure.microsoft.com/resources/templates/101-backup-la-reporting/) for LA-based Monitoring and Reporting in Azure Backup. Please note that users who were using the [earlier solution](https://azure.microsoft.com/resources/templates/101-backup-oms-monitoring/) will continue to see it in their workspaces even after deploying the new solution. However, the old solution may provide inaccurate results due to some minor schema changes. Users are hence required to deploy the new template.
44
-
45
-
After the data is inside the Log Analytics workspace, [deploy a GitHub template](https://azure.microsoft.com/resources/templates/101-backup-la-reporting/) to Log Analytics to visualize the data. To properly identify the workspace, make sure you give it the same resource group, workspace name, and workspace location. Then install this template on the workspace.
46
-
47
-
### View Azure Backup data by using Log Analytics
48
-
49
-
-**Azure Monitor**: In the **Insights** section, select **More** and then choose the relevant workspace.
50
-
-**Log Analytics workspaces**: Select the relevant workspace, and then under **General**, select **Workspace summary**.
51
-
52
-

53
-
54
-
When you select any of the overview tiles, you can view further information. Here are some of the reports you'll see:
55
-
56
-
- Non Log Backup Jobs
57
-
58
-

59
-
60
-
- Alerts from Azure Resources Backup
61
-
62
-

63
-
64
-
Similarly, by clicking on the other tiles, you will be able to see reports on Restore Jobs, Cloud Storage, Backup Items, Alerts from on-premises Resources Backup, and Log Backup Jobs.
65
-
66
-
These graphs are provided with the template. You can edit the graphs or add more graphs if you need to.
67
-
68
20
### Create alerts by using Log Analytics
69
21
70
22
In Azure Monitor, you can create your own alerts in a Log Analytics workspace. In the workspace, you use *Azure action groups* to select your preferred notification mechanism.
@@ -105,90 +57,65 @@ The default graphs give you Kusto queries for basic scenarios on which you can b
105
57
- All successful backup jobs
106
58
107
59
````Kusto
108
-
AzureDiagnostics
109
-
| where Category == "AzureBackupReport"
110
-
| where SchemaVersion_s == "V2"
111
-
| where OperationName == "Job" and JobOperation_s == "Backup"
112
-
| where JobStatus_s == "Completed"
60
+
AddonAzureBackupJobs
61
+
| where JobOperation=="Backup"
62
+
| where JobStatus=="Completed"
113
63
````
114
64
115
65
- All failed backup jobs
116
66
117
67
````Kusto
118
-
AzureDiagnostics
119
-
| where Category == "AzureBackupReport"
120
-
| where SchemaVersion_s == "V2"
121
-
| where OperationName == "Job" and JobOperation_s == "Backup"
| where OperationName == "Job" and JobOperation_s == "Backup" and JobStatus_s == "Completed" and JobOperationSubType_s != "Log" and JobOperationSubType_s != "Recovery point_Log"
133
-
| join kind=inner
134
-
(
135
-
AzureDiagnostics
136
-
| where Category == "AzureBackupReport"
137
-
| where OperationName == "BackupItem"
138
-
| where SchemaVersion_s == "V2"
139
-
| where BackupItemType_s == "VM" and BackupManagementType_s == "IaaSVM"
| where OperationName == "Job" and JobOperation_s == "Backup" and JobStatus_s == "Completed" and JobOperationSubType_s != "Log" and JobOperationSubType_s != "Recovery point_Log"
179
-
| join kind=inner
180
-
(
181
-
AzureDiagnostics
182
-
| where Category == "AzureBackupReport"
183
-
| where OperationName == "BackupItem"
184
-
| where SchemaVersion_s == "V2"
185
-
| where BackupItemType_s == "FileFolder" and BackupManagementType_s == "MAB"
0 commit comments