You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/backup/sap-hana-faq-backup-azure-vm.yml
+13-3Lines changed: 13 additions & 3 deletions
Original file line number
Diff line number
Diff line change
@@ -1,10 +1,12 @@
1
1
### YamlMime:FAQ
2
2
metadata:
3
-
title: FAQ - Back up SAP HANA databases on Azure VMs
4
-
description: In this article, discover answers to common questions about backing up SAP HANA databases using the Azure Backup service.
3
+
title: FAQ — Back up SAP HANA databases on Azure VMs
4
+
description: This article provides answers to common questions about backing up SAP HANA databases using the Azure Backup service.
5
5
ms.topic: article
6
6
ms.service: backup
7
-
ms.date: 09/27/2021
7
+
ms.date: 02/01/2022
8
+
author: v-amallick
9
+
ms.author: v-amallick
8
10
9
11
title: Frequently asked questions – Back up SAP HANA databases on Azure VMs
10
12
summary: This article answers common questions about backing up SAP HANA databases using the Azure Backup service.
@@ -63,6 +65,14 @@ sections:
63
65
answer: |
64
66
Azure Backup doesn’t set an explicit retention period on the Auto-heal full backups. This backup is retained till the time you retain the dependent Delta (differential or incremental) and Log backups. Once you delete the last dependent backup on this Auto-heal backup, the Auto-heal backup is also deleted.
65
67
68
+
- question: |
69
+
Can a full backup and a log backup run parallelly?
70
+
answer: |
71
+
Yes, a full backup and a log backup can run parallelly. This instance occurs in one of the following ways:
72
+
73
+
- **Full backup is in progress and a log backup is triggered**: The log backup should succeed irrespective of an ongoing full backup. Unless, the full backup triggered was a remedial full to handle any LSN chain break.
74
+
- **Log backup is in progress, and a full backup is triggered**: Both backups should run parallelly and su
75
+
66
76
- question: |
67
77
Are future databases automatically added for backup?
@@ -4,7 +4,7 @@ description: How to use the new data export to export your IoT data to Azure and
4
4
services: iot-central
5
5
author: dominicbetts
6
6
ms.author: dobett
7
-
ms.date: 10/20/2021
7
+
ms.date: 01/31/2022
8
8
ms.topic: how-to
9
9
ms.service: iot-central
10
10
ms.custom: contperf-fy21q1, contperf-fy21q3
@@ -84,6 +84,53 @@ IoT Central exports data in near real time to a database table in the Azure Data
84
84
85
85
To query the exported data in the Azure Data Explorer portal, navigate to the database and select **Query**.
86
86
87
+
### Connection options
88
+
89
+
Azure Data Explorer destinations let you configure the connection with a *service principal* or a [managed identity](../../active-directory/managed-identities-azure-resources/overview.md).
90
+
91
+
Managed identities are more secure because:
92
+
93
+
- You don't store the credentials for your resource in your IoT Central application.
94
+
- The credentials are automatically tied to the lifetime of your IoT Central application.
95
+
- Managed identities automatically rotate their security keys regularly.
96
+
97
+
IoT Central currently uses [system-assigned managed identities](../../active-directory/managed-identities-azure-resources/overview.md#managed-identity-types).
98
+
99
+
When you configure a managed identity, the configuration includes a *scope* and a *role*:
100
+
101
+
- The scope defines where you can use the managed identity.
102
+
- The role defines what permissions the IoT Central application is granted in the destination service.
103
+
104
+
This article shows how to create a managed identity using the Azure CLI. You can also use the Azure portal to create a manged identity.
105
+
106
+
# [Webhook](#tab/webhook)
107
+
108
+
For webhook destinations, IoT Central exports data in near real time. The data in the message body is in the same format as for Event Hubs and Service Bus.
109
+
110
+
### Create a webhook destination
111
+
112
+
You can export data to a publicly available HTTP webhook endpoint. You can create a test webhook endpoint using [RequestBin](https://requestbin.net/). RequestBin throttles request when the request limit is reached:
113
+
114
+
1. Open [RequestBin](https://requestbin.net/).
115
+
1. Create a new RequestBin and copy the **Bin URL**. You use this URL when you test your data export.
116
+
117
+
To create the Azure Data Explorer destination in IoT Central on the **Create new destination** page:
118
+
119
+
1. Enter a **Destination name**.
120
+
121
+
1. Select **Webhook** as the destination type.
122
+
123
+
1. Paste the callback URL for your webhook endpoint. You can optionally configure webhook authorization and add custom headers.
124
+
125
+
- For **OAuth2.0**, only the client credentials flow is supported. When you save the destination, IoT Central communicates with your OAuth provider to retrieve an authorization token. This token is attached to the `Authorization` header for every message sent to this destination.
126
+
- For **Authorization token**, you can specify a token value that's directly attached to the `Authorization` header for every message sent to this destination.
For webhook destinations, IoT Central exports data in near real time. The data in the message body is in the same format as for Event Hubs and Service Bus.
204
+
### Create an Azure Data Explorer destination
158
205
159
-
### Create a webhook destination
206
+
If you don't have an existing Azure Data Explorer database to export to, follow these steps. You have two choices to create an Azure Data Explorer database:
160
207
161
-
You can export data to a publicly available HTTP webhook endpoint. You can create a test webhook endpoint using [RequestBin](https://requestbin.net/). RequestBin throttles request when the request limit is reached:
208
+
- Create a new Azure Data Explorer cluster and database. To learn more, see the [Azure Data Explorer quickstart](/azure/data-explorer/create-cluster-database-portal). Make a note of the cluster URI and the name of the database you create, you need these values in the following steps.
209
+
- Create a new Azure Synapse Data Explorer pool and database. To learn more, see the [Azure Data Explorer quickstart](../../synapse-analytics/get-started-analyze-data-explorer.md). Make a note of the pool URI and the name of the database you create, you need these values in the following steps.
162
210
163
-
1. Open [RequestBin](https://requestbin.net/).
164
-
1. Create a new RequestBin and copy the **Bin URL**. You use this URL when you test your data export.
211
+
To configure the managed identity that enables your IoT Central application to securely export data to your Azure resource:
212
+
213
+
1. Create a managed identity for your IoT Central application to use to connect to your database. Use the Azure Cloud Shell to run the following command:
214
+
215
+
```azurecli
216
+
az iot central app identity assign --name {your IoT Central app name} \
217
+
--resource-group {resource group name} \
218
+
--system-assigned
219
+
```
220
+
221
+
Make a note of the `principalId` and `tenantId` output by the command. You use these values in the following step.
222
+
223
+
1. Configure the database permissions to allow connections from your IoT Central application. Use the Azure Cloud Shell to run the following command:
224
+
225
+
```azurecli
226
+
az kusto database-principal-assignment create --cluster-name {name of your cluster} \
227
+
--database-name {name of your database} \
228
+
--resource-group {resource group name} \
229
+
--principal-assignment-name {name of your IoT Central application} \
230
+
--principal-id {principal id from the previous step} \
231
+
--principal-type App --role Admin \
232
+
--tenant-id {tenant id from the previous step}
233
+
```
234
+
235
+
> [!TIP]
236
+
> If you're using Azure Synapse, see [`az synapse kusto database-principal-assignment`](/cli/azure/synapse/kusto/database-principal-assignment).
237
+
238
+
1. Create a table in your database with a suitable schema for the data you're exporting. The following example query creates a table called `smartvitalspatch`. To learn more, see [Transform data inside your IoT Central application for export](howto-transform-data-internally.md):
239
+
240
+
```kusto
241
+
.create table smartvitalspatch (
242
+
EnqueuedTime:datetime,
243
+
Message:string,
244
+
Application:string,
245
+
Device:string,
246
+
Simulated:boolean,
247
+
Template:string,
248
+
Module:string,
249
+
Component:string,
250
+
Capability:string,
251
+
Value:dynamic
252
+
)
253
+
```
254
+
255
+
1. (Optional) To speed up ingesting data into your Azure Data Explorer database:
256
+
257
+
1. Navigate to the **Configurations** page for your Azure Data Explorer cluster. Then enable the **Streaming ingestion** option.
258
+
1. Run the following query to alter the table policy to enable streaming ingestion:
To create the Azure Data Explorer destination in IoT Central on the **Create new destination** page:
167
265
168
266
1. Enter a **Destination name**.
169
267
170
-
1. Select **Webhook** as the destination type.
268
+
1. Select **Azure Data Explorer** as the destination type.
171
269
172
-
1. Paste the callback URL for your webhook endpoint. You can optionally configure webhook authorization and add custom headers.
270
+
1. Enter your Azure Data Explorer cluster or pool URL, database name, and table name. Select **System-assigned managed identity** as the authorization type.
173
271
174
-
- For **OAuth2.0**, only the client credentials flow is supported. When you save the destination, IoT Central communicates with your OAuth provider to retrieve an authorization token. This token is attached to the `Authorization` header for every message sent to this destination.
175
-
- For **Authorization token**, you can specify a token value that's directly attached to the `Authorization` header for every message sent to this destination.
176
-
177
-
1. Select **Save**.
272
+
> [!TIP]
273
+
> The cluster URL for a standalone Azure Data Explorer looks like `https://<ClusterName>.<AzureRegion>.kusto.windows.net`. The cluster URL for an Azure Synapse Data Explorer pool looks like `https://<DataExplorerPoolName>.<SynapseWorkspaceName>.kusto.azuresynapse.net`.
178
274
179
-
---
275
+
:::image type="content" source="media/howto-export-data/export-destination-managed.png" alt-text="Screenshot of Azure Data Explorer export destination.":::
Copy file name to clipboardExpand all lines: articles/vpn-gateway/troubleshoot-vpn-with-azure-diagnostics.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -171,7 +171,7 @@ The official document
171
171
172
172
## <aname="P2SDiagnosticLog"></a>P2SDiagnosticLog
173
173
174
-
The last available table for VPN diagnostics is **P2SDiagnosticLog**. This table traces the activity for Point to Site.
174
+
The last available table for VPN diagnostics is **P2SDiagnosticLog**. This table traces the activity for Point to Site (only IKEv2 and OpenVPN protocols).
175
175
176
176
Here you have a sample query as reference.
177
177
@@ -199,4 +199,4 @@ Also, whenever a client will connect via IKEv2 or OpenVPN Point to Site, the tab
199
199
200
200
## Next Steps
201
201
202
-
To configure alerts on tunnel resource logs, see [Set up alerts on VPN Gateway resource logs](vpn-gateway-howto-setup-alerts-virtual-network-gateway-log.md).
202
+
To configure alerts on tunnel resource logs, see [Set up alerts on VPN Gateway resource logs](vpn-gateway-howto-setup-alerts-virtual-network-gateway-log.md).
0 commit comments