You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
You can use the data lake connector to send data from Azure IoT MQ Preview broker to a data lake, like Azure Data Lake Storage Gen2 (ADLSv2) and Microsoft Fabric OneLake. The connector subscribes to MQTT topics and ingests the messages into Delta tables in the Data Lake Storage account.
| Send data to Azure Data Lake Storage Gen2 | Supported |
26
-
| Send data to local storage | Supported |
27
-
| Send data Microsoft Fabric OneLake | Supported |
28
-
| Use SAS token for authentication | Supported |
29
-
| Use managed identity for authentication | Supported |
30
-
| Delta format | Supported |
31
-
| Parquet format | Supported |
32
-
| JSON message payload | Supported |
33
-
| Create new container if it doesn't exist | Supported |
34
-
| Signed types support | Supported |
35
-
| Unsigned types support | Not Supported |
19
+
You can use the data lake connector to send data from Azure IoT MQ Preview broker to a data lake, like Azure Data Lake Storage Gen2 (ADLSv2), Microsoft Fabric OneLake, and Azure Data Explorer. The connector subscribes to MQTT topics and ingests the messages into Delta tables in the Data Lake Storage account.
36
20
37
21
## Prerequisites
38
22
39
23
- A Data Lake Storage account in Azure with a container and a folder for your data. For more information about creating a Data Lake Storage, use one of the following quickstart options:
40
24
- Microsoft Fabric OneLake quickstart:
41
-
- [Create a workspace](/fabric/get-started/create-workspaces) since the default *my workspace* isn't supported.
42
-
- [Create a lakehouse](/fabric/onelake/create-lakehouse-onelake).
25
+
-[Create a workspace](/fabric/get-started/create-workspaces) since the default *my workspace* isn't supported.
26
+
-[Create a lakehouse](/fabric/onelake/create-lakehouse-onelake).
43
27
- Azure Data Lake Storage Gen2 quickstart:
44
-
-[Create a storage account to use with Azure Data Lake Storage Gen2](/azure/storage/blobs/create-data-lake-storage-account).
28
+
-[Create a storage account to use with Azure Data Lake Storage Gen2](/azure/storage/blobs/create-data-lake-storage-account).
29
+
- Azure Data Explorer cluster:
30
+
- Follow the **Full cluster** steps in the [Quickstart: Create an Azure Data Explorer cluster and database](/azure/data-explorer/create-cluster-and-database?tabs=full).
45
31
46
32
- An IoT MQ MQTT broker. For more information on how to deploy an IoT MQ MQTT broker, see [Quickstart: Deploy Azure IoT Operations Preview to an Arc-enabled Kubernetes cluster](../get-started/quickstart-deploy.md).
47
33
@@ -225,22 +211,20 @@ authentication:
225
211
226
212
Configure the data lake connector to send data to an Azure Data Explorer endpoint using managed identity.
227
213
228
-
1. To deploy an Azure Data Explorer cluster, follow the **Full cluster** steps in the [Quickstart: Create an Azure Data Explorer cluster and database](/azure/data-explorer/create-cluster-and-database?tabs=full).
214
+
1. Ensure that the steps in prerequisites are met, including a full Azure Data Explorer cluster. The "free cluster" option doesn't work.
229
215
230
216
1. After the cluster is created, create a database to store your data.
231
217
232
-
1. You can create a table for given data via the Azure portal and create columns manually, or you can use [KQL](/azure/data-explorer/kusto/management/create-table-command) in the query tab.
218
+
1. You can create a table forgiven data via the Azure portal and create columns manually, or you can use [KQL](/azure/data-explorer/kusto/management/create-table-command)in the query tab. For example:
233
219
234
-
For example:
220
+
```kql
235
221
.create table thermostat (
236
222
externalAssetId: string,
237
223
assetName: string,
238
224
CurrentTemperature: real,
239
225
Pressure: real,
240
226
MqttTopic: string,
241
227
Timestamp: datetime
242
-
)
243
-
timestamp: datetime
244
228
)
245
229
```
246
230
@@ -252,12 +236,6 @@ Enable streaming ingestion on your table and database. In the query tab, run the
A *DataLakeConnector* is a Kubernetes custom resource that defines the configuration and properties of a data lake connector instance. A data lake connector ingests data from MQTT topics into Delta tables in a Data Lake Storage account.
0 commit comments