You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The default values for **parameters.json** file come from your project settings. If you want to deploy to another environment, replace the values accordingly.
59
60
@@ -251,7 +252,7 @@ To deploy your Stream Analytics project using ARM templates, follow these steps:
251
252
252
253
For more information about deploying resources with ARM templates, see [Deploy with a Resource Manager template file and Azure PowerShell](https://aka.ms/armdeploytemplate).
253
254
254
-
## Next steps
255
+
## Related content
255
256
256
257
* [Continuous integration and Continuous deployment for Azure Stream Analytics](cicd-overview.md)
257
258
* [Set up CI/CD pipeline for Stream Analytics job using Azure Pipelines](set-up-cicd-pipeline.md)
Copy file name to clipboardExpand all lines: articles/stream-analytics/power-bi-output.md
+8-7Lines changed: 8 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,11 +1,12 @@
1
1
---
2
2
title: Power BI output from Azure Stream Analytics
3
-
description: This article describes how to output data from Azure Stream Analytics to Power BI.
3
+
description: This article describes how to output data from Azure Stream Analytics to Power BI for a rich visualization experience of analysis results.
4
4
author: AliciaLiMicrosoft
5
5
ms.author: ali
6
6
ms.service: azure-stream-analytics
7
-
ms.topic: conceptual
8
-
ms.date: 07/20/2023
7
+
ms.topic: concept-article
8
+
ms.date: 01/23/2025
9
+
# Customer intent: I want to learn how to send output from a Stream Analytics job to Power BI.
9
10
---
10
11
11
12
# Power BI output from Azure Stream Analytics
@@ -44,7 +45,7 @@ Azure Stream Analytics creates a Power BI dataset and table schema for the user
44
45
Power BI uses the first-in, first-out (FIFO) retention policy. Data is collected in a table until it hits 200,000 rows.
45
46
46
47
> [!NOTE]
47
-
> We do not recommend using multiple outputs to write to the same dataset because it can cause several issues. Each output tries to create the Power BI dataset independently which can result in multiple datasets with the same name. Additionally, if the outputs don't have consistent schemas, the dataset changes the schema on each write, which leads to too many schema change requests. Even if these issues are avoided, multiple outputs will be less performant than a single merged output.
48
+
> We don't recommend using multiple outputs to write to the same dataset because it can cause several issues. Each output tries to create the Power BI dataset independently which can result in multiple datasets with the same name. Additionally, if the outputs don't have consistent schemas, the dataset changes the schema on each write, which leads to too many schema change requests. Even if these issues are avoided, multiple outputs are less performant than a single merged output.
48
49
49
50
### Convert a data type from Stream Analytics to Power BI
50
51
@@ -58,7 +59,7 @@ This table covers the data type conversions from [Stream Analytics data types](/
58
59
| nvarchar(max) | String |
59
60
| datetime | Datetime |
60
61
| float | Double |
61
-
| Record array | String type, constant value `IRecord` or `IArray`|
62
+
| Record array | String type, constant value `IRecord`, or `IArray`|
62
63
63
64
### Update the schema
64
65
@@ -83,7 +84,7 @@ You can use the following equation to compute the value to give your window in s
83
84
For example:
84
85
85
86
* You have 1,000 devices sending data at one-second intervals.
86
-
* You're using the Power BI Pro SKU that supports 1,000,000 rows per hour.
87
+
* You're using the Power BI Pro Stock Keeping Unit (SKU) that supports 1,000,000 rows per hour.
87
88
* You want to publish the amount of average data per device to Power BI.
88
89
89
90
As a result, the equation becomes:
@@ -113,7 +114,7 @@ Similarly, if a job starts after the token has expired, an error occurs and the
113
114
114
115
After the authorization has been refreshed with Power BI, a green alert appears in the authorization area to reflect that the issue has been resolved. To overcome this limitation, it's recommended to [use Managed Identity to authenticate your Azure Stream Analytics job to Power BI](powerbi-output-managed-identity.md)
115
116
116
-
## Next steps
117
+
## Related content
117
118
118
119
*[Use Managed Identity to authenticate your Azure Stream Analytics job to Power BI](powerbi-output-managed-identity.md)
119
120
*[Quickstart: Create a Stream Analytics job by using the Azure portal](stream-analytics-quick-create-portal.md)
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-parsing-json.md
+18-18Lines changed: 18 additions & 18 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,19 +1,19 @@
1
1
---
2
2
title: Parsing JSON and AVRO in Azure Stream Analytics
3
-
description: This article describes how to operate on complex data types like arrays, JSON, CSV formatted data.
3
+
description: This article describes how to operate on complex data types like arrays, JSON, CSV formatted data when using Azure Stream Analytics.
4
4
ms.service: azure-stream-analytics
5
5
author: an-emma
6
6
ms.author: raan
7
-
ms.topic: conceptual
8
-
ms.date: 05/25/2023
9
-
ms.custom:
7
+
ms.topic: concept-article
8
+
ms.date: 01/23/2025
9
+
# Customer intent: I want to know about Azure Stream Analytics support for parsing JSON and AVRO data.
10
10
---
11
11
# Parse JSON and Avro data in Azure Stream Analytics
12
12
13
-
Azure Stream Analytics support processing events in CSV, JSON, and Avro data formats. Both JSON and Avro data can be structured and contain some complex types such as nested objects (records) and arrays.
13
+
The Azure Stream Analytics service supports processing events in CSV, JSON, and Avro data formats. Both JSON and Avro data can be structured and contain some complex types such as nested objects (records) and arrays.
14
14
15
15
>[!NOTE]
16
-
>AVRO files created by Event Hub Capture use a specific format that requires you to use the *custom deserializer* feature. For more information, see [Read input in any format using .NET custom deserializers](./custom-deserializer-examples.md).
16
+
>AVRO files created by Event Hubs Capture use a specific format that requires you to use the *custom deserializer* feature. For more information, see [Read input in any format using .NET custom deserializers](./custom-deserializer-examples.md).
17
17
18
18
19
19
@@ -44,7 +44,7 @@ Record data types are used to represent JSON and Avro arrays when corresponding
44
44
```
45
45
46
46
### Access nested fields in known schema
47
-
Use dot notation (.) to easily access nested fields directly from your query. For example, this query selects the Latitude and Longitude coordinates under the Location property in the preceding JSON data. The dot notation can be used to navigate multiple levels as shown below.
47
+
Use dot notation (.) to easily access nested fields directly from your query. For example, this query selects the Latitude and Longitude coordinates under the Location property in the preceding JSON data. The dot notation can be used to navigate multiple levels as shown in the following snippet:
48
48
49
49
```SQL
50
50
SELECT
@@ -82,9 +82,9 @@ The result is:
82
82
83
83
### Access nested fields when property name is a variable
84
84
85
-
Use the [GetRecordPropertyValue](/stream-analytics-query/getrecordpropertyvalue-azure-stream-analytics) function if the property name is a variable. This allows for building dynamic queries without hardcoding property names.
85
+
Use the [GetRecordPropertyValue](/stream-analytics-query/getrecordpropertyvalue-azure-stream-analytics) function if the property name is a variable. It allows for building dynamic queries without hardcoding property names.
86
86
87
-
For example, imagine the sample data stream needs **to be joined with reference data** containing thresholds for each device sensor. A snippet of such reference data is shown below.
87
+
For example, imagine the sample data stream needs **to be joined with reference data** containing thresholds for each device sensor. A snippet of such reference data is shown in the following snippet.
88
88
89
89
```json
90
90
{
@@ -99,7 +99,7 @@ For example, imagine the sample data stream needs **to be joined with reference
99
99
}
100
100
```
101
101
102
-
The goal here is to join our sample dataset from the top of the article to that reference data, and output one event for each sensor measure above its threshold. That means our single event above can generate multiple output events if multiple sensors are above their respective thresholds, thanks to the join. To achieve similar results without a join, see the section below.
102
+
The goal here's to join our sample dataset from the top of the article to that reference data, and output one event for each sensor measure above its threshold. That means our single event above can generate multiple output events if multiple sensors are above their respective thresholds, thanks to the join. To achieve similar results without a join, see the following example:
103
103
104
104
```SQL
105
105
SELECT
@@ -119,8 +119,8 @@ WHERE
119
119
The result is:
120
120
121
121
|DeviceID|SensorName|AlertMessage|
122
-
|-|-|-|
123
-
|12345|Humidity|Alert : Sensor above threshold|
122
+
| - | - | - |
123
+
|12345|Humidity| Alert: Sensor above threshold|
124
124
125
125
### Convert record fields into separate events
126
126
@@ -165,7 +165,7 @@ SELECT DeviceID, PropertyValue AS Humidity INTO HumidityOutput FROM Stage0 WHERE
165
165
```
166
166
167
167
### Parse JSON record in SQL reference data
168
-
When using Azure SQL Database as reference data in your job, it's possible to have a column that has data in JSON format. An example is shown below.
168
+
When using Azure SQL Database as reference data in your job, it's possible to have a column that has data in JSON format. An example is shown in the following example:
169
169
170
170
|DeviceID|Data|
171
171
|-|-|
@@ -180,7 +180,7 @@ return JSON.parse(string);
180
180
}
181
181
```
182
182
183
-
You can then create a step in your Stream Analytics query as shown below to access the fields of your JSON records.
183
+
You can then create a step in your Stream Analytics query as shown here to access the fields of your JSON records.
184
184
185
185
```SQL
186
186
WITH parseJson as
@@ -198,9 +198,9 @@ You can then create a step in your Stream Analytics query as shown below to acce
198
198
199
199
## Array data types
200
200
201
-
Array data types are an ordered collection of values. Some typical operations on array values are detailed below. These examples use the functions [GetArrayElement](/stream-analytics-query/getarrayelement-azure-stream-analytics), [GetArrayElements](/stream-analytics-query/getarrayelements-azure-stream-analytics), [GetArrayLength](/stream-analytics-query/getarraylength-azure-stream-analytics), and the [APPLY](/stream-analytics-query/apply-azure-stream-analytics) operator.
201
+
Array data types are an ordered collection of values. Some typical operations on array values are detailed here. These examples use the functions [GetArrayElement](/stream-analytics-query/getarrayelement-azure-stream-analytics), [GetArrayElements](/stream-analytics-query/getarrayelements-azure-stream-analytics), [GetArrayLength](/stream-analytics-query/getarraylength-azure-stream-analytics), and the [APPLY](/stream-analytics-query/apply-azure-stream-analytics) operator.
202
202
203
-
Here's an example of a event. Both `CustomSensor03` and `SensorMetadata` are of type **array**:
203
+
Here's an example of an event. Both `CustomSensor03` and `SensorMetadata` are of type **array**:
204
204
205
205
```json
206
206
{
@@ -294,7 +294,7 @@ The result is:
294
294
|12345|Manufacturer|ABC|
295
295
|12345|Version|1.2.45|
296
296
297
-
If the extracted fields need to appear in columns, it is possible to pivot the dataset using the [WITH](/stream-analytics-query/with-azure-stream-analytics) syntax in addition to the [JOIN](/stream-analytics-query/join-azure-stream-analytics) operation. That join requires a [time boundary](/stream-analytics-query/join-azure-stream-analytics#BKMK_DateDiff) condition that prevents duplication:
297
+
If the extracted fields need to appear in columns, it's possible to pivot the dataset using the [WITH](/stream-analytics-query/with-azure-stream-analytics) syntax in addition to the [JOIN](/stream-analytics-query/join-azure-stream-analytics) operation. That join requires a [time boundary](/stream-analytics-query/join-azure-stream-analytics#BKMK_DateDiff) condition that prevents duplication:
298
298
299
299
```SQL
300
300
WITH DynamicCTE AS (
@@ -322,5 +322,5 @@ The result is:
322
322
|-|-|-|-|-|
323
323
|12345|47|122|1.2.45|ABC|
324
324
325
-
## See Also
325
+
## Related content
326
326
[Data Types in Azure Stream Analytics](/stream-analytics-query/data-types-azure-stream-analytics)
0 commit comments