Skip to content

Commit ba6aafb

Browse files
Merge pull request #293471 from spelluru/asafreshness01232
Stream Analytics - freshness
2 parents 1254562 + edf7d82 commit ba6aafb

File tree

3 files changed

+31
-29
lines changed

3 files changed

+31
-29
lines changed

articles/stream-analytics/cicd-tools.md

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,12 @@
11
---
2-
title: Automate builds, tests, and deployments of an Azure Stream Analytics job using CI/CD tools
2+
title: Automate builds, tests, and deployments using CI/CD tools
33
description: This article describes how to use Azure Stream Analytics CI/CD tools to auto build, test, and deploy an Azure Stream Analytics project.
44
author: alexlzx
55
ms.author: zhenxilin
66
ms.service: azure-stream-analytics
77
ms.topic: how-to
8-
ms.date: 03/08/2023
8+
ms.date: 01/23/2025
9+
# Customer intent: I want to know how to automate builds, tests, and deployments of Azure Stream Analytics projects using CI/CD tools.
910
---
1011

1112
# Automate builds, tests, and deployments of a Stream Analytics project
@@ -53,7 +54,7 @@ azure-streamanalytics-cicd build --v2 --project ./asaproj.json --outputPath ./De
5354
If the project is built successfully, you see two JSON files created under the output folder:
5455

5556
* ARM template file: `[ProjectName].JobTemplate.json`
56-
* ARM parameter file: `[ProjectName].JobTemplate.parameters.json`
57+
* Azure Resource Manager parameter file: `[ProjectName].JobTemplate.parameters.json`
5758

5859
The default values for **parameters.json** file come from your project settings. If you want to deploy to another environment, replace the values accordingly.
5960

@@ -251,7 +252,7 @@ To deploy your Stream Analytics project using ARM templates, follow these steps:
251252
252253
For more information about deploying resources with ARM templates, see [Deploy with a Resource Manager template file and Azure PowerShell](https://aka.ms/armdeploytemplate).
253254
254-
## Next steps
255+
## Related content
255256
256257
* [Continuous integration and Continuous deployment for Azure Stream Analytics](cicd-overview.md)
257258
* [Set up CI/CD pipeline for Stream Analytics job using Azure Pipelines](set-up-cicd-pipeline.md)

articles/stream-analytics/power-bi-output.md

Lines changed: 8 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,12 @@
11
---
22
title: Power BI output from Azure Stream Analytics
3-
description: This article describes how to output data from Azure Stream Analytics to Power BI.
3+
description: This article describes how to output data from Azure Stream Analytics to Power BI for a rich visualization experience of analysis results.
44
author: AliciaLiMicrosoft
55
ms.author: ali
66
ms.service: azure-stream-analytics
7-
ms.topic: conceptual
8-
ms.date: 07/20/2023
7+
ms.topic: concept-article
8+
ms.date: 01/23/2025
9+
# Customer intent: I want to learn how to send output from a Stream Analytics job to Power BI.
910
---
1011

1112
# Power BI output from Azure Stream Analytics
@@ -44,7 +45,7 @@ Azure Stream Analytics creates a Power BI dataset and table schema for the user
4445
Power BI uses the first-in, first-out (FIFO) retention policy. Data is collected in a table until it hits 200,000 rows.
4546

4647
> [!NOTE]
47-
> We do not recommend using multiple outputs to write to the same dataset because it can cause several issues. Each output tries to create the Power BI dataset independently which can result in multiple datasets with the same name. Additionally, if the outputs don't have consistent schemas, the dataset changes the schema on each write, which leads to too many schema change requests. Even if these issues are avoided, multiple outputs will be less performant than a single merged output.
48+
> We don't recommend using multiple outputs to write to the same dataset because it can cause several issues. Each output tries to create the Power BI dataset independently which can result in multiple datasets with the same name. Additionally, if the outputs don't have consistent schemas, the dataset changes the schema on each write, which leads to too many schema change requests. Even if these issues are avoided, multiple outputs are less performant than a single merged output.
4849
4950
### Convert a data type from Stream Analytics to Power BI
5051

@@ -58,7 +59,7 @@ This table covers the data type conversions from [Stream Analytics data types](/
5859
| nvarchar(max) | String |
5960
| datetime | Datetime |
6061
| float | Double |
61-
| Record array | String type, constant value `IRecord` or `IArray` |
62+
| Record array | String type, constant value `IRecord`, or `IArray` |
6263

6364
### Update the schema
6465

@@ -83,7 +84,7 @@ You can use the following equation to compute the value to give your window in s
8384
For example:
8485

8586
* You have 1,000 devices sending data at one-second intervals.
86-
* You're using the Power BI Pro SKU that supports 1,000,000 rows per hour.
87+
* You're using the Power BI Pro Stock Keeping Unit (SKU) that supports 1,000,000 rows per hour.
8788
* You want to publish the amount of average data per device to Power BI.
8889

8990
As a result, the equation becomes:
@@ -113,7 +114,7 @@ Similarly, if a job starts after the token has expired, an error occurs and the
113114

114115
After the authorization has been refreshed with Power BI, a green alert appears in the authorization area to reflect that the issue has been resolved. To overcome this limitation, it's recommended to [use Managed Identity to authenticate your Azure Stream Analytics job to Power BI](powerbi-output-managed-identity.md)
115116

116-
## Next steps
117+
## Related content
117118

118119
* [Use Managed Identity to authenticate your Azure Stream Analytics job to Power BI](powerbi-output-managed-identity.md)
119120
* [Quickstart: Create a Stream Analytics job by using the Azure portal](stream-analytics-quick-create-portal.md)

articles/stream-analytics/stream-analytics-parsing-json.md

Lines changed: 18 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -1,19 +1,19 @@
11
---
22
title: Parsing JSON and AVRO in Azure Stream Analytics
3-
description: This article describes how to operate on complex data types like arrays, JSON, CSV formatted data.
3+
description: This article describes how to operate on complex data types like arrays, JSON, CSV formatted data when using Azure Stream Analytics.
44
ms.service: azure-stream-analytics
55
author: an-emma
66
ms.author: raan
7-
ms.topic: conceptual
8-
ms.date: 05/25/2023
9-
ms.custom:
7+
ms.topic: concept-article
8+
ms.date: 01/23/2025
9+
# Customer intent: I want to know about Azure Stream Analytics support for parsing JSON and AVRO data.
1010
---
1111
# Parse JSON and Avro data in Azure Stream Analytics
1212

13-
Azure Stream Analytics support processing events in CSV, JSON, and Avro data formats. Both JSON and Avro data can be structured and contain some complex types such as nested objects (records) and arrays.
13+
The Azure Stream Analytics service supports processing events in CSV, JSON, and Avro data formats. Both JSON and Avro data can be structured and contain some complex types such as nested objects (records) and arrays.
1414

1515
>[!NOTE]
16-
>AVRO files created by Event Hub Capture use a specific format that requires you to use the *custom deserializer* feature. For more information, see [Read input in any format using .NET custom deserializers](./custom-deserializer-examples.md).
16+
>AVRO files created by Event Hubs Capture use a specific format that requires you to use the *custom deserializer* feature. For more information, see [Read input in any format using .NET custom deserializers](./custom-deserializer-examples.md).
1717
1818

1919

@@ -44,7 +44,7 @@ Record data types are used to represent JSON and Avro arrays when corresponding
4444
```
4545

4646
### Access nested fields in known schema
47-
Use dot notation (.) to easily access nested fields directly from your query. For example, this query selects the Latitude and Longitude coordinates under the Location property in the preceding JSON data. The dot notation can be used to navigate multiple levels as shown below.
47+
Use dot notation (.) to easily access nested fields directly from your query. For example, this query selects the Latitude and Longitude coordinates under the Location property in the preceding JSON data. The dot notation can be used to navigate multiple levels as shown in the following snippet:
4848

4949
```SQL
5050
SELECT
@@ -82,9 +82,9 @@ The result is:
8282

8383
### Access nested fields when property name is a variable
8484

85-
Use the [GetRecordPropertyValue](/stream-analytics-query/getrecordpropertyvalue-azure-stream-analytics) function if the property name is a variable. This allows for building dynamic queries without hardcoding property names.
85+
Use the [GetRecordPropertyValue](/stream-analytics-query/getrecordpropertyvalue-azure-stream-analytics) function if the property name is a variable. It allows for building dynamic queries without hardcoding property names.
8686

87-
For example, imagine the sample data stream needs **to be joined with reference data** containing thresholds for each device sensor. A snippet of such reference data is shown below.
87+
For example, imagine the sample data stream needs **to be joined with reference data** containing thresholds for each device sensor. A snippet of such reference data is shown in the following snippet.
8888

8989
```json
9090
{
@@ -99,7 +99,7 @@ For example, imagine the sample data stream needs **to be joined with reference
9999
}
100100
```
101101

102-
The goal here is to join our sample dataset from the top of the article to that reference data, and output one event for each sensor measure above its threshold. That means our single event above can generate multiple output events if multiple sensors are above their respective thresholds, thanks to the join. To achieve similar results without a join, see the section below.
102+
The goal here's to join our sample dataset from the top of the article to that reference data, and output one event for each sensor measure above its threshold. That means our single event above can generate multiple output events if multiple sensors are above their respective thresholds, thanks to the join. To achieve similar results without a join, see the following example:
103103

104104
```SQL
105105
SELECT
@@ -119,8 +119,8 @@ WHERE
119119
The result is:
120120

121121
|DeviceID|SensorName|AlertMessage|
122-
|-|-|-|
123-
|12345|Humidity|Alert : Sensor above threshold|
122+
| - | - | - |
123+
| 12345 | Humidity | Alert: Sensor above threshold |
124124

125125
### Convert record fields into separate events
126126

@@ -165,7 +165,7 @@ SELECT DeviceID, PropertyValue AS Humidity INTO HumidityOutput FROM Stage0 WHERE
165165
```
166166

167167
### Parse JSON record in SQL reference data
168-
When using Azure SQL Database as reference data in your job, it's possible to have a column that has data in JSON format. An example is shown below.
168+
When using Azure SQL Database as reference data in your job, it's possible to have a column that has data in JSON format. An example is shown in the following example:
169169

170170
|DeviceID|Data|
171171
|-|-|
@@ -180,7 +180,7 @@ return JSON.parse(string);
180180
}
181181
```
182182

183-
You can then create a step in your Stream Analytics query as shown below to access the fields of your JSON records.
183+
You can then create a step in your Stream Analytics query as shown here to access the fields of your JSON records.
184184

185185
```SQL
186186
WITH parseJson as
@@ -198,9 +198,9 @@ You can then create a step in your Stream Analytics query as shown below to acce
198198

199199
## Array data types
200200

201-
Array data types are an ordered collection of values. Some typical operations on array values are detailed below. These examples use the functions [GetArrayElement](/stream-analytics-query/getarrayelement-azure-stream-analytics), [GetArrayElements](/stream-analytics-query/getarrayelements-azure-stream-analytics), [GetArrayLength](/stream-analytics-query/getarraylength-azure-stream-analytics), and the [APPLY](/stream-analytics-query/apply-azure-stream-analytics) operator.
201+
Array data types are an ordered collection of values. Some typical operations on array values are detailed here. These examples use the functions [GetArrayElement](/stream-analytics-query/getarrayelement-azure-stream-analytics), [GetArrayElements](/stream-analytics-query/getarrayelements-azure-stream-analytics), [GetArrayLength](/stream-analytics-query/getarraylength-azure-stream-analytics), and the [APPLY](/stream-analytics-query/apply-azure-stream-analytics) operator.
202202

203-
Here's an example of a event. Both `CustomSensor03` and `SensorMetadata` are of type **array**:
203+
Here's an example of an event. Both `CustomSensor03` and `SensorMetadata` are of type **array**:
204204

205205
```json
206206
{
@@ -294,7 +294,7 @@ The result is:
294294
|12345|Manufacturer|ABC|
295295
|12345|Version|1.2.45|
296296

297-
If the extracted fields need to appear in columns, it is possible to pivot the dataset using the [WITH](/stream-analytics-query/with-azure-stream-analytics) syntax in addition to the [JOIN](/stream-analytics-query/join-azure-stream-analytics) operation. That join requires a [time boundary](/stream-analytics-query/join-azure-stream-analytics#BKMK_DateDiff) condition that prevents duplication:
297+
If the extracted fields need to appear in columns, it's possible to pivot the dataset using the [WITH](/stream-analytics-query/with-azure-stream-analytics) syntax in addition to the [JOIN](/stream-analytics-query/join-azure-stream-analytics) operation. That join requires a [time boundary](/stream-analytics-query/join-azure-stream-analytics#BKMK_DateDiff) condition that prevents duplication:
298298

299299
```SQL
300300
WITH DynamicCTE AS (
@@ -322,5 +322,5 @@ The result is:
322322
|-|-|-|-|-|
323323
|12345|47|122|1.2.45|ABC|
324324

325-
## See Also
325+
## Related content
326326
[Data Types in Azure Stream Analytics](/stream-analytics-query/data-types-azure-stream-analytics)

0 commit comments

Comments
 (0)