Skip to content

Commit 43a3bb8

Browse files
committed
Review, edit, Acrolynx, Learn Linter
1 parent 3e2a50b commit 43a3bb8

File tree

1 file changed

+18
-18
lines changed

1 file changed

+18
-18
lines changed

articles/stream-analytics/stream-analytics-parsing-json.md

Lines changed: 18 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -1,19 +1,19 @@
11
---
22
title: Parsing JSON and AVRO in Azure Stream Analytics
3-
description: This article describes how to operate on complex data types like arrays, JSON, CSV formatted data.
3+
description: This article describes how to operate on complex data types like arrays, JSON, CSV formatted data when using Azure Stream Analytics.
44
ms.service: azure-stream-analytics
55
author: an-emma
66
ms.author: raan
7-
ms.topic: conceptual
8-
ms.date: 05/25/2023
9-
ms.custom:
7+
ms.topic: concept-article
8+
ms.date: 01/23/2025
9+
# Customer intent: I want to know about Azure Stream Analytics support for parsing JSON and AVRO data.
1010
---
1111
# Parse JSON and Avro data in Azure Stream Analytics
1212

13-
Azure Stream Analytics support processing events in CSV, JSON, and Avro data formats. Both JSON and Avro data can be structured and contain some complex types such as nested objects (records) and arrays.
13+
The Azure Stream Analytics service supports processing events in CSV, JSON, and Avro data formats. Both JSON and Avro data can be structured and contain some complex types such as nested objects (records) and arrays.
1414

1515
>[!NOTE]
16-
>AVRO files created by Event Hub Capture use a specific format that requires you to use the *custom deserializer* feature. For more information, see [Read input in any format using .NET custom deserializers](./custom-deserializer-examples.md).
16+
>AVRO files created by Event Hubs Capture use a specific format that requires you to use the *custom deserializer* feature. For more information, see [Read input in any format using .NET custom deserializers](./custom-deserializer-examples.md).
1717
1818

1919

@@ -44,7 +44,7 @@ Record data types are used to represent JSON and Avro arrays when corresponding
4444
```
4545

4646
### Access nested fields in known schema
47-
Use dot notation (.) to easily access nested fields directly from your query. For example, this query selects the Latitude and Longitude coordinates under the Location property in the preceding JSON data. The dot notation can be used to navigate multiple levels as shown below.
47+
Use dot notation (.) to easily access nested fields directly from your query. For example, this query selects the Latitude and Longitude coordinates under the Location property in the preceding JSON data. The dot notation can be used to navigate multiple levels as shown in the following snippet:
4848

4949
```SQL
5050
SELECT
@@ -82,9 +82,9 @@ The result is:
8282

8383
### Access nested fields when property name is a variable
8484

85-
Use the [GetRecordPropertyValue](/stream-analytics-query/getrecordpropertyvalue-azure-stream-analytics) function if the property name is a variable. This allows for building dynamic queries without hardcoding property names.
85+
Use the [GetRecordPropertyValue](/stream-analytics-query/getrecordpropertyvalue-azure-stream-analytics) function if the property name is a variable. It allows for building dynamic queries without hardcoding property names.
8686

87-
For example, imagine the sample data stream needs **to be joined with reference data** containing thresholds for each device sensor. A snippet of such reference data is shown below.
87+
For example, imagine the sample data stream needs **to be joined with reference data** containing thresholds for each device sensor. A snippet of such reference data is shown in the following snippet.
8888

8989
```json
9090
{
@@ -99,7 +99,7 @@ For example, imagine the sample data stream needs **to be joined with reference
9999
}
100100
```
101101

102-
The goal here is to join our sample dataset from the top of the article to that reference data, and output one event for each sensor measure above its threshold. That means our single event above can generate multiple output events if multiple sensors are above their respective thresholds, thanks to the join. To achieve similar results without a join, see the section below.
102+
The goal here's to join our sample dataset from the top of the article to that reference data, and output one event for each sensor measure above its threshold. That means our single event above can generate multiple output events if multiple sensors are above their respective thresholds, thanks to the join. To achieve similar results without a join, see the following example:
103103

104104
```SQL
105105
SELECT
@@ -119,8 +119,8 @@ WHERE
119119
The result is:
120120

121121
|DeviceID|SensorName|AlertMessage|
122-
|-|-|-|
123-
|12345|Humidity|Alert : Sensor above threshold|
122+
| - | - | - |
123+
| 12345 | Humidity | Alert: Sensor above threshold |
124124

125125
### Convert record fields into separate events
126126

@@ -165,7 +165,7 @@ SELECT DeviceID, PropertyValue AS Humidity INTO HumidityOutput FROM Stage0 WHERE
165165
```
166166

167167
### Parse JSON record in SQL reference data
168-
When using Azure SQL Database as reference data in your job, it's possible to have a column that has data in JSON format. An example is shown below.
168+
When using Azure SQL Database as reference data in your job, it's possible to have a column that has data in JSON format. An example is shown in the following example:
169169

170170
|DeviceID|Data|
171171
|-|-|
@@ -180,7 +180,7 @@ return JSON.parse(string);
180180
}
181181
```
182182

183-
You can then create a step in your Stream Analytics query as shown below to access the fields of your JSON records.
183+
You can then create a step in your Stream Analytics query as shown here to access the fields of your JSON records.
184184

185185
```SQL
186186
WITH parseJson as
@@ -198,9 +198,9 @@ You can then create a step in your Stream Analytics query as shown below to acce
198198

199199
## Array data types
200200

201-
Array data types are an ordered collection of values. Some typical operations on array values are detailed below. These examples use the functions [GetArrayElement](/stream-analytics-query/getarrayelement-azure-stream-analytics), [GetArrayElements](/stream-analytics-query/getarrayelements-azure-stream-analytics), [GetArrayLength](/stream-analytics-query/getarraylength-azure-stream-analytics), and the [APPLY](/stream-analytics-query/apply-azure-stream-analytics) operator.
201+
Array data types are an ordered collection of values. Some typical operations on array values are detailed here. These examples use the functions [GetArrayElement](/stream-analytics-query/getarrayelement-azure-stream-analytics), [GetArrayElements](/stream-analytics-query/getarrayelements-azure-stream-analytics), [GetArrayLength](/stream-analytics-query/getarraylength-azure-stream-analytics), and the [APPLY](/stream-analytics-query/apply-azure-stream-analytics) operator.
202202

203-
Here's an example of a event. Both `CustomSensor03` and `SensorMetadata` are of type **array**:
203+
Here's an example of an event. Both `CustomSensor03` and `SensorMetadata` are of type **array**:
204204

205205
```json
206206
{
@@ -294,7 +294,7 @@ The result is:
294294
|12345|Manufacturer|ABC|
295295
|12345|Version|1.2.45|
296296

297-
If the extracted fields need to appear in columns, it is possible to pivot the dataset using the [WITH](/stream-analytics-query/with-azure-stream-analytics) syntax in addition to the [JOIN](/stream-analytics-query/join-azure-stream-analytics) operation. That join requires a [time boundary](/stream-analytics-query/join-azure-stream-analytics#BKMK_DateDiff) condition that prevents duplication:
297+
If the extracted fields need to appear in columns, it's possible to pivot the dataset using the [WITH](/stream-analytics-query/with-azure-stream-analytics) syntax in addition to the [JOIN](/stream-analytics-query/join-azure-stream-analytics) operation. That join requires a [time boundary](/stream-analytics-query/join-azure-stream-analytics#BKMK_DateDiff) condition that prevents duplication:
298298

299299
```SQL
300300
WITH DynamicCTE AS (
@@ -322,5 +322,5 @@ The result is:
322322
|-|-|-|-|-|
323323
|12345|47|122|1.2.45|ABC|
324324

325-
## See Also
325+
## Related content
326326
[Data Types in Azure Stream Analytics](/stream-analytics-query/data-types-azure-stream-analytics)

0 commit comments

Comments
 (0)