You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-parsing-json.md
+11-12Lines changed: 11 additions & 12 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,10 +2,10 @@
2
2
title: Parsing JSON and AVRO in Azure Stream Analytics
3
3
description: This article describes how to operate on complex data types like arrays, JSON, CSV formatted data.
4
4
ms.service: stream-analytics
5
-
author: ajetasin
6
-
ms.author: ajetasi
5
+
author: an-emma
6
+
ms.author: raan
7
7
ms.topic: conceptual
8
-
ms.date: 01/29/2020
8
+
ms.date: 05/25/2023
9
9
ms.custom: devx-track-js
10
10
---
11
11
# Parse JSON and Avro data in Azure Stream Analytics
@@ -14,12 +14,11 @@ Azure Stream Analytics support processing events in CSV, JSON, and Avro data for
14
14
15
15
>[!NOTE]
16
16
>AVRO files created by Event Hub Capture use a specific format that requires you to use the *custom deserializer* feature. For more information, see [Read input in any format using .NET custom deserializers](./custom-deserializer-examples.md).
17
-
>
18
-
>Stream Analytics AVRO deserialization does not support Map type. Stream Analytics can't read EventHub capture blobs because EventHub capture uses map.
17
+
19
18
20
19
21
20
## Record data types
22
-
Record data types are used to represent JSON and Avro arrays when corresponding formats are used in the input data streams. These examples demonstrate a sample sensor, which is reading input events in JSON format. Here is example of a single event:
21
+
Record data types are used to represent JSON and Avro arrays when corresponding formats are used in the input data streams. These examples demonstrate a sample sensor, which is reading input events in JSON format. Here's example of a single event:
23
22
24
23
```json
25
24
{
@@ -106,7 +105,7 @@ The goal here is to join our sample dataset from the top of the article to that
106
105
SELECT
107
106
input.DeviceID,
108
107
thresholds.SensorName,
109
-
"Alert: Sensor above threshold"AS AlertMessage
108
+
"Alert: Sensor above threshold"AS AlertMessage
110
109
FROM input -- stream input
111
110
JOIN thresholds -- reference data input
112
111
ON
@@ -170,8 +169,8 @@ When using Azure SQL Database as reference data in your job, it's possible to ha
170
169
171
170
|DeviceID|Data|
172
171
|-|-|
173
-
|12345|{"key": "value1"}|
174
-
|54321|{"key": "value2"}|
172
+
|12345|{"key": "value1"}|
173
+
|54321|{"key": "value2"}|
175
174
176
175
You can parse the JSON record in the *Data* column by writing a simple JavaScript user-defined function.
177
176
@@ -201,7 +200,7 @@ You can then create a step in your Stream Analytics query as shown below to acce
201
200
202
201
Array data types are an ordered collection of values. Some typical operations on array values are detailed below. These examples use the functions [GetArrayElement](/stream-analytics-query/getarrayelement-azure-stream-analytics), [GetArrayElements](/stream-analytics-query/getarrayelements-azure-stream-analytics), [GetArrayLength](/stream-analytics-query/getarraylength-azure-stream-analytics), and the [APPLY](/stream-analytics-query/apply-azure-stream-analytics) operator.
203
202
204
-
Here is an example of a single event. Both `CustomSensor03` and `SensorMetadata` are of type **array**:
203
+
Here's an example of a event. Both `CustomSensor03` and `SensorMetadata` are of type **array**:
205
204
206
205
```json
207
206
{
@@ -295,7 +294,7 @@ The result is:
295
294
|12345|Manufacturer|ABC|
296
295
|12345|Version|1.2.45|
297
296
298
-
If the extracted fields need to appear in columns, it is possible to pivot the dataset using the [WITH](/stream-analytics-query/with-azure-stream-analytics) syntax in addition to the [JOIN](/stream-analytics-query/join-azure-stream-analytics) operation. That join will require a [time boundary](/stream-analytics-query/join-azure-stream-analytics#BKMK_DateDiff) condition that prevents duplication:
297
+
If the extracted fields need to appear in columns, it is possible to pivot the dataset using the [WITH](/stream-analytics-query/with-azure-stream-analytics) syntax in addition to the [JOIN](/stream-analytics-query/join-azure-stream-analytics) operation. That join requires a [time boundary](/stream-analytics-query/join-azure-stream-analytics#BKMK_DateDiff) condition that prevents duplication:
299
298
300
299
```SQL
301
300
WITH DynamicCTE AS (
@@ -324,4 +323,4 @@ The result is:
324
323
|12345|47|122|1.2.45|ABC|
325
324
326
325
## See Also
327
-
[Data Types in Azure Stream Analytics](/stream-analytics-query/data-types-azure-stream-analytics)
326
+
[Data Types in Azure Stream Analytics](/stream-analytics-query/data-types-azure-stream-analytics)
0 commit comments