Skip to content

Commit 9697cb6

Browse files
Merge pull request #239315 from an-emma/patch-15
removed one limitation
2 parents e8345e5 + 6630c29 commit 9697cb6

File tree

1 file changed

+11
-12
lines changed

1 file changed

+11
-12
lines changed

articles/stream-analytics/stream-analytics-parsing-json.md

Lines changed: 11 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -2,10 +2,10 @@
22
title: Parsing JSON and AVRO in Azure Stream Analytics
33
description: This article describes how to operate on complex data types like arrays, JSON, CSV formatted data.
44
ms.service: stream-analytics
5-
author: ajetasin
6-
ms.author: ajetasi
5+
author: an-emma
6+
ms.author: raan
77
ms.topic: conceptual
8-
ms.date: 01/29/2020
8+
ms.date: 05/25/2023
99
ms.custom: devx-track-js
1010
---
1111
# Parse JSON and Avro data in Azure Stream Analytics
@@ -14,12 +14,11 @@ Azure Stream Analytics support processing events in CSV, JSON, and Avro data for
1414

1515
>[!NOTE]
1616
>AVRO files created by Event Hub Capture use a specific format that requires you to use the *custom deserializer* feature. For more information, see [Read input in any format using .NET custom deserializers](./custom-deserializer-examples.md).
17-
>
18-
>Stream Analytics AVRO deserialization does not support Map type. Stream Analytics can't read EventHub capture blobs because EventHub capture uses map.
17+
1918

2019

2120
## Record data types
22-
Record data types are used to represent JSON and Avro arrays when corresponding formats are used in the input data streams. These examples demonstrate a sample sensor, which is reading input events in JSON format. Here is example of a single event:
21+
Record data types are used to represent JSON and Avro arrays when corresponding formats are used in the input data streams. These examples demonstrate a sample sensor, which is reading input events in JSON format. Here's example of a single event:
2322

2423
```json
2524
{
@@ -106,7 +105,7 @@ The goal here is to join our sample dataset from the top of the article to that
106105
SELECT
107106
input.DeviceID,
108107
thresholds.SensorName,
109-
"Alert : Sensor above threshold" AS AlertMessage
108+
"Alert: Sensor above threshold" AS AlertMessage
110109
FROM input -- stream input
111110
JOIN thresholds -- reference data input
112111
ON
@@ -170,8 +169,8 @@ When using Azure SQL Database as reference data in your job, it's possible to ha
170169

171170
|DeviceID|Data|
172171
|-|-|
173-
|12345|{"key" : "value1"}|
174-
|54321|{"key" : "value2"}|
172+
|12345|{"key": "value1"}|
173+
|54321|{"key": "value2"}|
175174

176175
You can parse the JSON record in the *Data* column by writing a simple JavaScript user-defined function.
177176

@@ -201,7 +200,7 @@ You can then create a step in your Stream Analytics query as shown below to acce
201200

202201
Array data types are an ordered collection of values. Some typical operations on array values are detailed below. These examples use the functions [GetArrayElement](/stream-analytics-query/getarrayelement-azure-stream-analytics), [GetArrayElements](/stream-analytics-query/getarrayelements-azure-stream-analytics), [GetArrayLength](/stream-analytics-query/getarraylength-azure-stream-analytics), and the [APPLY](/stream-analytics-query/apply-azure-stream-analytics) operator.
203202

204-
Here is an example of a single event. Both `CustomSensor03` and `SensorMetadata` are of type **array**:
203+
Here's an example of a event. Both `CustomSensor03` and `SensorMetadata` are of type **array**:
205204

206205
```json
207206
{
@@ -295,7 +294,7 @@ The result is:
295294
|12345|Manufacturer|ABC|
296295
|12345|Version|1.2.45|
297296

298-
If the extracted fields need to appear in columns, it is possible to pivot the dataset using the [WITH](/stream-analytics-query/with-azure-stream-analytics) syntax in addition to the [JOIN](/stream-analytics-query/join-azure-stream-analytics) operation. That join will require a [time boundary](/stream-analytics-query/join-azure-stream-analytics#BKMK_DateDiff) condition that prevents duplication:
297+
If the extracted fields need to appear in columns, it is possible to pivot the dataset using the [WITH](/stream-analytics-query/with-azure-stream-analytics) syntax in addition to the [JOIN](/stream-analytics-query/join-azure-stream-analytics) operation. That join requires a [time boundary](/stream-analytics-query/join-azure-stream-analytics#BKMK_DateDiff) condition that prevents duplication:
299298

300299
```SQL
301300
WITH DynamicCTE AS (
@@ -324,4 +323,4 @@ The result is:
324323
|12345|47|122|1.2.45|ABC|
325324

326325
## See Also
327-
[Data Types in Azure Stream Analytics](/stream-analytics-query/data-types-azure-stream-analytics)
326+
[Data Types in Azure Stream Analytics](/stream-analytics-query/data-types-azure-stream-analytics)

0 commit comments

Comments
 (0)