Skip to content

Commit 54e6257

Browse files
authored
Update stream-analytics-troubleshoot-input.md
1 parent 56d4c47 commit 54e6257

File tree

1 file changed

+8
-21
lines changed

1 file changed

+8
-21
lines changed

articles/stream-analytics/stream-analytics-troubleshoot-input.md

Lines changed: 8 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: sidram
66
ms.reviewer: mamccrea
77
ms.service: stream-analytics
88
ms.topic: conceptual
9-
ms.date: 12/07/2018
9+
ms.date: 03/27/2020
1010
ms.custom: seodec18
1111
---
1212

@@ -18,43 +18,31 @@ This page describes common issues with input connections and how to troubleshoot
1818
1. Test your connectivity. Verify connectivity to inputs and outputs by using the **Test Connection** button for each input and output.
1919

2020
2. Examine your input data.
21-
22-
1. To verify that input data is flowing into Event Hub, use [Service Bus Explorer](https://code.msdn.microsoft.com/windowsapps/Service-Bus-Explorer-f2abca5a) to connect to Azure Event Hub (if Event Hub input is used).
2321

2422
1. Use the [**Sample Data**](stream-analytics-sample-data-input.md) button for each input. Download the input sample data.
2523

2624
1. Inspect the sample data to understand the shape of the data--that is, the schema and [data types](https://docs.microsoft.com/stream-analytics-query/data-types-azure-stream-analytics).
25+
26+
1. Check [Event Hub metrics](../event-hubs/event-hubs-metrics-azure-monitor.md) to ensure events are being sent. MEssage metrics should be greater than zero if Event Hubs is receiving messages.
2727

2828
3. Ensure that you have selected a time range in the input preview. Choose **Select time range**, and then enter a sample duration before testing your query.
2929

3030
## Malformed input events causes deserialization errors
31+
3132
Deserialization issues are caused when the input stream of your Stream Analytics job contains malformed messages. For example, a malformed message could be caused by a missing parenthesis or a brace in a JSON object, or an incorrect timestamp format in the time field.
3233

3334
When a Stream Analytics job receives a malformed message from an input, it drops the message and notifies you with a warning. A warning symbol is shown on the **Inputs** tile of your Stream Analytics job. This warning sign exists as long as the job is in running state:
3435

3536
![Azure Stream Analytics inputs tile](media/stream-analytics-malformed-events/stream-analytics-inputs-tile.png)
3637

37-
Enable the diagnostics logs to view the details of the warning. For malformed input events, the execution logs contain an entry with the message that looks like:
38-
```
39-
Could not deserialize the input event(s) from resource <blob URI> as json.
40-
```
41-
42-
### What caused the deserialization error
43-
You can take the following steps to analyze the input events in detail to get a clear understanding of what caused the deserialization error. You can then fix the event source to generate events in the right format to prevent you from hitting this issue again.
38+
Enable the diagnostics logs to view the details of the error and the message that cause the error (payload). There are multiple reasons why deserialization errors can occur. For more information regarding specific deserialization errors, see [Input data errors](data-errors.md#input-data-errors). If diagnostic logs are not enabled, a brief notification will be available in the Azure portal.
4439

45-
1. Navigate to the input tile and click on the warning symbols to see the list of issues.
40+
![Input details warning notification]()
4641

47-
2. The input details tile displays a list of warnings with details about each issue. The example warning message below includes the partition, offset, and sequence numbers where there is malformed JSON data.
48-
49-
![Stream Analytics warning message with offset](media/stream-analytics-malformed-events/warning-message-with-offset.png)
50-
51-
3. To find the JSON data with the incorrect format, run the CheckMalformedEvents.cs code available in the [GitHub samples repository](https://github.com/Azure/azure-stream-analytics/tree/master/Samples/CheckMalformedEventsEH). This code reads the partition ID, offset, and prints the data that's located in that offset.
52-
53-
4. Once you read the data, you can analyze and correct the serialization format.
54-
55-
5. You can also [read events from an IoT Hub with the Service Bus Explorer](https://code.msdn.microsoft.com/How-to-read-events-from-an-1641eb1b).
42+
In cases where the message payload is greater than 32 KB or is in binary format, run the CheckMalformedEvents.cs code available in the [GitHub samples repository](https://github.com/Azure/azure-stream-analytics/tree/master/Samples/CheckMalformedEventsEH). This code reads the partition ID, offset, and prints the data that's located in that offset.
5643

5744
## Job exceeds maximum Event Hub Receivers
45+
5846
A best practice for using Event Hubs is to use multiple consumer groups to ensure job scalability. The number of readers in the Stream Analytics job for a specific input affects the number of readers in a single consumer group. The precise number of receivers is based on internal implementation details for the scale-out topology logic and is not exposed externally. The number of readers can change when a job is started or during job upgrades.
5947

6048
The error shown when the number of receivers exceeds the maximum is:
@@ -82,7 +70,6 @@ To add a new consumer group in your Event Hubs instance, follow these steps:
8270

8371
7. When you created the input in the Stream Analytics job to point to the Event Hub, you specified the consumer group there. $Default is used when none is specified. Once you create a new consumer group, edit the Event Hub input in the Stream Analytics job and specify the name of the new consumer group.
8472

85-
8673
## Readers per partition exceeds Event Hubs limit
8774

8875
If your streaming query syntax references the same input Event Hub resource multiple times, the job engine can use multiple readers per query from that same consumer group. When there are too many references to the same consumer group, the job can exceed the limit of five and thrown an error. In those circumstances, you can further divide by using multiple inputs across multiple consumer groups using the solution described in the following section.

0 commit comments

Comments
 (0)