Skip to content

Commit a38b0c9

Browse files
authored
Update stream-analytics-troubleshoot-input.md
1 parent 54e6257 commit a38b0c9

File tree

1 file changed

+19
-14
lines changed

1 file changed

+19
-14
lines changed

articles/stream-analytics/stream-analytics-troubleshoot-input.md

Lines changed: 19 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -6,57 +6,60 @@ ms.author: sidram
66
ms.reviewer: mamccrea
77
ms.service: stream-analytics
88
ms.topic: conceptual
9-
ms.date: 03/27/2020
9+
ms.date: 03/31/2020
1010
ms.custom: seodec18
1111
---
1212

1313
# Troubleshoot input connections
1414

15-
This page describes common issues with input connections and how to troubleshoot them.
15+
This article describes common issues with Azure Stream Analytics input connections, how to troubleshoot input issues, and how to correct the issues. Many troublshooting steps require diagnostic logs to be enabled for your Stream Analytics job. If you do not have diagnosic logs enabled, see [Troubleshoot Azure Stream Analytics by using diagnostics logs](stream-analytics-job-diagnostic-logs.md).
1616

1717
## Input events not received by job
18-
1. Test your connectivity. Verify connectivity to inputs and outputs by using the **Test Connection** button for each input and output.
18+
19+
1. Test your input and output connectivity. Verify connectivity to inputs and outputs by using the **Test Connection** button for each input and output.
1920

2021
2. Examine your input data.
21-
22+
2223
1. Use the [**Sample Data**](stream-analytics-sample-data-input.md) button for each input. Download the input sample data.
2324

24-
1. Inspect the sample data to understand the shape of the data--that is, the schema and [data types](https://docs.microsoft.com/stream-analytics-query/data-types-azure-stream-analytics).
25+
1. Inspect the sample data to understand the schema and [data types](https://docs.microsoft.com/stream-analytics-query/data-types-azure-stream-analytics).
2526

26-
1. Check [Event Hub metrics](../event-hubs/event-hubs-metrics-azure-monitor.md) to ensure events are being sent. MEssage metrics should be greater than zero if Event Hubs is receiving messages.
27+
1. Check [Event Hub metrics](../event-hubs/event-hubs-metrics-azure-monitor.md) to ensure events are being sent. Message metrics should be greater than zero if Event Hubs is receiving messages.
2728

2829
3. Ensure that you have selected a time range in the input preview. Choose **Select time range**, and then enter a sample duration before testing your query.
2930

3031
## Malformed input events causes deserialization errors
3132

32-
Deserialization issues are caused when the input stream of your Stream Analytics job contains malformed messages. For example, a malformed message could be caused by a missing parenthesis or a brace in a JSON object, or an incorrect timestamp format in the time field.
33+
Deserialization issues are caused when the input stream of your Stream Analytics job contains malformed messages. For example, a malformed message could be caused by a missing parenthesis, or brace, in a JSON object or an incorrect timestamp format in the time field.
3334

34-
When a Stream Analytics job receives a malformed message from an input, it drops the message and notifies you with a warning. A warning symbol is shown on the **Inputs** tile of your Stream Analytics job. This warning sign exists as long as the job is in running state:
35+
When a Stream Analytics job receives a malformed message from an input, it drops the message and notifies you with a warning. A warning symbol is shown on the **Inputs** tile of your Stream Analytics job. The following warning symbol exists as long as the job is in running state:
3536

3637
![Azure Stream Analytics inputs tile](media/stream-analytics-malformed-events/stream-analytics-inputs-tile.png)
3738

38-
Enable the diagnostics logs to view the details of the error and the message that cause the error (payload). There are multiple reasons why deserialization errors can occur. For more information regarding specific deserialization errors, see [Input data errors](data-errors.md#input-data-errors). If diagnostic logs are not enabled, a brief notification will be available in the Azure portal.
39+
Enable diagnostics logs to view the details of the error and the message (payload) that caused the error. There are multiple reasons why deserialization errors can occur. For more information regarding specific deserialization errors, see [Input data errors](data-errors.md#input-data-errors). If diagnostic logs are not enabled, a brief notification will be available in the Azure portal.
3940

40-
![Input details warning notification]()
41+
![Input details warning notification](media/stream-analytics-malformed-events/warning-message-with-offset.png)
4142

4243
In cases where the message payload is greater than 32 KB or is in binary format, run the CheckMalformedEvents.cs code available in the [GitHub samples repository](https://github.com/Azure/azure-stream-analytics/tree/master/Samples/CheckMalformedEventsEH). This code reads the partition ID, offset, and prints the data that's located in that offset.
4344

44-
## Job exceeds maximum Event Hub Receivers
45+
## Job exceeds maximum Event Hub receivers
4546

46-
A best practice for using Event Hubs is to use multiple consumer groups to ensure job scalability. The number of readers in the Stream Analytics job for a specific input affects the number of readers in a single consumer group. The precise number of receivers is based on internal implementation details for the scale-out topology logic and is not exposed externally. The number of readers can change when a job is started or during job upgrades.
47+
A best practice for using Event Hubs is to use multiple consumer groups for job scalability. The number of readers in the Stream Analytics job for a specific input affects the number of readers in a single consumer group. The precise number of receivers is based on internal implementation details for the scale-out topology logic and is not exposed externally. The number of readers can change when a job is started or during job upgrades.
4748

4849
The error shown when the number of receivers exceeds the maximum is:
50+
4951
`The streaming job failed: Stream Analytics job has validation errors: Job will exceed the maximum amount of Event Hub Receivers.`
5052

5153
> [!NOTE]
5254
> When the number of readers changes during a job upgrade, transient warnings are written to audit logs. Stream Analytics jobs automatically recover from these transient issues.
5355
5456
### Add a consumer group in Event Hubs
57+
5558
To add a new consumer group in your Event Hubs instance, follow these steps:
5659

5760
1. Sign in to the Azure portal.
5861

59-
2. Locate your Event Hubs.
62+
2. Locate your Event Hub.
6063

6164
3. Select **Event Hubs** under the **Entities** heading.
6265

@@ -68,7 +71,7 @@ To add a new consumer group in your Event Hubs instance, follow these steps:
6871

6972
![Add a consumer group in Event Hubs](media/stream-analytics-event-hub-consumer-groups/new-eh-consumer-group.png)
7073

71-
7. When you created the input in the Stream Analytics job to point to the Event Hub, you specified the consumer group there. $Default is used when none is specified. Once you create a new consumer group, edit the Event Hub input in the Stream Analytics job and specify the name of the new consumer group.
74+
7. When you created the input in the Stream Analytics job to point to the Event Hub, you specified the consumer group there. **$Default** is used when none is specified. Once you create a new consumer group, edit the Event Hub input in the Stream Analytics job and specify the name of the new consumer group.
7275

7376
## Readers per partition exceeds Event Hubs limit
7477

@@ -77,7 +80,9 @@ If your streaming query syntax references the same input Event Hub resource mult
7780
Scenarios in which the number of readers per partition exceeds the Event Hubs limit of five include the following:
7881

7982
* Multiple SELECT statements: If you use multiple SELECT statements that refer to **same** event hub input, each SELECT statement causes a new receiver to be created.
83+
8084
* UNION: When you use a UNION, it's possible to have multiple inputs that refer to the **same** event hub and consumer group.
85+
8186
* SELF JOIN: When you use a SELF JOIN operation, it's possible to refer to the **same** event hub multiple times.
8287

8388
The following best practices can help mitigate scenarios in which the number of readers per partition exceeds the Event Hubs limit of five.

0 commit comments

Comments
 (0)