You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-troubleshoot-input.md
+10-10Lines changed: 10 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -48,23 +48,23 @@ In cases where the message payload is greater than 32 KB or is in binary format,
48
48
Other common reasons that result in input deserialization errors are:
49
49
1. Integer column having a value greater than 9223372036854775807.
50
50
2. Strings instead of array of objects or line separated objects. Valid example : *[{'a':1}]*. Invalid example : *"'a' :1"*.
51
-
3. Using Event Hub capture blob in Avro format as input in your job.
51
+
3. Using Event Hubs capture blob in Avro format as input in your job.
52
52
4. Having two columns in a single input event that differ only in case. Example: *column1* and *COLUMN1*.
53
53
54
54
## Partition count changes
55
-
Partition count of Event Hub can be changed. The Stream Analytics job needs to be stopped and started again if the partition count of Event Hub is changed.
55
+
Partition count of Event Hubs can be changed. The Stream Analytics job needs to be stopped and started again if the partition count of the event hub is changed.
56
56
57
-
The following errors are shown when the partition count of Event Hub is changed when the job is running.
57
+
The following errors are shown when the partition count of the event hub is changed when the job is running.
A best practice for using Event Hubs is to use multiple consumer groups for job scalability. The number of readers in the Stream Analytics job for a specific input affects the number of readers in a single consumer group. The precise number of receivers is based on internal implementation details for the scale-out topology logic and is not exposed externally. The number of readers can change when a job is started or during job upgrades.
63
63
64
-
The following error messages are shown when the number of receivers exceeds the maximum. The error message includes a list of existing connections made to Event Hub under a consumer group. The tag `AzureStreamAnalytics` indicates that the connections are from Azure Streaming Service.
64
+
The following error messages are shown when the number of receivers exceeds the maximum. The error message includes a list of existing connections made to Event Hubs under a consumer group. The tag `AzureStreamAnalytics` indicates that the connections are from Azure Streaming Service.
65
65
66
66
```
67
-
The streaming job failed: Stream Analytics job has validation errors: Job will exceed the maximum amount of Event Hub Receivers.
67
+
The streaming job failed: Stream Analytics job has validation errors: Job will exceed the maximum amount of Event Hubs Receivers.
68
68
69
69
The following information may be helpful in identifying the connected receivers: Exceeded the maximum number of allowed receivers per partition in a consumer group which is 5. List of connected receivers –
@@ -87,19 +87,19 @@ To add a new consumer group in your Event Hubs instance, follow these steps:
87
87
88
88
3. Select **Event Hubs** under the **Entities** heading.
89
89
90
-
4. Select the Event Hub by name.
90
+
4. Select the event hub by name.
91
91
92
92
5. On the **Event Hubs Instance** page, under the **Entities** heading, select **Consumer groups**. A consumer group with name **$Default** is listed.
93
93
94
94
6. Select **+ Consumer Group** to add a new consumer group.
95
95
96
96

97
97
98
-
7. When you created the input in the Stream Analytics job to point to the Event Hub, you specified the consumer group there. **$Default** is used when none is specified. Once you create a new consumer group, edit the Event Hub input in the Stream Analytics job and specify the name of the new consumer group.
98
+
7. When you created the input in the Stream Analytics job to point to the Event Hub, you specified the consumer group there. **$Default** is used when none is specified. Once you create a new consumer group, edit the event hub input in the Stream Analytics job and specify the name of the new consumer group.
99
99
100
100
## Readers per partition exceeds Event Hubs limit
101
101
102
-
If your streaming query syntax references the same input Event Hub resource multiple times, the job engine can use multiple readers per query from that same consumer group. When there are too many references to the same consumer group, the job can exceed the limit of five and thrown an error. In those circumstances, you can further divide by using multiple inputs across multiple consumer groups using the solution described in the following section.
102
+
If your streaming query syntax references the same input event hub resource multiple times, the job engine can use multiple readers per query from that same consumer group. When there are too many references to the same consumer group, the job can exceed the limit of five and thrown an error. In those circumstances, you can further divide by using multiple inputs across multiple consumer groups using the solution described in the following section.
103
103
104
104
Scenarios in which the number of readers per partition exceeds the Event Hubs limit of five include the following:
105
105
@@ -151,7 +151,7 @@ For queries in which three or more inputs are connected to the same Event Hubs c
151
151
152
152
### Create separate inputs with different consumer groups
153
153
154
-
You can create separate inputs with different consumer groups for the same Event Hub. The following UNION query is an example where *InputOne* and *InputTwo* refer to the same Event Hub source. Any query can have separate inputs with different consumer groups. The UNION query is only one example.
154
+
You can create separate inputs with different consumer groups for the same Event Hub. The following UNION query is an example where *InputOne* and *InputTwo* refer to the same Event Hubs source. Any query can have separate inputs with different consumer groups. The UNION query is only one example.
0 commit comments