You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-troubleshoot-input.md
+9-3Lines changed: 9 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,11 +1,11 @@
1
1
---
2
2
title: Troubleshooting Inputs for Azure Stream Analytics
3
3
description: This article describes techniques to troubleshoot your input connections in Azure Stream Analytics jobs.
4
-
author: ajetasin
5
-
ms.author: ajetasi
4
+
author: an-emma
5
+
ms.author: raan
6
6
ms.service: stream-analytics
7
7
ms.topic: troubleshooting
8
-
ms.date: 04/08/2022
8
+
ms.date: 01/17/2023
9
9
ms.custom: seodec18
10
10
---
11
11
@@ -47,6 +47,12 @@ Other common reasons that result in input deserialization errors are:
47
47
3. Using Event Hub capture blob in Avro format as input in your job.
48
48
4. Having two columns in a single input event that differ only in case. Example: *column1* and *COLUMN1*.
49
49
50
+
## Partition count changes
51
+
Partition count of Event Hub can be changed. The Stream Analytics job needs to be stopped and started again if the partition count of Event Hub is changed.
52
+
53
+
The following errors are shown when the partition count of Event Hub is changed when the job is running.
A best practice for using Event Hubs is to use multiple consumer groups for job scalability. The number of readers in the Stream Analytics job for a specific input affects the number of readers in a single consumer group. The precise number of receivers is based on internal implementation details for the scale-out topology logic and is not exposed externally. The number of readers can change when a job is started or during job upgrades.
0 commit comments