Skip to content

Commit 8a7999c

Browse files
Merge pull request #224151 from an-emma/patch-11
Update
2 parents a78a292 + ea17888 commit 8a7999c

File tree

1 file changed

+9
-3
lines changed

1 file changed

+9
-3
lines changed

articles/stream-analytics/stream-analytics-troubleshoot-input.md

Lines changed: 9 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,11 @@
11
---
22
title: Troubleshooting Inputs for Azure Stream Analytics
33
description: This article describes techniques to troubleshoot your input connections in Azure Stream Analytics jobs.
4-
author: ajetasin
5-
ms.author: ajetasi
4+
author: an-emma
5+
ms.author: raan
66
ms.service: stream-analytics
77
ms.topic: troubleshooting
8-
ms.date: 04/08/2022
8+
ms.date: 01/17/2023
99
ms.custom: seodec18
1010
---
1111

@@ -47,6 +47,12 @@ Other common reasons that result in input deserialization errors are:
4747
3. Using Event Hub capture blob in Avro format as input in your job.
4848
4. Having two columns in a single input event that differ only in case. Example: *column1* and *COLUMN1*.
4949

50+
## Partition count changes
51+
Partition count of Event Hub can be changed. The Stream Analytics job needs to be stopped and started again if the partition count of Event Hub is changed.
52+
53+
The following errors are shown when the partition count of Event Hub is changed when the job is running.
54+
Microsoft.Streaming.Diagnostics.Exceptions.InputPartitioningChangedException
55+
5056
## Job exceeds maximum Event Hub receivers
5157

5258
A best practice for using Event Hubs is to use multiple consumer groups for job scalability. The number of readers in the Stream Analytics job for a specific input affects the number of readers in a single consumer group. The precise number of receivers is based on internal implementation details for the scale-out topology logic and is not exposed externally. The number of readers can change when a job is started or during job upgrades.

0 commit comments

Comments
 (0)